GG with sql - GoldenGate

Hi,
I'm using GG with filter command, I would like to know the internal process of filter, If I'm using filter or SQLexec COMMAND IN EXTRACT , Whether that particular filter condition is executer against database table or its run against log file itself?
also is it possible to manipulate the data in redolog file itselfe, if not , whether the paricular filter condition will run every for commit. in this case always the command run against db, could you plz clarify how the filter or sqlexec, or sqlprdicat run in inside of the database,????
is there any performance issue will occur?
Thanks in Advance,
Annamalai 

Annamalai,
This appears to be a double post so I'll only answer your new questions here.
The SQLEXEC query is executed in the database; there is no method (that I know of) to query the transaction logs but OGG definitely doesn't do it. We simply read it chunks and parse the LCRs.
FILTER's simpler cousin, WHERE, sometimes confuses first timers because it is not a database where clause but simply a conditional filtering construct against the change data.
There is no "sqlprdicat" but if you were trying to save a second with shorthand it cost me 10 seconds to look it up to make sure. Please do not shorten words especially when related to syntax and parameters.
SQLPREDICATE is used during initial loads, which essentially issue a select * and convert the results into insert statements, but it allows you to add a database where to the select * query.
To answer your final question, yes, anytime replication stops to issue a database query via SQLEXEC it will be slower than not stopping and issuing a query. You can run two extracts side by side, one with the SQLEXEC and one with no SQLEXEC to determine the delta on your particular systems.
Have fun,
-joe 

thanks a lot joe , For your valuable information 

.

Related

Data Filtering, Transformation & Manipulation

Dear  OGG Experts, I am getting bit confused regarding DATA TRANSFORMATION & MANIPULATION in OGG.I am searching good examples to understand efficiently.Can anyone help me to get detailed information about FILTERING, TRANSFORMATION & MANIPULATION  for Schema, Table, column and Row -Level ?        Thanks in advance.
For a general description, see https://docs.oracle.com/goldengate/1212/gg-winux/GWUAD/wu_datainteg.htm#GWUAD391 See the reference guide for the MAP and TABLE parameter in OGG to do nearly all transformations.OGG allows you to capture a table from any supported databases and replicate them to any supported database irrespective of whether the data structure is the same or otherwise. So a simple 'default' transformation such as this parameter with for example tablex having more columns than tabley MAP myschema.tablex, TARGET otherschema.tabley; Basically replicates a different schema and table to another and OGG simply ignores the extra columns for you. Just an example of such a data transformation. You can also choose to say insert all the records, whether it is an insert, update or delete. MAP myschema.tablex, TARGET otherschema.tabley, insertallrecords;The possibility is endless. See also the section of the reference for 'column conversion functions' where one can concatenated strings, compare them before doing anything.  CheersKee Gan
Hi K.Gan, I do NOT have much idea regarding Filtering and Transformation.I have read some docs. Some links are recommending to use datapump and some are NOT recommending. 1) If we use Datapump, by default pass-through mode is enabled - am i right ? PASSTHRU parameter isn’t supported with OGG.  For the Datapump pass-through mode , both source and target table names and structures must be the same, and no filtering operations can be performed on the data. If Data Filtering or Transformation is required,it’s a good idea to do this with the Datapump to reduce the amount of data sent across the network. 2)  can we use Datapump option (without passthrough) to perform Data filtering and Transformation ?
1) If we use Datapump, by default pass-through mode is enabled - am i right ?> There is no default, an extract pump decides whether it is using passthru or otherwise. If you do any transformation you first need to use the userid parameter which enables the extract pump to create a session with the database. You should get a baseline as to how much data is captured. Just capture what you need without any change and see what sort of volumes you get. Filtering is reasonably fast like using a WHERE clause. Data transformation, example concatenating strings etc takes more time. So you have volume versus cpu time. You will find that complex transformation is easier if done at the replicat side. 2)  can we use Datapump option (without passthrough) to perform Data filtering and Transformation ?> Yes you can do this. However you can also do that in the main extract. The things you can do in a extract pump can be done in the main extract. Maybe if you describe what are you doing with OGG then I may be able to give you some suggestions. CheersKee Gan
Hi K.GAN, Maybe if you describe what are you doing with OGG then I may be able to give you some suggestions.I am trying to understand /  learning Data selection, Filtering and Transformation EXAMPLE :   i do not understand  (Whatever given below) To Transform data,  Native Oracle GoldenGate conversion functionsA user exit from the Extract or Replicat process that applies rules from an externaltransformation solution, then returns the manipulated data to Oracle GoldenGate. To filter data, A SQLEXEC query or procedureUser exists Without knowing the base (transformations), i am struggling to understanding it.If you wish to share some good links or good doc id, please share it
Which parts of https://docs.oracle.com/goldengate/1212/gg-winux/GWUAD/wu_datainteg.htm#GWUAD391 do you need clarification. This section explains it reasonably well. The user exit is a really advance way to configure OGG. where did you get these statements from?CheersKee Gan
Hi,do you need clarification.Yes. I am much confused about filtering and transformations.Very initial which part is essential to concentrateDo i need  end to end knowledge on Filtering & Transformations ?I need some suggestions. where did you get these statements from?Mastering Oracle GoldenGate Real-time Replication and Data Transformation  by  Ravinder Gupta  http://apps2fusion.com/training_demo/golden-gate-training/Oracle-Golden-Gate-Learning-Guide.pdf  Data selection, filtering and transformation Thanks
Ok you can ignore the user exit bits. User exit means you can embed your own C++ code into OGG to do your customized processing and it does not stop at data transformation. It's code you can virtually do anything. You don't need end to end knowledge for data transformation and filtering. For example:A simple WHERE clause is filtering. Actual OGG parameter eg:TABLE MYSCHEMA.MYTABLE where ID_NUM > 2000; A simple concatenate of two strings into a target column is transformation, eg. this maps two different tables and a column in tableb is made of two cols in tablea:MAP  SCHEMX.TABLEA, TARGET SCHEMAY.TABLEB, COLMAP (USEDEFAULT, newcol=#strcat(col3,col4);  This is basically what it means. You can do the simplest to crazy off the chart stuff. In fact by default OGG already do a lot of filtering and transformation for you. Simply saying which table you want is filtering and the replicat automatically matches column names and  datatypes. And if you capture from Oracle to DB2 for example the configuration looks very simple but obviously OGG will need to do a lot of work.  If you want to learn OGG,  start by installing and do some replication. Then try different table structures on both ends.  CheersKee Gan
HI K.Gan, I am extremely sorry for being very very late reply.I went out station just for  1 week. Thanks for your suggestion.

GG

Hi,
I'm using GG with broadcast mechanism , Here my doubt is related to db performance for GG fltering.
If im using SQLEXEC or FILTER with STRFIND in GG whether the particular command will run for every commit operation or is there any option for manipulating the data in redolog file itself? If the particular filter will run for every +operation there should be some performance impact?  always the filtering query run against the particular DB tables.
try to make it simple ....If we are going to use filter in inside of extract or pump , whether the paricular filter will run against db tables or any other way to gg filter from log file?
can any one please clarify? 
Hi annamalai.
The only way to "change" the data in the redo log files is to do database operations.
As for the filtering, seeing that you're using STRFND it seems you may already have a pretty good idea of how we filter at the operation level, which is the only way right now. However, you can store the results of that filter as a user token and it will be replicated along with the row change data. You can then filter on the token in your pump - much faster than running the SQLEXEC query again. And you can also filter on the replicat with the token if needed.
Hope this answers your question.
-joe 
yes correct , this is wat im expecting, once again thanks Joe. 
.

Runtime Autoincrementor sequence

I am currently facing 2 problems::
1>>
In my backend, i want to make a trigger which say,
On insert of a record...
creates an autoincrement sequence ..ie initialise
2>>
another trigger::
On insert of a record...
increment the autoincrement sequence tht was created earlier
These 2 triggers work on different tables (FYI)
Now how can i make the auto increment sequence which is set up and initiliased at run time in a trigger at the backend??
PS:- I am using Oracle 10g
Pls do not crucify me fr posting this query here in this section..:D
Thanks 
Oracle provide some thing called Sequence for this purpose but it is not recomended to create the Sequence which is a Database Object (In Oracle Vocabulary) so the creating of a Sequence is a DDL command and that will result in commiting of all the instructions that was executed earliar.
So you will loose the rollback option for the record which has ben inserted 
Can you please provide some more details. I couldnt follow you. I havent read much about runtime sequences...
Thanks

Replication based on condition for whole Schema

Hi There
Oracle 11.1.0.7
We are using oracle spatial and we have got huge tables of size more than 50 GB . I just started configuring GG on test and i would like to know if it suits our requirement.
In this regards i have certain queries :
1) Can i configure GG for whole schema based on conditions , for example in the schema there are 300 tables , out of those i want all 250 to replicate at the target as is and the rest 50 should replicate on where approved = 'YES' . Is this possible ?
2) If its possible to filter data during replication , i would like to know if we can filter using a query which has a 'JOIN'
3) Should the target database (Oracle 11.1.0.7) have the tables with the data as on before starting GG so that when we start replicating both source and target should be exactly the same.
I appreciate if some one can post a response.
Thanks 
You can filter data using a WHERE clause, or even using SQLEXEC, mapping OUT values to target columns. A join could be handled in the output from a stored procedure.
You can do all tables or a portion thereof. If names are similar, they can be wildcarded. Otherwise, list them.
Not sure what you mean by APPROVED.
For initial load, the stock answer is this:
GoldgenGate recommends using native RDBMS tools for the initial load. Restore a backup, that kind of drill. You can use one of the four GG initial load options, but given the size you mentioned, it would probably be much faster to create the target via Oracle Data Pump or RMAN. 
Hi Steven
Thanks for the information !
I tired using the WHERE and it worked perfectly for the condition APPROVED="YES" . However there are many tables that has to be based on certain other conditions .
I tried using the wild character for tables starting with MAJ* . However i would like to know how can i specify the list of tables in the EXTRACT parameter file and use differ WHERE clause for each of those tables . For example
Table scott.emp WHERE (EMPNO="123"),
Table scott.dept WHERE (DEPTNO="345").
Thanks 
You'll have to code out each where clause in the table parameter. Different filter for each table = separate TABLE parameter with its corresponding WHERE clause. 
I think you may try this way.
table SCOTT.EMP, FILTER ( #STRFIND ( EMPNO, "123" ) > 0 );
table SCOTT.DEPT, FILTER ( #STRFIND ( DEPTNO, "456" ) > 0 );
Thanks. 
Thanks

create replication without copy the initial date(empty target)

Hello
i have a 100GB table that i would like to replicate. i Dont need the current data and i would like to start with empty table in the target site.
insert are successfully replicated while updates fail with "no_data_found" and then i cannot recover the replication.
Is the a GG parameter that allows that situation and will not break the rplication?
thanks in advance
Dani 
943325 wrote:
Hello
i have a 100GB table that i would like to replicate. i Dont need the current data and i would like to start with empty table in the target site.
insert are successfully replicated while updates fail with "no_data_found" and then i cannot recover the replication.
Is the a GG parameter that allows that situation and will not break the rplication?Of course it is going to abend. Now it all depends upon what exactly you want to do with those updates/deletes ? If you dont need them you may consider using IGNOREDELETES & IGNOREUPDATES on source side. In that case Extract will not capture UPDATES and DELETES. 
I am assuming in this case he only wants updates and deletes for rows he has already replicated to insert so these wouldnt be helpful
#OP: Assuming I am right in my above statement, you can use HANDLECOLLISSIONS on the replicat side to ignore these
Else you can do
REPERROR(1403,IGNORE) to ignore SQL Error 1403.
Greetings,
N K 
N K wrote:
I am assuming in this case he only wants updates and deletes for rows he has already replicated to insert so these wouldnt be helpfulAaan. Totally absolutely missed that ! Stupid of me ! Having a gun doesn't mean one should start shooting immediately ! :P

Categories

Resources