GG - GoldenGate

Hi,
I'm using GG with broadcast mechanism , Here my doubt is related to db performance for GG fltering.
If im using SQLEXEC or FILTER with STRFIND in GG whether the particular command will run for every commit operation or is there any option for manipulating the data in redolog file itself? If the particular filter will run for every +operation there should be some performance impact?  always the filtering query run against the particular DB tables.
try to make it simple ....If we are going to use filter in inside of extract or pump , whether the paricular filter will run against db tables or any other way to gg filter from log file?
can any one please clarify? 

Hi annamalai.
The only way to "change" the data in the redo log files is to do database operations.
As for the filtering, seeing that you're using STRFND it seems you may already have a pretty good idea of how we filter at the operation level, which is the only way right now. However, you can store the results of that filter as a user token and it will be replicated along with the row change data. You can then filter on the token in your pump - much faster than running the SQLEXEC query again. And you can also filter on the replicat with the token if needed.
Hope this answers your question.
-joe 

yes correct , this is wat im expecting, once again thanks Joe. 

.

Related

Weird Problem? Replicating Triggers

Hello
We are having a strange issue with DDL replication.
Version 11.1.1.1.2_03 13555424 OGGCORE_11.1.1.1.3_PLATFORMS_120104.0600
Release 11.2.0.3.0 ProductionThe problem is when we create triggers on source schema, it will replicate them sometimes and we are unable to find the cause for this.
On source we have the option DDL INCLUDE ALLand on our replicat the configuration is as below:
DDL INCLUDE MAPPED
DDLERROR DEFAULT IGNORE RETRYOP
MAP SOURCE.*, TARGET.*;We do have DBOPTIONS SUPPRESSTRIGGERS but I would not expect this to be a problem at all.
Tables work fine, views, sequences and other kind of objects work properly aswell.
Any ideas?
Thanks in advance,
N K 
MAPPED as a scope would be a subset of ALL, so that is weird. Can you try using MAPPED in the extract so you have MAPPED --> MAPPED instead of ALL --> MAPPED? 
stevencallan wrote:
MAPPED as a scope would be a subset of ALL, so that is weird. Can you try using MAPPED in the extract so you have MAPPED --> MAPPED instead of ALL --> MAPPED?We had this in our previous configuration but it didnt replicate any ddl other than tables. So we changed to this one. 
Let me put a clear example.
If I run a create statement trigger on SQLPLUS it WILL NOT be captured.
If I do it from toad, the only way to see it replicating is to do "Execute as a Script" and then immediately "Execute Statement" (yes, 2 times!!)
Any other way will not replicate the Trigger. 
Still looking for ideas on cause of this! 
On source, put this in extract parameter file
DDL INCLUDE ALL EXCLUDE OBJTYPE 'TRIGGER' 
Have you even read my problem?
I WANT to replicate triggers(Create/Drop/Alter), it is just sometimes it is not working. How excluding these operations would help me? 
question was not very clear as what you want as end result (replicat trigger or Not to replicate)
"The problem is when we create triggers on source schema, it will replicate them sometimes and we are unable to find the cause for this."
So, now I know that you really want to replicate the trigger, let me know which user you were connected as when running create trigger from sqlplus and TOAD. Also please paste your complete extract param file 
I posted all the information which I know it is necesary to troubleshoot this issue. Other options will not impact this, since this is not "works/doesnt work". This is an intermitent issue.
I execute this DML with a user which has DBA privileges. Any other DML operation works just fine as already said (View, Table, Sequence.. all of it works!) But triggers wont. 
as requested, post full extract parameter file for us to diagnose. This is something oracle support would have asked first. 
A small confusion. The only way they get replicated is from Toad ? Right ? There is no way to make it happen if you are creating the trigger from SQL Plus. Right ? 
Also check if any such parameter defined in extract parameter file
TRANLOGOPTIONS EXCLUDEUSER
You can also check the ggserr.log to see how trigger DDLs are handled on target. It will show if the trigger DDL received on target and if was excluded/include or errored out. 
amardeep.sidhu wrote:
A small confusion. The only way they get replicated is from Toad ? Right ? There is no way to make it happen if you are creating the trigger from SQL Plus. Right ?This is correct Amardeep. I've tried several times from sqlplus to no avail.
the only excluded user is GG user which is not the one I am using.

Data Filtering, Transformation & Manipulation

Dear  OGG Experts, I am getting bit confused regarding DATA TRANSFORMATION & MANIPULATION in OGG.I am searching good examples to understand efficiently.Can anyone help me to get detailed information about FILTERING, TRANSFORMATION & MANIPULATION  for Schema, Table, column and Row -Level ?        Thanks in advance.
For a general description, see https://docs.oracle.com/goldengate/1212/gg-winux/GWUAD/wu_datainteg.htm#GWUAD391 See the reference guide for the MAP and TABLE parameter in OGG to do nearly all transformations.OGG allows you to capture a table from any supported databases and replicate them to any supported database irrespective of whether the data structure is the same or otherwise. So a simple 'default' transformation such as this parameter with for example tablex having more columns than tabley MAP myschema.tablex, TARGET otherschema.tabley; Basically replicates a different schema and table to another and OGG simply ignores the extra columns for you. Just an example of such a data transformation. You can also choose to say insert all the records, whether it is an insert, update or delete. MAP myschema.tablex, TARGET otherschema.tabley, insertallrecords;The possibility is endless. See also the section of the reference for 'column conversion functions' where one can concatenated strings, compare them before doing anything.  CheersKee Gan
Hi K.Gan, I do NOT have much idea regarding Filtering and Transformation.I have read some docs. Some links are recommending to use datapump and some are NOT recommending. 1) If we use Datapump, by default pass-through mode is enabled - am i right ? PASSTHRU parameter isn’t supported with OGG.  For the Datapump pass-through mode , both source and target table names and structures must be the same, and no filtering operations can be performed on the data. If Data Filtering or Transformation is required,it’s a good idea to do this with the Datapump to reduce the amount of data sent across the network. 2)  can we use Datapump option (without passthrough) to perform Data filtering and Transformation ?
1) If we use Datapump, by default pass-through mode is enabled - am i right ?> There is no default, an extract pump decides whether it is using passthru or otherwise. If you do any transformation you first need to use the userid parameter which enables the extract pump to create a session with the database. You should get a baseline as to how much data is captured. Just capture what you need without any change and see what sort of volumes you get. Filtering is reasonably fast like using a WHERE clause. Data transformation, example concatenating strings etc takes more time. So you have volume versus cpu time. You will find that complex transformation is easier if done at the replicat side. 2)  can we use Datapump option (without passthrough) to perform Data filtering and Transformation ?> Yes you can do this. However you can also do that in the main extract. The things you can do in a extract pump can be done in the main extract. Maybe if you describe what are you doing with OGG then I may be able to give you some suggestions. CheersKee Gan
Hi K.GAN, Maybe if you describe what are you doing with OGG then I may be able to give you some suggestions.I am trying to understand /  learning Data selection, Filtering and Transformation EXAMPLE :   i do not understand  (Whatever given below) To Transform data,  Native Oracle GoldenGate conversion functionsA user exit from the Extract or Replicat process that applies rules from an externaltransformation solution, then returns the manipulated data to Oracle GoldenGate. To filter data, A SQLEXEC query or procedureUser exists Without knowing the base (transformations), i am struggling to understanding it.If you wish to share some good links or good doc id, please share it
Which parts of https://docs.oracle.com/goldengate/1212/gg-winux/GWUAD/wu_datainteg.htm#GWUAD391 do you need clarification. This section explains it reasonably well. The user exit is a really advance way to configure OGG. where did you get these statements from?CheersKee Gan
Hi,do you need clarification.Yes. I am much confused about filtering and transformations.Very initial which part is essential to concentrateDo i need  end to end knowledge on Filtering & Transformations ?I need some suggestions. where did you get these statements from?Mastering Oracle GoldenGate Real-time Replication and Data Transformation  by  Ravinder Gupta  http://apps2fusion.com/training_demo/golden-gate-training/Oracle-Golden-Gate-Learning-Guide.pdf  Data selection, filtering and transformation Thanks
Ok you can ignore the user exit bits. User exit means you can embed your own C++ code into OGG to do your customized processing and it does not stop at data transformation. It's code you can virtually do anything. You don't need end to end knowledge for data transformation and filtering. For example:A simple WHERE clause is filtering. Actual OGG parameter eg:TABLE MYSCHEMA.MYTABLE where ID_NUM > 2000; A simple concatenate of two strings into a target column is transformation, eg. this maps two different tables and a column in tableb is made of two cols in tablea:MAP  SCHEMX.TABLEA, TARGET SCHEMAY.TABLEB, COLMAP (USEDEFAULT, newcol=#strcat(col3,col4);  This is basically what it means. You can do the simplest to crazy off the chart stuff. In fact by default OGG already do a lot of filtering and transformation for you. Simply saying which table you want is filtering and the replicat automatically matches column names and  datatypes. And if you capture from Oracle to DB2 for example the configuration looks very simple but obviously OGG will need to do a lot of work.  If you want to learn OGG,  start by installing and do some replication. Then try different table structures on both ends.  CheersKee Gan
HI K.Gan, I am extremely sorry for being very very late reply.I went out station just for  1 week. Thanks for your suggestion.

GG with sql

Hi,
I'm using GG with filter command, I would like to know the internal process of filter, If I'm using filter or SQLexec COMMAND IN EXTRACT , Whether that particular filter condition is executer against database table or its run against log file itself?
also is it possible to manipulate the data in redolog file itselfe, if not , whether the paricular filter condition will run every for commit. in this case always the command run against db, could you plz clarify how the filter or sqlexec, or sqlprdicat run in inside of the database,????
is there any performance issue will occur?
Thanks in Advance,
Annamalai 
Annamalai,
This appears to be a double post so I'll only answer your new questions here.
The SQLEXEC query is executed in the database; there is no method (that I know of) to query the transaction logs but OGG definitely doesn't do it. We simply read it chunks and parse the LCRs.
FILTER's simpler cousin, WHERE, sometimes confuses first timers because it is not a database where clause but simply a conditional filtering construct against the change data.
There is no "sqlprdicat" but if you were trying to save a second with shorthand it cost me 10 seconds to look it up to make sure. Please do not shorten words especially when related to syntax and parameters.
SQLPREDICATE is used during initial loads, which essentially issue a select * and convert the results into insert statements, but it allows you to add a database where to the select * query.
To answer your final question, yes, anytime replication stops to issue a database query via SQLEXEC it will be slower than not stopping and issuing a query. You can run two extracts side by side, one with the SQLEXEC and one with no SQLEXEC to determine the delta on your particular systems.
Have fun,
-joe 
thanks a lot joe , For your valuable information 
.

How ddl extract works?

Hi,
I wonder how the extract capture the ddls, I didnt find about the in the pdfs 
Because it is still a statement in the redo logs. 
So why the extra scripts? I though gg handle it somehow through triggers 
It does. DML is changes to data in existing objects. DDL involves the objects themselves, so how would that be tracked? That is going to impact data dictionary, not necessarily data within a table. Read through the ddl_setup.sql script - you'll see all the things that are checked in the trigger (which is fired based on before DDL on database).

Help with using Golden Gate and design

Hi there,
This would be our first attempt at installing and using GG. But before that, just had a question, we have a Oracle CRM database from where we are picking up data(source) and need to send this data across at real time to a target system which is on another instance and another DB, basically, Sqlserver. Now this exchange should happen in real time. My questions were
1. Can GG be used to have real time data sync between a Oracle DB and Sqlserver?
2. Can GG be used to load SQLserver at the first place.
3. Any issues that you see with this approach
4. Is GG a ETL tool? As in when we pick up data from the table, we would doing so using Oracle SQL. Can that SQL be stored in GG?
The data picked up from the source tables would be purely read only with no updates on the source table. We are totally, new to this, so just thought of checking once with the folks who have been working with GG for a while.
thanks!
L 
>
1. Can GG be used to have real time data sync between a Oracle DB and Sqlserver?Almost. As Oracle puts it "with sub second latency"
2. Can GG be used to load SQLserver at the first place.This should be helpful http://www.oracle.com/webfolder/technetwork/tutorials/obe/fmw/goldengate/11g/GGS_Sect_Config_WIN_MSS_2008_to_WinUX_ORA.pdf
3. Any issues that you see with this approachSounds fine to me.
4. Is GG a ETL tool? As in when we pick up data from the table, we would doing so using Oracle SQL. Can that SQL be stored in GG? GG scans redo logs/archives for committed transactions, write the information to trail files, trail files are transferred to the target, the GG process on target side reads from those trail files and applies the DML in the target database. Now what exactly are you looking for ?
The data picked up from the source tables would be purely read only with no updates on the source table. If there are no changes then why you need GG ? Capturing changes is what GG is for ! 
Thanks Amardeep,
Regarding point 3 and 4
3 - Bascially, not sure I understood. Lets say I fire a SQL to pickup data from the ORacle table, a set of columns based on a set of filter conditions(where clause). I want that data to be sent to the target system, not the whole source system. So are you saying, that GG only transfers the whole table as is and we cannot pick up data based on a form that we need it in?
4 - Yes, initially, it will be read only. The input is from sqlserver which passes that to maybe GG and then GG gets the data in real time from the CRM database. The point is that sqlserver is not allowed to directly connect with CRM. Hence we are exploring the option of GG.
thanks 
>
3 - Bascially, not sure I understood. Lets say I fire a SQL to pickup data from the ORacle table, a set of columns based on a set of filter conditions(where clause). I want that data to be >sent to the target system, not the whole source system. So are you saying, that GG only transfers the whole table as is and we cannot pick up data based on a form that we need it in?GG replicates DML. Whatever DML happens on the tables you choose (to replicate) will be replicated to the target. What is your scenario ? 
So Amardeep, taking a very simple example, Lets say I run a sql
1 .select, emp, sal from employee from emp and I want to send this data to the target system, which is lets say in SQl server. then SQL server will get only the emp, sal information into its table?
2. Where would we define the query select emp,sql from employee for the source object in GG. Do we define it in GG which then uses this DML to refresh data in SQL SERVER?
3. Is the configuration of source and target systems a complex part in GG?
Thanks 
Either you are getting it absolutely wrong or I am not getting what you are trying to convey.
GG is a replication tool for replicating DML from source to target.
You are talking about running a SELECT and sending that data to the target. Not sure where GG fits in such a scenario ?
Or may be what you can do is
insert into <some_table> [this table is configured for GG configuration]
select <*> from <some_table>.
Now the same data will be inserted to some_table#TARGET. 
As amardeep said, GG will replicate changes to data. It will help you keep data in sync when you modify it. A select statement wont do anything 
It sounds like you want to replicate a subset of the columns from the CRM.EMPLOYEE table to SQLSERVER.EMP table. For that, take a look at the MAP statement. You can explicitly include or exclude columns. MAP also allows mapping data between columns of different name (EMP to Employee, etc).
Or do you want to replicate a subset of the tables records? You can do this, I've never done it. You do it by using a function to test a column value to determine whether or not to replicate it. You may be on the right track with FILTER CONDITIONS and WHERE clause.
You could also replicate a subset of columns, and a subset of records.
However, if you want to trigger GG to replicate records from CRM to SQL Server based on a select query to SQL Server... then no, it doesn't work that way. GG does not query the source in real time based on a query to the target. Data is loaded, then kept up to date whenever DML occurs on the source.
HTH,
Kevin

Categories

Resources