OCDM - How to start with implementation - Communications Data Model

Hello,
I have installled OCDM and required Oracle components on Linux Redhat5 box. As I am running out of time, could someone guide me on the exact steps/tools that i need to start using to implement my BI solution?
E.g. Do I need to design my warehouse first? Am I suppose to use template base tables or design my own tables? How I go about designing ETLs?
Regards,
Bhavesh 

The next (easy) step is to install OBIEE, and link it to the OCDM DB.
I assume you have installed OWB, Mining and OLAP.
Before designing anything, please look at whatever is already available!
All derived and aggregated tables, as well as the OLAP cubes and mining models, are autromatically filled with the intra-ETLs provided.
The point is not to spend time on whatever is already there but on what could be adapted or extended.
So my suggested steps are as follow:
1/ Define which of the 8 business areas you want to concentrate on, on within them, the business questions you want to solve/the measures you want to see
2/ Look at the delivered KPIs (oob) available and check which tables are needed (using the default metadata report available in OBIEE) - you may need the full payable customer-only documentation for this.
3/ Define which data sources you want to use (the fewer the better to start with - limit the scope) and the data you need out of them
4/ Gap analysis: Check what you cannot do oob with OCDM and see whether you really need it => if yes, design and adapt OCDM...
5/ Get the data (cleaned) to fill those OCDM tables (3NF layer) => Define the ETL from your data source to OCDM (with OWB since you have it or with whatever tool you prefer)
6/ Configure the parameters of the intra-ETLs you use to make them work when you need (and possibly how you need, in case of adpatation)
7/ Test the data transfer
8/ Once in production, start the iteration again in your test environment with the next area/business domain
In any case, Oracle Consulting will be pleased to help you further. Please contact your nearest Oracle office.
Some tipps:
- Do not develop before checking what is already there
- Tables in foundation layer (base, reference and lookup) shall all be 3NF.
- All codes should be defined in lookup tables.
- Inspire yourself from the optimized ETLs - extend them rather than developing from scratch...
Hope this helps.
Regards
Axel.

Related

Question on Avro schema management

Hi,
in our use case, we need to generate Avro schema on the fly (given some business object structure) and install with Oracle NoSQL. I have
the following questions:
1. Is there an API to install the Avro schema in NoSQL. Ideally we'd like to avoid using the command line tool for this (and also creating
a .avsc file)
2. Any recommended way to make the Avro schema available on the client? Ideally we don't want to use any file system operations
for this. Would it be a reasonable way to store the Avro schema itself as a String in NoSQL and then when the client connects to NoSQL,
first thing it does is read the schemata stored in NoSQL and parse them?
On another topic, are there any performance penalties using JsonAvroBinding vs. GenericAvroBinding. Our objects are all JSON so we'd
like to use JsonAvroBinding, however we'd go the extra mile and use GenericAvroBinding if that performs better.
Best Regards and thanks in advance for your answer,
Ralf 
Hello Ralf,
1. Is there an API to install the Avro schema in NoSQL. Ideally we'd like to avoid using the command line tool for this (and also creating a .avsc file)No, there is no administrative API currently available. So currently, I think this would have to be done using a script that is invoked by your application and that uses the NoSQL DB admin CLI.
2. Any recommended way to make the Avro schema available on the client? Ideally we don't want to use any file system operations for this. Would it be a reasonable way to store the Avro schema itself as a String in NoSQL and then when the client connects to NoSQL, first thing it does is read the schemata stored in NoSQL and parse them?The begs the question of how you'll do schema evolution. Will there ever be multiple versions of the same Avro schema (more than one version having the same schema name) in your application? If so, there will be additional complications with the approach you're taking.
If not (if there will only ever be one version of a given schema), you can use the latest (only) version of a schema on the client. In that case you can call AvroCatalog.getCurrentSchemas to get the schemas you need. If a schema you recently added is not in the returned map, call AvroCatalog.refreshSchemaCache followed by AvroCatalog.getCurrentSchemas.
On another topic, are there any performance penalties using JsonAvroBinding vs. GenericAvroBinding. Our objects are all JSON so we'd like to use JsonAvroBinding, however we'd go the extra mile and use GenericAvroBinding if that performs better.They should have similar performance characteristics -- they do essentially the same thing -- but I would guess that in general JsonAvroBinding is slightly faster than GenericAvroBinding, based on my knowledge of the Avro and Jackson code. However, we have not done any performance testing to compare them. It is possible that the answer depends on the schema you're using. So you should do your own testing, if you're concerned about minor performance differences.
--mark                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

Is it possible to build a transaction tracker using oracle jet?

Hi All, I have a requirement where i need to visually show a transaction tracker which originates from one system and traverses through multiple systems to reach a target system.For eg an order is created in system A, goes through system B and C and end at system D.When the user logs into a dashboard portal they should be able to see when the transaction is currently at! Any ideas are welcome? Thanks,Kunal John 'JB' Brock-OracleDuncan Mills-Oracle
From the visualization point of view you could either use timeline:http://www.oracle.com/webfolder/technetwork/jet/jetCookbook.html?component=timeline&demo=basicTimelineOr Diagram http://www.oracle.com/webfolder/technetwork/jet/jetCookbook.html?component=diagram&demo=default Diagram has the most power and would be best for representing a process flow, however, it needs more work from you to create the layout.  
Thanks Duncan! I will take a look at both of them
I used the diagram layout for exactly the same scenario.  Check the "containers" example in the cookbook: http://www.oracle.com/webfolder/technetwork/jet/jetCookbook.html?component=diagram&demo=containers Each "container" can represent a system which can then even include some extra nodes to present different sub-systems  or workflow steps.If you want to rebuild the cookbook in your own system then be aware that it uses a special layout js file (http://www.oracle.com/webfolder/technetwork/jet/cookbook/dataVisualizations/diagram/layouts/DemoContainerLayout.js ) which is not included in the base distribution.
Sven, Thanks a lot for the information, it was really helpful
Sven, Is it possible to use implement websocket to get real time updates of the transaction? can you share any example if you have
This certainly is possible. I used websockets for other purposes, but it is an excellent idea to use it for such a display.This is outside the scope of this thread (and possible the whole forum). Just to give you some ideas: You would need to setup you own websocket server (based upon node.js). Then connect as a websocket client from your JET web page to that server.You would need a websocket connection that has some "channel" ability. Where each channel can repesent an order (if you only want to display one order at a time). Some DB process can sent information to that server whenever the status of an order changes. Depends a lot on the different systems that is involved.

How to copy database changes to production database?

Hello Experts,I'm using DB 11G XE in my development environment and 11.2.0.4 standard edition in production environment (Cloud service) First you must know i'm an ADF developer not a DBA so i have very little knowledge to DBA operations. I made some changes in my development database.Those changes include:- Creating a new table.- Adding data to this new table- Updating some data in other tables The only way i know to move these changes to the production database is to use a .dmp file.I had to drop the user (schema) create it again and run imp. I know this is of course not the right way to do it specially the dropping part in a production DB. So what is the correct way to do this? Thank youGado
Gado wrote: Hello Experts,I'm using DB 11G XE in my development environment and 11.2.0.4 standard edition in production environment (Cloud service) First you must know i'm an ADF developer not a DBA so i have very little knowledge to DBA operations. I made some changes in my development database.Those changes include:- Creating a new table.- Adding data to this new table- Updating some data in other tables The only way i know to move these changes to the production database is to use a .dmp file.I had to drop the user (schema) create it again and run imp. I know this is of course not the right way to do it specially the dropping part in a production DB. So what is the correct way to do this? Thank youGadoThe correct way would be to make your changes in dev with a script that can also be executed in prod.
Thank you for your response EdStevens, Yes but this means that i'll have to write it.Is there another solution for lazy people?  No i'm kidding. I totally agree using scripts is the best way.But i have some questions:What if there is users connected to the database while running the scripts?I mean is there any precautions in performing such thing?You know performing updates to a database in production is kinda scary, so i just wanna make sure i'll do it safely. Thank youGado
Gado wrote: Thank you for your response EdStevens, Yes but this means that i'll have to write it.Is there another solution for lazy people?  No i'm kidding. I totally agree using scripts is the best way.But i have some questions:What if there is users connected to the database while running the scripts?Databases are expected to have multiple users simultaneously connected, each doing their own thing.   And developers are expected to bear this in mind at all times. I mean is there any precautions in performing such thing?The necessary precautions would depend on what is being done.  If I'm creating and loading a new table, my expectation would be that, since no one as yet has code that accesses the table, I really wouldn't need to worry.  If I'm modifying a subset of data in one or more tables, I'd test the h**** out of it in dev to make sure I modified every bit I intended and not a bit more. One thing to keep in mind is that no matter what you do, no other session can see the results until you issue a COMMIT.  And keep in mind that DDL performs an automatic commit.  And that a COMMIT only commits what the committing session changed. You know performing updates to a database in production is kinda scary, so i just wanna make sure i'll do it safely. Then it is up to you to make sure you do it safely.  Script it.  Have a backup/backout position.  There's no shortcuts for lazy people.
What if i want to do all of those things in one script. For example the task at hand:I have two tables COMPANIES and PRODUCTS and now based on a new requirement i need to do this: 1) Rename PRODUCTS table to COMPANY_PRODUCT2) Create new table DOMAIN_PRODUCTS3) Copy data from the renamed table COMPANY_PRODUCTS to DOMAIN_PRODUCTS4) Change the renamed table COMPANY_PRODUCTS's PK (PROD_ID) to be a FK referencing DOMAIN_PRODUCTS.5) Add a new PK column to COMPANY_PRODUCTS and remove some columns. You see i want to change the PRODUCTS table to become an intermediary table between COMPANY and DOMAIN_PRODUCTS (MANY-TO-MANY). I think i can create a script that does this but my question is:Is it ok to run this script on production while users are connected and performing operations on the PRODUCTS table? Thank youGado
Gado wrote: What if i want to do all of those things in one script. For example the task at hand:I have two tables COMPANIES and PRODUCTS and now based on a new requirement i need to do this: 1) Rename PRODUCTS table to COMPANY_PRODUCT2) Create new table DOMAIN_PRODUCTS3) Copy data from the renamed table COMPANY_PRODUCTS to DOMAIN_PRODUCTS4) Change the renamed table COMPANY_PRODUCTS's PK (PROD_ID) to be a FK referencing DOMAIN_PRODUCTS.5) Add a new PK column to COMPANY_PRODUCTS and remove some columns. You see i want to change the PRODUCTS table to become an intermediary table between COMPANY and DOMAIN_PRODUCTS (MANY-TO-MANY). I think i can create a script that does this but my question is:Is it ok to run this script on production while users are connected and performing operations on the PRODUCTS table? Thank youGadoFor something like that I'd schedule a maintenance window, where all users are off the system.

Perst lite, PointBase Micro, Oracle, OpenBaseMovil etc.. which one is best?

Hello there,
I have following points before choosing embedded DB for J2ME (MIDP 2.0 and CLDC 1.0/1.1):
1. capable of CRUD (Create, update , delete) rows with minimal effort.
2. capable to store data in rows and columns.
3. capable to sort data.
4. capable of creating relationships between tables.
5. capable of easy synchronization with remote DB. At least should be capable to keep track/log of changes made and give us the same for syncing with remote DB.
6. fully compatible with J2ME.
7. light footprint and good memory management.
8. no restrictions on data storage limit and column data lengths.
I appreciate any Ideas about best suited embedded DB (not RMS pls).
Thanks 
Guys any replies?

ORACLE resource connector using multiple tables

The Oracle connector is using a single table to exchange data.
Has someone already implemented a connector working with multiple tables in read/write bi-directionnal modes ?
(select * from Table A, Table B where A.key = B.key...)
Can store procedures be used as well ?
It is somewhat cumbersome to map a relational db using a flat model ;-?
Thanks for your help
Regards 
I don't think that stored procedures are possible with the adapters
But I am sure SUN has already implemented custom adapters working on multiple tables with sql statements
Regds
Ed, 
SUN experts, have you done this ?
Do we need to use an API to do this ? which API ?
You help is very much appreciated
;-) ;-) ;-) ;-) ;-) ;-) ;-) 
I believe that there are two approaches you can take for this:
1) Create views on top of the database tables you need to access and continue to use the Database Table adapter (that's what you're using, right?).
2) Write your own custom adapter starting from ExampleTableResourceAdapter.java that is in the REF kit. See the Adapter Development chapter in the Technical Deployment guide.
sbr 
An educated guess would be to go with the view in Oracle since this is what oracle is best at!
//L 
Well to me, a view if only a flat aggregation of several tables for read-only purpose. It is ok to be used with a simple adapter to read data
However it cannot catter ?:
* db writes on the relational model
* advanced queries
In my case, we want IDM workflow to:
create new objects in the database, update them along with the workflows or provisionning events...
+
make queries on the db data model, to parse objects if necessary, deduct an approver name, compute escalation paths,etc...
Is that really possible with solution 1) views ?
Thanks for you help

Categories

Resources