Thursday, September 29, 2016

DRM Batch Client Refresh Current and Previous Day Versions

The Oracle DRM Batch Client is one of the best scripting tools that makes this Master Data Management application undoubtedly one of the best. It is nothing complex but provides a Windows batch scripting interface to do all the DRM activities. There's almost nothing I cannot do using the Data Relationship Management Batch Client.

One of the most common practices in a DRM application is to find out the latest changes that have been done over the day - this includes any new additions, movements, changes etc. For this, I need to do a Compare Export between a historical snapshot of the DRM Version before the change and the snapshot after the change.

The following design using DRM Batch Client will provide the most efficient way to automate this process with minimal long-term maintenance effort. Let's say the name of the version where all users will make the changes is known as the "Gold Version". There will be 2 more versions - "Current Day Version" and "Previous Day Version".

Now using DRM Batch Client, just before running the Compare export, I will do the below steps daily:

1. Delete the "Previous Day Version"
2. Export the "Current Day Version" to a file, say X
3. Delete the "Current Day Version"
4. Export the "Gold Version" to a file, say Y
5. Restore the "Previous Day Version" from X
6. Restore the "Current Day Version" from Y

Since this sequence of steps run daily at the end of the day, after the steps, I have a snapshot of yesterday's data in the version "Previous Day Version", and today's changes in "Current Day Version". This whole activity can be accomplished using the DRM Batch Client. The sample batch client script is given below for reference. I prefer to stuff parameters wherever possible, gives me lot of peace of mind knowing I have lesser places to focus on, so wherever you see the % symbol it means that's a parameter the value of which needs to be set.

set DRM_URL = net.tcp://DRM-DEV:5210/Oracle/Drm/ProcessManager
set DRM_BATCH_CLIENT = C:\Oracle\Middleware\EPMSystem11R1\products\DataRelationshipManagement\client\batch-client\drm-batch-client.exe
set DRM_LOG_HOME = C:\PROD\DRM\Batch_Log_Files
set SOURCE_VERSION_FOR_PREVIOUS_DAY_VERSION = Current Day Version
set TARGET_VERSION_FOR_PREVIOUS_DAY_VERSION = Previous Day Version
set SOURCE_VERSION_FOR_CURRENT_DAY_VERSION = Gold Version
set TARGET_VERSION_FOR_CURRENT_DAY_VERSION = Current Day Version


%DRM_BATCH_CLIENT% /op=DeleteVersion /url=%DRM_URL% /vabbrev="%TARGET_VERSION_FOR_PREVIOUS_DAY_VERSION%" /log="%DRM_LOG_HOME%\Current_and_Previous_Versions"\Refresh_Current_and_Previous_Day_Versions_Log.txt

%DRM_BATCH_CLIENT% /op=BackupVersionToFile /url=%DRM_URL% /conn="BackupVersionToFile" /objectaccess=System /vabbrev="%SOURCE_VERSION_FOR_PREVIOUS_DAY_VERSION%" /filename="%DRM_LOG_HOME%\Current_and_Previous_Versions"\Current_Day_Version_Backup.txt /delim="|" /log="%DRM_LOG_HOME%\Current_and_Previous_Versions"\Refresh_Current_and_Previous_Day_Versions_Log.txt

%DRM_BATCH_CLIENT% /op=DeleteVersion /url=%DRM_URL% /vabbrev="%TARGET_VERSION_FOR_CURRENT_DAY_VERSION%" /log="%DRM_LOG_HOME%\Current_and_Previous_Versions"\Refresh_Current_and_Previous_Day_Versions_Log.txt

%DRM_BATCH_CLIENT% /op=BackupVersionToFile /url=%DRM_URL% /conn="BackupVersionToFile" /objectaccess=System /vabbrev="%SOURCE_VERSION_FOR_CURRENT_DAY_VERSION%" /filename="%DRM_LOG_HOME%\Current_and_Previous_Versions"\Gold_Backup.txt /delim="|" /log="%DRM_LOG_HOME%\Current_and_Previous_Versions"\Refresh_Current_and_Previous_Day_Versions_Log.txt

%DRM_BATCH_CLIENT% /op=RestoreVersionFromFile /url=%DRM_URL% /conn="BackupVersionToFile" /objectaccess=System /sver="%SOURCE_VERSION_FOR_PREVIOUS_DAY_VERSION%" /tver="%TARGET_VERSION_FOR_PREVIOUS_DAY_VERSION%" /save=Y /filename="%DRM_LOG_HOME%\Current_and_Previous_Versions"\Current_Day_Version_Backup.txt /delim="|" /log="%DRM_LOG_HOME%\Current_and_Previous_Versions"\Refresh_Current_and_Previous_Day_Versions_Log.txt

%DRM_BATCH_CLIENT% /op=RestoreVersionFromFile /url=%DRM_URL% /conn="BackupVersionToFile" /objectaccess=System /sver="%SOURCE_VERSION_FOR_CURRENT_DAY_VERSION%" /tver="%TARGET_VERSION_FOR_CURRENT_DAY_VERSION%" /save=Y /filename="%DRM_LOG_HOME%\Current_and_Previous_Versions"\Gold_Backup.txt /delim="|" /log="%DRM_LOG_HOME%\Current_and_Previous_Versions"\Refresh_Current_and_Previous_Day_Versions_Log.txt

It's interesting to note the versions take some time to save, so this entire process can take atleast upto 5-10 minutes to complete successfully.

Monday, September 26, 2016

Load Segment Values and Hierarchies - DRM EBS Integration

The Oracle Data Relationship Management is one of the best in-class master data management tool by Oracle. It is often used along with Oracle E-Business Suite to maintain it's master data due to the benefits and flexibility it offers.

To start with the integration, there is a list of values that needs to be configured in Oracle EBS, these will help DRM and EBS interact with each other as required:

  • GL: DRM Username
  • GL: DRM Property for Value Set Name
  • GL: DRM Property to Allow Version Export
  • GL: DRM Property to Allow Hierarchy Export
  • GL: DRM API Adapter URL
  • GL: DRM WSDL URL
  • GL: DRM Template for Hierarchy Export
  • GL: DRM Template for Version Export
Out of these, the API Adapter URL and the WSDL URL are the first pre-requisites to be checked that they are working fine. If these are not returning any data then it is time to bring them up first. The Value Set name refers to the names of the different segments that is present in my system. Since I have multiple segment hierarchies to integrate, I have multiple Value Set Names configured in EBS.

The program 'Load Segment Values and Hierarchies' when triggered with a specific Value Set name, calls the DRM Exports EBSValueSetExport and EBSHierarchyExport using the Data Relationship Management Web Service (which is why this needs to be up and running as the first sanity check) - the username used to login being the value used in the parameter "DRM Username". On successful completion of the program, the data is loaded into the staging interface tables GL.GL_DRM_SEGVALUES_INTERFACE and GL.GL_DRM_HIERARCHY_INTERFACE and subsequently into the EBS tables. Here's a sample white paper of how the process works.

For smooth integration, I would like to summarize the pre-requisites in EBS for the process to work:

1. Apply the DRM integration patches 10632813 and 11659733
2. Setup the values for all the GL profile parameters specified above in the list
3. Store password for DRM user in FND_VAULT using below syntax:
sqlplus apps/apps_pwd @$FND_TOP/sql/afvltput.sql ModuleName DRMUserName DRMUserPassword

Also here are the pre-requisites in DRM for the process to work:

1. Verify the WSDL and API Adapter URLs
2. Verify the version to be exported has "Allow Export" property set to "True"
3. Verify the version to be exported has the "Value Set Name" populated correctly
4. Verify the DRM property name used for "Value Set Name" is the same as the value configured in EBS for the parameter "GL DRM Property for Value Set Name"
5. Make sure the 3 users exist as per this document
6. Set up the external DB connection with the correct credentials having read-write access to the tables GL.GL_DRM_SEGVALUES_INTERFACE and GL.GL_DRM_HIERARCHY_INTERFACE
7. Check if the DRM Exports EBSValueSetExport and EBSHierarchyExport exist fine
8. Make sure the exports used in point 6 above are the same as tagged with the EBS parameters "GL DRM Template for Hierarchy Export" and "GL DRM Template for Version Export"

Now when I run the program "Load Segment Values and Hierarchies" for a specific value set name, it is important to note that I can only upsert (update insert) segment values and hierarchy data accessible using my General Ledger data access setting. Without the right access, I will not be able to provide the correct parameter to the program at all. The profile option which controls this access and makes sure the parameters appear correctly is called GL Data Access Set. Additionally, as a pre-requisite, the value sets must be already configured correctly in the table APPS.FND_FLEX_VALUE_SETS in the field FLEX_VALUE_SET_NAME.

Finally, once the program "Load Segment Values and Hierarchies" completes, here's the lines in the log file that indicates a successful run.

2016.10.07 16:35:55:-- Running DRM version export...
2016.10.07 16:36:03:-- DRM Version Export finished successfully.
2016.10.07 16:36:19:-- DRM Hierarchy Export finished successfully.
2016.10.07 16:36:19:-- Calling PL/SQL API to import values from interface tables to FND ...
2016.10.07 16:36:19:-- Logs from GL_DRM_IMPORT_PKG.gl_drm_import_process in FND_FILE section
2016.10.07 16:36:23:-- Import process completed successfully

Once the program completes, the data will flow to Oracle GL - the data flow is explained in details here. That's all to this awesome integration process. Do you do your integration differently anyhow? Is there a easier or more flexible approach?

DRM EBS Integration Find Orphans in EBS

The Oracle Data Relationship Management application is integrated with Oracle E-Business Suite via the Oracle integration kit. However, it is often observed during integration with different EBS environments that orphans are sometimes created in EBS. It gives rise to issues during reporting for the transactions for that specific orphan and so becomes necessary to track the orphans and keep them identified for troubleshooting purposes.

The below query when executed in the APPS schema will help identify the orphans for the different segments. The query essentially checks the segment values that exist in EBS excluding the segment values that are tagged with a parent segment value belonging to a hierarchy - thus finding out the orphan segment values.

SELECT FLEX_VALUE_SET_ID, FLEX_VALUE FROM APPS.FND_FLEX_VALUES_VL
WHERE FLEX_VALUE_SET_ID IN ('1014867')
MINUS
SELECT FLEX_VALUE_SET_ID, CHILD_FLEX_VALUE_LOW FROM APPS.FND_FLEX_VALUE_NORM_HIERARCHY
WHERE FLEX_VALUE_SET_ID IN ('1014867')

Here the value 1014867 represents one of the segment hierarchies, and multiple values can be used in the IN clause to check all the hierarchies; thus identifying all the orphans in EBS. Once the orphans are identified, they need to be re-configured in DRM and the DRM EBS integration need to be executed one more time to bring back the segment value within a hierarchy in EBS.

Sunday, September 25, 2016

Oracle DRM Version Level Specific Access Security for Users

Security in Oracle Data Relationship Management is of immense significance in situations where I do not want to give all the application users the full visibility of all the data present. DRM provides the feature to restrict visibility of the existing DRM data at user-level which is amazing and provides the admin a lot of flexibility to work with.

The feature that enables this called DRM Node Access Groups. Say I want to give access to User U1 access to the DRM Version V1, but not to Version V2 and V3. My first step is to create a new Node Access Group, say AG1. I will add U1 to AG1, save and close.

Then after opening version V1, as I navigate to Nodes -> Assign -> Node Access, I can see the name AG1 under the categories 'Limb Access' which I need to set to the level of access I want to give (example Read, Edit, Insert, etc). Similarly for 'Leaf Access'. Then after Saving the changes, it is time to test how it's working for the User U1.

Now if I login with the credentials for User U1, despite the presence of 3 different version, I can only see  the Version V1. This thus achieves my goal of implementing user level version level leaf/limb level security for users with tremendous amount of flexibility.

DRM Batch Client Credentials Configuration

The Oracle DRM Batch Client is one of the most interesting features of the amazing master data management tool which enables a set of features without requiring manual intervention. Automation of many DRM activities can be setup daily using the DRM Batch Client.

The file that will store the credentials is named as drm-batch-client-credentials.exe - within this file the credentials for that specific Windows account will be stored. Interesting to note, if I login with a different Windows account and try to run the DRM Batch Client, it will not run because the credentials though already stored is for the other Windows account. So I need to login using that Windows account for my DRM Batch Client to to work successfully, otherwise I need to configure the DRM Batch Client credentials for all the possible Windows accounts that are present so that all the accounts can use the DRM Batch Client fine.

While setting up the DRM Batch Client, I would be interactively prompted with a few options whether I want to erase the existing credentials, or update the existing credentials etc. On selecting the the correct option and entering the credentials as requested, I am done configuring the DRM Batch Client successfully. On all future DRM Batch Client tasks, these are the credentials that will be used. This username thus can be found in all the audit logs going forward for all the activities performed by the DRM Batch Client.

DRM Batch Client Scripting for Automation

The DRM Batch Client is one of the most useful tools that complements the flexibility and features of Oracle Data Relationship Management. It has numerous features and benefits:
  • Perform activities by running a single DRM batch file
  • Helps is combining multiple DRM tasks into one
  • Scheduling of DRM jobs
  • Set dependency on different DRM jobs
  • Error capturing and logging of DRM processes
  • Archiving of DRM logs and email notifications
The DRM batch file runs closely integrated with Windows batch scripting as well. So we can make it even more powerful by using parameter files that can provide a central point of configuration control instead of making changes in individual files, I will come to that later.


There are DRM activities like Delete Version, Export, Import, Create Version, Restore Version, Add Node, Delete Node, Run Action Script which can be part of a daily activity. Say the admin makes changes on the main version, and for each of the different downstream systems, we need to maintain a separate version according to the specific requirements. These activities can all be automated using the DRM Batch Client commands. Once the script is ready, it will be a bat file which can be run in Windows and hence can easily be scheduled using Windows scheduler like any other task.

This process is very robust also in the sense that the next DRM task cannot proceed in the integrated script once the previous has completed. The errors in DRM can also be logged to a file using a error control mapping file - these can be the different batch client error values returned 0 1 2 3 100 200 210 220 and 230. Thus any error message can be interactively logged in the log file instead of only the error code. Example I can make the log file contain "Error occurred during initialization of the Batch Client" using the custom mapping file instead of error code 200, that will make it more user-friendly.

Archiving of the Batch Client logs is always preferred to keep track of the changes being made to a sensitive application such as DRM. It can be done by integrating the Windows commands to rename to the Batch Client generated logs to append them with timestamp and date, then move each of the log files to an Archive directory. This complete process of creating integrated scripts for various DRM Batch Client activities and then preserving the logs provides a very efficient way of managing the application.

For more details on Oracle DRM Batch Client, here's the documentation.

Thursday, September 22, 2016

DRM EBS Integration Fnd Flex Loader Value Updates

Related Article: Load Segment Values and Hierarchies

The Oracle DRM to EBS integration is one of the most widely used programs to maintain and refresh master data easily. The EBS concurrent program pulls the data from DRM and refreshes the existing values and hierarchies with the new information. There's a specific list of attributes which demands constant special attention for every integration, they are as below:
  • FLEX_VALUE_SET_NAME
  • PARENT_FLEX_VALUE
  • CHILD_FLEX_VALUE_LOW
  • CHILD_FLEX_VALUE_HIGH
Once the integration is completed, the values are expected to be showing up fine in EBS, more specifically in FND_FLEX_VALUES and FND_FLEX_VALUE_NORM_HIERARCHY. Any new values, hierarchy changes, description changes can be verified using these and few other FND tables under APPS or APPLSYS schemas.

Understanding how the data flows from DRM to EBS is useful and helps troubleshooting various scenarios quickly. Once the concurrent program Load Segment Values and Hierarchies completes fine, the values are updated and inserted in FND_FLEX_VALUES, i.e. loading is done in upsert mode. The table FND_FLEX_VALUE_NORM_HIERARCHY is truncated and loaded for that specific segment, rest data stays as-is.

The below diagram will explain schematically how the data flows, focussed mainly on the hierarchy tables in EBS. The interface tables which are truncated and loaded are GL_DRM_SEGVALUES_INTERFACE and GL_DRM_HIERARCHY_INTERFACE in the GL schema. The FND tables in APPS or APPLSYS schemas are refreshed after that.


For all the details regarding the integration, please refer the Oracle documentation.

Tuesday, September 20, 2016

DRM Data Validation Techniques for Better Quality and Control

The data in DRM is of extreme importance for any organization. Hence automatically the need comes to have a robust control over the data and making sure there's no invalid content in there. There are quite a few ways to achieve this based on what sort of validations and control I want to put in place.

Validations Required:

Let's assume I have a few types of validations as per my requirement to put in place for a specific hierarchy:
  1. Prefix of a node must be a specific sequence of letters
  2. Leaf node must be 6 characters in length
  3. Limb nodes must not have descriptions with hyphen
  4. Leaf nodes cannot have null description
Approach 1:

The DRM repository tables provide a range of options to find and format the relevant data easily. The main tables required for having all the required validation information above are below. By joining the below tables using SQL we can retrieve the required data.
  • RM_VERSION
  • RM_HIERARCHY
  • RM_NODE
  • RM_RELATIONSHIP
  • RM_NODE_PROP_LOCAL
  • RM_NODE_PROP_GLOBAL
  • RM_TRANSACTION_HISTORY
We have lot of options after identifying the nodes which do not meet the 4 validation criterion above. We can use Windows Batch scripting to extract the required data using SQL and dump it in a txt file and send it via mail using Windows Mail. This activity can be scheduled to run always after a specific interval (say 30 mins) to make it near real-time data monitoring. Note how we are completely bypassing the inbuilt DRM Exports functionalities here. It's totally driven by the repository tables, SQL and Windows scripting.

Approach 2:

DRM Exports can be done to a database table after setting up a connection string to the database. The highly preferred method is to immediately create a batch script to call this export via DRM Batch Client, so that we can leverage the immense power of Windows scripting in our design. The required columns (mostly from the System Property Category) need to be added to this Export and then saved in a database table (say X).

Once in the Oracle database table, we can leverage the full flexibility of SQL to capture our required information. If I am using any Data Extraction tools in my instance like ODI, Informatica PowerCenter, etc it becomes all the more useful. I can use the tools to create sample reports, format data extracts, and even send out notification mails to relevant people with this information and request any corrective action. Such ETL jobs can be scheduled to run in tandem with the batch script that refreshes the table X. This way always the data will be latest in the DB table.

Approach 3:

If I do not want to touch the DRM repository tables, and also do not want to create any new database table, there's still another way to generate notifications if data meeting the criterion is found. Create a DRM export which will generate a delimited flat file with the required data that is meeting the 4 validation rules. Directly read this file via an ETL tool like ODI, and send it via mail.

This approach assumes that the DRM export will be able to handle all the validation rules, which it is usually capable of doing. And sending mail using ODI is pretty straightforward, but it is essential that I have access to the path where the DRM export file is being generated, and also ODI must read the correct file from that path. 

Monday, September 19, 2016

Oracle BI Apps ODI integration with DRM - Single Child Multiple Parents Issue

The Oracle Business Intelligence Applications is a very useful product for many organizations and often Oracle Hyperion DRM is the master data management tool used for managing master data via integration through Oracle EBS.

So the master data flows from DRM --> EBS --> BI. But there is an interesting issue that crops up sometimes. The scenario is in the same hierarchy the same segment value can belong to multiple parents in DRM and EBS (which is logically incorrect, but technically feasible), but when such a scenario is integrated with BI Apps using ODI, the entire tree (say X is the duplicate member, and the two occurrences of X roll up to a common node Y, where Y may or may not be the top node of the hierarchy) will get deleted, and users will not be able to see any segment values appearing under that tree (i.e. under Y) since all those does not exist at all in the Dimension Hierarchy table.

To handle this scenario as best as possible, I need to understand the flow how the hierarchy is maintained in BI Apps. The flow of the data is as below:

EBS -> DW.W_GL_SEGMENT_HIER_PS -> DW.W_GL_SEGMENT_DHS -> DW.W_GL_SEGMENT_DH

Here DW represents the BI data warehouse schema. The table W_GL_SEGMENT_HIER_PS is not of much use to me since all the required fields to capture the issue is not yet generated there. However, in W_GL_SEGMENT_DHS I am getting all the required fields to find the issue. The below query will help in capturing the error data (i.e. the value of X):

SELECT * FROM W_GL_SEGMENT_DHS DHS
 WHERE  (INTEGRATION_ID, HIERARCHY_ID, HIERARCHY_VERSION, DATASOURCE_NUM_ID) IN
 (SELECT DISTINCT INTEGRATION_ID, HIERARCHY_ID, HIERARCHY_VERSION, DATASOURCE_NUM_ID  FROM W_GL_SEGMENT_DHS GROUP BY 
 INTEGRATION_ID, HIERARCHY_ID, HIERARCHY_VERSION,DATASOURCE_NUM_ID HAVING COUNT (1) > 1);

The table W_GL_SEGMENT_DHS is truncated and loaded every time during the loading process. And the deletion of the erroneous tree (i.e. Y) happens at the end. So I need to capture the error record after the loading and before deletion happens in W_GL_SEGMENT_DHS.

I have two options for this:

    1. Modify the ODI load plan to add a new step before the deletion happens - this step will save the duplicated error data in some table or send out an email notification with that information

    2. Create a database trigger which will execute before any deletion happens on W_GL_SEGMENT_DHS - this step will save the duplicated error data in some table

This will help troubleshoot the hierarchy issues faster and subsequently discuss with the DRM team for any necessary action.

Sunday, September 18, 2016

3 Essential Sanity Checks for DRM EBS Integration

The integration between Oracle's amazing master data management tool DRM and Oracle EBS is a pretty stable process given the configurations and pre-requisites have been taken care fine. But sometimes the concurrent program gives authentication error or invalid credentials error which can be an exhausting process to debug.

So we'll focus on the 3 most important places to make sure the setup is fine. This sanity check is to make sure the setup is as expected when using SSO with Mixed Mode Authentication Enabled. Let's say the name of the user is EBSIntegrationUser.

1. The user should be present in Weblogic Administration Server Console (port 7001 usually) in the below path:

        Home -> Summary of Security Realms -> myrealm -> Users and Groups -> EBSIntegrationUser

2. Let's say the name of the pool created in the Authentication side is called XYZ. The user should be present in Workspace (port 9000 usually) in the below path:

        Shared Services -> User Directories -> XYZ -> Users

Search by the value EbsIntegrationUser in the field 'User Filter', the user should be present.

3. Finally in DRM Web Client, the EbsIntegrationUser user should be present in the below path:

        Administer -> Users

Once the user is present all the three places, the integration should be working perfectly fine without any hiccups.