Tuesday, July 10, 2018

SNP_SESSION - The Data Analyst’s Dream Table of Oracle Data Integrator

The Oracle Business Intelligence Applications stack provides an array of tools during the implementation, and each of them comes with its rich set of features. The awesomeness comes when we get to experience several business use-cases and scenarios, analyze the metrics and data, interpret them along the lines of the business process, and ultimately when also we encounter product limitations - and discover amazing ideas to make our lives easier. Happiness...as deep the word sounds, becomes almost synonymous in such cases, as we make breakthroughs through innovative complex ideas.

All of us are aware of this repository table SNP_SESSION in ODI, the unattractive component that shows lots of rows and numbers and dates, often just helps only to find a specific information and then we are done with it. In an environment where overnight several incremental loads consume around 6 hours daily, it generates lots of logs and data and writes to all of the repository tables, including session level details in SNP_SESSION. All information of every session like rows processed, rows inserted, rows updated, period and filter variables used, duration taken, start time, end time - are logged in SNP_SESSION.

To understand the prowess of SNP_SESSION, we need to get to a few questions first, and then the train of thoughts and discoveries can follow. For a session am interested in, what is the behavior of this session over the last 4 months? Does it have any pattern during specific periods? Does it have any relation with other sessions’ attributes? Does a data volume of another session or duration of a different session influence this? Since rows processed do not always proportionately impact session durations, does a % variance of a different session impact the session am interested in to an extent? Say in my today’s load, can I find which scenarios from the past repeated today, say with similar data volume or % variance in duration? Can I foresee untold information or what is going to happen as the loads progress by real-time analysis of the data?

It’s been very exciting to know over the last few weeks that all of the above questions can be answered, tremendously by using SNP_SESSION, and with some help from SNP_LP_INST and SNP_LPI_STEP. We have implemented a solution which is now in its final testing phase with live data, and will heavily complement manual human monitoring activities - by providing root causes before the impact happens, and providing additional insights into the application which otherwise often gets overlooked due to the vastness of the system.

We have calculated the weighted average of each session duration, load plan wise, over a period of last several months, with an algorithm that took almost a week to develop after a lot of brainstorming. Then came the perilous task of calculating the standard deviation of the weight samples so as to help calculate the accuracy of our analysis, but it finally happened! Next came analyzing the data volume - with NB_ROW, NB_INS, NB_UPD already available in SNP_SESSION and waiting for us. Comparing today’s volume with the weighted average for the corresponding session itself started giving insights, but we wanted more. We asked what next, what if, why now, what then, and each metric opened up new paths before us to explore.

Each field of SNP_SESSION gave rise to almost 3-4 metrics of it’s own, giving rise to real-time daily calculation and analysis of the datasets during executions, and the impact it causes to the parent load plan. During the execution of each load, we are able to get insightful emails consisting of detailed analysis of the load - and a similar day in history if today’s scenario matched, and what happened then, and how the day went with the consequent activities.

But again, we need a single indicating factor, giving rising to the calculation of probability - as a single point of figure to indicate the possibility of actual realization of our real-time predictions, for each prediction. Hence more brainstormings followed, and finally now every email gets tagged by a probability that makes it so much more meaningful. Thus it becomes so true in today’s world, data is only useful when we know how to process it to our benefit - and it will truly continue to become more and more the case in future!

Saturday, June 09, 2018

Sensational Sequoia

The exquisite Sierra Nevada, in the magnificent slopes of California, is the home to some of the oldest living things on earth – the dazzling Giant Sequoia trees, some as old as 3500 years. The stunning facts of these exquisite trees were as overwhelming to me as a 10-year old, as it is today – and the long dream of experiencing these breathtaking wildernesses came true few days back.

Driving via Three Rivers Visalia through the wild fascinating roads uphill to the Giant Forest is an experience in itself that takes some time to sink in. The gradual change in the landscape around us as we approach is magnificent, the appearance of the trees, the heights, the feeling of suddenly shrinking is so electrifying – one just gets dazed.

The role of forest fires on the lifecycle of the Sequoia trees is interesting. Due to reduction in the number of natural forest fires today, the Park Service executes controlled fires to remove competing vegetation to allow Sequoia seedlings to germinate - which has difficulty otherwise. The Giant Forest Museum is a storehouse of some gripping figures and facts – the Sentinel tree greets all the visitors as we enter the museum, a statuesque of over 250 feet high.

As we went down from the parking lot via the Sherman Tree Trail, we could feel the energy and excitement around us – the weather felt perfect for visitors, not a surprise given that the Sequoias thrive best in humid climates. When we reached, standing in front of the General Sherman Tree – the magnificence and awe was too much to absorb, and took an hour of lingering around; to really believe it's happening for me. A part of me felt complete. For me, it was always more than a tree. That day I realized it for real. The largest living tree – the all so familiar impressive artistry that we all have craved to see for so long after reading about it in books – right before us.

The enchanting walks around Sequoia forest, the rousing emotions, the captivating views, the grace and elegance of surviving thousands of years withstanding fires and storms, converted it into a spellbinding magical forest – resulting in an indelible weekend.

The wildlife comprises of hundreds of black bears, rattlesnakes, mountain lions, and some friendly rodents. The visitor guide below is really well versed with all the different situations that our curious minds wandered on, and it was interesting to read.

Saturday, January 06, 2018

wss_username_token_service_policy use case for Oracle EBS and DRM Integration

The year of 2017 was an incredible year with tremendous ups and equal downs both at a professional and personal level. However it has again helped in garnering some very fruitful insights regarding everything around me and to plan few things better ahead. This is not to undermine any of the other years like say 2016 or 2015, but it is just that the impact of some of the incidents and the decisions that I have done and taken in 2017 will be changing my life forever. Let's see what 2018 has in hold! Wishing you a very happy new year ahead!

The web services play an important role in the authentication process for the EBS and DRM metadata integration. Few months back during the DRM repository movement we came across a few challenges with the MDS schema database host info which enlightened a few areas and paved way for some more personal study. After the initial setup, once the oracle-epm-drm-webservices WSDL is up and running fine, we need to attach a security policy to this application. This will ensure that clients like the program "Load Segment Values and Hierarchies" makes the request to the WebLogic Server to get the system generated token for the user (say EbsIntegrationUser) which can be passed to DRM. Then DRM can validate that token with OID to verify authentication instead of requiring a username/password.

Oracle Web Services Manager (OWSM) will need to be deployed first in the same EPM Server and domain where DRM Web Service is deployed. The database repository schema name for OWSM is set to a different value and usually ends with *_MDS which corresponds to Metadata Schema.

Once done, the new policy needs to be created in Weblogic under Farm_EPMSystem  Weblogic Domain  EPM System  Web Services  Policy Sets. Then in Step 3 for "Add Policy References" we need to select and assign wss_username_token_service_policy.

The details of the steps to be followed can be referred here. There are other policies also that can be used as per the scenario faced, however for this specific integration an authentication token suffices. Here are some more details related to authentication and uses of web services.

The ultimate test will be to make sure the token name is visible in the WSDL url. If the attachment of the policy is done fine, it will reflect in the URL. Else there's another approach to manually attach the policy which is kind of a workaround and done only in exception scenarios, which we faced few months back.