Monday, August 07, 2017

A Trip to the 1850s

The last few months have been incredible. It isn't something unusual to come across great movies every now and then, but the impact of a great book is so much more. It all started with the sci-fi 11.22.63 that held me spellbound for days till the last line. Time travel had always attracted me, not to mention how many times I watched Back to the Future every time it's on.

Then flooded with amazing books on the native American history, visits to Dakota last year, witnessing the struggle of man over time, it actually starts to feel so much more than just facts. Every trip to a museum, every detail of the significances of each antique, emphasizes so much more regarding the nature of man than just that period. It's how we suffered and how we understood our surroundings, the cause and effect repeatedly over time that shaped us as who we are today. We are blessed today that we have documented collections of historical events that can enrich us into understanding who we are, what we have done, and what we can do. It's not about the landscapes, monuments, attractions, sceneries, instead it is all about the people and the cultures that have played role in shaping our today's world.

A recent visit to the historic site of Abraham Lincoln's home sparked several realizations, specially after reading the book Team of Rivals. It's amazing to know how someone could endure and tolerate so much in a single lifetime, and that too without letting it change the flow of life and the challenges that it brings with it. The visit felt an icing on the cake after finishing the book that had so much details that one can ever ask for.

It is always a pleasure to be knowing what we could never have known had we not lived in the time. Maybe it's appearing too much obsession with the past, but maybe we really to need to know the past well to understand ourselves and create a better future. Really do not want some of the things in history to repeat itself!

Wednesday, May 24, 2017

OBIEE Dashboard Export to Excel PDF HTML using Custom Links

Oracle Business Intelligence Application in it's out of the box form do not provide any solution to export a dashboard from another dashboard using custom links. But given the available resources, it is possible to create custom links to export a dashboard in Excel format, PDF format, or HTML format. To do this we need to create few snippets of HTML and Javascript coding, and can add some cool animations as well if need be.

To achieve our goal, we first need to have the name of the dashboard and the tab which we will be using to create our extracts. Once that is ready, the below HTML and Javascript codes will help create few export links that will extract exactly what we want.

<div id="exports_hdr" onmouseover="document.getElementById('exports_dtls').style.display = 'block'; document.getElementById('exports_hdr').style.display = 'none'">
<font color = "mediumblue" size = "2"><u>Export</u></font>
<div id="exports_dtls" style="display:none">
<img style="vertical-align:middle; margin:0px 0px 0px 5px" alt="" src="res/sk_blafp/catalog/exporttoexcel_ena.png"><a name="ReportLinkMenu" title="Export to different format" href="javascript:void(null)" onclick="return saw.dashboard.exportToExcel('<<dashboard_name>>', '<<dashboard_to_be_exported>>', true);"><font color = "mediumblue" size = "2">Excel 2007+</font></a>
<img style="vertical-align:middle; margin:0px 0px 0px 5px" alt="" src="res/sk_blafp/catalog/printtopdf_ena.png"></img><a name="ReportLinkMenu" title="Export to different format" href="javascript:void(null)" onclick="return PortalPrint('<<dashboard_name>>', '<<dashboard_to_be_exported>>', true, 'pdf');"><font color = "mediumblue" size = "2">PDF</font></a>
<img style="vertical-align:middle; margin:0px 0px 0px 5px" alt="" src="res/sk_blafp/catalog/printtohtml_ena.png"></img><a name="ReportLinkMenu" title="Export to different format" href="javascript:void(null)" onclick="return PortalPrint('<<dashboard_name>>', '<<dashboard_to_be_exported>>', true, 'html');"><font color = "mediumblue" size = "2">HTML</font></a>

Once this is working fine, it will show a cool 'Export' link, and when we do a mouse hover it will instantly show three Export links for Excel format, PDF format, and HTML format.

Let me know if you have any other ideas to export in different formats!

Sunday, January 08, 2017

DRM Data Loading Automation using ODI

The Oracle Hyperion Data Relationship Management application is a pretty flexible tool and most of the activities that can be done manually can be automated (using ETL tools like ODI, Informatica PowerCenter, etc). Recently I was presented with a business scenario by one of my readers which is pretty interesting yet tricky. Yes it involves a request to automating a manual process as you might have already guessed.

The requirement goes as below:

The source systems are DWH and HRMS. Hence, the data/master data which we will get would be through staging tables to DRM. The source system would send the full data every month on month to the staging table and from where DRM has to pick. The comparison process in table (interim changes) first should go to the business user in an email and once the reply is ok, it should be incorporated in DRM and then publish to down stream applications.

To proceed with this activity, first and foremost we need to be familiar with DRM action scripts - which will involve generating Add, AddInsert and Move scripts. In addition to that, if we have an ETL tool in our environment (like ODI, Informatica PowerCenter, etc) it's beneficial - else we can also use SQL to achieve our purpose as we will see next.

Approach 1:

Since the source systems DWH and HRMS are sending the full dataset every month - there needs to be a mechanism to detect the changes arriving at the staging tables in DRM end. This is where the Changed Data Capture (CDC) feature of ODI can come in handy, or else if we are using any other equivalent tool this feature will come in handy. This changed data capture can also be achieved using standard query languages so that the records that have changed/inserted can be flagged (say U/I) accordingly.

This set of records can be sent as an email (with attachment if data volume is too big) to the respective business users for approval. Let's say we have a field called "APPROVED" in our table and all these records are defaulted to N. Once the approval comes from the users - these fields need to be set to Y for the approved changes manually by the IT team. Until then these records will sit idly in the staging table without propagating to DRM. This manual flagging step unfortunately cannot be avoided since there is no integration yet between DRM and email server.

Now, once the records (say 8 out of 10 got approved, so we have 8 Y's and 2 N's) are flagged Y - they will be considered for the next steps of processing. The next steps are to create the Action Scripts for DRM to create the Add and Move scripts. Assuming the DRM version and hierarchy are already existing - the scripts formatting once done as per the required format it will be ready to be loaded to DRM. If the target txt/csv Action Script file is not in the same server where DRM can read it from, it has to be further SFTP ed to that location. Or else we can use SQL Loader to fetch the data from the Action Script table to the DRM server and then schedule Windows Batch Scripts to load the action script. There are quite a few ways this can be done and all depends on the environment setup we have in place.

Approach 2:

If we want to keep the DRM staging table untouched by the manual update of Y and N flags - we can fine tune our approach to a control-file based design where the member names that are approved, will be kept in a file - that will be used for lookup. So say N1 and N2 are approved, the file will contain the below data:


Here X denotes the batch id (yes, we need to create a "BATCH_ID" field in the DRM staging table which should be updated with a sequence generator type of ID for every record flagged U/I for that specific load run) - which should be unique for every load.

Let's see why we need this BATCH_ID field. If tomorrow our load runs and detects a change for the same member N1, and it finds N1 already in the file, it will get processed without waiting for approval. So since the batch id will change in the DRM staging table for every run - we do not have chances to unapproved changes of the same member to flow through since the next steps will check if both the BATCH_ID and member name are present in the control file. Instead of a file, this can also be made into a small table with only 2 columns which the IT team will have update/delete privilege to manage it daily.

This approach will provide a more safer control to the IT team and avoid the risk of accidentally modifying sensitive application tables or objects. Anyways working with DRM is always extremely a cautious activity day-in and day-out where a simple typo can cause widespread implications to multiple downstream systems - so minimizing risk and avoiding manual errors is preferred to be a part of any design.

How do you prefer to automate your DRM data loading processes?