Friday, March 25, 2011

Simplified Essbase Administration - Part III

Have multiple applications? Not sure how to transfer or import data to multiple applications that contain different members/dimensions?

If you need to map multiple members from a source application to an equivalent member in a target application, you must first define the source members.

I. Define Source Members

Option 1 – Shared alternate hierarchy

       a. Create new parent member with the same member name as the target application and set property to  ignore to prevent duplication of data
       b. Add existing members as children of new parent member

- Requires additional maintenance of the dimension structure
- Does not compromise data retrieval
- May increase calculation time

We all want our end-users to be self efficient and I think they may be able to figure out the mapping once they drill-down the hiearchy. I have used alternate hierarchies in large applications and I have yet to compromise calculation time.

Option 2 – User Defined Attributes

       a. Assign UDA’s to source members
       b. Create a new member and set property to ignore to prevent duplication of data
       c. Develop a calc script fixed on the UDA and populate the Sum of the UDAs to “new member”

- Requires additional member property maintenance
- Does not compromise data retrieval
- Does not compromise calculation time

If data is constantly re-stated, re-classed or adjusted you must remember to re-calculate. Throw in a batch script and call it a day – I’m just saying.

Now, the sophisticated/advanced Essbase user will cleverly filter their retrievals on UDA’s and magically list the accounts that make up the mapping.

Option 3 – Attribute Dimension

        a. Create attribute dimension property

The attribute dimension gives you the same results as the alternate shared hierarchy, behaves similar to UDA’s. What is the difference? Attributes may significantly compromise report retrievals.

Which one is the best option?

Every environment is different. Performance is significantly dependent on the application configuration and I strongly recommend you test.

II. Data transfer and Data Export Options

Now, we are ready to explore options on how to populate data from the source application to the target application. The options are as follows:

Option 1 – @XREF built in function transfers data from source to target application via calc script

- Excellent option for consistent structured hierarchies
- Leverages built-in calculation functions
- Data flows seamlessly if the structures are one in the same
- Can be easily batched

If you fix at level 0 in a common dimension i.e. @LEVMBR(“ENTITY”,0) and it is not an exact mirror of the source application, duplication of data will result at the level where the dimension hierarchy differ. Therefore, the member names and levels must be identical.

The granular the fix statement, the quicker the performance.

Option 2 – Report Script Exports data to a flat file (see Simplified Essbase Administration Part II)

- Export data from specific Alias tables, UDA, levels
- Specify export column/row format
- Exclude members from a dimension
- Re-name members
- Can be easily batched

Report script performs efficiently when focused on smaller data sets. Larger data sets may take longer to process.

The rename built-in function works brilliantly if you need to map specific members to an additional dimension member in the target application i.e. source member 1234 must map specifically to target member 4567 and COLA {“RENAME” “1234”} “4567 COLA” . This option is best used if the mapping is consistent; however, unique to specific members in a different dimension. This data file can be easily parsed using the Split function within a data load rule.

Option 3 - DataExport – Exports data to a flat file

- Exports data to text file extremely fast
- Leverages built-in calculation functions
- Can be easily batched

DataExport outperforms report scripts. It is engineered to export data in the order that it is organized in the database. Whereas, report script performance is dependent on coding and may retrieve the same blocks multiple times.

DataExport does not support attributes, aliases, member re-name and data format options.

Every environment is different. Performance is significantly dependent on the application configuration and I strongly recommend you test.

This is the beauty of Essbase, the options are available at your disposal.
Ponder It is up to the individual to creatively leverage the tool.

Saturday, January 22, 2011

Simplified Essbase Administration – Part II

Data export via Essbase Report Script Object

Not sure how to export data for a specific Alias table, UDA, exclude members from a dimension or perhaps you need to Re-name a member for transformational purposes?

Below is a simplified “ Basic Master Report Script”. It is a simple as it seems below

It is all in the DBAG – the DBAG is your friend and remember to keep it simple..

Saturday, October 16, 2010

Simplified Essbase Administration - Part I

Why is it that many organizations have invested millions in financial analytical tools that not only offer visibility to their data but also promise business process automation and I am hearing the term “MANUAL” frequently these days  e.g. manual sourcing and transformation, manual loads, manual validation, manual e-mail notification of errors, manual migration, manual report delivery, and manual database object updates. I am exhausted of listening to these manual tasks … are you?

I'd like to believe this misfortunate manual business process administration is due to the current economic condition that has forced organizations to eliminate or consolidate functional roles, cut back on training and development resulting in poor business system administration, poor data analysis and poor reporting.

As a developer and system administrator, the good news is that all of these manual processes can be automated. The key to successful business application automation processes are as easy as going back to the basics:

- Business Acumen - What is the nature of the business?
- Data Process Flow - Where does the data source reside and what are the dependencies?
- Customer Needs - What do the customer need to make effective business decisions?
- Resources - What are their strengths and where can you leverage their talent?
- Oracle EPM Analytical and Reporting tools – What are the strengths of the application?

It is easy to lose site of the basics when one is wearing many hats. Understanding the strengths of the Oracle EPM applications, will give you a clearer view and understanding of:

- How you can align business knowledge to begin the build of a database?
- How you can leverage the tools to source and transform data?
- How you can align your resources?
- How you can leverage the strengths of the application and minimize the burden of administration?
- How you can deliver data and reporting timely to meet your customer needs?

If you are using multiple Essbase and Planning applications you can significantly reduce the burden of administration by automating database substitution variables, data loads, data transfers between multiple cubes by leveraging the XRef function, report scripts and/or Data Export via Maxl.:

I. Automating Substitution Variable Updates
  a. Define Global Variables - Create a batch file that contains the names of your analytical servers, databases , log directories and load rules as follows:

REM        This file sets Global variables to be used for all Month End administration functions
REM        i.e Sub Variable, Data Load, Dimension Build, Calculation Scripts and Report Scripts

Set Log_Dir=C:\Hyperion\Automation\Admin\Jobs\RPTG\Logs\
Set Log_Dir2=C:\Hyperion\Automation\Admin\Jobs\Plan\Logs\
Set RPTG_Process_Dir=C:\Hyperion\Automation\Admin\Jobs\RPTG\Bat
Set PLAN_Process_Dir2=C:\Hyperion\Automation\Admin\Jobs\Plan\Bat
Set RPTG_LoadBat_Dir=C:\Hyperion\Automation\Admin\Jobs\RPTG\Bat\LoadFile_Bats\
Set PLAN_LoadBat_Dir=C:\Hyperion\Automation\Admin\Jobs\PLAN\Bat\LoadFile_Bats\
Set Fin_Server=
Set Plan_Server=
Set Database=Finance

II. Substitution Variable Batch Script –

a. Create a central batch file that contains all of your substitution variables for all applications as follows:

REM ********************************************************************
REM ********************************************************************


REM **** ***************************************************************
REM     UPDATE FORECAST SUB VARIABLE REM*********************************************************************


REM ********************************************************************
REM ********************************************************************

Set Load_Year=FY10

b.  Create a Maxl file directory that contains all of your Global maxl scripts i.e. c:\Hyperion\Automation\Mxl\Set_SubVar.txt

/*Global script can be used for any applications */

spool stderr on to "$1";
set message level error;
login $2 $3 on $4;
iferror 'ErrHandler';
alter database $5.$6 drop variable "$7";
alter database $5.$6 add variable "$7";
iferror 'ErrHandler';
alter database $5.$6 set variable "$7" "$8";
iferror 'ErrHandler';
spool off;
define label 'ErrHandler';

c. Create a batch script to update substitution variable as follows:

REM **********************************************************************
REM    SetDB_SubVars.bat
REM    Special Instructions:
REM    Update Monthly_SubVariable.bat with current month
REM    variables prior to execution
REM **********************************************************************

REM *********************** SET VARIABLES SECTION *************************

Call C:\Hyperion\Admin\Jobs\Bat\Monthly_SubVariable.bat
Call C:\Hyperion\Admin\Jobs\Bat\Global_System_Variables.bat
Set Load_Dir=C:\Hyperion\Admin\Jobs\RPTG_Load\Data\%Load_Year%_%Load_Cur_Month%

REM *********************** UPDATE RPTG APP SUB VARIABLE SECTION *********

Set Application=%App1%

REM ***************************************
REM     Update Current Month Sub Variable
REM ***************************************

Set SubVar=Cur_Month
Set SubVal=%Load_Cur_Month%

C:\Hyperion\AnalyticServices\Bin\essmsh.exe C:\Hyperion\Automation\MXL\set_subvar.mxl %Load_Dir%Set%Application%_%SubVar%.txt  %USER% %PSWD% %Server% %Application% %Database% %SubVar% %SubVal%

REM ****************************************************
REM ****************************************************

Set SubVar=FCST
Set SubVal=%FCST_SCEN%

C:\Hyperion\AnalyticServices\Bin\essmsh.exe C:\Hyperion\Automation\MXL\set_subvar.mxl %Load_Dir%Set%Application%_%SubVar%.txt %USER% %PSWD% %Server% %Application% %Database% %SubVar% %SubVal%

III. Audit Trail –

a.  Not Certain if the substitution variables are updated. You can either log-on to the AAS console or check the Load Directory file initiated in the SETDB_SubVar.bat file to verify the substitution variable updated successfully.  The results should appear in the log file as follows: 

MAXL> display variable on database RPTG.Finance;
application database variable value (The log file will display pre and post Substitution Variable)


RPTG Finance      Cur_Mo                Jul

RPTG Finance      Cur_Yr                 FY10

RPTG Finance       FCST                   FCT_AUG

MAXL> alter database RPTG.Finance set variable Cur_Mo "'Aug'";

MAXL> display variable on database RPTG.Finance;
application database variable value


RPTG Finance      Cur_Mo             Aug

RPTG Finance      Cur_Yr               FY10

RPTG Finance      NEW_FCST      FCT_SEP

Tuesday, August 3, 2010

Dodeca Review







  It is clear that there is still an outcry and a desperate need to reduce cost, streamline processes, implement robust controls and most importantly leverage existing technology to facilitate the demands of the finance organization.

If we conduct a survey to identify where most organization continue to invest their dollars, my guess would be in IT. The question is, “has it become out of control that it has hampered the financial analyst team and now they have to rely on additional resources to provide the information that they need for forecasting, budgeting, audits, special requests, etc.?”

Have you analyzed how much time your resources are utilizing in gathering data?

What medium do analysts use to gather information from Oracle, EPM, ERP and other relational databases once the data has finally become available to the analyst?

The frequent answer is: EXCEL   :-O

Why are financial organizations utilizing a significant amount of their time in gathering data in this advanced era of information technology and not enough time to analyze the data thoroughly?  So now, the data finally becomes available; unfortunately, month-end reporting, forecasting or budgeting season creeps in and the “analyst” puts the analysis on hold until after the madness is over - but wait, they can’t complete the analysis because it is now either quarter or year-end reporting. Unfortunately, their time is not utilized efficiently and definitely not effectively.

Who is Accountable for the integrity of the data? Is there an opportunity to improve controls within both the IT and Finance organizations? There is too much room for error and what or how much would that error cost you?

I have been in the Financial Reporting and Financial System role for over a decade and I can relate to both IT and Finance organizations frustrations. I am alerting you that this is not a sales pitch and I am not one to easily convince, those who know me will Strongly agree.

Tim Tow from Applied Olap Inc., was kind to present Dodeca to me. A one hour presentation turned into a two hour webcast meeting. Dodeca is designed to connect to any relational database via JDBC and Essbase database via a front-end spreadsheet that "looks and acts" just like EXCEL. Tim’s team designed a front-end spreadsheet software with drill down capabilities to multiple relational sources including Essbase and other third-party applications. If you want to take it to another level, you can take it as far as drilling down to view approved invoices. This was the highlight of the presentation in my mind. Needless to say, the data retrieval was seamless.

As I ask questions and continue to discuss real challenges faced today by finance organizations, I am highly convinced that Dodeca will solve most of the challenges your finance organization is facing today in terms of gathering and analyzing data from multiple sources. Below are a few examples that I encountered or have witnessed within the finance organization:

Time Consuming Processes

Dodeca Benefits

Gathering data for Audits

Drill back to source system instantly via spreadsheet

Provide Hard Copies of invoices

Drill back to source system instantly via spreadsheet

Validate analytic, planning and
reporting data to source system

Drill back to source system instantly via spreadsheet

Commentary Notes

Commentary notes are stored within
relational/Essbase cubes and can be
retrieved instantly via excel from relational
or Essbase

In summary, Dodeca is cost effective, it leverages existing technology and in my opinion addresses the concerns and frustrations that most finance organizations face today.  I am an advocate for streamlining processes, implementing controls, leveraging existing systems and reducing cost – then again who isn’t?  ROI’s should not solely consider the business bottom line; most importantly, it should account for your most valuable investment and that is people.   Your analysts will appreciate the investment and they will be able to utilize their time effectively.

See Cantor Fitzgerald case study  Dodeca

O Lord