Wednesday, December 23, 2009
What's the right level of detail to include in Hyperion Planning
Often, a clients first instinct is to include the level of detail that exists within their ERP system for actual data. When confronted with this desire/requirement, I like to encourage spirited conversation between project sponsors and subject matter experts by having them address the following two items.
1) Does a desire exist across the majority of the user base, to plan/budget at the same level of detail for which actual data occurs?
2) Does this level of detail for formulating a plan/budget coincide with management’s performance management objectives?
I have found that the best practice for the level of detail contained within a planning application or applications is that it should reflect management’s performance management objectives. Detail for detail’s sake wastes resources, but arbitrarily limiting the level of detail is not a best practice if it limits analysis against important factors that must be controlled.
If the driving desire to include this level of detail in the Planning application is only to facilitate analysis against actual data and business owners have no desire to perform data entry to each and every travel expense account for example, then other alternatives exist that will not encumber the Planning application(s).
Successful Planning implementations provide the level of detail within the planning model that business owners are accustom to and desire to formulate their Plans/Budgets. Meaningful reporting and analysis often may require further details then what users desire to plan to. This disconnect can be addressed through properly architected reporting applications, partitioning, or drill through.
When addressing the level of detail that is to be included within a Planning application, answering the two fundamental questions above has lead me to successfully architect and implement in excess of 30 Planning applications. More often than not, most clients when really pushed to consider what is really necessary from a Plan/Budget perspective versus what is necessary from a Reporting/Analysis perspective will arrive at a Planning application that is more summary level.
While meeting management’s performance management objectives for Reporting and Analysis shouldn’t be ignored, encumbering users with unnecessary levels of detail within a Plan/Budget only introduces potential for issues with performance, maintenance and end user satisfaction.
The mantra that I as well as many other seasoned EPM professionals subscribe to when asked to architect a Planning application is that, “Provide the level of detail necessary to formulate the Plan. Planning should be used as a means to facilitate Plan/Budget vs. Actual analysis, but if the level of detail that this analysis occurs at is beneath the level of Plan formulation then this analysis should be done outside of Planning (i.e. Essbase, partitioning to a reporting app, drill through).”
Wednesday, December 16, 2009
Oracle partners with BICG on BI Applications Hands-on Workshop
BICG partnered up with Oracle to conduct a hands-on Oracle Business Intelligence Applications workshop for Oracle clients.
Oracle provided the meeting-space, latest and greatest view of the technology, and business value gained when implementing the technology presentation. This allowed all attendees to see first hand what is happening and what is coming. You had to be there!
As the only recipient of the coveted Oracle BI/EPM Partner of the Year designation in association with the Oracle Innovation Awards at this year's Oracle OpenWorld, Minneapolis based BICG was selected to provide the "hands-on" training. BICG has long been recognized as a top performer when it comes to executing "experience based" success in both product training and project design, development and implementation. BICG was also presented to the group on "Developing a Corporate Business Intelligence Strategy". The presentation directly addressed potential pitfalls and recommendations surrounding Business Intelligence projects. BICG chose 30 year BI veteran Steve Tinari to lead the training and make the BI Strategy presentation.
Registration for the workshop was required and limited to 40 seats. This is all the space there was in the meeting room. Registration was sold-out in days and a wait-list was quickly developed. Last look had 25 clients waiting to claim a seat.
BICG has been asked to continue conducting this hands-on Oracle BI Apps workshop at strategic locations across the US and abroad. Registration for upcoming workshops can be found at http://www.biconsultinggroup.com .
BICG Workshop Advice: register early for the upcoming events and don't forget to bring your laptop. BICG uses a virtual training method that allows you to connect to their environment using only a web browser!
Wednesday, December 9, 2009
Oracle OpenWorld Video: Spotlight on Oracle BI at LCRA
During Oracle OpenWorld, Keith Niesner from the Lower Colorado River Authority (LCRA) provided an overview of business intelligence at his organization.
Niesner highlights the use of Oracle Business Intelligence at LCRA while detailing the challenges faced during deployment and lessons learned..
Friday, December 4, 2009
GL Segments in the BI 7.9.6 GL Balance Sheet subject area
The OBIEE 7.9.6 GL Balance out-of-the-box subject area provides for segment reporting. A segment reporting hierarchy provides users with a drag-and-drop way to produce segment reports. Segments can be any subset of a dimension, but in our example let’s use the set of accounts in the Gross Margin report.
The value of managing the hierarchies for the segment is that multiple levels of the segment can be created, from most aggregated to most granular, and can be saved and exposed for drag-and-drop off the presentation layer in Answers and Office Add-Ons. For example, a user would be able to take Level3 of the Gross Margin Dimension, the Ending Amount from the Fact table, and get the Gross Margin report. Level4 can be just slightly different than Level3 for some reason specific to Gross Margin reporting, and that is perfectly fine since this hierarchy is dedicated specifically to Gross Margin reporting. The user should also be able to take Level 1, and then drill down to levels to see further breakouts of gross margin lines. Expense might break down to lower levels on the hierarchy like Indirect, Direct, and then further down at further levels to more granular groupings of accounts.
It is interesting to see that the GL Balance Sheet subject area has delivered 10 different account hierarchies. There is an assumption that there will be a need for many hierarchies for various segments in reporting. The hierarchies are labeled GL Segment1 through GLSegment10.

What's the value of hierarchy management?
Hierarchy management introduces the notion that hierarchies need neither be fixed, as is often seen in OLTP systems, nor unmanaged, as suggested by needs of ad-hoc end user reporting needs. In fact hierarchies are so important that they deserve the opportunity to have any number of iterative “what-if” scenarios to quickly come to the truth hidden behind their aggregates. In order to accomplish this, the hierarchy is abstracted away from both the operational and analytical systems (i.e. OLAP). Since most data distills into groups with lists of members, the art and science of manipulation of these taxonomies becomes its own managed system, improves analysis of business processes, and also enforces data quality. Hierarchy management has become more recognized as expectations have increased for consolidated reporting, service oriented architecture and data integrations. In these environments, not only do different account numbers in underlying systems need to be resolved, but there needs to be apples-to-apples reporting where accounts are segmented using hierarchies. Different roles in the organization need different hierarchies. According to Eric Kavanaugh of TDWI Research Institute:
TDWI Research article, By Eric Kavanagh)
Implementation Implications
Exposing multiple hierarchies helps unlock the advantages of the managed hierarchies using OBIEE. The 11g release is going to have some improvements in the presentation of the levels on the hierarchies, but this does not solve the whole problem. The challenge I have found in implementation is twofold. If the hierarchy management is not effective, it is difficult to find effective definitions of levels for segments such as a Gross Margin segment. But in the past the bigger problem has come around the decision whether to expose a hierarchy at all in OBIEE. In the past it was difficult to convince an organization to expose a hierarchy called Gross Margin, since all the accounts were in the Account hierarchy already. The problem was very serious though. The Levels were not tailored specifically for Gross Margin segment reporting, which made it impossible to do easy drag-and-drop reporting. OBIEE reporting was not seen as a viable option compared to other tools. Now that BI Apps 7.9.6 comes with 10 segment hierarchies out of the box, it is more clear that multiple segment hierarchies are valuable to the organization, and many hierarchies need to be exposed.
Another important tool to help with hierarchies is the BICG product IMPACT(c), because the hierarchies need to be analyzed at many levels. BICG IMPACT(c) provides a great way to analyze which hierarchies are being used where, how often, and by whom. When integrated with the ETL source, the legacy source can be exposed for information purposes using the BICG IMPACT(c) subject area to create OBIEE iBots, answers and dashboards.
Thursday, December 3, 2009
CAF = Migration Utility? Use Caution!
Content Accelerator Framework V1 (CAF V1) is designed to help OBI EE power users deploy template reports and RPD constructs from any source environment into their own OBI EE environment. The key functionality of CAF V1 allows easy duplication of any existing reports or logical RPD construct from one OBI EE environment to another one. Both, source and target environment could have nothing in common and be completely different. The only prerequisite is that the target environment already has at least a basic logical model designed within its RPD.CAF V1 clones any report existing in any Webcat, there is no specific property that makes a report be a template eligible to cloning by CAF V1. From a list of existing reports or dashboards (any kind of reports, including a highly formatted layout with different views, including various Webcat calculations), a functional user is able to select analysis of his interest and can clone this analysis to his environment.
- It's still not automated
- It's more labor intensive than the other ways
- It can only migrate certain content
- It's not a supported product
- Cannot clone reports using set operations (UNION, MINUS, INTERSECT)
- Cannot clone reports with aliases
- Does not carry forward saved filters that are used to filter reports
- Only carries forward the default column in a column selector
- If a dashboard is cloned that contain reports from multiple catalog folders, they will be cloned into a single folder
- Any link to a report will reproduce the link, but not clone the target report
- Cannot clone any report if the source or target RPD contains Essbase metadata
- Command line utilities may cause problems with parsing a specific RPD syntax
- Does not migrate any object security, groups, or privileges(!!!)
- Does not move reports or dashboards to another folder location
- Does not delete reports or dashboards from the catalog
CAF V1 is a free utility code, not maintained by Oracle as a licensed product.
Wednesday, December 2, 2009
Integrating Testing in the System Development Life Cycle
1.) The data on the Reports and Dashboards do not agree with the data on the in my Excel Spreadsheet
2.) The data on the Reports and Dashboards do not agree with the data in our source system
3.) It is difficult to navigate between the Dashboard and Reports to see the necessary information
4.) We do not like the gauges and look and feel of the Reports and Dashboards
5.) The Dashboard Prompts do not give us the selection criteria we need to see the reports
6.) We cannot see the information that we need on the Reports and Dashboards
7.) We have a need to look at data to follow up on the out of bounds metrics on the Reports and Dashboards
How many times have you encountered this in on a BI Application or OBIEE Project? If you have been on many of the projects that I have worked on it has occurred to frequently. Instead of going to the next project in the BI Program, you have to spend many hours trying to meet the user’s needs found during User Acceptance Testing.
So what has contributed to this situation? There are many items that have contributed to the above situations. Some of them are:
1.) Poor Requirements gathering
2.) Poor Design
3.) Not involving the user in all phases of the system development life cycle
4.) Poor or no Testing Processes
Of the above one of the biggest contributors to this situation are Poor or no Testing Processes. So where do we start the testing processes for a project. Bill Hetzel in his book “The Complete Guide to Software Testing” states that Testing Processes need to start with the Project Initiation Phase. He recommends the following steps for integration testing within the System Development Life Cycle:
Project Initiation:
Develop Broad Test Strategy
Establish the overall test approach and effort
Requirements:
Establish the testing requirements
Assign testing responsibilities
Design preliminary test procedures and requirements-based tests
Test and validate the requirements
Design:
Prepare preliminary system test plan and design specifications
Complete acceptance test plans and design specification
Complete design-based tests
Test and validate the design
Development:
Complete the system test plan
Finalize test procedures and any code-based tests
Complete module or unit test designs
Conduct the tests
Integrate and test subsystems
Conduct the system test
Implementation:
Conduct the acceptance test
Test changes and fixes
Evaluate testing effectiveness
As Bill Hetzel states above Testing needs to be involved in all phases of the System Development Life Cycle. Now one of the arguments against testing is that it takes a long time do and conduct. However, without following at least some of the major steps projects are prone to some of the same symptoms that we initially discussed. Also a fail or project that exceeds the cost and schedule and does not provide the client the information that he needs result in a failed project – this is the perception by the business users if the project does not meet their perceived needs and requirements.
It has been said the Projects are 80% communications and 20% technical. If we fail to communicate the testing needs and implement projects that the users perceive do not meet their needs and requirements then we are contributing to a failed project. Implementing testing into the System Development Life Cycle greatly improves the communication on a project, and results in users perceiving that the projects meet the needs and requirements. There is no major secret to integrating testing within the System Development Life Cycle, it just requires that it be included and the users be involved in all phases of the System Development Life Cycle. If the proper communication and testing processes are included in the System Development Life Cycle, User Acceptance Testing should only be a minor effort for the project because the users are only validating what already has been defined and tested.
Tuesday, December 1, 2009
Are your Performance Management metrics useful?
Metric Background
Metrics are intended as a simple measurement of activity. By developing and publishing organizational metrics to decision makers, the labor intensive effort of collecting source data, manipulating or adjusting the data, and ultimately calculating metrics, is eliminated. Unfortunately, this also eliminates a large part of the knowledge capital necessary to understand how to influence metrics.
- Key decision makers expected to act upon metrics must understand the full formula for deriving the metric. Only with this information can the best decisions regarding how to improve a business's performance be made.
Relevance of a change in a Metric's value
Similar to having an understanding of the makeup of a metric, organization decision makers must have an understanding of the relevance of a change in the value of a metric. Often, formulas producing metric results mask exponential changes in source values to appear to be linear changes. Over time, this can result in less efficiency from additional attempts to improve a measurement.
- Organizations must have an understanding of the relevance of a change in a key metric at different points. A thorough understanding of the historical change, as well as an analysis of the point at which further improving a metric results in materially diminishing returns, will guide the organization's decision makers to base their decisions on the most accurate basis at a point in time.
Over time all organizations undergo changes to their business model, whether due to internal, external, or a combination of factors. Flexible metrics can enhance the ability for an organization to respond to changing market dynamics by quickly refocusing key decision makers to respond to business model changes.
- Frequent review of Performance Metrics for relevance to the current business climate allows for improved competitive advantage by reducing the time necessary to refocus an organization.
In conclusion, Performance Metrics are an invaluable tool to improve the decision making intelligence of an organization. However, to leverage maximum long and short term benefit requires a commitment to supporting the organization with the best possible guidance. Organizations that embrace their Performance Metrics as an avenue for executing strategy can realize significant gains in competitive advantage.