Wednesday, December 23, 2009

What's the right level of detail to include in Hyperion Planning

When facilitating design sessions with clients, the question that invariably comes up is what level of detail should be included within their Hyperion Planning application.

Often, a clients first instinct is to include the level of detail that exists within their ERP system for actual data. When confronted with this desire/requirement, I like to encourage spirited conversation between project sponsors and subject matter experts by having them address the following two items.
1) Does a desire exist across the majority of the user base, to plan/budget at the same level of detail for which actual data occurs?
2) Does this level of detail for formulating a plan/budget coincide with management’s performance management objectives?

I have found that the best practice for the level of detail contained within a planning application or applications is that it should reflect management’s performance management objectives. Detail for detail’s sake wastes resources, but arbitrarily limiting the level of detail is not a best practice if it limits analysis against important factors that must be controlled.

If the driving desire to include this level of detail in the Planning application is only to facilitate analysis against actual data and business owners have no desire to perform data entry to each and every travel expense account for example, then other alternatives exist that will not encumber the Planning application(s).

Successful Planning implementations provide the level of detail within the planning model that business owners are accustom to and desire to formulate their Plans/Budgets. Meaningful reporting and analysis often may require further details then what users desire to plan to. This disconnect can be addressed through properly architected reporting applications, partitioning, or drill through.

When addressing the level of detail that is to be included within a Planning application, answering the two fundamental questions above has lead me to successfully architect and implement in excess of 30 Planning applications. More often than not, most clients when really pushed to consider what is really necessary from a Plan/Budget perspective versus what is necessary from a Reporting/Analysis perspective will arrive at a Planning application that is more summary level.

While meeting management’s performance management objectives for Reporting and Analysis shouldn’t be ignored, encumbering users with unnecessary levels of detail within a Plan/Budget only introduces potential for issues with performance, maintenance and end user satisfaction.

The mantra that I as well as many other seasoned EPM professionals subscribe to when asked to architect a Planning application is that, “Provide the level of detail necessary to formulate the Plan. Planning should be used as a means to facilitate Plan/Budget vs. Actual analysis, but if the level of detail that this analysis occurs at is beneath the level of Plan formulation then this analysis should be done outside of Planning (i.e. Essbase, partitioning to a reporting app, drill through).”

Wednesday, December 16, 2009

Oracle partners with BICG on BI Applications Hands-on Workshop

Pleasanton, California - December 09, 2009

BICG partnered up with Oracle to conduct a hands-on Oracle Business Intelligence Applications workshop for Oracle clients.

Oracle provided the meeting-space, latest and greatest view of the technology, and business value gained when implementing the technology presentation. This allowed all attendees to see first hand what is happening and what is coming. You had to be there!

As the only recipient of the coveted Oracle BI/EPM Partner of the Year designation in association with the Oracle Innovation Awards at this year's Oracle OpenWorld, Minneapolis based BICG was selected to provide the "hands-on" training. BICG has long been recognized as a top performer when it comes to executing "experience based" success in both product training and project design, development and implementation. BICG was also presented to the group on "Developing a Corporate Business Intelligence Strategy". The presentation directly addressed potential pitfalls and recommendations surrounding Business Intelligence projects. BICG chose 30 year BI veteran Steve Tinari to lead the training and make the BI Strategy presentation.

Registration for the workshop was required and limited to 40 seats. This is all the space there was in the meeting room. Registration was sold-out in days and a wait-list was quickly developed. Last look had 25 clients waiting to claim a seat.

BICG has been asked to continue conducting this hands-on Oracle BI Apps workshop at strategic locations across the US and abroad. Registration for upcoming workshops can be found at http://www.biconsultinggroup.com .

BICG Workshop Advice: register early for the upcoming events and don't forget to bring your laptop. BICG uses a virtual training method that allows you to connect to their environment using only a web browser!

Wednesday, December 9, 2009

Oracle OpenWorld Video: Spotlight on Oracle BI at LCRA


During Oracle OpenWorld, Keith Niesner from the Lower Colorado River Authority (LCRA) provided an overview of business intelligence at his organization.


Niesner highlights the use of Oracle Business Intelligence at LCRA while detailing the challenges faced during deployment and lessons learned..


More Video from Oracle OpenWorld and BICG >>

Friday, December 4, 2009

GL Segments in the BI 7.9.6 GL Balance Sheet subject area

The OBIEE 7.9.6 GL Balance out-of-the-box subject area provides for segment reporting. A segment reporting hierarchy provides users with a drag-and-drop way to produce segment reports. Segments can be any subset of a dimension, but in our example let’s use the set of accounts in the Gross Margin report.

The value of managing the hierarchies for the segment is that multiple levels of the segment can be created, from most aggregated to most granular, and can be saved and exposed for drag-and-drop off the presentation layer in Answers and Office Add-Ons. For example, a user would be able to take Level3 of the Gross Margin Dimension, the Ending Amount from the Fact table, and get the Gross Margin report. Level4 can be just slightly different than Level3 for some reason specific to Gross Margin reporting, and that is perfectly fine since this hierarchy is dedicated specifically to Gross Margin reporting. The user should also be able to take Level 1, and then drill down to levels to see further breakouts of gross margin lines. Expense might break down to lower levels on the hierarchy like Indirect, Direct, and then further down at further levels to more granular groupings of accounts.

It is interesting to see that the GL Balance Sheet subject area has delivered 10 different account hierarchies. There is an assumption that there will be a need for many hierarchies for various segments in reporting. The hierarchies are labeled GL Segment1 through GLSegment10.









What's the value of hierarchy management?

Hierarchy management introduces the notion that hierarchies need neither be fixed, as is often seen in OLTP systems, nor unmanaged, as suggested by needs of ad-hoc end user reporting needs. In fact hierarchies are so important that they deserve the opportunity to have any number of iterative “what-if” scenarios to quickly come to the truth hidden behind their aggregates. In order to accomplish this, the hierarchy is abstracted away from both the operational and analytical systems (i.e. OLAP). Since most data distills into groups with lists of members, the art and science of manipulation of these taxonomies becomes its own managed system, improves analysis of business processes, and also enforces data quality. Hierarchy management has become more recognized as expectations have increased for consolidated reporting, service oriented architecture and data integrations. In these environments, not only do different account numbers in underlying systems need to be resolved, but there needs to be apples-to-apples reporting where accounts are segmented using hierarchies. Different roles in the organization need different hierarchies. According to Eric Kavanaugh of TDWI Research Institute:



“Hierarchy management serves as a translation layer between values in a data warehouse or other business intelligence environment, and the reporting or analysis view that business users see. By abstracting this translation layer, individual business users can manipulate variables within that layer without overwriting data in source systems or requiring hard-coded changes to those systems. This abstraction opens up an entire realm of analysis that can help organizations maintain competitive advantage in an increasingly complex and demanding marketplace” (The Magic of Abstraction: Hierarchy Management and Decision-Making

TDWI Research article, By Eric Kavanagh)

Implementation Implications



Exposing multiple hierarchies helps unlock the advantages of the managed hierarchies using OBIEE. The 11g release is going to have some improvements in the presentation of the levels on the hierarchies, but this does not solve the whole problem. The challenge I have found in implementation is twofold. If the hierarchy management is not effective, it is difficult to find effective definitions of levels for segments such as a Gross Margin segment. But in the past the bigger problem has come around the decision whether to expose a hierarchy at all in OBIEE. In the past it was difficult to convince an organization to expose a hierarchy called Gross Margin, since all the accounts were in the Account hierarchy already. The problem was very serious though. The Levels were not tailored specifically for Gross Margin segment reporting, which made it impossible to do easy drag-and-drop reporting. OBIEE reporting was not seen as a viable option compared to other tools. Now that BI Apps 7.9.6 comes with 10 segment hierarchies out of the box, it is more clear that multiple segment hierarchies are valuable to the organization, and many hierarchies need to be exposed.

Another important tool to help with hierarchies is the BICG product IMPACT(c), because the hierarchies need to be analyzed at many levels. BICG IMPACT(c) provides a great way to analyze which hierarchies are being used where, how often, and by whom. When integrated with the ETL source, the legacy source can be exposed for information purposes using the BICG IMPACT(c) subject area to create OBIEE iBots, answers and dashboards.



Thursday, December 3, 2009

CAF = Migration Utility? Use Caution!

There has been a lot of chatter recently in the OBIEE blogosphere regarding the Content Accelerator Framework (CAF) and its use in the process of migrating OBIEE content from dev/test to production. Despite the CAF being available for quite a while now, it is suddenly being heralded as the new way to migrate OBIEE content to production. In this blog post, I'd like to talk about reasons why this may not be a good idea and why caution should be used in adopting CAF as the crux of this critical process.

Background

Before we get into that explanation, though, let's talk about what CAF is really meant to be/do. In the documentation for CAF, the utility is described in the following way:

Content Accelerator Framework V1 (CAF V1) is designed to help OBI EE power users deploy template reports and RPD constructs from any source environment into their own OBI EE environment. The key functionality of CAF V1 allows easy duplication of any existing reports or logical RPD construct from one OBI EE environment to another one. Both, source and target environment could have nothing in common and be completely different. The only prerequisite is that the target environment already has at least a basic logical model designed within its RPD.

CAF V1 clones any report existing in any Webcat, there is no specific property that makes a report be a template eligible to cloning by CAF V1. From a list of existing reports or dashboards (any kind of reports, including a highly formatted layout with different views, including various Webcat calculations), a functional user is able to select analysis of his interest and can clone this analysis to his environment.

In those two paragraphs, the concept of "template reports" is mentioned. This concept is the thrust of CAF and explains what CAF is really meant to do.

Back when OBIEE 10.1.3.4 was released, Oracle switched the out-of-the-box demo content from "Paint" to "Sample Sales". Sample Sales also included brand new sample reports and dashboards that really showcased the capabilities of OBIEE in a way not shown before. The reaction to the Sample Sales content was very positive, with many customers asking Oracle, "How can I get reports like those in my own environment?" CAF is the answer to that question. It allows a customer to take the Sample Sales content and "map" it into their own environment, replacing Sample Sales columns for their own, while all the rich formatting, report structure, and logical metadata comes along for the ride without the user having to do any development at all.

I was actually shown a very early version of CAF when it was an Excel-based tool that leveraged macros to do its thing. The thrust was all about accelerating content development (hence the name) for users with Sample Sales as the template for the new content. It was thought, at the time, that this somehow might be the beginning for having template capabilities in OBIEE, but ultimately it was all about taking a desired Sample Sales report and push it into a customer's environment without doing all of the manual development work. Since then, the tool has evolved into the Catalog Manager add-in that it is today.

Why Does OBIEE Need A Migration Utility?

Now that we understand the background behind why CAF was developed and what its intended use is, it's important to understand why this buzz about CAF as a migration utility is even occurring. The only reason for people to get excited about a new option for migrating OBIEE content is due to the fact that current options are less-than-desirable on an Enterprise level. For the customer supporting a lot of content in OBIEE across hundreds or thousands of users, the existing options for migration to production are difficult to swallow. I'm not going to get into those options here and discuss why they are inadequate, but it's safe to say that experienced implementers know the challenges, experienced customers know the challenges, and even Oracle understands the challenges (and hopefully 11g offers some improvements as a result).

Why CAF Shouldn't Be The New Best Way To Migrate

So now that I have all of that background material out of the way, let's get to the heart of the matter. CAF was not designed as a full-blown migration utility. Nowhere in the documentation will you see "migrating to production" as one of its purposes. While I love applying creativity and using software in outside-the-box ways, there are some very good reasons why CAF should not be considered the new best way to migrate OBIEE content to production. They are:
  1. It's still not automated
  2. It's more labor intensive than the other ways
  3. It can only migrate certain content
  4. It's not a supported product
So let's look at these in more detail.

1. It's Still Not Automated

This is the weakest of my reasons, but I'm going to list it anyway. The biggest challenge I see customers having with OBIEE production migrations is the fact that it is very, very difficult to automate the migration of catalog content to production. Again, I'm not going to go into the reasons why, but the bottom line is that the thing that I hear customers wanting most is not present in this solution. That's not a reason by itself to forego CAF as your migration solution, but it's worth mentioning.

2. It's More Labor Intensive Than The Other Ways

One of the selling points of CAF as a report template mechanism that you can take a report you like, clone it, and quickly map the columns on the report template to the columns you want to use in another subject area instead. This is a great feature, until you want to do this for dozens if not hundreds of reports, like you might do in a production migration. While CAF offers the ability to select multiple reports or entire dashboards, you still have to map all of the distinct columns across all of the reports to the same columns in the target RPD (because presumably in a production migration, the source RPD and target RPD will be the same). But wait - it gets even harder than that, because to do this mapping, you first have to select the target subject area, which means if you have dozens or hundreds of reports that span multiple subject areas, you have to do this needless mapping step subject area by subject area, one batch at a time.

If all that extra mapping mess weren't enough, there are other extra steps to consider. If there are columns used in filters that aren't displayed in the criteria, you have an extra mapping step for that. If there are columns that require level definition (AGO, TODATE), you have to map the levels for them (even if they are the same). You have to specify the target catalog folder path, which implies you can realistically only clone one parent folder at a time. You have to specify whether to create a dashboard or not, implying you can only clone one dashboard at a time. All of these steps add-up to extra effort that far exceeds the current migration techniques. While it's true you can save mappings for future re-use, that only helps if you're going to migrate new content against the same unchanged metadata, which is only a portion of all possible production migrations.

3. It Can Only Migrate Certain Content

The CAF documentation lays out the known limitations of CAF:
  • Cannot clone reports using set operations (UNION, MINUS, INTERSECT)
  • Cannot clone reports with aliases
  • Does not carry forward saved filters that are used to filter reports
  • Only carries forward the default column in a column selector
  • If a dashboard is cloned that contain reports from multiple catalog folders, they will be cloned into a single folder
  • Any link to a report will reproduce the link, but not clone the target report
  • Cannot clone any report if the source or target RPD contains Essbase metadata
  • Command line utilities may cause problems with parsing a specific RPD syntax
While each of these are show-stoppers in their own right, there are other undocumented limitations when considering CAF as a migration utility:
  • Does not migrate any object security, groups, or privileges(!!!)
  • Does not move reports or dashboards to another folder location
  • Does not delete reports or dashboards from the catalog
All of these limitations narrow the scope of migration scenarios very quickly, making it impossible to recommend using CAF for all production migrations.

4. It's Not A Supported Product

If I still haven't swayed you, the coup de gras is the fact that this utility is not an officially licensed product, which means no maintenance or support. The very first page of the CAF documentation spells it out plainly:

CAF V1 is a free utility code, not maintained by Oracle as a licensed product.

A migration to production is a critical event in the administration and use of OBIEE. If the migrations don't happen successfully, accurately, and timely, that can be a very big deal. While free utilities have their place, I would not want to rely on one to carry the weight of something as heavy as a migration to production on a regular basis. With no options for support if something goes wrong, and no guarantee that Oracle will continue to enhance the utility, it's hard to recommend it, regardless of whether the utility does the job or not.

Conclusion

So as you read all of the chatter on the blogosphere that advocates using CAF to perform your mission critical migrations to production, make sure you proceed with caution before doing so. Perform plenty of testing, consider the risks, and make an educated decision. If you find that CAF works as a migration solution for you, drop me a line. I'd love to hear that I'm wrong on this, as we're all looking for a better way to migrate.

Wednesday, December 2, 2009

Integrating Testing in the System Development Life Cycle

After you have spent three to six months gathering requirements, designing, and building the OBIEE Application you start the User Acceptance Testing with the business users. When the business users start to do the acceptance testing many times we hear the following from the business users:

1.) The data on the Reports and Dashboards do not agree with the data on the in my Excel Spreadsheet
2.) The data on the Reports and Dashboards do not agree with the data in our source system
3.) It is difficult to navigate between the Dashboard and Reports to see the necessary information
4.) We do not like the gauges and look and feel of the Reports and Dashboards
5.) The Dashboard Prompts do not give us the selection criteria we need to see the reports
6.) We cannot see the information that we need on the Reports and Dashboards
7.) We have a need to look at data to follow up on the out of bounds metrics on the Reports and Dashboards

How many times have you encountered this in on a BI Application or OBIEE Project? If you have been on many of the projects that I have worked on it has occurred to frequently. Instead of going to the next project in the BI Program, you have to spend many hours trying to meet the user’s needs found during User Acceptance Testing.

So what has contributed to this situation? There are many items that have contributed to the above situations. Some of them are:

1.) Poor Requirements gathering
2.) Poor Design
3.) Not involving the user in all phases of the system development life cycle
4.) Poor or no Testing Processes

Of the above one of the biggest contributors to this situation are Poor or no Testing Processes. So where do we start the testing processes for a project. Bill Hetzel in his book “The Complete Guide to Software Testing” states that Testing Processes need to start with the Project Initiation Phase. He recommends the following steps for integration testing within the System Development Life Cycle:

Project Initiation:

Develop Broad Test Strategy
Establish the overall test approach and effort

Requirements:

Establish the testing requirements
Assign testing responsibilities
Design preliminary test procedures and requirements-based tests
Test and validate the requirements

Design:

Prepare preliminary system test plan and design specifications
Complete acceptance test plans and design specification
Complete design-based tests
Test and validate the design

Development:
Complete the system test plan
Finalize test procedures and any code-based tests
Complete module or unit test designs
Conduct the tests
Integrate and test subsystems
Conduct the system test

Implementation:
Conduct the acceptance test
Test changes and fixes
Evaluate testing effectiveness


As Bill Hetzel states above Testing needs to be involved in all phases of the System Development Life Cycle. Now one of the arguments against testing is that it takes a long time do and conduct. However, without following at least some of the major steps projects are prone to some of the same symptoms that we initially discussed. Also a fail or project that exceeds the cost and schedule and does not provide the client the information that he needs result in a failed project – this is the perception by the business users if the project does not meet their perceived needs and requirements.

It has been said the Projects are 80% communications and 20% technical. If we fail to communicate the testing needs and implement projects that the users perceive do not meet their needs and requirements then we are contributing to a failed project. Implementing testing into the System Development Life Cycle greatly improves the communication on a project, and results in users perceiving that the projects meet the needs and requirements. There is no major secret to integrating testing within the System Development Life Cycle, it just requires that it be included and the users be involved in all phases of the System Development Life Cycle. If the proper communication and testing processes are included in the System Development Life Cycle, User Acceptance Testing should only be a minor effort for the project because the users are only validating what already has been defined and tested.

Tuesday, December 1, 2009

Are your Performance Management metrics useful?

Performance Metrics are the heart of any BI/EPM implementation, but often, the ability to effect change in the metrics is not seamless. In this blog, I'll explore three key factors related to less-than-optimal usage of Performance Management Metrics and suggest practices to improve application of metrics to an organization's benefit.

Metric Background
Metrics are intended as a simple measurement of activity. By developing and publishing organizational metrics to decision makers, the labor intensive effort of collecting source data, manipulating or adjusting the data, and ultimately calculating metrics, is eliminated. Unfortunately, this also eliminates a large part of the knowledge capital necessary to understand how to influence metrics.
  • Key decision makers expected to act upon metrics must understand the full formula for deriving the metric. Only with this information can the best decisions regarding how to improve a business's performance be made.

Relevance of a change in a Metric's value
Similar to having an understanding of the makeup of a metric, organization decision makers must have an understanding of the relevance of a change in the value of a metric. Often, formulas producing metric results mask exponential changes in source values to appear to be linear changes. Over time, this can result in less efficiency from additional attempts to improve a measurement.
  • Organizations must have an understanding of the relevance of a change in a key metric at different points. A thorough understanding of the historical change, as well as an analysis of the point at which further improving a metric results in materially diminishing returns, will guide the organization's decision makers to base their decisions on the most accurate basis at a point in time.
Industry Trends
Over time all organizations undergo changes to their business model, whether due to internal, external, or a combination of factors. Flexible metrics can enhance the ability for an organization to respond to changing market dynamics by quickly refocusing key decision makers to respond to business model changes.
  • Frequent review of Performance Metrics for relevance to the current business climate allows for improved competitive advantage by reducing the time necessary to refocus an organization.


In conclusion, Performance Metrics are an invaluable tool to improve the decision making intelligence of an organization. However, to leverage maximum long and short term benefit requires a commitment to supporting the organization with the best possible guidance. Organizations that embrace their Performance Metrics as an avenue for executing strategy can realize significant gains in competitive advantage.