Monthly Archives: January 2014

Important Information About the Latest OBIEE Bundle Patch for the New Year

January 30, 2014

Oracle recently released the latest bundle patch for OBIEE called version: 11.1.1.7.140114. Oracle changed its policy for versioning OBIEE last year.  OBIEE version 11.1.1.7.3 was the last version to use the old nomenclature. This patch is applicable to Enterprise and Exalytics versions of OBIEE.

One of the biggest surprises about this release is that it has no new features, just bug fixes. This is the first bundle patch I can remember that contained absolutely no new features. This is a pleasant surprise for me and I applaud Oracle for addressing bugs related to earlier versions before introducing new features. What is a little disappointing is that there are many exceptions to qualify for this patch. Please see my section below on items to note.

The following link to the Official Oracle “Business Analytics – Proactive Support” page will give you all the detailed information and links:

https://blogs.oracle.com/proactivesupportEPM/entry/obiee_11_1_1_4

Please note the following items:

  1. You will need to also download and install the Dynamic Monitoring Service patch 16569379 (this is noted on the blog)
  2. Oracle Business Intelligence Enterprise Edition on 32-bit operating systems is not supported
  3. Simple Install of Oracle Business Intelligence Enterprise Edition is not supported 
  4. IBM WebSphere Application Server is not supported
  5. Requires Oracle WebLogic Application Server 10.3.6 for BI Apps installations Note: Version 10.3.5 is not supported for BI Apps 11.1.1.7.1 and above
  6. Is Certified with Oracle Data Integrator 11.1.1.7.0
  7. Is Certified with Oracle GoldenGate 11.2.1.0.1. (Optional Software)
  8. Requires patches in support of Oracle Business Intelligence Applications 11.1.1.7.1. The patches are delivered as part of the Oracle Business Intelligence Applications 11.1.1.7.1 Media Pack.

See the certification matrix for more details on BI Apps:

http://www.oracle.com/technetwork/middleware/bi/biapps-11-1-1-7-1-cert-matrix-1943413.xls

See the certification matrix for more details on OBIEE:

http://www.oracle.com/technetwork/middleware/bi/bi-11gr1certmatrix-ps6-1928219.xls

If you would like to receive further information about patching, please contact us at the following email address: communications@performancearchitects.com

Recommended Reading:

  1. OBIEE 11g: Required and Recommended Bundle Patches and Patch Sets (Doc ID 1488475.1): https://support.oracle.com/rs?type=doc&id=1488475.1
  2. Business Intelligence Enterprise Edition Suite Bundle Patch 11.1.1.7.140114 (Patch Guide): https://updates.oracle.com/Orion/Services/download?type=readme&aru=17155497


Author: John McGale, Performance Architects


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

What is Hadoop and How Does It Compare to Relational Databases?

January 29, 2014

In a previous post, we discussed the case for getting started with a Big Data initiative. One of the fundamental technologies underlying Big Data initiatives is Hadoop. So, what is Hadoop, and where does the technology fit into the traditional relational database environment?

Hadoop is an open source platform for acquiring and processing Big Data sets using distributed and commoditized servers. The technology was originally developed by the Apache project in the mid-2000s, but has been incorporated by several vendors (such as Cloudera and Hortonworks) for wider distribution. The basic Hadoop platform consists of two primary components:

  1. Storage: Hadoop Distributed File System (HDFS)
  2. Transformation: MapReduce Engine

Hadoop Distributed File System, or HDFS, is the storage system used by the Hadoop platform. Data stored in these file systems are broken up into a NameNode (containing metadata) and multiple DataNodes (containing the data itself). At this point, however, data is not query-able. Hadoop must now start to group and make sense of the data. This is done through the MapReduce Engine.

MapReduce is a transformation mechanism that sits on top of the HDFS. It splits apart data for parallel processing and recombines the data into more understandable output. MapReduce programs were initially developed by Google in 2004 and can be written in a number of languages (such as Java, Python, and C++). Often, the outputs of MapReduce programs are understandable and query-able sets of data. These sets of data can then be queried using a number of SQL-like tools.

Much like ETL development is at the heart of data warehousing skill sets, writing and understanding MapReduce programs is a key skill set for investing in Hadoop. That said, the vendors providing packaged Hadoop solutions are often GUI-based, which assists in creating these programs. Many of these vendors also offer pre-packaged virtual machines that assist with understanding the Hadoop world.

So, now that we have a basis for the Hadoop technology, how does this compare to traditional relational databases? For starters, the storage mechanisms are fundamentally different. Relational databases store information in tables defined by a schema, whereas Hadoop uses key-value pairs as its fundamental unit.

The second major difference is how the data are queried. With Relational databases, users leverage SQL to specify what data they want more than how to obtain it. Hadoop uses MapReduce programs, which can also be initiated via SQL-like commands, to specify what data to retrieve and also how to retrieve it.

And a third major difference between Hadoop and Relational databases is how they scale. Relational databases typically scale by adding lots of horsepower (RAM and CPU) to a single or small set of database-class servers. Hadoop databases scale by adding far more (often hundreds) – but lower power – machines that work in parallel.

Stay tuned to Performance Architects’ blog for more posts on getting started with Hadoop and Big Data.

Author: Michael Bender, Performance Architects


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

Date error in OBIEE? Quick tip

January 23, 2014

If you ever suddenly have a case where NULL dates are formatted as 0/0/0 in your report within OBIEE, it’s most likely because you forgot to check the “Nullable” flag on the column in the physical layer.

When you use the import wizard, it does this for you automatically. When you make columns by hand it’s only populated if you remember to do it.

Just a quick note that may help save some time!

Author: John McGale, Performance Architects


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

HPCM: A Hidden Gem in the Oracle EPM (Hyperion) Product Suite

January 20, 2014

Each New Year, we all do some soul searching and goal setting. That process is ingrained in our culture and supported through all of the social media and advertising that hits us each day. For many of us, that process goes beyond the personal and extends into our professional lives. We set goals for our professional environment that will advise and affect our strategy for the New Year. One improvement area that sits on many strategy lists for this year is “better insights” supported by “better analytics.”  It’s not enough to simply provide data or analytics, as many organizations are striving to do this better and faster than ever before.

Hidden within the Oracle EPM product suite is a wonderful product called Hyperion Profitability and Cost Management (HPCM) that actually helps provide improved insight and analytics.  “What is it?” you ask?  That is a very good question. This is a mature product that has been available for a while and simply hasn’t gotten the attention it deserves.  But like the “little engine that could,” this one has chugged along until we were ready to notice it. And it looks to me like its time may have come!

Raise your hand (at least figuratively) if your organization has developed allocation models to better understand your customer or product P&L’s. Do you run allocation logic to predict customer response to a sales or marketing initiative?  Perhaps you are in healthcare or higher education and would like to better understand the cost per patient or student.  For many organizations, those models are a “black box” to everyone except the person who created the model.  Changing or tweaking the allocations logic to run various ‘what-if’ scenarios can be a painful process involving iteration between IT and the business analyst desiring the change. Furthermore, the process of testing the model and establishing trust in the model’s logic can be frustrating for everyone involved.

Can you imagine a world where that black box is ‘open’ and the end user can actually change variables and ‘see’ the impact to the data? Can you imagine running not just a ‘best and worst’ case model but being able to work through variations in between because the product is controlled by an end user? HPCM, combined with the power of the Oracle EPM (Hyperion) product suite, provides just that experience.  Suddenly the business users (not just analysts and the IT group) can view a traceability matrix to understand how costs are allocated, what variables are inputted, and how the business users themselves can affect change to the expense lines on the P&L.  HPCM provides both multi-stage costing to support multi-step allocations, as well as detailed profitability analysis through single-step allocation logic.

Organizations can also leverage their existing Oracle Hyperion Planning and Oracle Business Intelligence (OBIEE) products in conjunction with HPCM. OBIEE (or the BI Foundation Suite) provides report distribution, interactive dashboards and mobile capabilities against HPCM for those seeking a graphically rich interface.  For the number crunchers in the organization, Smart View is also enabled against HPCM.  Enterprise performance management (EPM) can serve as a source and/or a target for the results of the HPCM data as can any reporting and analysis solution.  The focus for this product really is that all too critical niche of better insight and analytics into corporate resource mapping.  And who doesn’t want a little better insight in the New Year?

Author: Kelli Pircio, Performance Architects


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

Oracle Business Intelligence Enterprise Edition (OBIEE) BI Analysis & Reporting Versus BI Published Reporting

January 15, 2014

Whether you are new to using Oracle Business Intelligence Enterprise Edition (OBIEE) or have been using this product for a while, you will see an option to use both the BI Analysis & Reporting (formerly known as BI Answers, referred to in this post as “BI Analysis”) and BI Published Reporting (formerly known as BI Published Reporting) capabilities once you log into your OBIEE application.

pic 1

BI Analysis displays and visualizes data using charts, tables and reports pulling data from multiple data sources, while BI Published Reporting generates highly-formatted enterprise reports using multiple data sources.

Determining whether to use a BI Analysis report or a BI Published Reporting report depends on what you are trying to deliver or produce for your end users.

BI Analysis1:

  • Full ad hoc analysis, pivot tables, and report creation
  • Simple point and click interface
  • Users are shielded from complexity and structure of underlying data structures
  • Explore and interact with results
  • Save, organize and share reports
  • Integrate reports back into Intelligence Dashboards

BI Published Reporting2:

  • True “pixel-perfect” reporting and publishing
  • Report layout using familiar tools like Microsoft Word and Adobe Acrobat
  • Single product for all document needs, e.g., invoices, checks, financial statements, government forms, etc.

To create a new Analysis report, click on ‘Analysis’ in the ‘Analysis and Interactive Reporting’ section.  Users will see a list of available subject areas that were created from the existing RPD development.

pic 2

To create a new BI Published Reporting Report, click on ‘Report’ in the ‘Published Reporting’ section.  Users will need to search for an existing data model or quickly create one before accessing a BI Published Reporting report.

pic 3

Both options have the ability to generate reports just by dragging and dropping facts and dimensions from the existing data model(s).

Even without having any in-depth knowledge on how an RPD/data model was created, both BI Analysis and BI Published Reporting reports can be used by any organization to provide answers to business questions.

Finally, by utilizing either BI Analysis or BI Published Reporting reporting features within OBIEE 11g, you can save many hours and resources and therefore focus more on analyzing information to make better business decisions for your organization.

Sources: 1, 2 Oracle Business Intelligence Enterprise Edition 11G. http://www.oracle.com/us/industries/media-entertainment/oracle-obiee-datasheet-203409.pdf

Author: Jon Kim, Performance Architects 


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

How to Apply Lean Practices to the Planning Process

January 8, 2014

The 80-20 rule (based on the Pareto principle) has been applied in numerous ways to identify and solve many management issues. In the world of financial planning and analysis, this cause and effect methodology unearths an issue where 80% of resources and time are spent on creating the budget and forecast when in fact it should be the other way around; 80% of the time should be analyzing information and determining its impact on the organization. This post discusses how we can use lean practices in conjunction with a tool like Oracle Hyperion Planning to overcome this issue that plagues many organizations; we will use expense planning as an example:

Sree pic 1

Figure 1: Illustration of a lean expense planning budgeting cycle execution

Imagine a scenario in which we can identify expenses (typically operating expenses), where the budget for the following year can be pre-populated via Hyperion and approved without too much iteration. For example, if rent is expected to increase by 5% and social security taxes are expected to increase by 2% in the following year, the budget for these expenses can be easily pre-populated by setting up calculation scripts to copy the data over with the associated increase. As long as the budget approvers are able to see that the budget increase compared to the prior year’s actuals is close to 5% for rent and 2% for Social Security, they can approve the budget without too much iteration. Using this example, we will look at how to extend this to 80% of the budget.

Most organizations start expense planning by looking at prior year actuals. The goal to achieve is how your team can meaningfully pre-populate the following year’s budget and then spend your time wisely on the remaining activities as follows:

1. Classify prior year actuals by categorizing each expense into one of the following expense categories: run; one-time expenses; one-time grow; and recurring grow. Once tagged accurately, you will notice that the expenses that are tagged as run and recurring grow are typically 80% of your total expenses.

• Run expenses are the equivalent of “keeping the lights on” expenses that are recurring, e.g., rent, electricity, and full-time employees’ salaries.

• One-time expenses are just that – they are one-time in nature and will not recur in the future, e.g., roof replacement at an office location.
• One-time grow are unexpected expenses that apply only to that calendar year, e.g., costs related to hiring consultants on specific projects.
• Recurring grow are expenses that will recur into the future, e.g., maintenance costs on newly acquired hardware.

2. Best practice is to tag expenses in the above categories when businesses review actuals on a monthly basis. Alternatively, you can choose to do this as a one-time activity at the beginning of your budgeting cycle but this is not the recommended approach.

3. Forecast the months for which you do not have any actuals data. For example, if you have only 10 months of actuals data, you can run a calculation script that pre-populates data for the two missing months based on run-rate, last month’s actuals, or an approach that fits your organization’s expense patterns.

4. Once you have 12 months of actuals and/or forecasts, copy the run and recurring grows as the base for the next budgeting cycle (80% in the next step). Don’t copy over the expenses tagged as one-time in nature as they are not expected to recur into the future. Alternatively, you can copy the entire budget and delete the expense items tagged as one-time items.

5. Follow the 80/20 rule on Operating Expenses: 80% of Budgets and forecasts are typically running expenses (keeping the lights on) with adjustment for increases or decreases based on inflation, tax changes, etc. These 80% of budget/forecasts are usually operating expenses and can be setup automatically via calculation scripts in Hyperion. As long as senior management and budget approver sign off on the guidance on inflation percentage, tax increase percentage, etc., this 80% budget can be setup easily without manual input, reviewed by departments and budget owners, and approved by senior management as the increases or decreases will match their guidance.

6. Some specifics for operating expenses related to capital are as follows:

• On existing capital that is already live or currently depreciating, the depreciation schedule can be setup as a calculation and needs no manual user input.
• On existing capital that is on the books but not yet live, the depreciation schedule can be setup via a form or calculation script and needs no manual user input.
• On new capital that is not yet purchased or delivered but is expected to go live in the following year’s budgeting cycle, the depreciation schedule can be set up via a form and calculation and needs no manual user input.

By using a combination of this sound process and Hyperion’s capabilities, your organization’s time and resources can be spent better focusing on the remaining 20% of the expenses, which typically focus on the future growth of your organization (like new projects, capital expenses, etc.) and the time spent reviewing and analyzing these items is time and money better spent for you, your organization, and all your stakeholders, both internal and external.

Author: Sreekanth Kumar, Performance Architects

 


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.