Monthly Archives: October 2013

Big Data: What’s the Big Deal?

October 30, 2013

The case for gathering and analyzing big data streams is growing. A recent study by the MIT Center for Digital Business found that the more a company characterizes itself as data-driven, the better the company performs in terms of productivity and profitability1. So what is “Big Data?”

To start, we at Performance Architects define “Big Data” as a broad category of data that is of a higher capacity and more unstructured in nature than data typically found in traditional financial reports and IT relational databases. Big Data size and subject matter varies by industry.

So what characterizes Big Data? There are examples in nearly every industry, but the “3 Vs” of Big Data include:

  • Volume: Big Data volumes usually start in the terabytes (1TB = ~1000GB). However, it is not uncommon for companies to process petabytes (1PB = ~1,000,000GB) or more. It is estimated that WalMart collects more than 2.5 PB of data every second from customer transactions2. 
  • Velocity: Big Data accumulates in organizations at a rapid pace, far more than the traditional data sets. The most often used example of Big Data velocity is mobile-based social media opinions. 
  • Variety: Big Data are uniquely varied. Streams like social media and GPS tracking have only been ubiquitous for a few years, and differ greatly from traditional Chart of Account (COA) based views of a business. 

So what will adopting Big Data mean for an organization? As with many IT paradigm shifts, adopting Big Data means changes in hardware, software, and expertise. This does not necessarily translate into a multi-year, multi-million dollar investment, however.

In terms of hardware, the new trend is to break the incoming stream into parts and distribute processing onto many smaller scale nodes at once. So, instead of purchasing a few very expensive high end servers, Big Data streams can be processed with clusters of off-the-shelf hardware.

On the software side, many of the offerings out there are open source and in use by some of the most advanced tech companies. Hadoop is one such open source software framework that is in use by firms such as Facebook and Yahoo to process enormous Big Data sets. The Hadoop framework also offers a platform to process streams through MapReduce programs as well as query through tools like Hive.

And finally, processing and interpreting these Big Data sets will require some new skill sets, but these skills should be complimentary to many of those already found in organizations today. Identify those people in your organization who best fit the Data Architect role, and let them experiment with the Hadoop framework to better understand the technology behind Big Data. Interpreting and correlating these data streams is yet another skill set, one that is being defined under the emerging Data Scientist role.

The bottom line is getting started with a Big Data initiative should be thought of in terms of weeks, rather than months or years. Start with the hypothesis that capturing and storing Big Data streams and analyzing these against existing structured and newer unstructured information streams may reveal new insights into your organization.

Author: Michael Bender, Performance Architects

Sources: 1, 2 Big Data: The Management Revolution, Andrew McAfee and Eric Brynjolffson, Harvard Business Review October 2012


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

ERPi & EPMA in Oracle EPM (Hyperion) Versions 11.1.2.1 and 11.1.2.2: Pure Synergy

October 23, 2013

Enterprise Resource Planning Integrator (ERPi) is an Oracle EPM (Hyperion) solution that allows Hyperion applications to interact with enterprise resource planning (ERP) systems. These interactions include elements such as drill-through, data sourcing, and metadata sourcing.

Enterprise Performance Management Architect (EPMA) is a Hyperion solution that maintains hierarchies and applications in a user-friendly, central location. In addition to its primary objective of application and metadata maintenance, EPMA also permits easy data synchronization across applications and databases.

As previously mentioned, ERPi has the ability to source metadata for applications by pulling members out of the ERP system. With this functionality, a user has the ability to send ERP members from the source system into EPMA. Once the members become available within EPMA, they can be distributed across applications. EPMA can then allow the user to specify application-specific properties and hierarchies. In addition to sourcing hierarchies, data can also be sourced from the ERP system into select Hyperion applications.

The integration of these two applications offers a lot of potential:

1. Seamless Application Management

Historically, the act of adding a new application to an existing process could be cumbersome. However, EPMA can easily add a new application into the mix. Previously created dimensions, hierarchies, and properties can be used to create a new application in a fraction of the usual time.

2. Synchronized Actual Data

In addition to keeping ERP members in sync, ERPi also provides the ability to keep data synchronized. By using the EPMA Batch Client (an out-of-the-box automation component) in addition to the ERPi automation tools, the process of updating all Hyperion applications with current actuals data can be a completely automated.

The methods described in this article cover only a small piece of the Oracle EPM capabilities in this arena. For example, in 11.1.2.1 or 11.1.2.2, Hyperion Financial Data Quality Management (FDM or FDQM) works directly with ERPi to add even more data processing power (starting in 11.1.2.3, these two have been combined into Financial Data Quality Management Enterprise Edition or FDMEE). As with all out-of-the-box applications, there are limitations if the business requires a substantial amount of customization. However, there are excellent opportunities to implement simple and effective solutions if the out-of-the-box principles can be adopted.

Interested in learning more? Register for our “How to Implement Oracle EPM (Hyperion) Financial Data Quality Management (FDM): Utility Customer Case Study” or access the slides and webinar replay on the Performance Architects Learning Center.

Author: Tyler Feddersen, Performance Architects

 


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

The Top Five Things to Avoid When Implementing a New Enterprise Performance Management (EPM) Solution

October 9, 2013

We have all heard and read volumes of content on how to implement an EPM system. However, I don’t think anyone has ever explained how not to implement an EPM solution. This blog post discusses the top five things to avoid when implementing your new EPM solution.

5. Reinventing Bad Processes

Too many times (and often under protest), I have implemented the same old broken planning logic into a nice, shiny, new system. This is certainly a low-risk way to implement a new solution. However, what is the real value here? The ideal time to challenge methodologies and fix broken methods is during an implementation. Service providers can help here, and often can share leading practice methods both in and outside of your industry/vertical. Concepts that are important to explore include the use of drivers, inflation factors, spread methodologies, and allocations. The idea of linking operational and strategic planning into a singular or rolling process continues to be a method that many organizations consider and adopt as well.

4. Stretching Beyond Technical Limits

We all have become spoiled over the past decade because we have become accustomed to making computer-based systems do what we want. That being said, when you purchase ‘packaged’ software, it is typically designed to do certain tasks in a certain way. Don’t fight it! If you can nudge your requirements to fit the tool, as then you may have a more efficient and useful solution at the end of the day. In addition, there may be other tools that better fit your requirements.

While it is my job as a consultant to stretch the capabilities of a software package to fit the needs of my customer, it is also my job to make my customer aware of what the product does well, and what the product does not do well. Put your service provider on the spot. Make them tell you where the efficiencies are, what the tool does best, and what other tools are available; and then make an educated decision on how/when best to stretch this functionality.

3. The “Apple Pie” Method

When you cook an apple pie, you stick it in the oven, and then check on it every so often. In the system implementation world, you do NOT want to stick a bunch of consultants in a room, and check on them from time to time. The results can be a system that does not meet expectations, and a group of employees that do not know how to maintain their new system.

Instead, work with your implementers! Dedicate professionals from your organization to work with the consultants for the full lifecycle of the project. This will not only allow the consultants to receive information more readily; but will also guarantee that the product you purchased is what was expected, with no surprises. In addition, employees will be fully trained by the time you are ready for go-live, and can then train others without the assistance of consultants.

2. Spreadsheet Hell

Often times, an organization reaps the benefit of a smooth and successful EPM implementation with an optimal system and a functional data repository containing actual, budget, and forecast data. However, when it comes to reporting, users once again find themselves in spreadsheet hell. It does not take long for users to utilize am ad-hoc reporting tool and to create hundreds of conflicting and confusing spreadsheets, links, and formulas.

Instead, make a “reporting roadmap” part of your implementation. Utilize enterprise reporting tools to standardize reporting templates, and deliver them from a reporting repository where these reports are created and maintained by a reports administrator.

1. Double Dipping

I had to put a “Seinfeld” reference in somewhere! “Double dipping” here refers to asking your EPM system to do other tasks; sometimes several other tasks, beyond its true purpose. Through my work in the Oracle/Hyperion space, I often see giant Hyperion Planning applications providing several different business functions such as financial consolidation, cash flow reporting, reporting of historical actual data, and more. If your model can successfully function this way in the long term, then you have successfully double-dipped, and should be congratulated. However, most times, this methodology results in a solution that is not sustainable, performs badly, and is not scalable.

The alternative? No one is asking you to sacrifice anything. Instead, utilize appropriate system architecture practices to design a full EPM and reporting solution. In the past several years there have been huge improvements in the manner in which EPM, BI, and reporting components integrate. In the Oracle/Hyperion world, systems that seemed somewhat disparate in the past (e.g., Planning, Essbase, HFM, HSF, OBIEE) are now closely integrated through provided tools (such as ERPi, FDM, DRM, ODI, EPMA, etc.). So have your cake, and eat it too! But do it through the proper design and architecture instead of a single “super application.”

Stay tuned for more on “double dipping” and specific Oracle/Hyperion architecture strategies.

Author: Chuck Persky, Performance Architects


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

How and When to Use the Oracle Hyperion Planning @RETURN Function

October 2, 2013

The @RETURN function was introduced with the 11.1.2.1 release of Oracle Hyperion Planning. Currently, the use of @RETURN is still widely underutilized as a result of its relative obscurity within the EPM realm. However, this function can be utilized in multiple, very useful ways during business rules development and debugging, validating data integrity, or simply informing a user of an invalid entry.

The syntax of this function in a calculation block looks like this: @RETURN(“Message”,ERROR). An important note: Oracle documentation allows for the use of WARNING, INFO, and ERROR as a type of message. However, only ERROR will successfully work.

Developing and Debugging Business Rules

Whether created by an experienced developer or beginner, Essbase calculations can cause headaches! Common miscues include failing to create expected blocks, misusing nested IF statements, and an incorrect use of formulas. By using @RETURN, checkpoints can be set up along the way to help debug code within the script. Here’s an example:

The intention of the calculation is to accept a “YearTotal” data input from the user. The business rule will then spread the data across the twelve months evenly. However, the data is not appearing as expected.

In the first row, the first FIX statement appears to have been skipped. After telling the calculation to create blocks, the first FIX statement is now being processed correctly.

Tyler 1 Tyler 2

Tyler 3 Tyler 4

Validating Data Integrity

Hyperion Planning contains many techniques for keeping data integrity intact. However, prior to 11.1.2.1, Hyperion Planning lacked a streamlined method for to outright prevent invalid data from entering the system. For example:

A user prompt is going to be utilized to accept a “YearTotal” data input. The business rule will then spread the data evenly across all months. However, the application does not want the total planned amount for the specific Entity to surpass the total budgeted amount.

By using @RETURN, the system can not only create a notification of the incorrect amount but also prevent the invalid data from ever entering the system.

Tyler 5

Entering “2500” returns an error, as it is greater than the budgeted amount.

Tyler 6

Entering “2400” successfully spreads data across all months, creating a “YearTotal” of 2400.

Tyler 7

Tyler 8

User Invalid Entry Guidance

In addition to providing data integrity, the @RETURN function can deliver customized error messages to the user when data is rejected. If the function is stretched even further, it can be used to deliver a message on either success or failure. However, due to the function only accepting the “ERROR” type, all jobs will show up through the job console as having an error. For example:

The message from the previous example has been further developed to display a more informative message. Users can use then use the displayed information to assist in the data input process.

Tyler 9

Tyler 10

While not all Hyperion Planning projects will call for the use of @RETURN the @RETURN function can be a very powerful asset in the delivery of EPM solutions, whether it is used for debugging, data integrity, or simply to better communicate with users.

Author: Tyler Feddersen, Performance Architects


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.