Monthly Archives: February 2016

Data Discovery versus Business Intelligence (BI)

February 24, 2016

Author: Andy Tauro, Performance Architects

In the ever-changing landscape of industries large and small, a question that arises in the minds of financial decision makers is, “What is data discovery and how it is changing the approach to business intelligence (BI)?” At Performance Architects, we are also often asked, “Why do I need a solution for data discovery? Isn’t that what my BI solution is for? Isn’t my BI solution supposed to help me parse my data, to gauge the pulse of the state of the company or organization and give me the answers that I need?”

While legacy BI solutions are very good at answering the questions that have been asked, what about questions that arise due to unexpected market factors that impact your business, such as the introduction of a new product in a peripheral market that suddenly changes consumer consumption behaviors. That is what happened to the publishing industry with the advent of the modern smart phone. As a result, your customers are suddenly expecting more from your product that they were very comfortable with just a short while ago. At times such as these, unless you have a crystal ball that actually works, you are probably wishing you had a way to spot new trends as they are evolving, to give you as much advanced warning as possible to stay ahead of the curve.

AT1
(image – oracle.com)

With the ability to incorporate new data sets into existing data (usually of vastly different structure), good purpose-built data discovery tools like Oracle Endeca Information Discovery (OEID) can parse updated data sets to pull out characteristics like sentiment and objects like products, places, and names from textual data. These can be quickly tabulated into existing reports, or visualized using graphing tools that are present in the toolset itself. This informs a data modeler, or data scientist, about data dimensions that may not have been visible before, and allows for updating BI models to ask new questions. Even existing BI tools like Oracle Business Intelligence (OBIEE) 12c or Oracle Business Intelligence Cloud Service (BICS) have add-on tools like Visual Analyzer, that allow for the visualization of on data sets with little to no involvement needed from infrastructure (IT) teams.

For this reason, data discovery tools (either purpose-built or add-on), can be of tremendous value to decision makers, allowing decision makers to become self-sufficient through making relevant decisions that won’t become obsolete before they’re implemented. It is important to pick a solution that is easy to use, yet is powerful enough to incorporate new data sets into existing models quickly, and to efficiently parse them to provide the decision points that matter.

Need help determining what these tools are, or how they can help you? Give us a shout, and we will be glad to help.


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

EPM Automate in Oracle Planning and Budgeting Cloud Service (PBCS)

February 17, 2016

Author: Tom Blakeley, Performance Architects

EPM Automate is the utility used in Oracle Planning and Budgeting Cloud Service (PBCS) that is used to manage automation routines, including the upload of data files; running Oracle Hyperion Financial Data Quality Management Enterprise Edition (FDMEE) data rules; building dimensions; and managing the application. As part of most projects, we spend a good chunk of time with this utility, setting up business users for success with automation routines that automatically refresh key components on a nightly basis. While batching scripting can get fairly complex quickly, there are a series of basic commands that any administrator can pull together to start their own automation routines. I’ve included some of these here along with the likely use cases to help folks get started.

Case #1: On a nightly basis an aggregation business rule runs to calculate the Planning application, and then a map reporting job pushes data from one plan type to another.

Script Steps: Login, Run Business Rule, Run Map Reporting Application, Logout

As you can see in the screenshot below, I’ve created a very basic script that logs into the EPM environment, and then launches the Business Rule followed by the Map Reporting application. Nothing fancy here; the script moves to the next step, even if the prior step fails.

Tom 1

Case #2: Each weekend an automation script needs to run that uploads a new metadata file, runs a cube refresh, and runs a business rule to prep the cube for the next week.

Scripts Steps: Login, Run File Upload, Run Metadata Import, Run Cube Refresh, Run Business Rule, Logout

As you can see in the screenshot below, I’ve create a script that performs the initial login to the EPM environment, and then initiates a file transfer from the on-premise file system up to the Cloud Inbox. From here, I’ve launched a metadata import using the file that I previously placed in the inbox. From here I’ve run a cube refresh to push my metadata changes down to the underlying Essbase database. Finally, I’ve executed the cube aggregation and logged out.

Tom 2

Case #3: Each night we need to run a FDMEE data load, and then run an aggregation script on the cube. This refreshes the system with updated financial data, and preps it for the next day of planning and reporting.

Script Steps: Login, Run File Upload to the FDMEE Data Load Folder, Run FDMEE Data Rule, Run Business Rule, Logout

In the script below, I’ve first shipped up my data file into the Oracle Cloud instance from my local on-premise environment. Once up in the cloud, I process the data file using an FDMEE data rule called “Monthly Financials.” From there, I launched cube aggregation and logged out.

Tom 3

In the above three examples, I am just using basic commands to move data around and run rules. Typically we would want to then wrap these commands in error detection and logging components. We might also integrate this with other local routines to provide end-to-end automation. I’ll cover this in another post – so for now, happy scripting.

 


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

Why Do Rolling Forecasts Matter?

February 10, 2016

Author: Andy Tauro, Performance Architects

Are you happy with your current forecasting process? Is it providing the means to guidance that can be implemented? Or do you find yourself increasingly abandoning plans that were finalized with great effort, only to start over again? Are your goals being met? Is your current process flexible enough that new market factors, that could impact your business are incorporated into the process, and does this process allow for sufficient look-ahead duration to allow for planning? If you are currently using an annual forecast that uses a forecasting window that shrinks as the fiscal year progresses, you may want to keep reading.

A fiscal-year-based forecasting process typically forecasts into a “wall,” which is the fiscal year boundary. As the fiscal year progresses, the forecasting window reduces in size, and any goals not met in periods past need to be shoe-horned into whatever duration is left. As the end of the fiscal year approaches, it usually brings with it a rush to “use it or lose it.” In other words, whatever resources you were given to accomplish your goals at the beginning of the year must be used and the goals be met by the end of the fiscal year…otherwise you may end up losing “it” (resources, your function, your job, etc.). Furthermore, while you have built your guidance for the year, your competitors (or some other business entity that in the past did not affect your business), have made a move in the middle of the year that makes your fiscal plan nearly irrelevant.

Andy1

An example would be a natural disaster, or employee action, shutting down a manufacturing plant just before you were planning to ramp up manufacturing to meet the holiday demand for your most profitable product. You now only have the rest of the year to rewrite your plan to adapt to this. If it took you a whole year to put your plan into action the first time, what are the chances that you will be able to do this in six months? Will you be able to show you stockholders a plan that will be able to answer their concerns?

Andy2

Using a fixed-size rolling window for forecasting can solve some of these problems. Such a model moves the forecasting boundaries forward during the calendar year rather than keeping them fixed, and as a result you always get a fixed number of periods to build your plan. For example, if your fiscal year begins in April and ends in March the following calendar year, rather than working with reduced number of periods as the year progresses, the forecast window moves forward. So if you are working on a plan that begins in July, instead of nine months, you still get twelve. This allows for the same amount of time to implement the plan with every iteration. Moreover, since the plan can cover steps across fiscal year boundaries, resources that could not be adequately utilized need not be lost. Instead the new plan can lay out how to use these resources to achieve the intended goal.

While such an approach can have some challenges, the process can be made less effort-intensive with scripted seeding and extrapolation methods for developing initial plans. This way, financial planners can focus on doing what they do best, tweaking plans based on human skill and experience, and not recreating plans from scratch, every time.

Intrigued, looking for more information, want to figure out where to get started? Drop us a note, and we will be happy to help you explore how moving to a rolling forecast could benefit your financial planning and analysis process.


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

Demystifying Oracle’s Business Analytics Cloud Offerings

February 3, 2016

Author: Tom Blakeley, Performance Architects

Cloud, Cloud, Cloud, Cloud! …

It is all we have heard about for the past year in the business analytics space, particularly as Oracle releases offerings moving functionality traditionally found in on-premise applications as software-as-a-service (SaaS) offerings.

Cloud.

Okay – you get the picture. I thought I might take a few minutes to write about all of the different offerings to help folks wrestling with all the acronyms understand what is out there. If you are interested in more information, feel free to shoot me an email or comment below. We also have a host of content available with more details on the Performance Architects blog.

Oracle Planning and Budgeting Cloud Service (PBCS):

Oracle Planning and Budgeting Cloud Service, or PBCS, is Oracle’s SaaS offering for an organization’s forecasting and budgeting processes. It provides a single platform to build out an operational budget; quarterly/yearly/rolling forecast model; and reports and analysis. PBCS is the same code base as Oracle’s on-premise EPM planning solution, Hyperion Planning, which has long been the strongest offering in the space. PBCS provides both a web-based and Microsoft Office-based (called “Smart View”) platform that is rather intuitive. Customers are provided with two environments, a Development/Test instance and a Production instance.

Oracle Enterprise Performance Reporting Cloud (EPRC):

Enterprise Performance Reporting Cloud, or EPRC, is Oracle’s newest SaaS offering for an organization’s narrative reporting process. This tool provides a collaborative workspace geared towards organizations that go through an internal process to build out financial and management reporting packs. Users are able to portion out, collaborate, and work together on the development, review and final publication of a reporting pack, using data sourced from a variety of different systems ranging from on-premises applications to Excel spreadsheets. EPRC compliments the forecasting, budgeting, reporting and analysis content in PBCS.

Oracle Business Intelligence Cloud Service (BICS):

Business Intelligence Cloud Service, or BICS, is Oracle’s new BI solution built for the cloud. This is more than just a rehash of Oracle Business Intelligence Enterprise Edition (the on-premise Oracle BI offering). Instead, BICS is a BI tool that includes traditional reporting; data traditionally found within spreadsheets; and unstructured and semi-structured data analysis. BICS bridges the gap between the traditional on-premise solutions built over the past ten years, and the newer departmental offerings that typically do not scale well, though they do offer a strong data visualization layer. BICS sources data from both on-premise data sources, as well as data that users upload themselves into the cloud.

Cloud.

Look for some new blog posts coming out on additional Oracle solutions launching in the business analytics cloud arena in the next few months, as well as my recent ‘When Oracle Cloud EPM and BI Solutions are the Answer’ webinar where I covered these cloud offerings in detail and provided some insight into the strengths and weaknesses of each offering. Email us for a copy of the slides and the replay!

 

 


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.