Monthly Archives: April 2013

The Need for Service Accounts to Manage Server Software during Upgrades and Migrations

April 29, 2013

With Enterprise Performance Management (EPM) and Business Intelligence (BI) systems easier to install and maintain, we are seeing our customers try their hand at installations or upgrades of systems like Oracle Hyperion Enterprise Performance Management (EPM) or Oracle Business Intelligence Enterprise Edition (OBIEE) environments on their own. While the installation and setup of sophisticated software like this has become much more user-friendly than in the past, there are still some best practices we recommend that you follow. These are practices that seasoned system administrators use as a result of years of experience supporting similar systems. One such practice is the use of “Service Accounts” to set up and manage server software.

Corporate security policies usually dictate that a network account should be assigned and controlled by one single individual. This helps with traceability and accountability, not to mention the ability to set up customized security at a highly granular level. While this works well for user workstations, this does not lend itself well to a server environment.

The reason is that while the work performed on a workstation is mostly silo-ed and repeatable elsewhere, the work that resides on a server usually represents the work of multiple individuals that could be using the machine around the clock. If the deployed tools on such a machine are tied to an individual’s account, the machine is then dependent on that individual. The reason is that software such as EPM or OBIEE uses wallets, inventories and other keystores to set up secure communication streams and to look up lists that allow this software to function and to be maintained. The security of the system is often entrusted to the operating system (OS) of the host machine and gets placed in the most secure areas of the machine that are assigned to the user who setup the software.

In this case, these items are not available to another account. While some operating systems offer features that can overcome some of these limitations, the whole purpose of such restrictions is to make the system secure. Allowing for these restrictions to be overcome could allow the system to be compromised.

To get around this, “service” accounts are created that are tied to a specific toolsets or “services.” The access to, and responsibility of, such an account then rests on the support team that is in charge of the “service.” One of the major advantages of this is that the system is not hostage to the availability of one user, and can be supported around the clock on a rotational schedule. Also, if the user leaves the team or establishment, someone else can take over, without the concern of having shared access to another user’s “private” space like email accounts, that arise from sharing passwords. “Service” accounts do not usually have such private spaces within an organization’s systems.

For such reasons, this is one of the prerequisites that we insist on before we set up a new EPM or OBIEE environment. Customers with support, and security teams, that are experienced in such systems usually already have this, or require very little justification to do this. If you’re in one of the organizations where it is more difficult to make your team understand, hopefully this blog post will help you to justify the need for this type of solution.

Author: Andy Tauro, Performance Architects


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

Requirements Traceability Matrix Definition: Part 2 of 3

April 25, 2013

I generally think of the Requirements Traceability Matrix (RTM) as a document as well as an associated process. Process steps are tied to groups of columns in the matrix. In this post, we’ll explore the first two steps in the process.

Blogging pic - Ron3

Capture Business Requirements. In this section, requirements are catalogued during discovery sessions. The table below provides the column names and a description.

Blogging pic - Ron4

Capturing good business requirements takes some practice. The first time through using an RTM the process will be a bit slow. Fortunately, after a few tries, cataloging requirements gets easier and the final output will be higher quality.

The unique identifier is particularly important as it builds some element of control into the spreadsheet. I wrote a simple macro and attached it to a button at the top of the spreadsheet that adds a new row and assigns the next ID number.

Blogging pic - Ron 5

The macro also copies all the formatting, validation and lookups from the prior row. Here is the VB code for the macro:

Blogging pic - Ron6

Prioritization and Acceptance. This should take place after the team has had an opportunity to review and revise the requirements. During this process, the team can vote on the initial priority of each item. During the review, the team’s initial rating can be revised.

Blogging pic - Ron7

Each item logged onto the matrix should remain on the document. The rejected and deferred items can easily be filtered out of the displayed records.

Author: Ron Woodlock, Performance Architects


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

Master Your Metadata: Oracle Data Relationship Management (DRM)

April 24, 2013

Many organizations maintain a host of disparate enterprise performance management (EPM) and business intelligence (BI) solutions, in addition to various transactional systems. Since these systems are disparate, they often handle information differently, and the output of information from them is also different. Furthermore, such systems are often handled by different teams, such as Finance for EPM and IT for BI. As a result, many teams are starting to realize the need for data governance processes and data integration tools.

We advocate that you control the metadata structures at the source. What is metadata? It is the template into which data gets formatted, putting it into a structure that is not just easy to read, but much easier to understand and incorporate into decision making. This usually takes the form of a hierarchy of information (e.g., Chart of Accounts).

Doesn’t this then add an additional step? Perhaps at first, but as it is incorporated into the processes that source the data for these information systems, the overhead is next to nothing.

To understand this, consider the capabilities of the industry-leading Master Data Management (MDM) tool known as Oracle Data Relationship Management (DRM).

Blogging pic - Andy

The capabilities of this tool are highlighted in four key functionality areas:

  • Consolidate. Provides the tools to import and “blend” metadata from multiple sources, both in bulk and incremental updates. This makes it easy to not just set it up, but to keep on using it on a periodic basis. Just prior to a close period, it’s not uncommon to gather a list of accounts and hierarchies that is to be cleansed and approved for downstream systems. DRM simplifies this process with its inbuilt tools.
  • Rationalize. By applying pre-built business rule logic, metadata can be cleansed and validated. In addition, properties can be added or derived, setting up the various nodes for consumption.
  • Share. Once cleansed, DRM can export information in the format for integration with downstream systems. This makes the handover seamless without manual involvement, reducing errors and time to deliver.
  • Govern. No system is worth much if it does not provide data security and auditing capability. DRM allows for very granular level of access control, and for auditing as well as approvals. Furthermore, by using “versions,” DRM can be used to save off history and offers the ability to perform impact analysis.

The hallmark of a good MDM tool is the ability to work with any, and many, source systems to deliver information that can be easily fed into any downstream system. Examples if such a system could be ERP (Enterprise Resource Planning), CRM (Customer Relationship Management) or Business Intelligence (BI) systems like multidimensional databases. That is where DRM shines and delivers easy of use, with the consistency and auditability that is much needed in corporate information systems.

Author: Andy Tauro, Performance Architects

 


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

Requirements Traceability Matrix Definition: Part 1 of 3

April 23, 2013

A requirements traceability matrix (RTM) is a single document where business requirements are linked to the solution design and the testing strategy. Some of the benefits of using a RTM include:

  • Bi-directional tracing (e.g., requirements to solution elements and testing bugs to solution design)
  • Reduces project risk by systematically cataloging requirements
  • Encourages greater precision in writing requirements by linking design elements and testing strategy
  • Improves transparency and accountability

Some of the challenges of using an RTM include:

  • Requires a greater upfront effort to properly catalog requirements
  • Requirements often need to be rewritten to adjust to the various solution and test elements
  • Can become too cumbersome and time consuming

A good strategy for overcoming the challenges is to be flexible and open-minded when setting up the matrix. Every project is different and presents unique challenges. Adapt the matrix to the project rather than applying a one size fits all approach.

I generally think of the RTM as a document as well as an associated process. The process steps are tied to groups of columns in the matrix.

Blogging pic - Ron

In future posts I’ll provide more detail on each of these steps.

Author: Ron Woodlock, Performance Architects

 


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

Virtualizing the EPM Environment

April 17, 2013

Increasingly, we see clients embracing virtual machines (VM) as the hardware platform of choice. The reasons for this could be many:

  • Consolidating small machines onto a large machine for reduced hardware maintenance costs
  • Reduced time to set up the environment
  • High-availability (aka: if the host machine fails, the VM can be moved to another host and can resume processes)
  • Prototyping (when a new solution is being developed which has no comparison for sizing, it is better to start off with a VM. This is because resources like central processing unit (CPU) cores and/or RAM can be added to a VM fairly easily and quickly. With a physical machine, the hardware will need to be acquired, and in some cases the machine is not upgradable, and must be replaced.)
  • Increase in available talent for setting up / administering VMs
  • Improvements in virtualization software (virtualization software has greatly improved in the last year or so to the point that the performance of machines built using this software is close to that of physical machines)

However, not all components of Oracle EPM (Hyperion) work the same way in virtual environments. Web applications, like Oracle Hyperion Planning or Hyperion Financial Reporting, are not very CPU-intensive, and work equally well on virtual or physical hardware. On the other hand, data-intensive applications like Oracle Hyperion Essbase and Hyperion Financial Management depend on CPU, as well as disk I/O, extensively. This usually means top-tier disk volumes, because the disk throughput directly affects the performance of the solution built using these applications. Also, server CPU chips are getting much better at crunching numbers. Due to this, we generally do not recommend using VMs in a production environment for Oracle EPM (Hyperion) solutions.

For a non-production environment, like a development environment, we do suggest starting off with VMs as this allows for a “quick start” to the development process with minimal resources which then scales up as the prototype begins to take the form of a fully-developed solution. At this stage, it should not be too difficult to determine the permanent sizing needed for physical machines that would be deployed in the production environment.

At this point, it would not be fair if I don’t mention the Oracle VM software. This uses a Hypervisor (also called the Client VM), which provides the virtualized hardware direct access to the hardware of the host machine. This way the OS of the Client VM gets to use its hardware optimization features. However, very few operating systems are designed for this process, except those based on the Linux OS (includes Oracle’s flavor of Linux with its unbreakable rnterprise linux kernel). All of Oracle Hyperion EPM is fully certified on Oracle VM software, but not on any other VM software at this time.

“Going virtual” is a decision that takes significant commitment. Corporations going the VM route build a significant infrastructure consisting of both human as well as hardware resources to build and support them optimally. A way out for companies without such resources is an enterprise cloud, which will hopefully be covered in this forum in the near future.

Author: Andy Tauro, Performance Architects

 


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

Another COLLABORATE 13 Newsflash: Oracle Business Intelligence Foundation Suite 11.1.1.7 Release and 12-Month Technology Roadmap

April 15, 2013

Last week we saw the release of Oracle Business Intelligence Foundation Suite 11.1.1.7, which includes an incredible amount of new features and functions. During several sessions at COLLABORATE 13, Oracle highlighted the following new and improved visualization options:

• Performance Tiles. A new type of report that you can imbed in a dashboard. Performance tiles are intended to illustrate high level KPIs and metrics. These look really nice on a dashboard and are fully “drillable” – they’re not just decoration

Freeze headers. A great solution for hierarchical columns that cause reports to extend beyond the edge of the screen. As a report consumer, you can now choose to freeze the header columns in a report in order to maintain the context

Waterfall. This new charting option gives you a great way to visualize positive and negative contributions

View suggestions. A new option you can select when creating reports for OBIEE, and a feature that has been part of the Exalytics platform for a while now. This feature gives OBIEE the ability to suggest the best type of graphical visualization for your data

Looking forward in the next 12 months, Oracle representatives mentioned that we can expect the following new features from Oracle Business Intelligence Foundation Suite:

1. Oracle BI Mobile Designer
• Pre-built reporting applications
• High-fidelity design and layout
• Multi-source, cloud, and task-oriented
• Support for Android and other mobile platforms

2. Oracle BI Applications will be able to use Oracle Data Integrator (ODI) instead of ONLY Informatica (finally).

3. Two whole new BI analytic applications will be introduced:
a. Oracle Student Information Analytics
b. Oracle Indirect Spend Planning
• New optional module on top of Oracle Procurement and Spend Analytics
• Gives you the ability to forecast indirect spending
• Reduce spend with “what if” modeling
• Enables you to make better decision about supplier allocations

Author: John McGale, Performance Architects


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

How to Shorten the Monthly Close with BI Alerts

April 11, 2013

Business intelligence tools can be used to shorten the monthly accounting close. As anyone that has been involved in closing the books can attest, much of the hard work begins after the journal entries are posted. Traditionally, accountants and analysts build spreadsheets or run reports to identify variances and try to make sense of them. The work involved isn’t trivial – this effort is akin to a treasure hunt – one variance simply leads to another and another.

This is where business intelligence tools can add significant value. Rather than hunting through variances, you can set up alerts to specifically identify variances that meet a specific threshold. For example, an analyst can review all variances that are greater than $100K and represent a change from prior period of more than 10%. This approach sets up a “reverse treasure hunt” as the analyst starts with the treasure and broadens perspective (a drill up) as needed, saving a lot of time downloading data into spreadsheets or running reports.

Author: Ron Woodlock, Performance Architects


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

Live EPM Backups using Oracle Hyperion Lifecycle Management (LCM)

April 10, 2013

Enterprise Performance Management (EPM) systems house critical financial data for the consolidation, budgeting and forecasting processes, and therefore need to be regularly backed up both to restore the system not just in case of system hardware failure, but also to address data corruption caused by human error. For these reasons, it is necessary not only to have reliable backup of such a system, but to have as short as possible time windows since last backup. This means the EPM system should be backed up often – and through tried and tested means – since it is not possible to test each and every backup that is made to ensure that it can restore the system correctly and consistently.

While there are many mechanisms for storing such backups that are available off-the-shelf, the means to reliably make backups is still in question. For Oracle Hyperion EPM systems, we recommend incrementally backing up files on servers, as well as relational repositories, using “cold” backups.

The downside to this methodology is that backups are best made when the objects are not being accessed by either the software or the users of the system. Otherwise, it is difficult to gauge the integrity of the backups. The best practice used to be for organizations to schedule downtime for EPM applications to allow for the backup and to avoid this issue; unfortunately, there are many issues with scheduled downtime. First, software tools that work around such “file-being-accessed” situations add a time overhead that can result in system instability. Second, it is difficult to schedule other operational processes around this as such processes usually require very costly service level agreements (SLAs) and need to finish in a certain time window. Finally, with systems going global, a downtime in one geographical region may be peak time for another region.

As a result, we maintain that a live backup is generally, eventually unavoidable. Clients often ask us for a way to create a process that is not only supported, but consistent in backup integrity and duration. In such cases, we recommend using a tool that Oracle has included with the Hyperion EPM suite of tools called Lifecycle Management (LCM). LCM is intended to be used as a utility to migrate objects between environments, such as application definitions or structures of solutions built using Hyperion Planning, Essbase, Financial Management (HFM), Hyperion Financial Reporting, etc. Like a migration from a development to QA and/or production environment, LCM exports from one and imports into the other. It also works just fine to re-import the exported objects right back into same environment. And best of all, it comes with command line utilities to perform these tasks. These tasks can be scripted and incorporated into scheduled operational processes using any existing scheduling tool like CA Workload Automation AE (formerly Autosys) , Cron, or Windows Scheduler.

With exports and imports using LCM fully supported by Oracle, this provides a reliable “tried-and-tested” tool with consistent processes and run times.

Author: Andy Tauro, Performance Architects


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

More News from COLLABORATE 13: Endeca as an OBIEE Data Source!

April 9, 2013

While attending COLLABORATE 13, I sat in on the “Oracle Business Intelligence Product and Technology Roadmap” session presented by Paul Rodwick, the VP of Product Management for BI at Oracle.

The major theme of the presentation was on OBIEE’s new ability to natively connect to Endeca 3.0 (officially released yesterday).  Surprisingly, the reverse is also true – you can connect Endeca to OBI as a data source.

The “secret sauce” of Endeca plus OBIEE, according to Paul, is exploration combined with search.  In other words, OBIEE’s rich exploratory model can be combined with the powerful search capabilities of Endeca.

So why is this all of a sudden so important?  Unstructured data is difficult to model in a traditional relational database because it’s basically a big block of text.  Unstructured data is also being created at such a rapid pace that it is nearly impossible to keep up with.

As a result, we need programs like Endeca to be able to “tag” data using a Key:Value metaphor known as a “faceted data model” that we can then load all in one place and perform searches against.  Search examples could include:

  • What are our customers saying about our new service fare increase?
  • Why should I pay a checking fee if other banks don’t charge one?

The key takeaway from Endeca is that it’s meant to really for data discovery and not reporting – which makes it an excellent complementary tool for OBIEE as far as I’m concerned, because OBI has never really been admired for its search capabilities – although it’s quite ironic in a way since nQuire (the predecessor of Siebel Analytics and OBIEE) started out as a search tool for networked content.

On a final note, you will be glad to know that Endeca can also be loaded with structured data from an Oracle RDBMS in order to take advantage of Endeca’s search engine.  I’ll be curious to see how I can use Endeca to federate data such that users can effectively drill into Endeca at the bottom of the discovery path in terms of their click stream.

Author: John McGale, Performance Architects


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

How to Make Tax Season Easier: Oracle Hyperion Tax Provision

April 8, 2013

With the introduction of many regulatory requirements after the recent recession, corporate tax accounting and reporting is receiving more scrutiny and priority than ever before. And appropriately so, since it happens to be a major expense item on a company’s income statement.

hyperiontax1

Parallel processes for financial close and tax close (Source: Oracle Hyperion Tax Provision Whitepaper)

Gathering of tax information during a tax close process usually occurs independent to the gathering of financial information for management and financial reporting. This involves either duplication of the process that is undertaken for the financial management and consolidation process, like posting of journals and such, or waiting until the period close is complete. As a result, the work effort is duplicated or the filing is delayed since the tax close has to wait until financial data is ready. Either of these results in additional costs – either due to the delay or to the additional resources from the duplication. Duplication brings with it significant overhead as well for coordination, supervision and oversight of duplicate efforts.

hyperiontax2

Built on the popular financial consolidation tool Oracle Hyperion Financial Management (HFM), Hyperion Tax Provision (HTP) comes with pre-built packages to automate the gathering of financial data for tax reporting purposes.

hyperiontax3

Starting from a strong calculation and translation engine to translate data by multiple tax and currency rates to the flexibility provided by multiple custom dimensions to customize data aggregation, to pre-built web data entry forms, HTP builds on the strong fundamentals of HFM to provide a solution that can not only gather tax information accurately, but also quickly and in a highly controlled and automated fashion.

hyperiontax4

HTP leverages the built-in controls for metadata and data that come with HFM, while providing an interface that is familiar to the finance department. Additionally, since it is has its own application space and metadata, it provides the tax department the independence to run the process as is best suited to them.

HTP also comes with an extensive set of reports built on the Oracle Hyperion Financial Reporting toolset. These highly formatted, print-ready reports are suitable for executive reporting as well as provide for detailed analysis by tax managers. And through Oracle Hyperion Smart View for Office, these reports can be integrated into Microsoft Office documents and presentations. Additionally, since HTP builds on the HFM platform, it can be used for Business Analytics as a source of data for multidimensional databases like Oracle Hyperion Essbase, or through the Oracle Business Intelligence Enterprise Edition (OBIEE) suite. Furthermore, the data gathered can be used to seed budgeting and forecasting solutions through tools like Oracle Hyperion Planning.

To summarize, HTP provides the tax department with the same strong data collection and management tools that the finance department is accustomed to. By providing the tax department with the same platform as is used by the finance department, the coordination between the two departments improves the financial close process, since the financial close cannot be complete without the tax close.

If you would like to receive more information how Performance Architects could help you implement Oracle Hyperion Tax Provision in your organization, please send us a note at communications@performancearchitects.com or comment below.

Author: Andy Tauro, Performance Architects


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.