Monthly Archives: June 2013

Strange Bedfellows – Oracle’s New Partnerships

June 27, 2013

in Shakespeare’s Tempest, Trinculo is forced to shelter from a storm with someone he scorns and says of it “misery acquaints a man with strange bedfellows.” The storm facing today’s technology titans is cloud computing. Oracle has announced a number of cloud computing related deals with bedfellows recently scorned.

Strange bedfellows #1: Oracle and Dell announced on June 4, 2013 an alliance that would enable Dell customers to deploy solutions with Oracle’s enterprise software with a joint-infrastructure solution. The press release stresses the combined solution would be high-performance, low-cost and has superior customer support. What makes this alliance strange is that Dell and Sun have traditionally been competitors.

Strange bedfellows #2: Oracle and Microsoft announced on June 24, 2013 a partnership where Oracle’s software will be deployed on Microsoft’s Hyper-V and Windows Azure. The press release tag line describes the benefits: “Deal will help customers embrace cloud computing by providing greater choice and flexibility.” Larry Ellison is quoted as saying “It’s Microsoft versus mankind, with Microsoft having only a slight lead.” This quote is few years old and I think Microsoft’s lead has shrunk considerably since then making this deal possible.

Strange bedfellow #3: Oracle and Salesforce announced on June 25, 2013 a nine-year partnership that will combine their cloud computing offerings. For its part Salesforce will standardize on Oracle’s Linux operating system, Database, Java Middleware platform and the new Exadata engineered systems. This announcement comes less than one year after Salesforce announced a strategy moving away from Oracle’s database in favor of PostgreSQL an open source competitor. The CEOs of these two companies have had a hot-cold relationship. I guess it is hot again for now.

In the twenty-two days Oracle has announce deals with three rivals – to what end?

All three deals increase Oracle’s reach with small and mid-sized companies, something its traditional sales approach isn’t addressing. The Microsoft and Dell agreements provide Oracle the opportunity to grow market share with customers that would have otherwise been inaccessible. The deal with Salesforce not only eliminates PostgreSQL as competition for Salesforce’s business, but also opens up a channel to sell to Salesforce’s customer base.

Cloud Computing may be the latest fashion or a new paradigm that is here to stay, whichever, it is being looked to by companies big and small as a way of:
• Cutting Cost – of in-house hosting of applications
• Reducing risk – technical complexity, disaster preparedness / recovery
• Increased flexibility – to scale environments and applications up and down without significant lead times and capital investment
These three deals broaden Oracle’s options with regard to offering cloud based solutions.

All three deals depart from Oracle’s traditional do-it-yourself approach. For this new approach is to be successful each partner will need to invest significantly to ensure the delivered solutions are tightly integrated and well supported. Otherwise the cloud could turn into a tempest that no strange bedfellow can save them from.

Author: Ron Woodlock, Performance Architects


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

Financial Data Quality Management Enterprise Edition (FDMEE): An Old Favorite Gets a Makeover, with More Horsepower

June 26, 2013

With the release of Oracle Hyperion Enterprise Performance Management (EPM) 11.1.2.3, the Financial Data Quality Management (FDM) toolset got a much needed overhaul. This tool has been popular with administrators and users of the Hyperion Financial Management (HFM) application software since it was known as Upstream. And as its capabilities expanded (and after it was renamed FDM), it has been a leading solution to cleanse and load data to Hyperion applications such as Hyperion Planning, HFM or Essbase due to its strong mapping and drill back capabilities. With the introduction of ERP Integrator (ERPi), this tool was given the capability to connect to a wide range of source systems like Oracle E-Business Suite (EBS), SAP BW or PeopleSoft.

However, FDM has always been restricted to a 32-bit Windows architecture, which limited scalability and throughput (the rate at which it could process data). With applications getting larger, and data refresh rates getting more and more into the realm of real-time availability, this was proving to be a significant bottleneck.

Andy pic 1

Enter FDM Enterprise Edition (FDMEE). Bringing all of FDM’s core functionality into the same J2EE platform that ERPi runs on, it also integrates Oracle Data Integrator (ODI) into on toolset. In previous versions, FDM, ERPI and ODI needed to be separately installed, configured and maintained, which made the solution more complicated to set up, not to mention enhance and maintenance. With the single 64-bit platform independent J2EE platform, the tool is scalable and much easier to install with the rest of the Oracle EPM system. This also makes it easy to update via patches applied with Oracle’s OPatch tool.

Andy pic 2

What’s more, FDMEE comes with full Life Cycle Management (LCM) support that it so much easier to build the solution once and then migrate it to Production or Disaster Recovery (DR) environments. Previously, the most commonly used approach was to recreate the solution in Production or DR environments.

So then what’s not to like? For new implementations, not much, and everything will be created fresh. For migrations from existing FDM solutions, VBA-based and VBScript-based custom scripting and adapter enhancements will need to be translated into the newer scripting tools that are Java-based. It is not a huge hurdle, as a large talent pool is available for Java based scripting tools like Jython. It is just something that needs to be kept in mind and planned for when considering the move from FDM to FDMEE. Also, a lot of the custom scripting for mapping will be reduced, now that FDMEE supports multi-dimension mapping (previously this necessitated getting creative with the “Import Formats” and a whole lot of VBScript-based custom scripts). Less code will lead to a more robust solution that can be maintained and extended by less technically-gifted personnel, like Finance managers who have a better understanding of the data.

With promised capabilities like integration with Oracle Data Relationship Management’s (DRM’s) Data Governance module, or the ability to source data from EPM applications, there are many good things on the horizon with this tool. All this makes this a great time to move to this tool. Just make sure to plan, to allow sufficient time to do it right, and to take advantage of the pluses that FDMEE can bring to your solution.

Author: Andy Tauro, Performance Architects


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

Oracle’s BI Publisher versus Oracle Business Intelligence Enterprise Edition (OBIEE): Strengths and Weaknesses

June 18, 2013

BI Publisher has long been a companion application to Oracle Business Intelligence Enterprise Edition (OBIEE) since the early days of OBIEE version 10g and is now part of the Oracle Business Intelligence Foundation Suite (BIF). This is often a confusing point since BIF is referring to OBIEE and its companion products where OBIEE is referring to Oracle Business Intelligence Enterprise Edition itself. That tidbit of clarification aside, BI Publisher is often underutilized in an OBIEE implementation and many times is ignored. While I can’t say I’ve had the most memorable experiences with this tool, it should be considered more often as a viable reporting solution in an Oracle-based BI environment.

Oracle positions the BI Publisher solution as the go-to tool for “pixel-perfect” reporting. If your report requires data to be arranged in a particular way, this is your tool. BI Publisher is delivered in a myriad of Oracle tools under the name “XMLPublisher” and also “BI Publisher” and varies in product version and feature set depending on the products it is bundled with.

For instance, the version that is bundled with Oracle’s Enterprise Business Suite (EBS) Release 12 (R12) is XMLPublisher 5.6.3, whereas the BI Publisher version that is bundled with Oracle Business Intelligence Foundation Suite 11.1.1.7 is 11.1.1.7. We have been told through reliable sources at Oracle Support that the EBS version will soon be listed as 10g. BI Publisher can be purchased in a stand-alone version as well, and the standalone version still tracks with OBIEE’s version number.

BI Publisher’s strengths include:

  • Format output on a form in any way
  • Create custom data models based on any query
  • Provides support for multiple types of output
  • Offers a scheduling tool
  • Allows output bursting (this is the process of generating multiple documents based on a data)
  • Provides lightweight Java j2ee application with a simple architecture
  • Supports multiple authentication protocols and single sign-on methodologies
  • Ubiquitous within the Oracle technology stack
  • Allows true ad-hoc reporting

BI Publisher’s areas for improvement include:

  • Data models can only be leveraged within BI Publisher
  • Multiple data models are created based on any number of SQL statements
  • Only offers primitive prompting and user interfaces
  • Intended mostly for delivered reports (unless served up through OBIEE dashboards)
  • Limited to no user interactivity (unless served up through OBIEE dashboards)

The main takeaway here for me is that BI Publisher is clearly a tool meant for operational reporting and not analytical reporting. You can create reports based on a data mart and deliver analytical reports, but there is no concept of real user interactivity without OBIEE. Unlike OBIEE, which imposes a Kimball methodology in modeling its central model, BI Publisher will let you use any old SQL as the basis for a “data model.” In the hands of non-technical users, this is a recipe for disaster in terms of maintenance costs and overall durability of those reports over time. So BI Publisher is really still a report-based solution for operational-type reporting.

If your organization is upgrading from BRIO, SQR, Oracle Reports, or the like, and you want to just be able to quickly port your reports “as is,” then BI Publisher is for you. It does a great job of allowing you to replicate any report, based on any SQL, with any inputs, and to be able to produce a report just the way it should look.

Author: John McGale, Performance Architects


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

ETL vs. ELT: What’s the Difference?

June 13, 2013

This is a question that we get often from clients and potential clients. What’s the fuss with the sequencing of these three words? After all, there will be an Extract (E) from the source system; the extracted data will undergo a Transformation (T) process; and all this will undergo a Load (L) process into the target system. So why does it matter if the L goes before the T? The answer lies in the setup of the infrastructure to accomplish this process.

Andy pic1

In a typical reporting system implementation, the source system is built before the reporting solution. As a result, the source system is optimized to perform the tasks it was meant to, and not much else. This is mainly because either reporting requirements were not considered at the time of system creation, or the requirements didn’t evolve enough to build out the reporting system. Examples of such source systems are Customer Relationship Management (CRM), or transactional systems (OLTP). So, in other words, when a new system is added to the environment, the source system cannot be touched, lest the entire environment ends up rebuilt from scratch. As a result, the reporting or analysis system is an add-on, with no load added to the source system. Tools like Oracle Golden Gate are popular in the extraction stage for this kind of traditional environment, since they essentially read the log files of the source system to glean the data that is to be extracted, thus putting no load on the source system whatsoever.

Until a few years ago, it was normal to stage the data into an intermediate system, before it was pushed into the target. This was simply because the target could not be taken offline to perform the load, or because the target system was better optimized to retrieve and report (and did not have the resources to perform hard crunching of numbers or data). As a result, the intermediate stage would be optimized to perform calculations and transformations on the data, which lead to the stage being called ‘Transform,’ since the data underwent a transformation. This approach also kept the target reporting system independent of the implementation method during the transform stage. As a result, many organizations implemented three separate systems to satisfy the requirements of each of the stages, usually requiring separate hardware each. This is the typical ETL – Extraction, Transformation, and Loading – system.

Andy pic2

More recently, computing hardware systems have become more capable. And since database software (DBMS or database management systems) are now more powerful, often working in concert with purpose-built machines like Oracle’s Exadata. As a result, systems that perform reporting can now also perform data calculations. In some cases, the design of the transformational technology ties closely into the technology or platform used for reporting. For example, software like Financial Data Quality Management Enterprise Edition (FDMEE) ties into Enterprise Performance Management (EPM) applications closely, because they are often built on the same system, and are integral to the solution. In this case, the ‘Load’ is performed into the target system before the data undergoes ‘Transform’ on the same system for reporting. Among other things, this allows the reporting applications to drill-back into the source system, allowing a data point to be traced all the way back to the source transactions that created it.

So then is the difference between ETL and ELT purely semantics? Not necessarily so. It is more of a rethinking of the approach taken to transferring transaction data into reporting systems, to take advantage of changes in technology. And along the way, it has enhanced the reporting solution with added value like tracing of data points. Traditional ETL was only one way – transactional system to reporting. With ELT, it is possible to trace-back from the reporting to transactional systems as needed, while retaining the original requirement of the reporting system not adding a load onto the transactional system.

Author: Andy Tauro, Performance Architects


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

Oracle Business Intelligence Enterprise Edition (OBIEE) 11.1.1.7 Sample Application Now Available

June 12, 2013

The new Oracle Business Intelligence Enterprise Edition (OBIEE) sample application (V305) is now available as of June 12, 2013. The new sample application does a good job at displaying all the latest and greatest features of OBIEE 11.1.1.7.0. As with the prior release, the sample application is now deployed only on Oracle Virtual Box (OVB) – and there is no longer an option to deploy directly on your machine (well, not without a lot of manual effort!). Having said that, this blog post summarizes the highlights of what is available for the OVB deployment of the OBIEE Sample App.

Application Versions

  • OEL 5.9 64bit
  • OBIEE 11.1.1.7 Enterprise install
  • Oracle Database 11.2.0.3, AWM 11.2.0.3.0, 
  • APEX 4.2 & APEX Listener 2.0 
  • ORE 1.3 & R-2.15.1
  • ENDECA 3.0, EID Studio
  • EPM 11.1.2.2 plug and play companion
  • Utils : Start Scripts, Map Builder 11ps5 , 
  • SQLDev 3.2.20

Deployment and Scalability

  • A 64-bit enterprise install (finally!) 
  • Auto IP recognition

New Display Features

  • Interactions
  • Breadcrumbs
  • Smart View Integration
  • All new standard Views (Performance Tiles, Waterfall Charts)
  • New print and dashboard layout options (“freeze panes” for dashboards)
  • Map lines, map targets, non-BI layers…and more
  • Blob images content integration
  • Mobile-optimized pages

John Blog 1

BI + Essbase Integration

  • Provides the ability to spin off Essbase cubes from a dashboard
  • Direct calc script/MaxL running and creation using Action Framework
  • Direct contextual write-back to an Essbase cube from OBIEE using Action Framework
  • Essbase as a target for the Aggregate Persistence wizard
  • Dynamic selection of Essbase aliases in the RPD

OBIEE Custom Function Interactions

  • JS additions to Userscripts.js
    – AF Launch of Contextual LSQL or ODBC procedure (clear cache, write back, etc.)
    – Popup contextual drilling from any report
    – Fully source OBIEE from any web service
    – Leverage UTL_HTTP database, within/outside same Origin Policy
  • Actions leveraging OBIEE Go URLs
    – Running BI contextual searches from any report (what to search, where to search, etc.)
  • Other custom dashboard JSs
    – Incrementally update/edit presentation variables
    – Collapse or expand all dashboard sections on a page
  • Custom Java program utilities
    – Concurrency simulation (monitor system while large volume of users/sessions/reports are multithreaded on it)
    – Physical SQL Generator (generate all your physical SQL without hitting your databases)

BI + Db Advanced Analytics

  • OBIEE end users can seamlessly leverage database capabilities
  • Frequent Itemset (native market basket analysis capability)
  • ‘Model’ clause calculations to project future values
  • Source OBIEE from any web service using Db UTL WS capability
  • OBIEE interacts with Spatial NDM (Network Data Model) via WLS servlet
  • OBIEE explores huge data with Db Descriptive Analytics, including configuring report-to-DB interaction for interactive and performant analysis on Bns of rows (granular source)

BI + Oracle R Enterprise (ORE) Integration

  • OBIEE CEIM consumes ORE result sets as a data source
  • OBIEE natively exposes ORE formats such as tabular, png, BLOB and XML
  • Author ORE scripts directly from the OBIEE dashboard with no RPD interaction
  • Dashboard end user transparently consumes ORE analysis and visuals
  • ORE runs analysis on OBIEE Common Enterprise Information Model (CEIM) via JDBC

BI + Endeca Integration

  • Integrates Endeca + Dashboard, with OBIEE interacting/drilling directly into Endeca
  • Endeca directly sources OBIEE WebCat and RPD
  • Includes Endeca 6M Rows+ Airline demo application
  • OBIEE sends EQL results via JS connection to Endeca WS

BI + EPM/FR Integration

  • Installs default EPM
    – Includes Essbase Server, EAS, EIS, APS, Studio, FR, EPM Workspace
    – Starts automatically with OBIEE startup, with minimal system overhead
  • Financial Reporting Studio enabled to:
    – Run against OBIEE Essbase installation
    – Webcat-stored FR reports example, with an embedded dashboard
  • Standalone EPM “tar” available as plug and play into SampleApp V305
    – Includes Foundation Services, full Workspace, Oracle Planning, FR, APS and Essbase
    – Can easily be expanded to more EPM web applications

BI 11.1.1.7 Large Dataset-Enabled

1. Medium-sized schema BISAMPLE_EXA

  • Generic 5M rows fact, ~800MB
  • Dense dimensionality: 10k Cust x 200 Prods x 200 Empl x 20 Offices x 3 Yrs
  • OOB comes with:
    – ORCL Granular Schema,
    – ORCL Aggregate Schema (and scripts)
    – Essbase Aggregate Cube (and scripts)
    – Enhanced Turnkey Data Inflator Scripts to massively grow the data

2. Medium-sized schema BI_AIRLINES

  • “Real life” 6.3M fact rows, 5.5GB
  • Dimensionality: 7k Flight# x 5k Air Routes x 20 Carriers x 1 Yr
  • OOB comes with:
    – ORCL Granular Schema
    – ORCL Agg Schema (and scripts)
    – TT Agg Schema (and scripts)
    – ESSB Agg Cube (and scripts)
  • Can be swapped with actual Exalytics Airline demo dataset (130MRows, 15 GB)

Author: John McGale, Performance Architects


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

Oracle Exalytics Evaluation Metrics

June 10, 2013

It is self-evident that porting an existing application onto an Exalytics server will improve performance. However, anecdotal evidence of a possible performance improvement is generally not enough to justify an organization’s adoption of new technology. What is needed is a comparative set of metrics on performance and cost between Exalytics and the legacy platform.

As a result, we’ve put together a set of measures and sample metrics for an Oracle Business Intelligence Enterprise Edition (OBIEE) application to assist in the identification of high quality candidate applications for migration to Exalytics, as well as for evaluating the performance and cost of those pilot applications once migrated.

rw

Possible performance metrics include:

  • Data set size. Given the capabilities of the Exalytics environment, it would be best to select a large data set. Metrics in this category might include current footprint and average growth in data set size over a specific period of time.
  • Data usage. A large data set that isn’t used isn’t going to challenge the Exalytics environment. Candidate applications should have significant usage patterns (e.g., high peaks and breath of data queried).
  • Peak number of concurrent users. The number of concurrent users is a general indicator of the volume of data being accessed and somewhat overlaps with other volume and size metrics, but is a good comparative measure.
  • BI Answers response times. Select a subset of BI Answers queries using some predefined criteria such as run time, most used, largest number of run-time calculations, etc. These measures should be consistent across evaluated applications. 
  • BI Dashboard response times. Since dashboards often have a high profile, it may be helpful to include dashboards with metrics that are used by decision makers. Again, select dashboards consistently across the population of applications using predefined criteria. 
  • Delivers response times. The response times for Delivers are less visible to the end-user. However, these jobs may have been set up using Delivers because they have long run times. In this case, there may be an opportunity to redevelop these as Answers or Dashboards and provide a quick win. Regardless, predefined criteria should be developed for scoring across applications.

Unlike performance measures, possible Total Cost of Ownership (TCO) analysis criteria will require some creativity to effectively compare the legacy and Exalytics environments. Installation comparison criteria are likely to be difficult to develop. If specific time measures or cost measures are unavailable, the use of some quasi-anecdotal measures can be adopted. To do this effectively, put language in the scoring (e.g., installation requires significant troubleshooting = score of n). For each of the measures, keep in mind the objective of determining the relative cost of each platform compared to the Exalytics environment.

Here is an example of how you could score your existing environment versus your prospective Exalytics environment:

rw2

Measures and metrics are going to include some subjectivity and could be open to critique. To mitigate this risk, include key stakeholders in the development or approval of the measures in advance. Developing measures and metrics, as well as capturing data for analysis, will support the overall Exalytics evaluation and adoption strategy.

Author: Ron Woodlock, Performance Architects


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

The Myths and Realities of Oracle Business Intelligence (BI) Applications

June 6, 2013

The question of build versus buy is an old and much-debated topic in the field of Information systems and data-related technologies. Oracle has clearly been selling the “buy” proposition when it comes to building a metadata model in OBIEE (aka: Oracle Business Intelligence Applications or OBIA), and there are many great and valid reasons for this that I would like to discuss.

Back in 2006, I began my career in what would become the Oracle Business Intelligence Enterprise Edition (OBIEE) solution arena. Previous to that, I had been a Siebel CRM transactional system developer. So I definitely had a major advantage when it came to understanding the Siebel Sales BI application that my company purchased as part of Siebel Analytics. I knew all the Siebel CRM “base” tables from Siebel. In addition, I was able to easily understand what the Informatica jobs were doing in terms of pulling together information from Siebel and also what the “core” business model was doing in terms of further consolidating and organizing this information. This was really the ground floor of what was to become Oracle’s present-day “BI applications.”

Since those days, very little has changed about what comprises a BI application. BI application building blocks include:

  • A pre-built set of real physical tables that are organized into a star schema
  • A set of pre-built ETL code to load those tables
  • A set of pre-built tables and triggers in the transaction system to pick up incremental changes to data
  • A pre-built set of meta-data based on the star schema structure

These BI applications aren’t too complicated if you think about them this way, are they? Now, all you need to do is to vary the transaction system and – viola – you have what are known today as the Oracle BI Applications (OBIA):

  • Oracle Sales Analytics (my favorite!)
  • Oracle Service Analytics
  • Oracle Financial Analytics
  • Oracle HR Analytics
  • Oracle Marketing Analytics
  • Oracle Order Management Fulfillment Analytics
  • Oracle Vertical (Industry Specific) Analytics 
  • Oracle Contact Center Analytics
  • Oracle Supply Chain Analytics

If you vary the ETL tools as well, you will now arrive at Version 11.1.1.7 of the BI applications – where Oracle Data Integrator (ODI) becomes the centerpiece (and only!) ETL solution. Don’t worry…support for Informatica will be back soon! I can imagine there will be many customers skipping the ODI-only version of BI applications – especially if they have invested in an excellent ETL tool like Informatica.

So why purchase Oracle’s BI applications?

Here are some of the reasons we discuss with our clients when they are evaluating “build versus buy:”

1. Lower Total Effort.

Given what I know about Oracle’s Enterprise Business Suite (EBS), or even the transactional tables of Siebel CRM, I would never want to embark on building a data warehouse or an OBIEE from scratch – given that this has already been done by Oracle. The man hours alone to create a data warehouse would far exceed a year – and even then, you would not be in lock-step with Oracle’s team, who changes base tables quite often. With BI applications, you can deliver not only a data warehouse, but a metadata data model, along with multiple prebuilt applications in less than half the time it would take you to build it all from scratch.

2. Dollar Cost.

The caliber of technical personnel you would need to undertake such a project would be expensive. Whether you outsourced or hired staff, it would cost a small fortune to fund a “from-scratch” data warehousing and integration project that would even have a chance of being done within a single year.

3. You Can Customize It Anyway.

No organization exists today that has such generic needs generically setup for a transaction system that there is no requirement to customize a BI application (aka: there are always customizations based on the realities of the business you are in). Oracle does a great job of getting you most of the way there – and now you need to just fill in the gaps.

4. Pace of Technology.

Oracle maintains a small army dedicated to keeping the BI applications in synch with the latest and greatest changes that occur within their transaction systems. I’m not saying there are never bugs or issues, but by and large, they do a better job than any company could manage in-house of keeping things in synch – especially given the secret and proprietary nature of future versions of their tools. Internally, Oracle will already be developing the next version of the BI applications against versions of transactional systems that are not even out yet.

Author: John McGale, Performance Architects


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.