Monthly Archives: June 2016

Yes, Master! Creating a Master-Detail Visualization in Oracle BI Cloud Service (BICS)

June 22, 2016

Author: Doug Ross, Performance Architects

Oracle’s BI Cloud Service (BICS) offers data analysts a wide range of visualization options through its Visual Analyzer component. One of the features of Visual Analyzer that may not be as widely known is the ability to create master-detail relationships between visualizations to help make analyzing data even more interactive.

Creating a master-detail view begins with logging into BICS and choosing “Visual Analyzer” from the home screen. From there, select “Create a New Project” and the list of available “Subject Areas” and “Datasets” will be displayed.

Note: All of the steps that describe the master-detail views are consistent across all of Oracle’s data visualization offerings:  BI Cloud Service, Data Visualization Cloud Service, and the Data Visualization Desktop application for Windows.

For this example, we will be using data from Performance Architect’s “A Higher Education Analytics Dashboard” (AHEAD) Higher Education QuickStart Program for Oracle BICS. The AHEAD offering includes rapid training, knowledge transfer, and implementation of a predefined BICS analytic environment that provides government data from the National Center for Education Statistics (NCES) and the Council for Aid to Education (CAE).   For more information, visit Performance Architects’ AHEAD press release.

doug 1

Once the “Subject Area” has been selected, click the “Create Project” button and the application will display a blank Visual Analyzer canvas.

doug 2

For the initial visualization, select one attribute and two metrics (“Insitution State,” “Full Time Enrollment Count,” and “Average Professor Salary”), while holding the CTRL key. After right-clicking on any of the selected columns, select the “Create Best Visualization” option:

doug 3

The resulting visualization displays a scatter chart of the two metrics with dots representing each state.

doug 4

Next, we will add a second visualization to the canvas. This visualization includes a table view with various columns with details about individual institutions:

doug 5

The two visualizations are displayed side-by-side on the canvas and then arranged with the “Table View” below the scatter chart.  The initial view includes:

doug 6

After moving table to bottom, the view changes in this manner:

doug 7

Next, we can add filters to the canvas to limit the results display. If we right-click on the “Control of Institution” attribute and select “Create Filter,” the attribute is added to the filter bar. From there, we will select the “Public” value from the filter to limit the displayed values:

doug 8

doug 9

If we add a second filter to the “Carnegie Classification,” we can select “High” and “Very High” research activity institutions.

doug 10

At this point, the visualizations on the canvas are both controlled by the filter bar selections.

doug 11

We can now see how to implement the master-detail option. First, click on the “Canvas Settings” icon in the upper right corner of the canvas.  Uncheck the “Synchronize Visualizations” option.

doug 12

When “Synchronize Visualizations” is checked on for a project canvas, then all filters are applied to all visualizations on the canvas and any actions that impact the filters will be applied across the entire canvas as well. This would include actions like “Keep Selected,” “Remove Selected,” and “Drill.”

If “Synchronize Visualizations” is turned off, then actions like Keep Selected,” “Remove Selected,” and “Drill” only modify the visualization on which the action is executed. In unsynchronized mode, each visualization gets its own small filter bar above it.

The next step in creating the master-detail relationship between views is to identify one of the visualizations as the master for the canvas. In this example, the scatter chart will be established as the master by clicking on the gear icon in the upper right of the visualization and then selecting the “Use as Master” option.

doug 13

Notice that after checking the “Use as Master” box, the scatter chart now displays the letter “M” in a blue circle next to the visualization title. This signifies that the visualization is the master.  Also, be aware that there can only be at most one master visualization per canvas.

doug 14

Now that the scatter chart is the master view, clicking on any of the dots in the chart will impact the table view and filter the data based on the selected state value. For example, clicking on the dot for “Texas (TX)” shows institutions from Texas in the table view that match the filter criteria.

doug 15

Adding other views to the canvas will make them responsive to the master visualizatuon as well.   Here we see a bar chart sorted by “Institution Locale” that also reflects the selected value from the scatter chart.

doug 16

In conclusion, using the master-detail option in Oracle’s Visual Analyzer can enhance your data analysis actions in a way that adds interactivity and insight.


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

Entropy on a Linux Server

June 8, 2016

Author: Andy Tauro, Performance Architects

A quick “Google” search for the word “entropy” yields this:
tauro1

However, recently we found a completely different meaning for the word that I had not heard since I was a senior in high school. Turns out the word has a pretty significant relevance to Linux machines.

While building from scratch an Oracle Business Intelligence (BI) 12c environment on Red Hat Linux 6.x on a virtual machine, we hit some serious slow down…forget the “grab a cup of coffee while you wait” thing, this was more like “let’s go grab lunch, and maybe take the scenic route back.”

This didn’t really make sense, since we did not seem to be hitting network latencies. We checked the “ulimit” for open files and it all seemed to set to what we needed it to be. Pretty soon the situation devolved into the “Config” tool repeatedly failing on different steps which seemed to be related to ‘timeouts’ at various points, including setting up JDBC connections and WebLogic deploying the domain. After what seemed like the millionth time that we backed out all things and restarted the process, we stumbled upon an article in the Oracle KnowledgeBase on My Oracle Support that was talking about an I/O connection error to the database.  We didn’t specifically get this error, but we had ruled out the usual suspects. The article spoke about the Linux machine running low on ‘entropy,’ which should usually be in the 100s and ideally in the 1000s.

After some more research on Google, we found this article that showed how to check for available entropy and we found it to be in the 30-50 range. Once we implemented the fix suggested in the article, things just flew ahead and we got the performance we were expecting.

We conducted further research on those commands, and a whole host of web sites that talk about this setting provide content that can be added to the script that runs during WebLogic startup, which does something similar: -Djava.security.egd=file:/dev/./urandom

Wikipedia refers to the term entropy used in the computing arena as “the randomness collected by an operating system or application for use in cryptography or other users that require random data.” In layman’s terms, this refers to the available capacity of the operating system to generate random numbers that are used to drive functionality, most notably encryption like SSL communication, vital to applications in the enterprise world today.

Last point to mention is that it is recommended to use caution in using this setting in a production environment because the increase in this ‘randomness’ apparently can open the machine to vulnerabilities.

All things considered, this was certainly something wild and crazy that we were not expecting, though interesting to use nevertheless. Not a silver bullet for all performance problems, but one more avenue to explore when all other things run out. In our case, it made a difference that is equivalent to night-and-day in the environment we’re working in.


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

When to Run a Financial Report in Oracle BI Cloud Service (BICS) versus Planning and Budgeting Cloud Service (PBCS)

June 1, 2016

Author: Ron Woodlock, Performance Architects

Many organizations have more than one system that can be used to run financial reports because both the enterprise performance management (EPM) and business intelligence (BI) application markets are fairly mature and “grew up” separately. This is often a point of discussion during requirements and design as there isn’t one correct answer.

Since Performance Architects maintains a strong Oracle practice in both the EPM and BI arenas, we wanted to address the questions we’ve recently received about when to use Oracle Planning and Budgeting Cloud Service (PBCS) versus Oracle BI Cloud Service (BICS) to run financial reports.

We’ve listed the main (generic) items we ask our clients to consider when evaluating these options below.  These elements also apply generally to this conversation when looking at EPM versus BI solutions in general, although there are some nuances when talking about specific vendors or solutions. If you’re going through an evaluation or architecting process right now, please contact us directly so that we can help you in more detail to address your specific situation.

Timing

Typically, data is loaded into a BI tool at regular intervals (i.e., nightly, twice daily, etc.).  A BI solution almost never accesses data directly from source systems and therefore is always somewhat out-of-date.  Conversely, budget systems are often constructed to accept frequent actual data feeds to support budget creation and updating which can provide an opportunity for more up-to-date reports.

Reports run during the budget process therefore are typically run out of the budgeting application (PBCS).  The primary reason for this is that PBCS is an application that captures user-entered data and calculates other dependent values which make PBCS the “system of record” for planning and budgeting data, in the same way the general ledger (GL) is the “system of record” for actual data.

The dynamic nature of the budgeting process requires users to adjust their detailed line items and to immediately assess the overall budget after calculations have run.  This real-time feedback requirement is the primary reason there are timing differences between reports run out of PBCS and BICS.  The timing differences eventually get resolved once the budget process completes and final budget values can be passed to the BI environment.

Security

Details used to calculate values for the budget are generally “sensitive” information, and management prefers that this information is not released to a broader audience.  In these situations, reports are created in the budgeting application to provide access to the data.  A good example is staff planning which often includes the sensitive salary and at times human resource data.

Level of Detail

Similar to security discussion above, there are often reasons for summarizing data in a planning application prior to exporting to the BI environment for reporting.  Individual salaries or details about specific trips driving travel expenses are good examples of budget data that should be summarized for general (BI) reporting. Conversely, actual data used within a budgeting system is often summarized and would not be useful for certain types of analysis.

Data Scope

BI tools typically have an advantage when reporting includes information beyond budget and actual data.  The data storage structures supporting BI reports are constructed to allow for a broader scope of reporting and analysis.  The ability to represent complex data structures in an easily used format is a key differentiator for tools like BICS.

Complications

Reports and dashboards that are high density or have very specific format requirements tend to be a better fit for a BICS environment.  BICS provides greater flexibility in formatting and presentation of data and information.  There are more resources to draw on to address reporting performance challenges.


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.