Monthly Archives: January 2016

Modeling and Analysis in the Cloud – Defying Gravity to Gravitate Towards the Clouds

January 27, 2016

Author: Sree Kumar, Performance Architects

A cloud-based analytics solution can really change the way you view things, from the tiniest bit of data to the most complex business problem that you are trying to solve; however, you’ll want to proceed with caution as this can impact the people, process and technology within your organization. There are several complexities to understand before you embark on this gravity-defying experience, so to speak. In this blog, I hope to help you understand some of the challenges, benefits and risks with moving your modeling and analysis to the cloud.

One of the things you have to consider from an organizational perspective is if you already have an on-premise (non-cloud-based) solution. In such cases, migrating all your processes, data, models, and analytics to a cloud-based solution is not as straightforward in the current evolutionary state of the cloud. Needless to say, this only matters with a product that supports both options. If you are starting with a cloud-based or software-as-a-service (SaaS) solution in mind, migrations are not an issue for you to consider.

Most cloud-based products tout the collaborative benefits of their solution. While this is accurate from a product solution standpoint, the collaborative benefit also extends through to the design and build stages as well. It’s not that on-premise solutions do not allow you to facilitate a collaborative approach, it’s just that cloud-based solutions force you into an agile mode of thinking, designing and building. The benefits of an agile methodology were well documented and known even before cloud-based solutions popped up – the cloud just makes it easier to follow an agile approach.

Let’s now take a look at the cloud from a design perspective. The key thing to remember is that a cloud-based solution still doesn’t absolve you of the need to design your process and application correctly. What I mean is that a bad design is still just that – a bad design. Bad designs will continue to plague your organization with the bottlenecks that existed before you decided to move to a cloud-based solution. Even in such cases, the cloud solution will most likely bring the bottlenecks to the surface much faster as a result of its inherent increased flexibility and transparency to all processes and data.

Security is another area that plagues analytics and modeling. It’s easy for organizations and employees to protect archaic ways of running their business sometimes in the name of security. While this may be true for very specific circumstances, this is not the case for most of the analytics and modeling we see. Moreover, when your solution is designed with the right level of access and security from the start, you shouldn’t run into such situations of security playing catch up. In addition, cloud-based solutions give you more flexibility with designing security around different layers of your business – applications, processes, models, tasks or data.

Change management is sometimes put on the backburner when organizations embark on a cloud-based solution. Change management is critical to any change an organization makes but it becomes an extremely critical success criterion when moving to the cloud. There is certainly fast adoption of cloud-based solutions but it most likely comes at the cost of losing some control. For example, data hoarders who reveled in a non-cloud environment certainly lose control of the data they used to manage as a result of the increased transparency of the data. Resistance to change thus sometimes amplifies with the move to cloud solutions. This resistance reverberates through the organization at a much faster pace than other non-cloud solutions primarily due to transparent and open nature of the cloud. Just remember that managing the change is as, if not more, important than actually making the change to the cloud.

With the cloud, we sometimes forget the need to evaluate things we used to take for granted. For example, a simple task like printing a report may not be as straightforward as you think in the cloud. Cloud-based applications designed with the future in mind are often designed without the need for paper in mind. That being said, we might have gone paperless in a lot of areas, but in the majority of analysis and modeling tasks, you may still have a need for printed reports as opposed to using the cloud on a tablet. Again, this is just an example of one simple task – there could be several that you may need to evaluate that specifically impact your business.

Just remember that while gravitating towards cloud-based solutions can transform the way you analyze, model and transact, it still remains an enabler to manage your business better. If you had a process that was cloudy to begin with and fraught with bottlenecks and you end up transporting it to the cloud, it will still remain cloudy but you may just be able to identify those bottlenecks a little earlier and easier with the cloud. Process can indeed be your friend and moving to the cloud can make the friendship a little closer if designed right.

 

 


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

Installing Oracle Database 12c

January 20, 2016

Author: Tom Blakeley, Performance Architects

At Performance Architects, we have a lovely sandbox environment where many of us work on pilot programs, demos, and code snippets. It’s a nice central server that we all share access to, with plenty of horsepower and space (to blow things up). Recently, we went on a journey to revamp our installs on this server, including moving from Oracle Database 11g to Oracle Database 12c, upgrading Oracle EPM (Hyperion) to 11.1.2.4, and moving Oracle Business Intelligence to 12c. As I sat down to do the Oracle 12c install I thought it might be a good idea to throw together a simple guide – so here you go! FYI: This install is for Microsoft Windows x64 (Windows Server 2008R2).

Downloading the Assemblies:

  1. Obviously the first step is to locate and download the 12c assemblies/installers from Oracle. To begin, navigate to the following site: http://www.oracle.com/technetwork/database/enterprise-edition/downloads/index-092322.html
    TB1
  2. We are installing on Microsoft Windows x64, so I’ll be downloading both File 1 and File 2. I’ve gone ahead and confirmed my Oracle credentials, and downloaded the two files to a DB12C Installers directory. This will serve as the home for my download files, and eventually the Installer I will unpack from these downloads.
    TB2
  3. With the downloads complete, it is now time to begin the unzipping process.
    TB3
  4. First thing to do is create a folder to house the Installer. As shown below, I’ve created a folder called “12C_Installer”.
    TB4
  5. Next, I’ll unzip both of the Oracle assemblies into this 12C_Installer directory. I’m using 7-zip as my utility, as I find it to be quick at this. I’ll unzip the first file into 12C_Installer, and then the second into the subfolder I just created – winx64_12102_database_1of2.
    TB5
  6. Once the files are unzipped the installation folder should resemble the below screenshot.
    TB6
  7. With the files successfully unzipped I’ll navigate into the “winx64_12102_database_1of2” folder, down to the database folder.
    TB7
  8. Once here, I’ll right click and run “setup.exe” as an Administrator. The 12C Installer should launch.
    TB8
  9. From here we can begin the Oracle 12C Installation.
    TB9
  10. First, I’ll go ahead and uncheck the tick box next to “I wish to receive security updates via My Oracle Support.” as I’ll prefer to do so manually. Then I’ll click Next. Note: Confirm the warning message.
    TB10
    TB10 2
  11. Here we are presented with a variety of installation options. I am going to create and configure a database. I’ll leave this checked, and click “Next.”
    TB11
  12. On the next screen I’ve selected to perform a “Server Class” installation, as this yields the most configuration options.
    TB12
  13. I am going to be performing a “Single Instance database installation” of Oracle Database 12C, so I’ll just go ahead and select the first option. Then I’ll click “Next.”
    TB13
  14. I am going to perform an “Advanced Install” as I want to be able to make a few changes to passwords, and review my character sets.
    TB14
  15. I am going to take the “Default Language” selection.
    TB15
  16. I am going to be performing an “Enterprise Edition” installation.
    TB16
  17. On the next screen I’ve chosen to use the built-in account as opposed to a separate user. I’ll be managing the services with the admin level account. I’ll also confirm that I do want to use a built in account.
    TB17
    TB17 2
  18. On the next screen I’ve specified my installation directory. Here I am going to install into a new folder called “oracle_12c_db”, clearly denoting where it is located.
    TB18
  19. I am doing to perform a “General Purpose/Transaction Processing Database” install, as the database is likely going to have a mix of everything installed.
    TB19
  20. Now I’ll give my database a name. Shown below is the default name provided by the installer. I’m also going to included the “Container” database, introduced as part of the 12c release.
    TB20
  21. Next I’ll allocate an appropriate amount of memory for the database.
    TB21
  22. I’m also going to select the “Unicode” character set on the “Character sets” tab.
    TB22
  23. I’m not including the Sample schemas.
  24. I’ll go ahead and choose to use “File system” as my storage location.
    TB24
  25. I won’t be using Oracle’s “Enterprise Manager Cloud Control” for this database install.
    TB25
  26. At this point I could elect to create an “Enable Recovery” area for my database. I am choosing not to.
    TB26
  27. From here I’ll go ahead and specify the schema passwords I want to use.
    TB27
  28. Once those passwords are specified, the installer will perform some prerequisite checks.
    TB28
  29. After the checks are performed, I have a chance to review my install. After I’ve reviewed my setup, I’ll save off the response file for record keeping.
    TB29
  30. Now the install begins! You will likely see the database configuration window pop up towards the end of the install. Good time to get a cup of coffee and relax.
    TB30
  31. Once the install is complete, you’ll have a working instance of Oracle Database 12c!
    TB31
  32. You should now be able to access your install via SQL Developer, or Toad for Oracle.

The Oracle 12c installation and configuration is really straightforward. I hope this helps you get started!

Still stuck on install or want additional help with Oracle applications or databases? Send us a note at sales@performancearchitects.com or leave us a note in the comments below, and we’ll be sure to respond.


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.

Data Integration in Oracle Planning and Budgeting Cloud Service (PBCS)

January 14, 2016

Author: Tom Blakeley, Performance Architects

In the past year, we have all watched the explosive growth in cloud offerings from Oracle – starting with Planning and Budgeting Cloud Service (PBCS) and most recently with the release of Enterprise Performance Reporting Cloud Service (EPRC). These are attractive offerings for organizations looking to reduce infrastructure costs, simplify the technology stack, and provide users with robust performance management tools.

At Performance Architects, I’ve participated in a few of these implementations, primarily providing support on the data integration side. Data integration during these projects generally means moving financial data, dimensions, and metrics back and forth from the cloud. This is either done manually by an administrator – or, in an ideal state, automatically via automation routines.

In an on-premise environment, this would be par for the course. I would typically create a series of routines that automatically pull financial data and dimensions from a source system, transform it if required, and then load it directly to the target enterprise performance management (EPM) system. We might accomplish this using ETL/ELT tools such as Informatica, Oracle Data Integrator (ODI), or Financial Data Management Enterprise Edition (FDMEE). In other cases, we might use PL/SQL to manage the data transformation before the final load to EPM.

In all of these examples, we can provide not only direct integration between source and target, but also advanced logging, email notifications, and finally clear visibility into the process for administrators and users. I consider this to be a requirement for any enterprise system – and a clear measure of success. Is financial data delivered in a timely fashion? Is the transformation process clearly understood by the business? Do the users have a reliable process that provides them with clear insight into the process?

With these new cloud offerings data integration is a little different. We don’t have direct access to the target EPM environment, as we would on premises – and there are no “cloud adapters” for ODI and Informatica. Instead the migration of financial data, and management of metadata is accomplished using a basic command line utility, called EPM Automate. This utility allows administrators to move components back and forth using flat files (!), and to launch processes within PBCS. This is a bit of a deviation from traditional on-premise solutions as we’ve lost that direct injection into EPM, and now have to manage the creation, size, and transfer of files from local environment to cloud.

All is not lost however, as the EPM Automate utility does come packed with core functionality that helps to provide some of the features needed to develop a proper data integration solution. EPM Automate first and foremost helps with the movement of data files from local environment up to the cloud. Without this, administrators would be forced to manually upload a data file through the front end. With the files uploaded, EPM Automate can then launch off FDMEE rules in the cloud, to assist with data transformation and a load to the target application. EPM Automate can also launch the Planning business rules to perform data calculations. After each step, EPM Automate provides a basic success/failure status, so scripts can be written to evaluate this before proceeding to the next step. Let’s take a look at an example of what a routine might play out like using EPM Automate:

  1. The locally hosted automation script first calls an email utility to send a notification that the routine has started.
  2. EPM Automate then collects a text file that had previously been dropped off by Oracle Data Integrator (or Informatica). EPM Automate uploads this file to the Cloud Inbox.
  3. EPM Automate launches a FDMEE data rule that picks the file up from the inbox, performs data transformation, and then loads the data to PBCS.
  4. EPM Automate receives a success status, and then proceeds to launch a business rule that performs an aggregation of data and performs a required allocation calc.
  5. EPM Automate receives a success status, and subsequently fires off a closure email to the administrators that the job completed successfully.

This routine moves data accordingly from source to target and provides visibility into the status of the job. EPM Automate provides us with the functionality we need to build our end-to-end process, though we do give up that direct injection we find in on-premise solutions.

I could continue to cover this in far more detail, but I’ll reserve that for another blog post. If you interested in hearing more about the differences between on-premise and cloud solutions, and want to hear about what the cloud does best, then please join my upcoming webinar “When Oracle Cloud EPM and BI Solutions are the Answer”.


© Performance Architects, Inc. and Performance Architects Blog, 2006 - present. Unauthorized use and/or duplication of this material without express and written permission from this blog's author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Performance Architects, Inc. and Performance Architects Blog with appropriate and specific direction to the original content.