Configuring Custom Skin & Style for OBIEE 12c




Each version of OBIEE has a set of styles and skins designed to make the displaying of data visually appealing. This gives the end user the ability to select styles as per their need. Generally, each OBIEE version has a default style. For example, in OBIEE 12c, the default UI Skin and Style is called ‘Alta’. In 11g, it is ‘blafp’.

Clients often want to see their organization’s BI data with their company logo, colours, styles etc. Not surprisingly, customizing the OBIEE user interface is one of the most frequent activities undertaken while implementing OBIEE in an organization.

In this article, we will describe how to customize the Skin/Style in OBIEE 12c as per the customer requirements below:

Replace the Oracle logo with the Client logo
Change the header bar color in the analytics home page
Customize the OBIEE charts with a custom color pallete
Change the default style/skin


1. Unarchiving the Analytics EAR File:

OBIEE analytics components and configuration files are bundled together as an .ear file named ‘analytic.ear’ and are deployed in weblogic services as part of the installation and configuration. We are going to create our own customization under this file and then deploy it back in weblogic services.

It is always a best practise to make a back-up copy of the files and folders before starting any customization. This would help us to revert back to before the changes if anything goes wrong. The recommended files to be backed up are:


To begin the unarchiving process, please follow the below steps:

1 Navigate to C:\Oracle\Middleware\Oracle_Home\bi\bifoundation\jee to find the analytics.ear file, which is already deployed in weblogic.
2 Copy analytics.ear and paste it in your local folder D:\OBIEE.
3 Unzip the analytics.ear from D:\OBIEE\analytics ear with your preferred zip tool.
custom style OBIEE 12c


4 Further unzip the analytics.war file in another folder to avoid conflicts,
D:\OBIEE\analytics ear\analytic war.
custom style OBIEE 12c


Note: Don’t touch/modify the folders META-INF & WEB_INF.


2. Replacing the Oracle Logo:

To Replace the Oracle logo with the client logo in the analytics home page:

Navigate to ‘res’ in D:\OBIEE\analytics ear\analytic war\res.
In this folder, we can find separate folders for each Skin and Style:
»  Folders starts with ‘s_’ refers for style
»  Folders starts with ‘sk_’ refers for skin
Decide on which out-of-the-box style we are going to extend with our new customization. We can either add our customization in an existing style or make a copy of any folder for customizing the style & skin and then rename it, like s_CPcustom, sk_CPcustom.
Navigate to ‘D:\OBIEE\analytics ear\analytic war\res’ and open the filemap.xml file and do the below change only if you have created the custom folder (s_CPcustom). Otherwise skip this step if you are customizing the existing style(for s_Alta). Change the code as below to extend the Alta style to the new custom skin and style ‘s_CPcustom’ and ‘sk_CPcustom’.
 Figure 3


Here I’m doing my customization under s_Alta folder after backing up the original folder, so I just skipped the above step.
Move to the location ‘D:\OBIEE\analytics ear\analytic war\res\s_Alta\master’.
Rename the oracle_logo to some other name(ex: oracle_bkp) and paste the custom picture with the same dimensions the as oracle logo has. Name the custom picture as oracle_logo.

 Figure 4



3. Changing the Header Bar Color in the Analytics Home Page:

By default, the header bar of the OBIEE Analytics web page is grey. To customize this color :

Navigate to D:\OBIEE\analytics ear\analytic war\res\s_Alta\master.
Find and open the file ‘custom.css’.
The custom.css file is an empty file that is imported by the master.css file. You can use the custom.css file to add your own CSS classes, which will override the classes in the master.css file.
Keeping the custom code in custom.css file enables you to take advantage of future improvements to the master.css file that are applied by patches and upgrades.
To get the class/ id’s, login to analytics through Chrome, right-click and choose "inspect" to inspect the elements whose styles you want.
custom style OBIEE 12c


Override the code with your custom code as below in custom.css file.
custom style OBIEE 12c


Save and close the file.


4. Customizing the Default Color Pallete:

In OBIEE 12c, the initial colors of default color pallete are blue and green, therefore, the graphs and charts are generated in those colors by default. To modify the color pallete and use the standard colors of your organization, follow the steps below.

Navigate to D:\OBIEE\analytics ear\analytic war\res\s_Alta\master.
Open ‘graph.xml’ file with the text editor. Here you will find the list of SeriesItems just below the SliceLable tag.
 Figure 7


Change the color code as per your need and save the file.


5. Set OBIEE Default Style & Skin:

The default Skin and Style for each version of OBIEE is set in the file named ‘instanceconfig.xml’. The default style for OBIEE 12c is ‘Alta’. Since I made my customizations under s_Alta, I don't need to modify the style name in instanceconfig.xml.

In case you are doing the customization under your own custom folders (s_CPcustom/sk_CPcustom) then follow the below steps.

Navigate to C:\Oracle\Middleware\Oracle_Home\user_projects\domains\bi\config\fmwconfig\biconfig\OBIPS.
Open & Edit the instanceconfig.xml file with text editor.
Change the DefaultSkin and DefaultStyle as CPcustom as per the screenshot below.
custom style OBIEE 12c


6. Archiving the Analytics EAR file:

Now we are done with the customization part. So bundle the files back in analytics.war and then analytics.ear as before.

Analytics.war should be archived with the below files.
 Figure 9


Analytics.ear should contain the below files.
 Figure 10


Copy and paste the ‘analytics.ear’ file into the source location. C:\Oracle\Middleware\Oracle_Home\bi\bifoundation\jee.

Now ‘analytics.ear’ file is ready to deploy with the custom styles.


7. Deploy the analytics.ear File:

To deploy the analytics.ear file with custom styles:

Login to Weblogic Server Administration Console with Administrator credentials.
Navigate to Domain Structure -> Deployments -> control tab.
Select analytics from the deployment list and click on Stop -> Force Stop now.
custom style OBIEE 12c


Click on 'Lock & Edit' from the left side panel.
Go to the Configuration tab, there you can see the status of anlytics has been changed.
Now select 'analytics' and click on ‘Update’ to update the new changes from analytics.ear file.
Browse the source path where the analytics.ear file is saved and click on 'Finish'.
custom style OBIEE 12c


Once the deployment is successfully updated, click on the 'Activate the Changes' tab.
custom style OBIEE 12c


Now move to the 'Control' tab and start the analytics by servicing all requests.
custom style OBIEE 12c


Now the status should be Active as per the below sceenshot.
custom style OBIEE 12c

To reflect all changes made in the analytics deployment, perform a Full Restart in OBIEE by using the and script files or by restarting the services from Oracle Enterprise Manager(EM).


8. Restarting the Services From Enterprise Manager:

Login to Oracle Enterprise Manager with administrator credentials.
Navigate to Target Navigation tree -> Business Intelligence -> biinstance -> Availability.
Click on the 'Restart All' button to restart all the services.
custom style OBIEE 12c


Note: As a best practise, Clear the browser cache files before login to OBIEE analytics.


9. Testing the Customization:

Login to OBIEE Analytics and check the default style and logo in OBIEE analytics home.
Navigate to any report that is saved under the catalog and see if the default color palette for OBIEE graph has been changed in the report.
custom style OBIEE 12c




Every organization likes to have their business reports and charts aligned to their standards with respect to skin and styles. OBIEE offers us the flexibility here to modify the text fonts for the reports and to have a custom color palatte for the graphs, logos and skin as per the standards of the organization.

Further customization can be made on the css files referred in this article to improve the look and feel of the OBI analytics home page, reports and charts.

If you want to know more about how to customize your reports and charts and enhance your visualizations you can always contact us!


ClearPeaks´ Visualisation Plugin




If you’ve worked with any kind of data, you know how harrowing reading rows and rows of numbers can be. It isn’t easy to wade through all of those numbers and easily figure out what they mean. That’s the case where visualisation comes to the rescue. Visualisation de-mystifies the data and helps decision-makers derive actionable insights from it. The whole idea of this blog is to delve into the blossoming world of data visualisation. It is hard to ignore the weight that more advanced and more interactive visualisations carry in today’s data-centric world. It is common to get requests to add visualisations or other improvements to the dashboard in order to provide a flashier look or design. Typically, these kinds of requests are handled by embedding HTML and JavaScript to provide data driven, custom visualisations. At ClearPeaks, we have developed a set of Custom Visualisations that allow configurable visualisations to work natively in modern browser technologies.

1. Motivation

Following are the main Key factors, which drive the need for Custom Visualization Plugins:

Add new Capabilities to existing chart types for modern reporting tool.
Add brand new Visualization types.

Let us discuss each one of these points in detail.

Add new Capabilities to existing Chart types.
How many times have you received a request that you just could not meet with native chart types because there was not enough ability to customize the chart? For example, imagine that you want to build a line/bar combo chart with a third distinct Y-axis. Oh sure, you can map three metrics, but if the metrics do not have the same unit of measurement or the same scale, then you are screwed. To handle these kind of scenarios, we have to use these Pre-build custom Plugins.

Add brand new Visualisation types.
There is huge scope for improvement in terms of Interactive Visualizations with traditional charts Library and there are so many new visualization Patterns we can build using JavaScript like Donuts, Meters, Pokers or Animated Donuts etc, which is hard to visualize with Traditional Reporting tools.


2. How does it work?

ClearPeaks’ plugin is solely based on native browser technologies and doesn't require client side plugins like Flash or Java. The plugin is built with jQuery and SVG. With the ClearPeaks plugin referenced in your webpage, we are ready to use a wide range of highly customizable charts. In case of the Oracle BI implementations, where their servers are isolated from the Internet and protected in internal LAN segments, referencing these resources from Oracle BI means having to first deploy them in the WebLogic Server as static resources.
ClearPeaks’ plugin allows us to easily customize the design of a chart, like its size, colour and fonts. Also, it is possible to include or exclude parts of the charts.

3. Data Visualization Graphs

In this section, we present some of the visualization graphs developed by our team:

Gauge Chart: Gauge provides a rich amount of configurable items, which can set options we pass through the plugin invocation.
Figure 1 - Gauge Chart
Donut Chart: A Donut Chart is a circle chart that shows the percentage of an activity.
Figure 2 - Donut Chart
Pyramid: A Pyramid chart displays a single series of data in progressively decreasing or increasing proportions, organized in segments, where each segment represents the value for the particular item from the series.
Figure 3 - Pyramid
TimeLine: Timeline allows to visualise the starting and ending of an activity.
Figure 4 - Timeline

Visualisation also includes the below charts.

Horizontal Bar Chart:
Figure 5 - Horizontal Bar Chart
Vertical Bar Chart:
Figure 6 - Vertical Bar Chart
Bubble Chart:
Figure 7 - Bubble Chart
Percentage Bar:
Figure 8 - Percentage Bar



ClearPeaks’ custom visualisation plugin provides some of the best and most unique visualisations available. The charts are highly customizable in terms of size, colour and design. Better yet, the visualisations can be implemented in OBIEE or any custom html reports or dashboards.

Click here if you would you like to know more about this innovative solution!

Unleash the power of R to forecast data from your OLAP Cubes with Tableau


Unleash the power of R to forecast data from your OLAP Cubes with Tableau

R is a programming language and software environment for statistical computing, the leading tool amongst statisticians, data miners and data scientists for performing sophisticated statistical analysis, data discovery and predictive analytics on any set of data, such as linear/non-linear modelling, classification, clustering, time-series analysis and so on.

R offers a wealth of functions and, as an open source project, its capabilities are constantly enhanced and extended through user-created packages; this is why it is so popular and just keeps growing.

In this article, we will see how R can be connected to Tableau to perform highly-customizable forecasting of your data from OLAP Cubes. Your users will benefit without the need for an advanced coding background, and visualize the results in the Tableau charts that they like the most, with the help of just a few neat tricks.


1. Forecasting with Tableau

Tableau already comes with a forecasting feature; it uses a technique known as exponential smoothing which tries to find a regular pattern in the measures that can be continued into the future.

To access this feature, simply drag a date and a measure to your sheet and right-click on the canvas. In this example, we are using Tableau’s Superstore sample dataset.

Figure 1: Tableau’s Superstore sample dataset.

Figure 1: Tableau’s Superstore sample dataset.

Select Forecast ➜ Show Forecast, and the estimated line will appear. As you can see in the picture below, it’s very simple. What Tableau did was basically a linear regression to a straight line; it also shows the confidence interval. Right-click on the canvas again to access the Forecast Options, and an options pane will pop up.

Figure 2: Forecast Options

Figure 2: Forecast Options

In the options pane (below, 1), you can set a few parameters, such as forecast length, aggregation type, and you can change or remove the confidence intervals (called prediction intervals) and adjust the forecast model: it can be automatic, automatic without seasonality, or custom. If you choose custom, you will be able to select the type of Trend and Season: None, Additive or Multiplicative.

Figure 3: Forecast Options, Summary and Models

Figure 3: Forecast Options, Summary and Models

By right-clicking again and selecting Describe Forecast, you will be shown a pane that gives you additional forecast information. You can see this in 2 and 3.


2. Why use R instead?

What we have described so far is basically everything that Tableau can offer when it comes to forecasting. It works pretty well, but it’s not really customizable and, above all, it doesn’t work with OLAP Cubes!

OLAP Cubes have many advantages, but unfortunately, Tableau doesn’t allow us to do all the things we can do with a “traditional” data source, and forecasting is one of these (you can find a list of other OLAP-related limits here.

Luckily for us, Tableau is such a great tool that, despite not being able to perform forecasting on cubes, it offers us an even more powerful alternative: a smooth integration with R.


3. Integrating R in Tableau

There are 4 functions that allow Tableau to communicate with R: SCRIPT_REAL, SCRIPT_STR, SCRIPT_INT, SCRIPT_BOOL. They only differ in the return type, as suggested by their names.

They work very simply, and take at least 2 parameters. The first one is an actual R script; it is pure R code that is passed on to the R engine to be executed, and that will return the last element that is called. From the second argument, the values that are passed on are specified. In the script, you will refer to them as .arg1, .arg2, and so on, depending on the order in which they are declared.

R will take these values, interpret them as vectors (R basically only works with vectors and their derivatives), and return a vector of data that Tableau can use just like any other calculated field.


4. Connecting R to Tableau

In order to communicate, R and Tableau must be connected, and this is really easy to do:

1. Download R from here or R Studio from here, and install it

2. From the R console, download the Rserve package and run it:


3. In Tableau, select Help ➜ Setting and Performance ➜ Manage External Service Connections

4. Enter a server name or an IP (in the same machine, use localhost or and select port 6311

5. Test the connection by clicking on the “Test Connection” button, and press OK if you get a positive message

You can also automatize the start of Rserve by creating an Rscript with the code mentioned in point 2 (without repeating the installation), save it as a .R file, and call it from a .bat file with “start Rscript myRserveScript.R”. You’ll find it pretty useful, especially in a server environment, because you can schedule its execution when the machine starts via the Task Scheduler, or by copying a shortcut to %AppData%\Microsoft\Windows\Start Menu\Programs\Startup (in Windows).

After Rserve has started and Tableau is successfully connected, you can start using the 4 Script_* functions to exploit R’s power.


5. Forecasting in R

To perform forecasting in R, you have to download a user-made package called “forecast”.

To do so, simply write install.packages(“forecast”) in your R console, and call it by writing library(forecast) any time you need it.

It offers various types of fully-customizable forecast methods, like Arima, naïve, drift, Exponential Smoothing (ETS), Seasonal-Trend decomposition by Loess (STL) and so on, seasonable or not, with a number of different parameters.

For more details about the package, check the author’s slides, the help package pdf , or each function’s help page by writing ?func_name in the console (ex: ?stlf).


6. Use case

In our use case, we will take a series of monthly measures from the last 3 years and forecast them with different frequency/period combinations. We will use the function called stlf, which, without getting bogged down in details, gives us more freedom when playing with periods and frequencies whilst still providing a solid result.

In the picture below, you can see our data, from January 2014 to October 2016.

Figure 4: Data, from January 2014 to October 2016.

Figure 4: Data, from January 2014 to October 2016.

We dragged our date dimension (Year Number Month Number) to the column field, and put our desired measure in the row field. The trend is not regular, but we can see how the line has a few peaks and generally tends to grow.


7. Writing the R scripts

Let’s see how to write the R scripts that perform the forecast.

Firstly, we need to know what values we will be passing: we have our measure (for R to accept it, it must be aggregated, but coming from the Cube it already is, and if not, the ATTR function can be called on to help) and the two integer parameters (create them in Tableau) for the number of periods to be forecasted and for the frequency of the season.

Let’s use them in this order, calling them ForecastMeasure, ForecastPeriods and Freq. The calculated field should initially look something like this:

data <- .arg1;
nulls <- length(data[]);
data <- data[!];
freq <- .arg3[1];
periods <- .arg2[1];
[Forecast Measure],[ForecastPeriods],[Freq]

In the first argument, we have already set up the R variables that collect the values to pass to the script (this is not mandatory, but just to keep the code cleaner). Remember that R works with vectors: notice how the first one (data) takes the entire .arg1, because we actually want the whole array of measures, while the parameters (simple integers) need to taken from the first value of a 1-sized array. We then filter this vector to remove eventual null values, and we save the number of those in a variable called nulls (we will need it later).

We now perform the forecasting. Add the following lines to the script:

time <- ts(data,frequency=freq);
fcast <- stlf(time, h=periods);

The first line simply calls the forecast library; we then create a time series object, by passing the measure and the frequency parameter (for more customizable details, check the resources already mentioned). Finally, we produce our forecasting result, by feeding the stlf function with the time series object and the periods parameter.

Out forecast is ready. If we use the same data and we execute the script in R, the console will look like this:

The interesting columns are Forecast, Lo80, Hi80, Lo95, Hi95. The first one contains the actual forecasted value. There are 12 because we used parameter = 12 (freq = 12, too). The others are the confidence interval (the limits can be changed when calling the function).

Now let’s complete our script. There is one thing we need to remember: Tableau expects to receive the same number of measures that it sent, so we append our forecasted array to the original one, properly shifted, along with rep(NaN, nulls), an array of nulls (we know how many there are because we saved them). Add these lines to the script:

n <- length(data);
result <- append(data[(periods+1):n],fcast$mean);
result <- append(result,rep(NaN,nulls));

The result that we want is fcast$mean, which corresponds to the column called “Forecast” in the R matrix that we saw. We append it along with eventual null values to the segment of original data that starts from the position number period+1, in order to get an array of the original size (note: by doing this, we won’t visualize the first period elements).

That’s it! Your new calculated field should look like the one below. Drag it to the row field, and you get your forecast.

data <- .arg1;
nulls <- length(data[]);
data <- data[!];
freq <- .arg3[1];
periods <- .arg2[1];

time <- ts(data,frequency=freq);
fcast <- stlf(time, h=periods);
n <- length(data);
result <- append(data[(periods+1):n],fcast$mean);
result <- append(result,rep(NaN,nulls));
[Forecast Measure],[ForecastPeriods],[Freq]

Figure 5: Complete R script and result

Figure 5: Complete R script and result


8. Shifting time measures

Now, we start getting the first issues. Tableau received a vector from R the same size as the one that we passed it, so each date-measure value is substitued for the new one, coming from the calculated field. The problem is that the date hasn’t changed!

Normally, this could be fixed by creating a new calculated date with the help of the DATEADD function, but as we can’t add a dimension in cubes, we have to overcome this issue by getting dimension value as a measure with an MDX formula, and then applying DATEADD to it.

Create a new calculated member, and set it as:


Then create a calculate field like this:


Where [YearNumberMonthNumber] is (for example) the name of your newly calculated member. This will get you a measure with the date shifted accordingly to the forecasted period that you selected.

Figure 6: Tableau's Error

Figure 6: Tableau's Error

Drag this new field on top of the original date column and… your chart will crash!

Why? You’ve just created a new shifted date for every original date value, and basically your data is now “split” into “columns” of 1 value, 1 for each shifted month, and R will receive them 1 by 1 instead of getting an entire vector to process. This explains the error that says “series is not periodic or has less than two periods”, because what R is actually seeing is a series of indipendent 1-sized vectors.

To fix this, there’s a simple but neat trick: add your original data field to “Detail”, so that it is available in the sheet, and edit your calculated forecast field like this.

Figure 7: Table Calculation - Forecast (Months)Figure 7: Table Calculation - Forecast (Months)

This will make Tableau process the calculated field at the correct level, and help R see the received data in the correct way, thus producing the desired result.

That’s it! You’ve just forecasted your OLAP cube!

Figure 8: OLAP cube forecasted

Figure 8: OLAP cube forecasted


9. Adding confidence intervals

You can also add confidence intervals to improve your visualization: just create two more metrics, copying the one you created for the forecast, and change fcast$mean to fcast$lower[,1], fcast$lower[,2], fcast$upper[,1], or fcast$upper[,2], depending on the limit that you want to show (remember that the defaults are 80 and 95, but these are editable in the R function).

In the picture below you can see the forecast with the 95% bounds (fcast$upper[,2] and fcast$lower[,2]) , and the trend line (calculated by Tableau).

Figure 9: Forecast final visualization

Figure 9: Forecast final visualization


10. Conclusions

In this guide we have shown how R can help you build great forecast charts for your OLAP Cube data, but this isn’t the only benefit: we got a much higher degree of customization, and overcame the limitations of Tableau when it comes to defining parameters, methods or entire algorithms.

Once Tableau can connect to R, its potential is virtually unlimited, and it can enhance your charts in an infinity of ways: from any source (not necessarily OLAP Cubes), or with any goal, just send the data to R and let it do the job, whatever it is! Forecasting, clustering, classification, or anything else with an R package written for it – you could even write one yourself! Then visualize the result in Tableau, and adjust the look and feel of your dashboard with the powerful features that you love.

Click here if you would like to receive more information about the topic!

Oracle Essbase End-user Tools


Turn around your Excel tables with the SmartView plug-in and discover further insights with Tableau

Excel is the most widespread tool across organizations for dealing with data, and whilst it is clear that it puts a big package of features and capabilities in the users’ hands, it is also worth noting that there are more options to enhance its performance and capabilities.

In this Oracle Essbase End-user Tools article, we'll explore how Essbase and its specific add-in for Excel can bring the performance of your staff to another level when a support tool for data-driven decision making or data analysis is required. Not only we will present the Excel add-in for Essbase, but we will also introduce Tableau as a complementary tool to analyze data from Essbase, thanks to its built-in connector.


1. Main Motivation

Excel is the world’s favorite data analysis tool, but we all too often see how a BI solution with a particular tool is then exported top Excel so that business users can carry on working in a comfortable environment.

Instead of trying to fight the end-users’ natural instincts, setting Excel against the BI system, a more suitable approach would perhaps be the following: Why shouldn't the BI system use Excel as its final consumption tool?

You might think that a regular Excel sheet is not fancy or powerful enough to meet the requirements of a BI report tool, but suppose that the BI system is going to use an enhanced version of Excel, specifically designed to gain crucial capabilities in two main aspects:

• Lightning-fast access to corporate data

• User interactivity with data. Drill-downs, drill-throughs, and other capabilities to explore and navigate the data


2. Essbase Technical Overview

Before starting a more detailed description of how Essbase could enhance your BI stack, let’s review what Essbase actually is from a technical perspective.

Essbase is as an On-Line Analytic Processing (OLAP) technology which consists of a Multidimensional Database Management System (MDBMS) for storing and analyzing data using a multidimensional approach. This key technologic feature gives us the developed Essbase applications, called cubes (or hypercubes).

In contrast with On-Line Transaction Processing (OLTP), OLAP is focused and optimized for processing queries instead of transactions.

It is also interesting to mention the etymology of the product name: Essbase stands for Spread Sheet Database, so it’s easy enough to see that it can be optimized for analyzing Excel-like data and enhancing Microsoft Excel for analytic purposes.

In particular, Essbase shines at financial reporting, scenario analysis (such as actual, budget, forecast, etc.) and, in general, at analyzing highly hierarchical data and complex formula calculations.


3. Benefits and enhancements of empowering Excel with Essbase

So far, we have introduced Essbase as a tool that could empower Excel in order to provide an enhanced experience to Business users working with data in spread sheets, and we have also outlined the technology behind it. But what exactly are the specific features and capabilities that this OLAP system can really offer Excel?

Let’s have a look at some of what Essbase can do for Excel users:

• Build your own ad hoc analysis in a user-friendly way.
And remember you will still be using your favorite spreadsheet program!

• Navigate the data.
Once a cube is accessed from Excel, exploring the data through all its dimensions is both fast (we are talking about sub-second query times) and intuitive. After a quick training session on the new features offered by this tool, users will find themselves slicing and dicing the cube according to the specific information they are looking for.

• Drill-down through the hierarchies of your dimensions.
Cube dimensions are designed in a hierarchical fashion in Essbase, and aggregation rules are always defined for its measures. This provides built-in drill-down functionality, allowing the end-user to retrieve data with the best-fitting detail for each analysis.

• Access pre-built reports directly in Excel.
Although ad hoc analyses may be the jewel of the crown, SmartView for Excel also provides functionality to develop reports so that they can be directly accessed by end-users.

• No exporting and importing data.
There is no need to export the requested data from any BI presentation tool for subsequent import to Excel; the data is already in Excel, and you only need to access the cube, look for the required data and build your own custom analysis. Bear in mind that once the data is in your spreadsheet, no one is going to prevent you from applying formulas to get new calculations to support your analysis.


4. Hands On: SmartView add-in for Excel

Now everything’s ready for show-time! Let’s proceed to a brief demonstration of how to access to your Essbase applications (cubes) with the SmartView plug-in for Excel.

Once the add-in (kindly provided by Oracle) is installed in the user’s local machine, the SMART VIEW tab will be available right after the default ones.

The very first time, it will be necessary to set up the connection to Essbase. Once this is done, connecting to the server is going to be as easy as:

➜ Open the SMART VIEW table

➜ Click the Panel button

➜ Access Shared Connections

➜ Fill in your user name and password

Figure 1: Applications (cubes) with the SmartView plug-in for Excel

Figure 1: Applications (cubes) with the SmartView plug-in for Excel

Note that all the security settings for data governance are managed through user accounts in the Oracle’s WebLogic domain.

Once logged in, we will be able to select the Oracle® Essbase server, where the developed applications will be shown; expanding the desired application will show the cube deployed.

Clicking the cube (in this example we are selecting TBC_PO) will make different options to access and manage it available.

Figure 2: Excel SmartView Shared Connections

Figure 2: Excel SmartView Shared Connections

In this case, we select Ad hoc analysis, thus loading the cube and making the ESSBASE tab available:

Figure 3: Excel Ad hoc analysis

Figure 3: Excel Ad hoc analysis

Here we can see the aggregation of total profit across all the dimensions. We can move dimensions in and out of the spreadsheet, set filters and drill-down to the level of detail required. In this article, we will perform two different ad hoc analyses:


Analysis 1

We want to look at the Sales per region across the year 2000. We will need both Market and Time dimensions:

Figure 4: Analysis 1

Figure 4: Analysis 1

Double-clicking the Market dimension will drill-down to the next level of the Market hierarchy:

Figure 5: Analysis 1 - Market hierarchy and Time

Figure 5: Analysis 1 - Market hierarchy and Time

In a few steps, we obtain the desired dataset (by drilling-down and pivoting the dimensions):

Figure 6: Analysis 1 - Sales by Market and Time

Figure 6: Analysis 1 - Sales by Market and Time

Analysis 2

Now let’s say that we are interested in the Sales by Product type and quarter, across the year 2000.

We will need both Product and Time dimensions. Then, as we did in the other analysis, we will drill-down until we get the desired detail of data.

Let’s have a look at the final dataset:

Figure 7: Analysis 2 - Sales by Product and Time

Figure 7: Analysis 2 - Sales by Product and Time

Now that the desired data has been gathered, it is up to the user whether there are going to be further steps in the analysis or not. In any case, the good news is that the data is already in Excel, so it is 100% ready to be played with.

As you can see, in a few steps we can build our own ad hoc analyses by querying the Essbase cube, which provides us with fast access to corporate data. And there is no need for the end-user to work in any BI-specific client tool; everything is done in the friendly Excel domain.


5. Hands On: Tableau with Essbase data sources

Now that we have seen the classic end-user tool for accessing Essbase applications, we are going to introduce an alternative, more oriented to data-discovering and visualizations, in a modern-day BI fashion. There are a number of similar tools covering this market, such as QlikView or Microsoft Power BI.

In this article we will introduce Tableau, which comes with a pre-built connector for Essbase. In particular, we will show how to perform the same analyses as above with SmartView, but taking advantage of the advanced visualizations to give them a special twist.

First of all, we need to make the connection to the cube. In order to do so, we have to:

➜ Click More Servers

➜ Select Oracle Essbase

➜ Fill in the connection properties

Figure 8: Tableau with Essbase data sources connection properties

Figure 8: Tableau with Essbase data sources connection properties

Figure 9: Oracle Essbase: Connection

Figure 9: Oracle Essbase: Connection

Note that a specific driver is required for this connection; if it is not already installed in the client computer, Tableau will provide a link to download it.

Once the connection with the server is set up, we need to select the application and the Database:

Figure 10: Tableau Database

Figure 10: Tableau Database

Clicking Sheet 1 will open the default blank sheet for the selected cube. Note that both Dimensions and Measures come with pre-built hierarchies (those we set up in Essbase):

Figure 11: Tableau Clicking Sheet 1

Figure 11: Tableau Clicking Sheet 1

Now we can proceed with the two analyses:


Analysis 1

We want to look at the Sales per region across the year 2000. In this case, we can take advantage of Tableau’s map visualization, and also use an animation to represent the Time dimension:

Oracle Essbase: Sales per region across the year 2000

Figure 12: Oracle Essbase: Analysis 1 - Sales per region across the year 2000

It’s a rather different output from before, isn’t it? Of course, there is no good or bad output; it all comes down to each analysis’s requirements. This one is much more useful in a scenario in which we are attempting to gain some extra insight from our data, while in the other, we might be more interested in the specific amount ($) of Sales for each region and month, or perhaps plan to use it as the input data for further calculations.


Analysis 2

We are interested in the Sales by Product type and quarter, across the year 2000. It is worth noting that we have the functionality to drill-down in the Time dimension, so that we can display the information by quarter instead of by year:

Oracle Essbase: Sales by Product type and quarter, across the year 2000

Figure 13: Oracle Essbase: Analysis 2 - Sales by Product type and quarter, across the year 2000

Again, the outputs of an equivalent analysis in Tableau and in SmartView are quite different; nevertheless, we maintain that both tools are complementary, each one better suited to different analytic scenarios.

Finally, I would like to provide a (non-exhaustive) summary list of extra features and limitations:

• Extra features
- Dynamic hierarchies inherited from the database (no need to build them in Tableau)
- Alias file support, in order to display the aliases created in Essbase
- Creating Calculated Members for the Measures, through MDX queries to the cube
- Security settings and functionality preserved from the Essbase configuration (within Weblogic domain)

• Limitations
- Extracts are not allowed, only direct connections to the cube
- Drill-down is not available for Measures
- It is not possible to create Dimension Calculated fields
- It is not possible to create Constraints for Level of Detail Expressions



In this article, we have been looking at how Essbase can empower your spreadsheets and take them to a new level. We have also introduced Tableau as a complementary alternative to SmartView for Excel, which may be better suited to certain analytic situations.

So far we have only spoken about end-user applications and how can they be empowered by Essbase, so now let’s have a brief look at some of the generic benefits of using Essbase:

• Centralized and consistent corporate information, with security settings governed by Weblogic server

• Outstanding performance and features for financial analytics

• Built-in writeback capabilities

• Fast, in-memory database. Response query times much lower than RDBMS

It is also worth noting that we have only mentioned a few end-user tools, but these are not the only ones. The different tools can be placed in three main groups:

• Specific tools for Essbase: like SmartView for Office or EAS Console

• Traditional BI tools: like OBIEE, which can have Essbase as one of its data sources

• Modern BI tools more focused on data discovery: like Tableau or QlikView

To summarize, Essbase is a data storage tool that excels in fields where RDBMS can’t match its performance in analytics with highly hierarchical information and complex formula calculations. Besides its core technical features, it performs really well as a database system for spreadsheet data, both because of its design and SmartView client tool. And it’s not only limited to spreadsheets - there are also more data discovery oriented tools, offering built-in connectors.

So, if you are looking for a way to enhance your company’s analytics, especially if you want to boost Excel or if you are in a financial environment, Essbase may just be the back-end tool you are looking for!

Click here if you would like to receive more information about the topic!

Cross-Database Join functionality with Tableau 10


Cross-Database Join, one of the most anticipated features in Tableau 10

Tableau 10 comes with a great new feature called Cross-Database Join, which allows us to cross data between different sources easily and intuitively.

In previous Tableau versions, you needed the Data-Blending solution to join data from different databases. This feature works well enough in one-to-one relationships, but unwanted asterisks pop up when we want to perform a join in one-to-many relationships. JOIN Data from Different Sources is one of the most voted for ideas in the Tableau community for avoiding this scenario and at last we got this great feature in August with Tableau 10.

In this article we are going to reproduce these painful asterisks by applying Data-Blending and then explain how to use Cross-Database Join functionality to escape such limitations.


1. Data-Blending

Imagine that we want to analyse sales by employee and region and that the data comes from different sources:

➜ Oracle table: contains region details

➜ Excel file: contains region details and sales by employee

Notice that in the second data source, there are multiple employees by region, so in order to cross data between both data sources we use the Region Name field whose relationship is one-to-many.


Figure 1: Primary Data source

Figure 1: Primary Data source

Figure 2: Secondary Data source

Figure 2: Secondary Data source

As we mentioned before, when we apply Data-Blending in one-to-many relationships we get asterisks in those cases where a specific region contains more than one employee.

Figure 3: Data-Blending

Figure 3: Data-Blending

Until now, there was no quick way to avoid these asterisks, and technical IT knowledge was needed to apply database federation between connections.


2. Cross-Database Join

Cross-Database Join is the new feature that Tableau 10 provides to cross data between different sources much faster and without any additional technical knowledge. Let’s explain how to perform a Cross-Database Join, step by step, using the same example and data sources as before.

First, we need to include the Oracle DB table as a primary source and the Excel file as a secondary source. Once both data sources are available, we need to carry out the following steps to apply Cross-Database Join:

1. Place on localhost connection (HR Oracle Schema data source).

2. In table area, double click on REGIONS to use this entity.

3. Perform the same steps to include Employee Sales entity from Employee connection (Excel file). Now you can see that Tableau tries to join both entities automatically.

4. Click on the circles icon; a Join set-up window will appear.

5. Select which kind of join you want to perform.

6. Select the specific fields of each entity that you are going to use to apply the join. In our example, they are “Region Name” field for REGIONS entity and “region Name1” field for Employees Sales.

7. After following the above steps, just click on “Update Now” to display the join results.

Figure 4: Cross-Database Join, step by step with Tableau 10

Figure 4: Cross-Database Join, step by step with Tableau 10

If we reproduce the same analysis as before (sales by employee and by region), Tableau 10 aggregates fields from the second data sources without any issues and asterisks do not appear in the analysis.

Figure 5: Data-Blending Tableau 9.3

Figure 5: Data-Blending Tableau 9.3

Figure 6: Cross-Database Join Tableau 10

Figure 6: Cross-Database Join Tableau 10

It’s easy to see the benefits of this new feature. Cross-Database Join functionality will allow us to cross data between different data sources and types in an easier and more intuitive way (avoiding those painful asterisks when using Data-Blending). It is a very interesting improvement that many Tableau users will welcome to create their daily scorecards.

If you want to know the latest news about Tableau 10 check our previous post and keep updated!

Click here if you would like to receive more information about the topic!

privacy policy - Copyright © 2000-2010 ClearPeaks