OBIEE12c Integration with Oracle EBS Security

.

Integration of the Oracle Business Intelligence Enterprise Edition and Oracle E-Business Suite provides a seamless controlled flow between the systems, allowing for drill down and drill back from key metrics to underlying detail. Theoretically, this can be done between any OBIEE form and any EBS form with pass-thru to any connected EBS subsystem.

If you are integrating OBIEE with EBS, you are likely to be using OBIA, Oracle Business Intelligence Analytics, although this is certainly not a requirement. OBIA is a pre-built, pre-packaged BI solution that delivers role-based intelligence to the organization. It is a set of OBIEE dashboards and reports that run from a pre-built warehouse previously serviced by Informatica/DAC, while the next generation of the OBIA warehouse utilizes Oracles Data Integrator, ODI, which runs high-volume batch load plans, event-driven load plans, and even SOA data services.

1. OBIEE 12c Configuration

While configuring an initialization block to retrieve data from EBS, make sure that Row-wise initialization is checked, as this allows multiple results to be stored in the variable, regardless of whether the variable is static or dynamic; otherwise you will only be able to retrieve the last item in the result set. Be sure to set the Execution Precedence of the EBS Integration init block that attaches the session through the session cookie, so that it executes before any attempt is made to retrieve security information.

Figure 1: EBS Configuration

Figure 1: EBS Configuration

Two files must be modified in order for WebLogic to find, accept and attach to the EBS session.

• instanceconfig.xml

• authenticationschema.xml

To configure external authentication, you will need to modify instanceconfig.xml as follows:
Path to instanceconfig.xml:

$BI_HOME/config/fmwconfig/biconfig/OBIPS

Note: Take a backup of the file before editing.

Add “EBS-ICX” in the EnabledSchemas xml tag.

<Authentication>
<!--This Configuration setting is managed by Oracle Enterprise Manager Fusion 
Middleware Control-->
<EnabledSchemas>UidPwd,Impersonate,UidPwd-soap,Impersonate-soap,EBS-ICX</EnabledSchemas>
</Authentication>

Middleware recognizes internal schema name for interface to EBS "EBS-ICX". The Oracle already worked on that. It is only necessary to let the system know that this is to be utilized. Then, let the system know the name of the session cookie that EBS writes.
To configure authenticationschema.xml it is necessary to know the Oracle EBS instance cookie name.
Follow the steps to get the cookie name.

1. Login to Oracle EBS

2. If you are using Google Chrome or Firefox then open Dev Tools ➜ Web Console and write the following command:

javascript:alert(document.cookie)

or

javascript:document.write(document.cookie)

ALERT command will pop up a dialog box as follows, while DOCUMENT.WRITE will display the cookie information in browser window.

Figure 2: Alert command pop up

Figure 2: Alert command pop up

Notice that key value pair of the cookie, ERPU1 is the cookie name of Oracle EBS Test instance and the value dynamically generated for each user after each login. We only required the key from it which is “ERPU1”. Now we will use this value in authenticationschema.xml file.

Path to authenticationschema.xml file:

Obiee/bi/bifoundation/web/display

Note: Take a backup of the file before editing.

Edit the following tags in the file:

<SchemaKeyVariable source="cookie" forceValue="EBS-ICX" nameInSource="ERPU1" />

 

<RequestVariable source="cookie" type="auth" nameInSource="ERPU1" 
biVariableName="NQ_SESSION.ICX_SESSION_COOKIE" />

As per Oracle Doc ID 2141505.1
"Access Prohibited" When Logging In To Analytics In EBS Integrated Environment.
Following tag need to be added in the file under “AuthenticationSchema”.

<RequestVariable source="constant" type="auth" nameInSource="ssi" 
biVariableName="NQ_SESSION.SERVICEINSTANCEKEY" />

That’s it for OBIEE configuration!

 

2. RPD Changes using Administration Tool

Here comes the part that is familiar to every OBIEE administrator, the RPD modifications. If you are following the document, the sample EBS connection pool can be used or create a new one just for the initialization process and retrieving security.
Create database objects and connection pools for Oracle EBS database.

Figure 3: EBS Connection Pool

Figure 3: EBS Connection Pool

Note: APPS user should have all READ and EXECUTE permissions to run PL/SQL queries. If not, grant the privileges to the APPS user.

Now, create an init block which will use this connection pool to retrieve the EBS context and set that into OBIEE session variables. The init block will use the just defined connection pool and will send a data source query to the EBS database:

SELECT
FND_GLOBAL.RESP_ID,
FND_GLOBAL.RESP_APPL_ID,
FND_GLOBAL.SECURITY_GROUP_ID,
FND_GLOBAL.RESP_NAME,
FND_GLOBAL.USER_ID,
FND_GLOBAL.EMPLOYEE_ID,
FND_GLOBAL.USER_NAME,
FND_GLOBAL.RESP_NAME
FROM DUAL

Figure 4: EBS Security Context

Figure 4: EBS Security Context

Referring to another Oracle document, 1539742.1, create these static session variables to hold the context:

EBS_RESP_ID
EBS_RESP_APPL_ID
EBS_SEC_GROUP_ID
EBS_RESP_NAME
EBS_USER_ID
EBS_EMPLOYEE_ID
USER
ROLES

Note: You have to create another init block named “EBS Security Context – ROLES – Row wise” only for ROLES as a user will have more than one Role in Oracle EBS and the init block will be set for row-wise initialization.

Figure 5: EBS Security Context - Roles

Figure 5: EBS Security Context - Roles

Figure 6: EBS Security Context - Roles - Row Wise

Figure 6: EBS Security Context - Roles - Row Wise

The following query will be used to fetch all the Responsibilities of log-in user and assign it to the variable ROLES.

SELECT DISTINCT 'ROLES',
RESPONSIBILITY_NAME
FROM FND_USER,
FND_USER_RESP_GROUPS,
FND_RESPONSIBILITY_VL
WHERE FND_USER.USER_ID = FND_USER_RESP_GROUPS.USER_ID
AND FND_USER_RESP_GROUPS.RESPONSIBILITY_ID = FND_RESPONSIBILITY_VL.RESPONSIBILITY_ID
AND FND_USER_RESP_GROUPS.RESPONSIBILITY_APPLICATION_ID = FND_RESPONSIBILITY_VL.APPLICATION_ID
AND FND_USER_RESP_GROUPS.START_DATE < SYSDATE
AND (CASE WHEN FND_USER_RESP_GROUPS.END_DATE IS NULL THEN SYSDATE
ELSE TO_DATE (FND_USER_RESP_GROUPS.END_DATE)
END) >= SYSDATE
AND FND_USER.USER_ID = (SELECT USER_ID
			FROM FND_USER
			WHERE UPPER (USER_NAME) = UPPER('VALUEOF(NQ_SESSION.USER)')
			)

 

3. Oracle EBS Configuration

Now we need to introduce a responsibility for OBIEE through which a user can navigate to OBIEE from Oracle EBS.

1. Create a Function, using Application in EBS:

Figure 7: Oracle EBS Configuration - Form Functions - Description

Figure 7: Oracle EBS Configuration - Form Functions - Description

2. In the Properties Tab, add as follows:

Function: OBIEE
Type: SSWA jsp function
Maintenance Mode Support: None
Context Dependence: Responsibility

Figure 8: Oracle EBS Configuration - Form Functions - Properties

Figure 8: Oracle EBS Configuration - Form Functions - Properties

3. In the Web HTML tab, add the following link:

Function: OBIEE
HTML Call: OracleOasis.jsp?mode=OBIEE&function=Dashboard

Figure 9: Oracle EBS Configuration - Form Functions - Web HTML

Figure 9: Oracle EBS Configuration - Form Functions - Web HTML

4. Create a Menu in Oracle EBS named “OBIEE Dashboard” and add the Function created in step 1:

Figure 10: Oracle EBS Configuration - Menus

Figure 10: Oracle EBS Configuration - Menus

Note: Only create Menu for OBIEE Dashboard

5. Assign Menu to the relevant responsibility:

Figure 11: Oracle EBS Configuration - Users

Figure 11: Oracle EBS Configuration - Users

6. Set Profile

You need to enter the URL of the Oracle BI Server as part of a profile. You can set up a profile for a responsibility, a user, or a site. The following procedure shows how to set profile options for a responsibility:

Figure 12: Oracle EBS Configuration - Find System Profile Values

Figure 12: Oracle EBS Configuration - Find System Profile Values

You should use a fully-qualified host server.domain name rather than an IP address or just a host name. The OBIEE domain must be the same as the Oracle EBS domain, so that the EBS-ICX cookie is visible to OBIEE from the user's browser.

References:
OBIEE 12c: Integrating OBIEE 12c with Oracle E-Business Suite (EBS) Security (Doc ID 2174747.1)
Chapter 9: Oracle® Fusion Middleware Integrator's Guide for Oracle Business Intelligence Enterprise Edition.

Click here if you would like to receive more information about the topic or if you need help with your EBS-OBIEE configuration!

Unleash the power of R to forecast data from your OLAP Cubes with Tableau

.

Unleash the power of R to forecast data from your OLAP Cubes with Tableau

R is a programming language and software environment for statistical computing, the leading tool amongst statisticians, data miners and data scientists for performing sophisticated statistical analysis, data discovery and predictive analytics on any set of data, such as linear/non-linear modelling, classification, clustering, time-series analysis and so on.

R offers a wealth of functions and, as an open source project, its capabilities are constantly enhanced and extended through user-created packages; this is why it is so popular and just keeps growing.

In this article, we will see how R can be connected to Tableau to perform highly-customizable forecasting of your data from OLAP Cubes. Your users will benefit without the need for an advanced coding background, and visualize the results in the Tableau charts that they like the most, with the help of just a few neat tricks.

 

1. Forecasting with Tableau

Tableau already comes with a forecasting feature; it uses a technique known as exponential smoothing which tries to find a regular pattern in the measures that can be continued into the future.

To access this feature, simply drag a date and a measure to your sheet and right-click on the canvas. In this example, we are using Tableau’s Superstore sample dataset.

Figure 1: Tableau’s Superstore sample dataset.

Figure 1: Tableau’s Superstore sample dataset.

Select Forecast ➜ Show Forecast, and the estimated line will appear. As you can see in the picture below, it’s very simple. What Tableau did was basically a linear regression to a straight line; it also shows the confidence interval. Right-click on the canvas again to access the Forecast Options, and an options pane will pop up.

Figure 2: Forecast Options

Figure 2: Forecast Options

In the options pane (below, 1), you can set a few parameters, such as forecast length, aggregation type, and you can change or remove the confidence intervals (called prediction intervals) and adjust the forecast model: it can be automatic, automatic without seasonality, or custom. If you choose custom, you will be able to select the type of Trend and Season: None, Additive or Multiplicative.

Figure 3: Forecast Options, Summary and Models

Figure 3: Forecast Options, Summary and Models

By right-clicking again and selecting Describe Forecast, you will be shown a pane that gives you additional forecast information. You can see this in 2 and 3.

 

2. Why use R instead?

What we have described so far is basically everything that Tableau can offer when it comes to forecasting. It works pretty well, but it’s not really customizable and, above all, it doesn’t work with OLAP Cubes!

OLAP Cubes have many advantages, but unfortunately, Tableau doesn’t allow us to do all the things we can do with a “traditional” data source, and forecasting is one of these (you can find a list of other OLAP-related limits here.

Luckily for us, Tableau is such a great tool that, despite not being able to perform forecasting on cubes, it offers us an even more powerful alternative: a smooth integration with R.

 

3. Integrating R in Tableau

There are 4 functions that allow Tableau to communicate with R: SCRIPT_REAL, SCRIPT_STR, SCRIPT_INT, SCRIPT_BOOL. They only differ in the return type, as suggested by their names.

They work very simply, and take at least 2 parameters. The first one is an actual R script; it is pure R code that is passed on to the R engine to be executed, and that will return the last element that is called. From the second argument, the values that are passed on are specified. In the script, you will refer to them as .arg1, .arg2, and so on, depending on the order in which they are declared.

R will take these values, interpret them as vectors (R basically only works with vectors and their derivatives), and return a vector of data that Tableau can use just like any other calculated field.

 

4. Connecting R to Tableau

In order to communicate, R and Tableau must be connected, and this is really easy to do:

1. Download R from here or R Studio from here, and install it

2. From the R console, download the Rserve package and run it:

install.packages(“Rserve”);
library(Rserve);
Rserve();

3. In Tableau, select Help ➜ Setting and Performance ➜ Manage External Service Connections

4. Enter a server name or an IP (in the same machine, use localhost or 127.0.0.1) and select port 6311

5. Test the connection by clicking on the “Test Connection” button, and press OK if you get a positive message

You can also automatize the start of Rserve by creating an Rscript with the code mentioned in point 2 (without repeating the installation), save it as a .R file, and call it from a .bat file with “start Rscript myRserveScript.R”. You’ll find it pretty useful, especially in a server environment, because you can schedule its execution when the machine starts via the Task Scheduler, or by copying a shortcut to %AppData%\Microsoft\Windows\Start Menu\Programs\Startup (in Windows).

After Rserve has started and Tableau is successfully connected, you can start using the 4 Script_* functions to exploit R’s power.

 

5. Forecasting in R

To perform forecasting in R, you have to download a user-made package called “forecast”.

To do so, simply write install.packages(“forecast”) in your R console, and call it by writing library(forecast) any time you need it.

It offers various types of fully-customizable forecast methods, like Arima, naïve, drift, Exponential Smoothing (ETS), Seasonal-Trend decomposition by Loess (STL) and so on, seasonable or not, with a number of different parameters.

For more details about the package, check the author’s slides, the help package pdf , or each function’s help page by writing ?func_name in the console (ex: ?stlf).

 

6. Use case

In our use case, we will take a series of monthly measures from the last 3 years and forecast them with different frequency/period combinations. We will use the function called stlf, which, without getting bogged down in details, gives us more freedom when playing with periods and frequencies whilst still providing a solid result.

In the picture below, you can see our data, from January 2014 to October 2016.

Figure 4: Data, from January 2014 to October 2016.

Figure 4: Data, from January 2014 to October 2016.

We dragged our date dimension (Year Number Month Number) to the column field, and put our desired measure in the row field. The trend is not regular, but we can see how the line has a few peaks and generally tends to grow.

 

7. Writing the R scripts

Let’s see how to write the R scripts that perform the forecast.

Firstly, we need to know what values we will be passing: we have our measure (for R to accept it, it must be aggregated, but coming from the Cube it already is, and if not, the ATTR function can be called on to help) and the two integer parameters (create them in Tableau) for the number of periods to be forecasted and for the frequency of the season.

Let’s use them in this order, calling them ForecastMeasure, ForecastPeriods and Freq. The calculated field should initially look something like this:

SCRIPT_REAL("
data <- .arg1;
nulls <- length(data[is.na(data)]);
data <- data[!is.na(data)];
freq <- .arg3[1];
periods <- .arg2[1];
",
[Forecast Measure],[ForecastPeriods],[Freq]
)

In the first argument, we have already set up the R variables that collect the values to pass to the script (this is not mandatory, but just to keep the code cleaner). Remember that R works with vectors: notice how the first one (data) takes the entire .arg1, because we actually want the whole array of measures, while the parameters (simple integers) need to taken from the first value of a 1-sized array. We then filter this vector to remove eventual null values, and we save the number of those in a variable called nulls (we will need it later).

We now perform the forecasting. Add the following lines to the script:

library(forecast);
time <- ts(data,frequency=freq);
fcast <- stlf(time, h=periods);

The first line simply calls the forecast library; we then create a time series object, by passing the measure and the frequency parameter (for more customizable details, check the resources already mentioned). Finally, we produce our forecasting result, by feeding the stlf function with the time series object and the periods parameter.

Out forecast is ready. If we use the same data and we execute the script in R, the console will look like this:

The interesting columns are Forecast, Lo80, Hi80, Lo95, Hi95. The first one contains the actual forecasted value. There are 12 because we used parameter = 12 (freq = 12, too). The others are the confidence interval (the limits can be changed when calling the function).

Now let’s complete our script. There is one thing we need to remember: Tableau expects to receive the same number of measures that it sent, so we append our forecasted array to the original one, properly shifted, along with rep(NaN, nulls), an array of nulls (we know how many there are because we saved them). Add these lines to the script:

n <- length(data);
result <- append(data[(periods+1):n],fcast$mean);
result <- append(result,rep(NaN,nulls));
result

The result that we want is fcast$mean, which corresponds to the column called “Forecast” in the R matrix that we saw. We append it along with eventual null values to the segment of original data that starts from the position number period+1, in order to get an array of the original size (note: by doing this, we won’t visualize the first period elements).

That’s it! Your new calculated field should look like the one below. Drag it to the row field, and you get your forecast.

SCRIPT_REAL("
data <- .arg1;
nulls <- length(data[is.na(data)]);
data <- data[!is.na(data)];
freq <- .arg3[1];
periods <- .arg2[1];

library(forecast);
time <- ts(data,frequency=freq);
fcast <- stlf(time, h=periods);
n <- length(data);
result <- append(data[(periods+1):n],fcast$mean);
result <- append(result,rep(NaN,nulls));
result",
[Forecast Measure],[ForecastPeriods],[Freq]
)

Figure 5: Complete R script and result

Figure 5: Complete R script and result

 

8. Shifting time measures

Now, we start getting the first issues. Tableau received a vector from R the same size as the one that we passed it, so each date-measure value is substitued for the new one, coming from the calculated field. The problem is that the date hasn’t changed!

Normally, this could be fixed by creating a new calculated date with the help of the DATEADD function, but as we can’t add a dimension in cubes, we have to overcome this issue by getting dimension value as a measure with an MDX formula, and then applying DATEADD to it.

Create a new calculated member, and set it as:

[Cube_name].[Dimension_name].MemberValue

Then create a calculate field like this:

DATE(DATEADD('month',[ForecastPeriods],DATE([YearNumberMonthNumber])))

Where [YearNumberMonthNumber] is (for example) the name of your newly calculated member. This will get you a measure with the date shifted accordingly to the forecasted period that you selected.

Figure 6: Tableau's Error

Figure 6: Tableau's Error

Drag this new field on top of the original date column and… your chart will crash!

Why? You’ve just created a new shifted date for every original date value, and basically your data is now “split” into “columns” of 1 value, 1 for each shifted month, and R will receive them 1 by 1 instead of getting an entire vector to process. This explains the error that says “series is not periodic or has less than two periods”, because what R is actually seeing is a series of indipendent 1-sized vectors.

To fix this, there’s a simple but neat trick: add your original data field to “Detail”, so that it is available in the sheet, and edit your calculated forecast field like this.

Figure 7: Table Calculation - Forecast (Months)Figure 7: Table Calculation - Forecast (Months)

This will make Tableau process the calculated field at the correct level, and help R see the received data in the correct way, thus producing the desired result.

That’s it! You’ve just forecasted your OLAP cube!

Figure 8: OLAP cube forecasted

Figure 8: OLAP cube forecasted

 

9. Adding confidence intervals

You can also add confidence intervals to improve your visualization: just create two more metrics, copying the one you created for the forecast, and change fcast$mean to fcast$lower[,1], fcast$lower[,2], fcast$upper[,1], or fcast$upper[,2], depending on the limit that you want to show (remember that the defaults are 80 and 95, but these are editable in the R function).

In the picture below you can see the forecast with the 95% bounds (fcast$upper[,2] and fcast$lower[,2]) , and the trend line (calculated by Tableau).

Figure 9: Forecast final visualization

Figure 9: Forecast final visualization

 

10. Conclusions

In this guide we have shown how R can help you build great forecast charts for your OLAP Cube data, but this isn’t the only benefit: we got a much higher degree of customization, and overcame the limitations of Tableau when it comes to defining parameters, methods or entire algorithms.

Once Tableau can connect to R, its potential is virtually unlimited, and it can enhance your charts in an infinity of ways: from any source (not necessarily OLAP Cubes), or with any goal, just send the data to R and let it do the job, whatever it is! Forecasting, clustering, classification, or anything else with an R package written for it – you could even write one yourself! Then visualize the result in Tableau, and adjust the look and feel of your dashboard with the powerful features that you love.

Click here if you would like to receive more information about the topic!

Oracle Essbase End-user Tools

.

Turn around your Excel tables with the SmartView plug-in and discover further insights with Tableau

Excel is the most widespread tool across organizations for dealing with data, and whilst it is clear that it puts a big package of features and capabilities in the users’ hands, it is also worth noting that there are more options to enhance its performance and capabilities.

In this Oracle Essbase End-user Tools article, we'll explore how Essbase and its specific add-in for Excel can bring the performance of your staff to another level when a support tool for data-driven decision making or data analysis is required. Not only we will present the Excel add-in for Essbase, but we will also introduce Tableau as a complementary tool to analyze data from Essbase, thanks to its built-in connector.

 

1. Main Motivation

Excel is the world’s favorite data analysis tool, but we all too often see how a BI solution with a particular tool is then exported top Excel so that business users can carry on working in a comfortable environment.

Instead of trying to fight the end-users’ natural instincts, setting Excel against the BI system, a more suitable approach would perhaps be the following: Why shouldn't the BI system use Excel as its final consumption tool?

You might think that a regular Excel sheet is not fancy or powerful enough to meet the requirements of a BI report tool, but suppose that the BI system is going to use an enhanced version of Excel, specifically designed to gain crucial capabilities in two main aspects:

• Lightning-fast access to corporate data

• User interactivity with data. Drill-downs, drill-throughs, and other capabilities to explore and navigate the data

 

2. Essbase Technical Overview

Before starting a more detailed description of how Essbase could enhance your BI stack, let’s review what Essbase actually is from a technical perspective.

Essbase is as an On-Line Analytic Processing (OLAP) technology which consists of a Multidimensional Database Management System (MDBMS) for storing and analyzing data using a multidimensional approach. This key technologic feature gives us the developed Essbase applications, called cubes (or hypercubes).

In contrast with On-Line Transaction Processing (OLTP), OLAP is focused and optimized for processing queries instead of transactions.

It is also interesting to mention the etymology of the product name: Essbase stands for Spread Sheet Database, so it’s easy enough to see that it can be optimized for analyzing Excel-like data and enhancing Microsoft Excel for analytic purposes.

In particular, Essbase shines at financial reporting, scenario analysis (such as actual, budget, forecast, etc.) and, in general, at analyzing highly hierarchical data and complex formula calculations.

 

3. Benefits and enhancements of empowering Excel with Essbase

So far, we have introduced Essbase as a tool that could empower Excel in order to provide an enhanced experience to Business users working with data in spread sheets, and we have also outlined the technology behind it. But what exactly are the specific features and capabilities that this OLAP system can really offer Excel?

Let’s have a look at some of what Essbase can do for Excel users:

• Build your own ad hoc analysis in a user-friendly way.
And remember you will still be using your favorite spreadsheet program!

• Navigate the data.
Once a cube is accessed from Excel, exploring the data through all its dimensions is both fast (we are talking about sub-second query times) and intuitive. After a quick training session on the new features offered by this tool, users will find themselves slicing and dicing the cube according to the specific information they are looking for.

• Drill-down through the hierarchies of your dimensions.
Cube dimensions are designed in a hierarchical fashion in Essbase, and aggregation rules are always defined for its measures. This provides built-in drill-down functionality, allowing the end-user to retrieve data with the best-fitting detail for each analysis.

• Access pre-built reports directly in Excel.
Although ad hoc analyses may be the jewel of the crown, SmartView for Excel also provides functionality to develop reports so that they can be directly accessed by end-users.

• No exporting and importing data.
There is no need to export the requested data from any BI presentation tool for subsequent import to Excel; the data is already in Excel, and you only need to access the cube, look for the required data and build your own custom analysis. Bear in mind that once the data is in your spreadsheet, no one is going to prevent you from applying formulas to get new calculations to support your analysis.

 

4. Hands On: SmartView add-in for Excel

Now everything’s ready for show-time! Let’s proceed to a brief demonstration of how to access to your Essbase applications (cubes) with the SmartView plug-in for Excel.

Once the add-in (kindly provided by Oracle) is installed in the user’s local machine, the SMART VIEW tab will be available right after the default ones.

The very first time, it will be necessary to set up the connection to Essbase. Once this is done, connecting to the server is going to be as easy as:

➜ Open the SMART VIEW table

➜ Click the Panel button

➜ Access Shared Connections

➜ Fill in your user name and password

Figure 1: Applications (cubes) with the SmartView plug-in for Excel

Figure 1: Applications (cubes) with the SmartView plug-in for Excel

Note that all the security settings for data governance are managed through user accounts in the Oracle’s WebLogic domain.

Once logged in, we will be able to select the Oracle® Essbase server, where the developed applications will be shown; expanding the desired application will show the cube deployed.

Clicking the cube (in this example we are selecting TBC_PO) will make different options to access and manage it available.

Figure 2: Excel SmartView Shared Connections

Figure 2: Excel SmartView Shared Connections

In this case, we select Ad hoc analysis, thus loading the cube and making the ESSBASE tab available:

Figure 3: Excel Ad hoc analysis

Figure 3: Excel Ad hoc analysis

Here we can see the aggregation of total profit across all the dimensions. We can move dimensions in and out of the spreadsheet, set filters and drill-down to the level of detail required. In this article, we will perform two different ad hoc analyses:

 

Analysis 1

We want to look at the Sales per region across the year 2000. We will need both Market and Time dimensions:

Figure 4: Analysis 1

Figure 4: Analysis 1

Double-clicking the Market dimension will drill-down to the next level of the Market hierarchy:

Figure 5: Analysis 1 - Market hierarchy and Time

Figure 5: Analysis 1 - Market hierarchy and Time

In a few steps, we obtain the desired dataset (by drilling-down and pivoting the dimensions):

Figure 6: Analysis 1 - Sales by Market and Time

Figure 6: Analysis 1 - Sales by Market and Time

Analysis 2

Now let’s say that we are interested in the Sales by Product type and quarter, across the year 2000.

We will need both Product and Time dimensions. Then, as we did in the other analysis, we will drill-down until we get the desired detail of data.

Let’s have a look at the final dataset:

Figure 7: Analysis 2 - Sales by Product and Time

Figure 7: Analysis 2 - Sales by Product and Time

Now that the desired data has been gathered, it is up to the user whether there are going to be further steps in the analysis or not. In any case, the good news is that the data is already in Excel, so it is 100% ready to be played with.

As you can see, in a few steps we can build our own ad hoc analyses by querying the Essbase cube, which provides us with fast access to corporate data. And there is no need for the end-user to work in any BI-specific client tool; everything is done in the friendly Excel domain.

 

5. Hands On: Tableau with Essbase data sources

Now that we have seen the classic end-user tool for accessing Essbase applications, we are going to introduce an alternative, more oriented to data-discovering and visualizations, in a modern-day BI fashion. There are a number of similar tools covering this market, such as QlikView or Microsoft Power BI.

In this article we will introduce Tableau, which comes with a pre-built connector for Essbase. In particular, we will show how to perform the same analyses as above with SmartView, but taking advantage of the advanced visualizations to give them a special twist.

First of all, we need to make the connection to the cube. In order to do so, we have to:

➜ Click More Servers

➜ Select Oracle Essbase

➜ Fill in the connection properties

Figure 8: Tableau with Essbase data sources connection properties

Figure 8: Tableau with Essbase data sources connection properties

Figure 9: Oracle Essbase: Connection

Figure 9: Oracle Essbase: Connection

Note that a specific driver is required for this connection; if it is not already installed in the client computer, Tableau will provide a link to download it.

Once the connection with the server is set up, we need to select the application and the Database:

Figure 10: Tableau Database

Figure 10: Tableau Database

Clicking Sheet 1 will open the default blank sheet for the selected cube. Note that both Dimensions and Measures come with pre-built hierarchies (those we set up in Essbase):

Figure 11: Tableau Clicking Sheet 1

Figure 11: Tableau Clicking Sheet 1

Now we can proceed with the two analyses:

 

Analysis 1

We want to look at the Sales per region across the year 2000. In this case, we can take advantage of Tableau’s map visualization, and also use an animation to represent the Time dimension:

Oracle Essbase: Sales per region across the year 2000

Figure 12: Oracle Essbase: Analysis 1 - Sales per region across the year 2000

It’s a rather different output from before, isn’t it? Of course, there is no good or bad output; it all comes down to each analysis’s requirements. This one is much more useful in a scenario in which we are attempting to gain some extra insight from our data, while in the other, we might be more interested in the specific amount ($) of Sales for each region and month, or perhaps plan to use it as the input data for further calculations.

 

Analysis 2

We are interested in the Sales by Product type and quarter, across the year 2000. It is worth noting that we have the functionality to drill-down in the Time dimension, so that we can display the information by quarter instead of by year:

Oracle Essbase: Sales by Product type and quarter, across the year 2000

Figure 13: Oracle Essbase: Analysis 2 - Sales by Product type and quarter, across the year 2000

Again, the outputs of an equivalent analysis in Tableau and in SmartView are quite different; nevertheless, we maintain that both tools are complementary, each one better suited to different analytic scenarios.

Finally, I would like to provide a (non-exhaustive) summary list of extra features and limitations:

• Extra features
- Dynamic hierarchies inherited from the database (no need to build them in Tableau)
- Alias file support, in order to display the aliases created in Essbase
- Creating Calculated Members for the Measures, through MDX queries to the cube
- Security settings and functionality preserved from the Essbase configuration (within Weblogic domain)

• Limitations
- Extracts are not allowed, only direct connections to the cube
- Drill-down is not available for Measures
- It is not possible to create Dimension Calculated fields
- It is not possible to create Constraints for Level of Detail Expressions

 

Conclusion

In this article, we have been looking at how Essbase can empower your spreadsheets and take them to a new level. We have also introduced Tableau as a complementary alternative to SmartView for Excel, which may be better suited to certain analytic situations.

So far we have only spoken about end-user applications and how can they be empowered by Essbase, so now let’s have a brief look at some of the generic benefits of using Essbase:

• Centralized and consistent corporate information, with security settings governed by Weblogic server

• Outstanding performance and features for financial analytics

• Built-in writeback capabilities

• Fast, in-memory database. Response query times much lower than RDBMS

It is also worth noting that we have only mentioned a few end-user tools, but these are not the only ones. The different tools can be placed in three main groups:

• Specific tools for Essbase: like SmartView for Office or EAS Console

• Traditional BI tools: like OBIEE, which can have Essbase as one of its data sources

• Modern BI tools more focused on data discovery: like Tableau or QlikView

To summarize, Essbase is a data storage tool that excels in fields where RDBMS can’t match its performance in analytics with highly hierarchical information and complex formula calculations. Besides its core technical features, it performs really well as a database system for spreadsheet data, both because of its design and SmartView client tool. And it’s not only limited to spreadsheets - there are also more data discovery oriented tools, offering built-in connectors.

So, if you are looking for a way to enhance your company’s analytics, especially if you want to boost Excel or if you are in a financial environment, Essbase may just be the back-end tool you are looking for!

Click here if you would like to receive more information about the topic!

Cross-Database Join functionality with Tableau 10

.

Cross-Database Join, one of the most anticipated features in Tableau 10

Tableau 10 comes with a great new feature called Cross-Database Join, which allows us to cross data between different sources easily and intuitively.

In previous Tableau versions, you needed the Data-Blending solution to join data from different databases. This feature works well enough in one-to-one relationships, but unwanted asterisks pop up when we want to perform a join in one-to-many relationships. JOIN Data from Different Sources is one of the most voted for ideas in the Tableau community for avoiding this scenario and at last we got this great feature in August with Tableau 10.

In this article we are going to reproduce these painful asterisks by applying Data-Blending and then explain how to use Cross-Database Join functionality to escape such limitations.

 

1. Data-Blending

Imagine that we want to analyse sales by employee and region and that the data comes from different sources:

➜ Oracle table: contains region details

➜ Excel file: contains region details and sales by employee

Notice that in the second data source, there are multiple employees by region, so in order to cross data between both data sources we use the Region Name field whose relationship is one-to-many.

 

Figure 1: Primary Data source

Figure 1: Primary Data source

Figure 2: Secondary Data source

Figure 2: Secondary Data source

As we mentioned before, when we apply Data-Blending in one-to-many relationships we get asterisks in those cases where a specific region contains more than one employee.

Figure 3: Data-Blending

Figure 3: Data-Blending

Until now, there was no quick way to avoid these asterisks, and technical IT knowledge was needed to apply database federation between connections.

 

2. Cross-Database Join

Cross-Database Join is the new feature that Tableau 10 provides to cross data between different sources much faster and without any additional technical knowledge. Let’s explain how to perform a Cross-Database Join, step by step, using the same example and data sources as before.

First, we need to include the Oracle DB table as a primary source and the Excel file as a secondary source. Once both data sources are available, we need to carry out the following steps to apply Cross-Database Join:

1. Place on localhost connection (HR Oracle Schema data source).

2. In table area, double click on REGIONS to use this entity.

3. Perform the same steps to include Employee Sales entity from Employee connection (Excel file). Now you can see that Tableau tries to join both entities automatically.

4. Click on the circles icon; a Join set-up window will appear.

5. Select which kind of join you want to perform.

6. Select the specific fields of each entity that you are going to use to apply the join. In our example, they are “Region Name” field for REGIONS entity and “region Name1” field for Employees Sales.

7. After following the above steps, just click on “Update Now” to display the join results.

Figure 4: Cross-Database Join, step by step with Tableau 10

Figure 4: Cross-Database Join, step by step with Tableau 10

If we reproduce the same analysis as before (sales by employee and by region), Tableau 10 aggregates fields from the second data sources without any issues and asterisks do not appear in the analysis.

Figure 5: Data-Blending Tableau 9.3

Figure 5: Data-Blending Tableau 9.3

Figure 6: Cross-Database Join Tableau 10

Figure 6: Cross-Database Join Tableau 10

It’s easy to see the benefits of this new feature. Cross-Database Join functionality will allow us to cross data between different data sources and types in an easier and more intuitive way (avoiding those painful asterisks when using Data-Blending). It is a very interesting improvement that many Tableau users will welcome to create their daily scorecards.

If you want to know the latest news about Tableau 10 check our previous post and keep updated!

Click here if you would like to receive more information about the topic!

Tableau 10 Release!

.

Introduction

The Tableau 10 release is already here and we’d like to highlight the most important new features available in this version. Some of these features will revolutionize how we develop with the tool whilst others will make our lives much easier - just check out the cross database join functionality as an example. It also presents us with some new functionalities for Tableau final users such as the highlighter and, of course, Tableau is still improving the look and feel of the tool and its amazing visualizations, and it is now more flexible for mobile and tablet designs allowing you to consume your reports anywhere.

Tableau 10 Release

 

1. Analytics

We can detect some interesting new features in analytics. One of the most relevant is the cross data source filter - with this capability we can easily set a filter for the data sources feeding our visualizations. We can also build a cluster analysis by just dragging and dropping the functionality to our report, thus allowing the detection of patterns in the data. Finally, with the highlighter function, the dashboard consumer can mark the desired data. Other interesting functions are the custom territories, the new bar charts options available and more advanced analytics.

 

2. Data

In Tableau 10 we can see some big improvements in the way we interact with our data sources. The most important is the cross database join feature, which allows us to make joins between tables coming from different data sources. This will avoid data-blending in many situations and thus multiply analysis possibilities. In this version of Tableau you can also get email alerts and notifications if there is any issue when refreshing a published extract. In terms of data, there are some other new features like a connector to Google sheet and many others, enhanced maps data and more Oracle options in the connection.

 

3. Beauty

Tableau has also invested heavily in developing the look and feel of the tool and the resulting creations of the users. The workbook formatting feature allows you to apply a format to the entire workbook, thus saving time, and global changes are quicker. We can also see enhancements in the predefined fonts and titles.

 

4. Mobile

Here we can see another big improvement: the device designer feature can make a real difference as you can now define how you want to display your dashboards depending on the client’s device. You can design a different disposition of the visualization for mobile, tablet and desktop; and needless to say, there is an Android version of the mobile app.

 

5. Web

Here there are some improvements that offer more flexibility to web users: with web authoring it is possible to create new dashboards and add new data sources in the Tableau Server editor; with the revision history feature, the admin will be able to access the history of the published workbooks and data sources and restore or download an older version if desired. Furthermore, there’s more information on the statistics of your data sources and the publish flow to the server is now more user-friendly.

 

6. Enterprise

In Tableau 10 there is also space for IT perspective improvements. One of the most important is the subscribe others function, allowing you to subscribe other Tableau users to your visualizations so that they will automatically receive the view via email; there are other new features for the admins like the licensing views, new API functionalities for the developers, SAML support and sensible performance improvements.

 

Click here if you would like to receive more information about Tableau 10 or if you need any guidance with the migration process!

 

tableau-partner-logo

privacy policy - Copyright © 2000-2010 ClearPeaks

topnav