• All
  • Advanced Analytics
  • AI & LLM
  • Amazon
  • Big Data & Cloud
  • ClearPeaks
  • Cloudera
  • CoolThoughts
  • Data Governance
  • Enterprise BI
  • ETL
  • Executive Analytics
  • Google
  • Informatica
  • KNIME
  • Microsoft
  • Observation Deck
  • Oracle
  • Qlik
  • Snowflake
  • Tableau
  • Visualizations
  • Web & Mobile BI Development

As dust begins to settle, the hype around Big Data is slowly changing into a more realistic thinking on how to actually make it work. At this point, mostly everyone have heard of Big Data and have a clear understanding of what it is and what could be the benefits of putting in place a Big Data strategy in the company. However, one of the main adoption barriers is still the technical complexity when it comes to envision and deploy a Big Data solution.

 

These kinds of projects are usually driven by IT and the CIO on the early stages. The key at this point is to identify a use case to prove that the investment on a Big Data project is worthy; this use case has to clearly demonstrate that thanks to the new data analysis strategy new unprecedented insights could be unleashed allowing for game changer decisions to the company.

 

For many companies this is just the easy part, many CIOs were conscious already of the huge value that the data produced had, and yet they were not able to bring that data in their BI systems due to its size, speed, lack of structure, etc… Now, Big Data seems to make that possible, but the question that remains is: How are we going to do it?

As part of the Oracle Engineered Systems portfolio, Oracle Exalytics is as of today under the radar of all companies looking for performing, state-of-the-art BI platforms.

 

In fact, Exalytics blends specifically chosen hardware components with unique versions of Oracle Business Intelligence software to provide best in class performance by leveraging in-memory database techniques. If you want to read more about Exalytics, click here.

 

We have been recently working with customers to test the appliance capabilities on their actual OBIEE project implementation. The purpose of this blog entry is to share with you the main challenges and achievements coming from a real-life by the name of realistic Exalytics deployment scenario.

Last week the annual Oracle Open World convention took place in San Francisco, and as always there were some interesting announcements about the Oracle analytics suite of products. As we can see from Mark Hurd´s keynote on the 4th of October, one of the main problems that organizations are going to face in the short future is the growth of data.

GoldenGate is a tool that Oracle acquired in 2009 that allows us to capture, route, transform and send data between heterogeneous systems in real time with a very low impact on the source systems.

 

So basically Oracle GoldenGate (OGG from now on) moves data between point A and point B, but it does it in a brilliant way. Let’s take a look in detail to its main features:

Very low impact on data sources – This means that OGG does not disturb the source systems when fetching new data. In order to achieve that it doesn’t really access the database layer but it’s constantly looking for change in the redo files (or equivalent technology) to know when a transaction has been done.

 

Heterogeneous Systems – Thanks to the ODBC technology and the VAM (Vendor Access Modules) OGG is able to interconnect a nice amount of database vendors between each other, although only the ones that offers referential integrity can be chosen as source systems.

Real time – Maybe you’ve heard that OGG can transfer data with a sub second latency. That is true if your integration scenario meets some conditions and the physical distance between source and destination is not huge. Anyway, in the worst circumstances we are talking about real low latency times.

 

antonio 1

Loading new posts...
No more posts