When we say something is in-memory, it means residing in the RAM (Random Access Memory) against the disk. For a CPU to do some data processing, the data needs to be in-memory and if the data is already is in-memory then we can avoid the usual slow disk I/O, making the in-memory computing a pretty fast process.

The fall in RAM prices and the advent of 64 bit operating systems have enabled manufacturers to manufacture machines with few Terabytes of main memory. For most of the practical purpose, that kind of memory should potentially hold your entire data warehouse. In simple terms, this means that you don’t require storing your data in tables on the disks and don’t have to worry about indexing or optimizing queries etc. This is much better than caching technique, a very widely used method to speed up query performance, as caches are only subsets of very specific pre-defined organized data which was created as a result of a specific query. When the actual data undergoes any changes or if we modify the query, then a new data subset will have to be fetched and cached again from the disk.

Business Intelligence (BI)

BI has two main components, Reporting and Analytics. Analytics is the difficult part which includes parsing huge datasets, doing comparisons and computations at an organization level to understand shopping patterns, usage patterns etc. How fast this analytics can be performed depends upon how quickly each data record can be accessed and how quickly a computation can be performed. This is where the in-memory processing can revolutionize Analytics as the entire data records would be available in-memory and does not involve a time consuming disk fetch.

Scenario

Let’s look at a scenario where the entire data needs to be made available to the user in a report and the user can drill through the data using filters. For a conventional BI tool, querying the entire data from the database will take a very long time and they don’t usually do this. Instead, they will only pull the data at a high level and only when the user drills down, they will fetch the detailed data. Each fetch in such cases involves a query which is time consuming.  Whereas in-memory BI solutions can make the entire data available in the beginning itself and the data loads at the speed of thought and every further drill down again happens at the speed of thought as no disk queries are required.

This kind of immediate, interactive analysis is particularly important when people are trying to uncover unknown patterns or discover new opportunities. With conventional reporting tools, users typically have a specific question, such as “What are my sales for this quarter?” With in-memory visualizations tools, the question can be less specific and exploratory, asking to “Show me the data and the patterns in the data.” Another feature of in-memory tools is the ability to perform ‘what-if analysis’ on the fly. For example, users would want to know how the profits of this quarter would change if the prices of several items were increased, transit times reduced etc. A conventional OLAP tool would require the entire database to be recalculated overnight whereas using an in-memory technology, the results are immediately available and held in memory.

Many of the popular Analytics tools out there like QlikView, Tableau, TIBCO Spotfire, SAP HANA, Oracle Exalytics makes use of in-memory processing to varying degrees. These should be definitely looked upon as the industry trends and any custom analytics application should be written, by exploiting the potential of the vast memory available with the present day computers.
 

image

Case study on Qlikview upgrade

We improved the performance and scalibility by migrating existing Qlikview installation from 8.5 to 9.0 customers