Business intelligence (BI) refers to computer-based techniques used in identifying,
extracting, and analyzing business data, such as sales revenue by products and/or departments, or by associated costs and incomes
In-memory computing involves processing data stored in computer memory rather than on a hard disk. Because of the speed at which data can be accessed from computer memory, and other technical innovations such as the use of parallel computer processors to carry out data analysis, in-memory computing can analyse data in a fraction of the time it takes using data stored in traditional relational databases on hard disks.
With the emergence of multi-core processors and the sharp decline in prices of processors and memory, software maker developed a technology that made it possible for even large enterprises to dispense with hard disks and store and perform all operations on the main memory. It boosted performance enormously compared to systems based on retrieving data from hard drives
The concept of in-memory business intelligence is not new. It has been around for many years. The only reason it became widely known recently is because it wasn’t feasible before 64-bit computing became commonly available. Before 64-bit processors, the maximum amount of RAM a computer could utilize was barely 4GB, which is hardly enough to accommodate even the simplest of multi-user BI solutions. Only when 64-bit systems became cheap enough did it became possible to consider in-memory technology as a practical option for BI.
In-memory computing is necessary for two things.
- The volume of information is growing rapidly. So we need new ways of analyzing these huge amounts of data. Traditional ways will take too long.
- Companies are moving from annual budgets and quarterly reviews to instant responses.
- QlikView from QlikTech
- PowerPivot from Microsoft
- HANA from SAP
No comments:
Post a Comment