Kognitio Analytical Platform

In-memory analytics company Kognitio will base the pricing of its Analytical Platform solely on the amount of memory employed by the servers hosting the platform. The Kognitio Analytical Platform utilises the performance of data held in memory to satisfy queries quickly. Data that is instantly needed by systems can stay in memory, while the remaining data can stay on cost-effective standard mechanical hard disks.

The Massively Parallel Processing (MPP) scale-out architecture allows platforms to be built with memory sizes ranging from half a terabyte to hundreds of terabytes. Customers are free to attach as much disk as they like to these systems, insulating them against increasing software license costs from the growth in data volumes.

Regardless of the amount of data put on a Kognitio Analytical Platform, users will only be charged for what they actually put into memory. Only 10 per cent of a data set the size of the US Library of Congress – around 200 terabytes – is regularly needed and would therefore reside in memory. As an example of the new pricing structure, the Library would pay only for the 20TBs of data used regularly.


For functionality and security for externalised research, software providers have turned to the cloud, writes Sophia Ktori


Robert Roe investigates the growth in cloud technology which is being driven by scientific, engineering and HPC workflows through application specific hardware


Robert Roe learns that the NASA advanced supercomputing division (NAS) is optimising energy efficiency and water usage to maximise the facility’s potential to deliver computing services to its user community


Robert Roe investigates the use of technologies in HPC that could help shape the design of future supercomputers