Skip to main content

APAX HDF

Samplify, a leading intellectual property company for accelerating memory, storage, and I/O bottlenecks, has released its APAX HDF (hierarchical data format) Storage Library for high-performance computing (HPC), big data, and cloud computing applications.

With APAX HDF, HPC users can accelerate disk throughput by three to eight times and reduce the storage requirements of their HDF-enabled applications without having to modify their application software. The APAX HDF Storage Library works with Samplify’s APAX Profiler tool to analyse the inherent accuracy in each dataset being stored, and applies the recommended encoding rate to maximise acceleration of algorithms with no effect on results.

‘Our engagements with government labs, academic institutions, and private data centres reveal a continuous struggle to manage an ever increasing amount of data,’ said Al Wegener, founder and CTO of Samplify. ‘We have been asked for a simpler way to integrate our APAX encoding technology in big data and cloud applications. By using plug-in technology for HDF, we enable any application that currently uses HDF as its storage format to get the benefits of improved disk throughput and reduced storage requirements afforded by APAX.’

Topics

Read more about:

Product

Media Partners