Cosmological Particle Data Compression in Practice
Workshop: ISAV 2017: In Situ Infrastructures for Enabling Extreme-Scale Analysis and Visualization
Authors: James Ahrens (Los Alamos National Laboratory)
Abstract: In cosmological simulations, trillions of particles are handled and several terabytes of particle data are generated in each time step. Transferring this data directly from memory to disk in an uncompressed way results in a massive load on I/O and storage systems. Hence, one goal of domain scientists is to compress the data before storing it to disk while minimizing the loss of information. In this in situ scenario, the available time for the compression of one time step is limited. Therefore, the evaluation of compression techniques has shifted from only focusing on compression rates to including throughput and scalability. This study aims to evaluate and compare state-of-the-art compression techniques applied to particle data. For the investigated compression techniques, quantitative performance indicators such as compression rates, throughput, scalability, and reconstruction errors are measured.
Based on these factors, this study offers a comprehensive analysis of the individual techniques and discusses their applicability for in situ compression. Based on this study, future challenges and directions in the compression of cosmological particle data are identified.