. home.aspx

NEWS

home.aspx
   


Beyond the Delta Compression is a Must for Big Data

October 08, 2018 / insideBIGDATA

In an era of big data, high-speed, reliable, cheap and scalable databases are no luxury. Our friends over at SQream Technologies invest a lot of time and effort into providing their customers with the best performance-at-scale. As such, SQream DB (the GPU data warehouse) uses state-of-the-art HPC techniques. Some of these techniques rely on modifying existing algorithms to external technological advances, and other algorithms are home-brewed. Dr. Benjamin C. van Zuiden of SQream wrote a special report, “Beyond the Delta: Compression is a Must for Big Data,” that focuses on compression algorithms that make big data-at-scale possible. In data and signal processing, data compression is the process of encoding information using less bits (data) than the original representation. Data compression is useful to save disk space or reduce the I/O or bandwidth used when sending data (e.g., over the internet, or from storage to RAM).