. home.aspx

NEWS

home.aspx
   


Why Data Gravity Will Grow Stronger

January 14, 2019 / Matthew Wallace

The term “data gravity” refers to the desire to have applications and data attract more applications and data on a network. The idea is based on Newtonian gravity: The larger a mass, the more the attraction. The term was coined by Dave McCrory in 2010. One of the first things you discover with large data sets is that they are hard to move. The need for low latency and high throughput drives data gravity. AWS famously rolled out its “snowmobile” service to help customers with moving up to 100 petabytes of data per truck. It is literally a storage data center in a box delivered by a semitrailer truck. If you had a full 10 Gbps connection straight to the cloud that you could maximize throughput on, it would take nearly three years to transfer that much data. That’s the throughput problem in a nutshell. Additionally, applications that want access to data want it to be fast. If you have an application that runs in a data center in Chicago and it needs to access...