No black holes — but a data tsunami


The Large Hadron Collider will produce roughly 15 petabytes (15 million gigabytes) of data annually – enough to fill more than 1.7 million dual-layer DVDs a year!

Thousands of scientists around the world want to access and analyse this data, so CERN is collaborating with institutions in 33 different countries to operate a distributed computing and data storage infrastructure: the LHC Computing Grid (LCG).

Data from the LHC experiments is distributed around the globe, with a primary backup recorded on tape at CERN. After initial processing, this data is distributed to eleven large computer centres – in Canada, France, Germany, Italy, the Netherlands, the Nordic countries, Spain, Taipei, the UK, and two sites in the USA – with sufficient storage capacity for a large fraction of the data, and with round-the-clock support for the computing grid.

These so-called “Tier-1” centres make the data available to over 120 “Tier-2” centres for specific analysis tasks. Individual scientists can then access the LHC data from their home country, using local computer clusters or even individual PCs…

Hopefully, all of this is not orchestrated by Windows servers.