Constructing a grid for billions of pieces of data

CERN has proved to be particularly significant in the technological world. Two decades ago, Tim Berners-Lee developed the World Wide Web and the Internet. Today two Greek scientists, Maria Dimou and Yiannis Papadopoulos, are working in Berners-Lee’s footsteps. «Lee just wanted to solve the problem of sharing the huge amount of data produced here with collaborators all over the globe,» said Papadopoulos, who is creating applications to analyze the deluge of data produced by the experiment. As soon as the Large Hadron Collider is switched on it will produce 2 percent of the world’s data. From the proton collision of the accelerator, 40 billion pieces of information will be produced per second in 2000 particle collisions. Of these, only 100 will be analyzed and recorded every second. «The processing power needed to analyze all the data does not exist,» said Dimou, who is working on a revolutionary computing grid (a super computer consisting of thousands of smaller computers around the world). «We send the data to computers all over the world. Ten large computer centers in institutions worldwide access the data and distribute it to computer ‘farms; ‘ 150,000 state-of-the-art processors create the largest functioning grid. There is no other way to analyze so much data.» The grid will enable researchers to use the stored information much more efficiently.