Big bang collider experiment rests on power of grid computing

The world's biggest physics experiment has gone live, with scientists from around the world hoping to use grid computing to to shed light on the origins of the universe.

Share

The world's biggest physics experiment has gone live, with scientists from around the world hoping to use grid computing to to shed light on the origins of the universe.

The Large Hadron Collider (LHC), which has been under construction for 20 years, will shoot its first beam of protons around a 17-mile, vacuum-sealed loop at a facility on the Franch-Swiss border. The test run of what is the largest, most powerful particle accelerator in the world, is a forebear to the coming time when scientists will accelerate two particle beams toward each other at 99.9% of the speed of light.

Smashing the beams together will create showers of new particles that should recreate conditions in the universe just moments after its conception.

Today's test run is a critical milestone in getting to that ultimate test. And a worldwide grid of servers and desktops will help the scientific team make sense of the information that they expect will come pouring in.

"This will move the limit of our understanding of the universe," said Ruth Pordes, executive director of the Open Science Grid, which was created in 2005 to support the LHC project. "I'm very excited about the turning on of the accelerator. Over the next two years, our grids will be used by thousands of physicists at LHC to make new scientific discoveries. That's what it's all for."

Pordes said the US portion of the global grid is a computational and data storage infrastructure made up of more than 25,000 computers and 43,000 CPUs. The mostly Linux-based machines linked into the grid from universities, the U.S. Department of Energy, the National Science Foundation and software development groups. Pordes also said the U.S. grid offers up about 300,000 compute hours a day with 70% of it going to the particle collider project.

Harvey Newman, a physics professor at the California Institute of Technology, told Computerworld that there are about 30,000 servers and more than 100,000 cores around the world hooked into grids that support the LHC project.

"The distributed computing model is essential to doing the computing, storage and hosting of the many Petabytes of data from the experiments," said Newman. "Coordinating data distribution, processing and analysis of the data collaboratively by a worldwide community of scientists working on the LHC are key to the physics discoveries. Only a worldwide effort could provide the resources needed."

"Recommended For You"

CERN says EU data protection laws are hindering cloud adoption Big Bang machine in fresh delay