Big data vital for Large Hadron Collider project: CTO

European Centre for Nuclear Research (CERN) Openlab’s Sverre Jarp says the Collider generated 30 terabytes of data in 2012
The Large Hadron Collider

The Large Hadron Collider

When you’re trying to learn more about the universe with the Large Hadron Collider (LHC), which generated 30 terabytes of data this year, using big data technology is vital for information analysis, says CTO Sverre Jarp.

Speaking at the Big Data Warehousing and Business Intelligence 2012 conference in Sydney this week, European Centre For Nuclear Research (CERN) Openlab’s Jarp told delegates that physics researchers need to measure electrons and other elementary particles inside the LHC at Geneva, Switzerland.

“These particles fly at practically the speed of light in the LHC so you need several metres in order to study them,” he said. “When these collide, they give tremendous energy to the secondary particles that come out.”

Galaxy census one of the first ASKAP projects

Register or Login to continue

This article is only available for subscribers. Sign up now for free and get free access to premium content from ARN, CIO, CMO, and Computerworld.

Join the Computerworld Australia group on Linkedin. The group is open to IT Directors, IT Managers, Infrastructure Managers, Network Managers, Security Managers, Communications Managers.

More about: APAC, CERN, Facebook, Galaxy, Gartner, Geneva, IDG, IDG Communications, IDG Communications, IDG Communications, Switzerland
References show all
Related Coverage
Related Whitepapers
Latest Stories
Community Comments
Tags: large hadron collider, CERN, big data, business intelligence
Whitepapers
All whitepapers

Court slaps TPG with $400k fine over 000 failure

READ THIS ARTICLE
MORE IN Mobility & Wireless
DO NOT SHOW THIS BOX AGAIN [ x ]
Sign up now to get free exclusive access to reports, research and invitation only events.

Computerworld newsletter

Join the most dedicated community for IT managers, leaders and professionals in Australia