Big data vital for Large Hadron Collider project: CTO
- 26 November, 2012 16:22
The Large Hadron Collider
When you’re trying to learn more about the universe with the Large Hadron Collider (LHC), which generated 30 terabytes of data this year, using big data technology is vital for information analysis, says CTO Sverre Jarp.
Speaking at the Big Data Warehousing and Business Intelligence 2012 conference in Sydney this week, European Centre For Nuclear Research (CERN) Openlab’s Jarp told delegates that physics researchers need to measure electrons and other elementary particles inside the LHC at Geneva, Switzerland.
“These particles fly at practically the speed of light in the LHC so you need several metres in order to study them,” he said. “When these collide, they give tremendous energy to the secondary particles that come out.”
- A/NZ College of Anaesthetists to expand campus security monitoring
- Credit Union Australia signs Good Technology to secure 400 devices
- Taxi startup ingogo hails $3.4 million in latest funding round
- Updated: Federal Court dismisses Aust Post trade mark appeal
- Ruyton Girls’ School to swap paper books for tablets
Training critical to Australia tapping broadband potential: CSIRO
US faces major Internet image problem, former gov't official says
Why CIOs stick with cloud computing despite NSA snooping scandal
Telstra hits 300 Mbps in LTE-A trial
TPG buys AAPT