iVEC acquires Fornax supercomputer
- 28 September, 2011 11:30
Australian academic supercomputing consortium iVEC has acquired another major supercomputer, Fornax, to be based at the University of Western Australia (UWA), to further the country’s ability to conduct data-intensive research.
The SGI GPU-based system, also known as iVEC@UWA, is made up of 96 nodes, each containing two 6-core Intel Xeon X5650 CPUs, an NVIDIA Tesla C2050 GPU, 48 GB RAM and 7TB of storage. All up, the system has 1152 cores, 96 GPUs and an additional dedicated 500TB fabric attached storage- (FAS) based global filesystem.
According to iVEC, the machine also features two InfiniBand networks with the first allowing each node to access the global filesystem and the second allowing nodes to access the local disks on neighbouring nodes.
A combination of Cisco Nexus 7009 Cisco Nexus 5548 switches, an passive 8-channel coarse and dense wave division multiplexors providing multiple 1 Gbps and 10 Gbps connections, are used for networking.
The purchase of Fornax follows the acquisition of a Linux-based high performance computing system, known as iVEC@Murdoch, earlier this year and housed at Murdoch University. The system underpins research in nanotechnology, marine science, bioinformatics, resources and radio astronomy fields. The machine is capable of speeds of up to 100 teraflops. The Fornax machine is capable of about 40 teraflops.
According to Pawsey Centre systems architect, Guy Robinson, the ability to quickly and effectively move data around the machine was a central requirement for Fornax’s users.
“On of the things we are looking at doing with this machine is experimenting more with the moving of data around the machine and inputting it, doing some work, and outputting it,” he told Computerworld Australia.
“In comparison, the Murdoch system has less storage available per node. The 7TB per node [of Fornax] allows a data sent to be loaded on each node and worked on quite intensively.”
Robinson said the networking — currently a mixture of connections to the Australian Academic Reseach Network (AARNet) and its own dedicated links — was also a challenge due to iVec’s distributed machines and users.
“One of the interesting things about how iVEC is set up is that it is a partnership between the main universities in Western Australia and than has led to all of our sites being distributed across various universities,” Robinson said.
“There are also other universities who are partners even though they don’t have machines based at their sites. There are also national and international users. That leads directly to the challenges of moving data backwards and forwards between the machines and storing.
“We will also have to upgrade all out connections to 10gig for the ASKAP (Australian Square Kilometre Array Pathfinder telescope). We need that to keep users happy. The faster the network can go, the faster researchers can work with the data. The ability to churn through it and use it as an interactive, or near interactive resource can allow you to explore the data.”
iVEC is a joint venture between CSIRO and Murdoch University, Edith Cowan University, Curtin University of Technology and the University of Western Australia.
Robinson would not reveal the exact cost of the new Fornax machine, but said that whereas the initial purchase price of supercomputers was once the biggest expense, powering and cooling supercomputers was now the major cost.
“We have been a looking at a lot of novel solutions [to energy costs],” he said. “If you look around the world how people address that often depends on their local environment. People look at local lakes for cooling or ground water cooling. Some people look for local industries who might like a couple of megawatts of hot water.
“We have a whole range of things. In the Perth area we are looking at the amount of time we can use free-air cooling and some groundwater solutions such as shallow water and deep water aquifers.”
Robinson said iVEC has also recently doubled the performance of its storage, which is used among other things, to support an archive and library resource for data sets from multiple fields of science.
The petabyte-sized library allows national and international researchers from fields such as astronomy, climate science, and biotechnology to bank and borrow data.
In light of researchers need to not only store, but be able to find data relevant to their field, iVEC had also being working on improving the search function of its data library, Robinson said.
“A lot of the e-research activities ongoing in iVEC is actually about the management of, and setting up of, data repositories so that people in a different, or similar discipline, can find the items in your data set without having to ask you all the time,” he said.
“It’s making sure data is tagged correctly, can be found by search engines, can have things extracted from it without having to learn a whole new suite of software tools.”
Earlier this month PC users around the world were asked to contribute spare CPU cycles as part of theSkyNet project to further the science of radio astronomy.
In May the Commonwealth Scientific and Industrial Research Organisation (CSIRO) said it would begin the procurement phase of its multi-stage petascale and real-time supercomputing Pawsey Centre project.
The Pawsey Centre would act as a real-time processing facility for data, storage and analysis of data products from the Murchison Radio-Astronomy Observatory (MRO) and other data used by researchers in the fields of geoscience, nanotechnology and biotechnology.
Follow Tim Lohman on Twitter: @Tlohman
Follow Computerworld Australia on Twitter: @ComputerworldAU