The University of Sydney has launched its first high-performance computer service aimed to help researchers leverage existing data for research and cross-disciplinary collaboration.
Artemis, which has been developed by the university in partnership with Dell Australia, consists of 1,512 cores of compute capacity, 56 standard compute nodes, two high memory compute nodes, and five GPU compute nodes.
NHMRC Australia fellow professor Edward Holmes from the Charles Perkins Centre, who initiated the idea for an HPC service, said he wanted to help give researchers access to unified high-performance computing, something that has never existed at the university until now.
"I realised, although there was a lot of talent, we were deficient in the computing capabilities to analyse data. So I started fact finding what we needed to put that right," he said.
Prior to the launch of Artemis, Holmes said research data was very fragmented.
"Some people had access to data in computers elsewhere, and it was very much a fragmented system. You could still do good work, but you had to be in the know to get things done," he said.
He added that since the installation of Artemis, researchers at the university can now complete tasks at least 10 times faster than they could before. For instance, as part of his own research, Holmes is now able to examine the DNA sequences of potential Ebola outbreaks in real time.
Artemis is expected to be used as a key tool to assist researchers in areas of molecular biology, economics, mechanical engineering, and physical oceanography.
"The HPC solutions will enable researchers to perform complex calculations to provide fast and broad data analysis," said John McCloskey, enterprise general manager at Dell ANZ.
"HPC is a highly effective way to help analyse complex data, and it's exciting to see it used in research that could potentially impact the world we live in."