Network, storage are key components in supercomputers

Sun exec says network and storage are fast becoming key considerations in improving the performance of supercomputers.
Written by Lynn Tan @ Redhat, Contributor

Network and storage are just as critical as processor speed in boosting the performance of high-performance supercomputers, according to a senior executive at Sun Microsystems.

"It's not only important to make the CPU (central processing unit) [run] faster, it's [also] important to make the network faster [and] it's important to make the storage faster," Marc Hamilton, Sun's director of technology of high-performance computing, said in an interview with ZDNet Asia.

Hamilton explained that the demand for new storage is "just overwhelming". In fact, the world's largest particle physics laboratory CERN--based in Switzerland--faced a massive storage challenge.

"[CERN researchers] actually generate so much data that there's not enough [storage capacity] in all the computers in Switzerland," he said. They then started working with universities all over the world, sending pieces of data for each of these institutions to process. "CERN, with their new experiment that's [in the pipeline] this year, will generate 15 million gigabytes of storage a year," he added.

Hamilton said: "One of the trends in supercomputing [is that] networking and storage requirements will become more and more important." Rather than focus simply on the speed of the CPU, users will start looking at how fast the network is running, he said.

This same trend swept through the computing space eight years ago, where personal computers gained the ability to produce computing power similar to that offered by mainframes. Hamilton likened today's large storage subsystems to that of mainframes.

Asia's largest supercomputer
The largest supercomputer in the region, according to Hamilton, is housed at Japan's Tokyo Institute of Technology. Dubbed TSUBAME Grid Cluster, the system was installed by Sun last year and ranks as the world's ninth largest supercomputer on the list of the Top 500 project. The Top 500 project measures the world's fastest supercomputers according to the Linpack benchmark, which focuses on solving linear equations.

Hamilton said that traditionally, the three big players in supercomputing were the United States, Japan and Western Europe, and Japan housed the largest supercomputer in the world from the late 1990s to 2004.

"[Japan] was ahead of the United States in [the supercomputing] space, but it dropped off [the supercomputing curve around] 2004," he said. "The reason [for that was] because Japan had focused on working with Japanese vendors [that] were building custom computers out of their own computers chips and were not keeping up with the commoditization of processors."

"Every year, as [commoditized] microprocessors that go into supercomputers get faster…and build faster systems, Japan was actually [starting to] fall off that curve," Hamilton said. "It's more and more difficult for specialized companies, such as the Japanese companies, to build their own supercomputers [based on] their own chips".

"Today, over half of the supercomputers in the world are built with AMD or Intel processors--the same processor [that] you have in your PC or laptop," he said.

Editorial standards