X
Home & Office

Larry Smarr: Pumping the Net up for gigabyte images

Larry Smarr believes that the emerging Internet information grid is going to be far more pervasive than the electric power grid is today. He is the Harry E.
Written by Dan Farber, Inactive

Play audio version

Download this Podcast
larry3.jpg
Larry Smarr believes that the emerging Internet information grid is going to be far more pervasive than the electric power grid is today. He is the Harry E. Gruber Professor, Department of Computer Science and Engineering, UCSD, and director, California Institute for Telecommunications and Information Technology, and in 1985 founded the National Center for Supercomputer Applications at the University of Illinois at Champaign-Urbana.

In my audio interview with Smarr (which is available as an MP3 that can be downloaded or, if you’re already subscribed to ZDNet’s IT Matters series of audio podcasts, it will show up on your system or MP3 player automatically. See ZDNet’s podcasts: How to tune in), he discussed his vision of the meta-computer and its intersection with grid computing and Web services; the evolution of data-intensive science and the Internet; and how overbuilding by the telecoms during the dotcom era created fiber optic network capacity that is now being leased to deliver the bandwidth required to study large data objects, such as high-definition images of several hundred million pixel. For example, medical images pumped over a dedicated 10-gigabyte fiber optic network can be displayed through custom software on a large number of tiled LCD screens, driven by Linux clusters. "If the PC has a limited number of pixels, it fundamentally limits anything they can do with data that's out on the Net," Smarr said. "We've got to put the Net on steroids."

Editorial standards