Larry Smarr: Pumping the Net up for gigabyte images

Larry Smarr: Pumping the Net up for gigabyte images

Summary: Larry Smarr believes that the emerging Internet information grid is going to be far more pervasive than the electric power grid is today. He is the Harry E.

SHARE:
TOPICS: Fiber
4

Download this Podcastlarry3.jpgLarry Smarr believes that the emerging Internet information grid is going to be far more pervasive than the electric power grid is today. He is the Harry E. Gruber Professor, Department of Computer Science and Engineering, UCSD, and director, California Institute for Telecommunications and Information Technology, and in 1985 founded the National Center for Supercomputer Applications at the University of Illinois at Champaign-Urbana.

In my audio interview with Smarr (which is available as an MP3 that can be downloaded or, if you’re already subscribed to ZDNet’s IT Matters series of audio podcasts, it will show up on your system or MP3 player automatically. See ZDNet’s podcasts: How to tune in), he discussed his vision of the meta-computer and its intersection with grid computing and Web services; the evolution of data-intensive science and the Internet; and how overbuilding by the telecoms during the dotcom era created fiber optic network capacity that is now being leased to deliver the bandwidth required to study large data objects, such as high-definition images of several hundred million pixel. For example, medical images pumped over a dedicated 10-gigabyte fiber optic network can be displayed through custom software on a large number of tiled LCD screens, driven by Linux clusters. "If the PC has a limited number of pixels, it fundamentally limits anything they can do with data that's out on the Net," Smarr said. "We've got to put the Net on steroids."

Topic: Fiber

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

4 comments
Log in or register to join the discussion
  • We who? Who has to do this?

    I mean it's easy to say something like this but who is supposed to do it or pay for it? Seems a ton of cash goes into the internet every year as is.
    No_Ax_to_Grind
  • Web-based everything is the ultimate goal.

    Who will need or want a PC with a net cost of $2000 for PC, monitor, shrtink-wrapped apps when paying a monthly extortion - whoops, usage - fee for web-based applications and CPU time will exterminate piracy for the corporations? (though they'll still allow China and India to freely take what they want; I mean how else can they educate themselves to the point where we can exploit them? Until the student outwits the master, of course...)

    Microsoft, et al, will be rolling in dough for years. Pity the dollar is destabilized and set to crash if things get worse, but nobody's perfect...

    Of course, broadband right now is $50/month for 1.5/down and 1.0/up. That's just to access Internet1 at fast speeds (unless Internet2 is going to phase out Internet1...) So any monthly cost is either going to be very high on a static level, or pro-rated (which would be pointless.)
    HypnoToad
  • How Robust is the Internet?

    I was concerned to read the other day that the vast majority of Internet bandwidth crosses the Missippippi over two bridges. If those were hit by terrorists, there would be a huge east-west breakdown of Internet traffice. The same article said that there was a single (unidentified) building in New York that controls a huge portion of all east coast traffic. Again, if this were hit by terrorists, much of the east coast Internet would be out for months.
    These things make me wonder how much we can bet our company on the Internet. We do an awful lot over the Net these days, but as we develop more robust disaster recovery plans, we are seriously re-thinking our strategy.
    Any thoughts on this?
    ted15
    • Well...

      It is more fragile and more robust than you'd think. I would say it is extremely hard to cripple the internet through physical destruction. A major node going down can cause slow downs but packets will just find another path. But at the same time large sections of poorly maintained networks can go down depriving people of its use. Comcast comes to mind with their outages affecting many Northeast users. But even then, that was a computer fixable problem with their DNS and had little to do with hardware.

      My thoughts on it is that the net is not very prone to hardware failure. But is rather vulnerable to software. Trying to bomb the internet would be like trying to kill a worm by cutting it in half. It'll just regrow.
      Zinoron