Using Cell: a modest proposal

IBM's cell processor is absurdly fast now and the rewards to making it smaller are disportionate to current performance because its grid-on-a-chip design is fundamentally all about reducing the power and time costs of distance. In other words, when they hit 32nm manufacturing they can expect to gain another order A surprising amount of the innovation we've seen on the web in the last ten years has been driven by the the porn industry.

IBM's cell processor is absurdly fast now and the rewards to making it smaller are disportionate to current performance because its grid-on-a-chip design is fundamentally all about reducing the power and time costs of distance. In other words, when they hit 32nm manufacturing they can expect to gain another order A surprising amount of the innovation we've seen on the web in the last ten years has been driven by the the porn industry. of magnitude in performance - and it's already the fastest thing around: nearly two orders magnitude better than Intel on floating point and ten times faster on character pushing.

So what to do with all that power? Well, I have an idea....

Video conferencing continues to disappoint. Even people willing to fork over $12,000 a month in communications fees at each end of a link anchored by HP's $500,000 split room system aren't happy, and Microsoft's PC stuff is, well pathetic - but it doesn't have to be that way.

In fact we've traditionally gone about this backwards - attempting to reduce an enormous amount of visual and aural data in order to facilitate transmission and display. With Cell we can reverse this: build a display client that accepts a couple of still images of the source environment and then animates that imagery according to instructions sent from the server.

Remember X-Windows vs. NeWs? X won because Adobe wouldn't release Display PostScript royalty free, but was brain damaged from the gitgo because it transmitted bit maps where the Network environment Windows system transmitted only the information needed for PostScript to draw those bitmaps on the display device.

We can do the same with teleconferencing - build a display client that combines a few initial jpegs, a compressed sound file, and the Cell processor's awesome computing power to generate lifelike imagery, reproducing facial movement, hand gestures, and seat squirming in real time on the display end at a two way communications density achieveable with nothing more than a 1MB DSL link.

There's code out there now, in the games and movies industries, that will do most of what's needed - and which adequately demonstrates technologies like those needed to keep sound and shading realistic - but, of course, there's the question of who's going to pay for the initial R&D needed to make it happen commercially.

And that's the crux of my modest little proposal. A surprising amount of the innovation we've seen on the web in the last ten years has been driven by the the porn industry. From their perspective, a couple of million Sony playstations in the hands of mostly male teenagers is a market to conjure with - and it's a short step from video conferencing with real people at both ends to on demand visual and aural stimulation through imaginary interactions controlled, at the seller's end, via a server and delivered to the customer via a PS3 "conferencing" client.