X
Tech

Light Peak: black hole

Light Peak is a high-speed optical I/O interconnect - starting at 10 Gbit/sec and scaling to 100 - whose parents are 3 of the biggest I/0 screwups in high tech: Sony, Apple and Intel. Can you spell doomed?
Written by Robin Harris, Contributor

Light Peak is a high-speed optical I/O interconnect - starting at 10 Gbit/sec and scaling to 100 - whose parents are 3 of the biggest I/0 screwups in high tech: Sony, Apple and Intel. Can you spell doomed?

And that is too bad for us, because Light Peak - or something like it - is a Very Good Thing.

The need The Moore's Law driven merry-go-round of CPU, storage and bandwidth growth needs to get another push - this time with bandwidth. USB 3.0 claims a 5 Gbit/sec bandwidth - like USB 2.0 claims 480 Mbit/sec, HA! - but leading edge consumers are already using that with eSATA.

For these folks USB 3.0 is too little, too late.

But USB 3.0 faces a bigger problem: stricter EMI regulations. The 1 GHz and above band has been lightly regulated but as more services - Wifi and more - move into that range standards groups are clamping down.

Each USB 3.0 cable and port is a gigahertz radio broadcasting in your living room. The cables can be shielded but the emission problems will only get worse with time as devices proliferate.

That's where Light Peak comes in. An optical interconnect cable doesn't radiate EMI. It's immune to problems like ground loops that can afflict copper cables. High-speed optical interconnects are cleaner.

Every interconnect takes up device space and power, requires driver updates and leaves customers with the limitations of older technology. Yet the longer a popular interconnect is shipping the better the drivers can be tuned for both reliability and performance.

The players Intel, Sony and Apple hope to make Light Peak a universal optical interconnect. Intel has been driving hard to reduce the cost of the optical transceivers and they have made tremendous progress. Apple wants to limit the number of interconnects on its systems. And Sony and the rest of Hollywood dream of a consumer paradise where we spend all day watching carefully DRM'd 3D super high def video on an array of costly displays, speakers and servers.

And everyone wants a physically small and fast I/O port for mobile devices.

Sadly, none of them get I/O.

  • Apple. Arguably the worst offender, Apple's penchant for tiny I/O ports - mini-VGA, mini-DVI, mini-DisplayPort and the ghastly ADC - has inflicted more pain on Mac users than even FireWire's high license fees.
  • Intel. Who remembers that Intel pushed Infiniband to replace PCI? Great I/O tech with no sense of market realities like cost.
  • Sony. They've pooched Blu-ray. Last consumer hits: PS2, Walkman and Trinitron. And they love DRM.

Intel claims all Light Peak components will ship next year. But they need a launch customer, preferably one with a large following among bandwidth intensive users such as scientists and video pros. Apple fits the bill.

The consumer wild card Consumer acceptance - even from Apple fans - is not a given. One big issue: the cost of critical bits like hubs, disk interfaces and cables.

When USB was new the extra bits were costly too. The key is how fast Intel can drive optical chip prices down. They tend to be way optimistic - so I'm not.

The Storage Bits take Light Peak is a great idea and doomed. Between obnoxious DRM, costly optical hubs and switches, Blu-ray style licensing fees, Intel over-engineering and Apple's penchant for twee little I/O ports, Light Peak is almost certain to come to fail.

Which is too bad. USB2 shows the value and power of a mature general purpose interconnect. Plug in a thumb drive, digital camera, media player, disk drive, scanner, printer, hub or rocket launcher - and they just work.

That takes years, so an interconnect designed to scale to 100 Gbits/sec as Moore's Law drives down interface costs is a Very Good Thing. Let's hope our corporate overlords surprise us this once.

Comments welcome, of course. How about giving thanks this week for driver writers whose hard work is among the least appreciated in all of techdom?

Editorial standards