The Large Hadron Collider (LHC) will be coming online this summer when its construction is complete outside of Geneva. Run by CERN, the European Organization for Nuclear Research, the collider will be the largest facility of its type, designed to detect new types of subatomic particles, generate previously unseen nuclear phenomena, create black holes, and possibly detect new dimensions.
While this is all very cool, what does it have to do with the Internet? Because of the nature of the physics being explored, terabytes of data must be analyzed to "see" what is happening inside the LHC. These data are being transmitted over Internet2 and DANTE, the second generation US and European high-speed academic networks. The data are stored at several facilities in the US and Europe and then made available in near-real time to other academic institutions such as CalTech and UCSD.
This will, in fact, serve as a major test of the capacity of these networks (largely leased dark fiber) and the networks at the universities. As Dai Davies of the European DANTE network explains to Ars Technica,
The biggest barrier, however, is simply the challenges of getting things to work properly. "This isn't exactly plug-and-play," Davies said, noting that, even in cases where high-speed connections are in place, it's often the case that "you connect it to the computer systems at either end, and it doesn't work anymore."
Because LHC data are transmitted via a hybrid of dedicated research lines and regular packet-switched lines, considerable human intervention is required to manage the flow of data. As CERN data begin pouring over the network and other academicians begin to leverage the serious capacity afforded by DANTE and Internet2, the focus will turn towards automating much of the process, ultimately allowing these technologies to trickle down to businesses and consumers.