X
Innovation

Between the cloud and the corporate data center, there is fog computing

OpenFog proponents say computing should take place across networks, not just in faraway data centers.
Written by Joe McKendrick, Contributing Writer

As cloud computing and the Internet of Things have taken hold of corporate imaginations, there is an underlying assumption that they will bring about a lot of devices feeding raw data to centralized enterprise or cloud-based data centers for storage and analysis. Thus, there have been concerns about how quickly enterprises and cloud companies can build or rent massive amounts of horsepower and storage to handle these growing workloads.

buildings-san-francisco-oct-2013-photo-by-joe-mckendrick.jpg
Photo: Joe McKendrick

A new consortium, however, thinks things will fall out differently. Rather than devices generating tons of data that are sent to centralized locations, the data will be handled and applications run on the network somewhere, in a highly distributed manner -- as "fog computing," which may be anywhere along the spectrum of centralized data centers, edge of network, or somewhere in between.

Fog computing technology distributes the resources and services of computation, communication, control, and storage closer to devices and systems at or near the users. A fog computing architecture performs analytics or run applications on anything from the network center to the edge -- wherever it makes the most sense, economically and technically .

The notion of fog computing was first promulgated by Cisco, which, as a network provider, would have a natural stake in seeing more intelligence going out across the network. Now, Cisco has been joined by several other companies -- as well as an Ivy-League university -- in promoting the idea of fog computing.

The OpenFog Consortium was launched in November by Cisco, along with ARM, Dell, Intel, Microsoft Corp., and the Princeton University Edge Laboratory. The stated goal of the consortium is to accelerate the deployment of "fog" technologies, and is mainly geared toward facilitating the growth and promise of IoT. The founding members will build initial frameworks and architectures that reduce the time required to deliver the end-to-end IoT scenarios.

Additional organizations lending support to the consortium since the inital launch include Arizona State University, FogHorn Systems, Fujitsu, GE Digital, Georgia State University, IEEE, MARSEC Inc., National Chiao Tung University, Nebbiolo Technologies, PrismTech, Real-Time Innovations, Schneider Electric, and Toshiba.

As part of my work with RTInsights.com, I had the opportunity to learn more about the consortium's work through Helder Antunes, senior director of Cisco's Corporate Strategic Innovation Group (and chairman of the consortium), as well as Matt Vasey, director of IoT business development at Microsoft and the consortium's VP of business development and treasurer.

Antunes and his colleagues hope to build a thriving ecosystem of developers who build solutions to the OpenFog model. These members will come from all over the place, "from technology suppliers, academia as well as technology users -- ISPs, auto companies, robotics." Such a mix of tech and traditionally non-tech enterprises "will provide an environment where the different perspectives and requirements will foster innovation and wide interoperability within the industry." The application directory or online marketplace that comes out of this may resemble Apple's iTunes store, he adds.

Ultimately, today's cloud developers may become part of the ecosystem, as "fog computing is actually the continuum of cloud towards the edge of the network," Antunes pointed out. The OpenFog Consortium will also encourage the creation of open APIs at several levels of the fog software stack, and SDKs to assist developers in production of software packages, Antunes says.

I asked Antunes whether fog computing architectures may actually introduce more complexity into cloud and IoT networks, since intelligence and capacity gets inserted at various points, which could even be geographically separated, and also must be financed and maintained. While some increase in complexity is inevitable, fog computing is the best way to address the inadequacies of cloud-only models, which rely on faraway, centralized applications, he said. Such distances between data sources and processing facilities create serious challenges with latency, network bandwidth, geographic focus, reliability and security.

Any additional complexity "is sometimes necessary to help enable the functions of essential IoT applications," Antunes says. "OpenFog is taking steps to manage the incremental capacity increase in networks due to fog, by creating common hardware and software platforms, and very sophisticated orchestration, management, configuration, and network analytics capabilities to largely automate the operation of fog networks." He adds that "in some cases, by aggregating at the fog level, you may actually save money by using dumber sensors and in essence offload those capabilities to the fog nodes. In fact, fog will help more devices to be dumb or even dumber than they are today."

Editorial standards