X
Innovation

​Facebook open sources AI tools, possibly turbo charges deep learning

The plain English takeaway is that faster training of neural networks will now be widely available via open source project Torch.
Written by Larry Dignan, Contributor

Facebook on Friday said it will open source deep learning modules for the Torch artificial intelligence project in a move that could speed up technology and make it more accessible to developers and ultimately companies.

The social networking giant has a long open source contribution history and has even released documentation on its data center designs and building blocks. By contributing its optimized deep-learning modules and other tools, Facebook could turbo charge numerics, machine learning and computer vision. Facebook's technology has been used to train neural networks faster.

Specifically, Facebook is contributing the following:

  • GPU-optimized modules for so called convolutional networks as well as tools used in natural language processing and speech recognition. These modules are built around Nvidia's GPU library.
  • Containers to use multiple GPUs at the same time to train networks in parallel.
  • Optimized lookup table.
  • A model to speed training over various objects and data classes.
  • A technique called cross-map pooling to create visual and text modules.

Of those aforementioned items, Facebook's biggest contribution is the GPU layer code that speeds up the training of convolutional networks, also known as ConvNets. Facebook explained:

Since improving training time of these models translates to faster R&D, we've spent considerable engineering effort to improve the GPU convolution layers. The work has produced notable results, achieving speedups of up to 23.5x compared to the fastest publicly available code. As far as we can tell, our code is faster than any other publicly available code when used to train popular architectures such as a typical deep ConvNets for object recognition on the ImageNet data set.

Facebook provided links to various papers about its technology, but the plain English takeaway is that faster training of neural networks will now be widely available. That move should make Torch more relevant to more enterprises. Today, Torch is used by academic labs and companies that have a role in AI processing such as Google, Twitter, Nvidia, AMD and Intel to name a few.

In the enterprise, companies such as IBM have touted AI as a way to improve analytics. While Watson isn't AI exactly and more cognitive computing, the upshot is that computers will learn and transform multiple industries. Facebook may have helped accelerate that AI movement a bit.

Editorial standards