X
Innovation

Wolfram Research goes for Software 2.0, releases neural net repository

Wolfram, having been into AI before it was cool, now gets a piece of the deep learning hype, in its sui generis way. Where does it stand compared to the competition, and how easy is it to use and integrate Wolfram with the rest of the world?
Written by George Anadiotis, Contributor

Video: With AI everywhere, experts advocate the need to educate users

Wolfram Research is somewhat of a legend. Founded by Stephen Wolfram in 1987, Wolfram Research has been into things such as reasoning, question answering, and an array of advanced services that go under the AI moniker these days long before it was as cool and mainstream as it is today.

Read also: Alibaba neural network defeats human in global reading test

Yet, you do not see Wolfram in the news as much as you see others in the field. Wolfram is a sui generis kind of firm, so that might well be part of how things are done there. But Wolfram does conferences, and today, at its European Wolfram Technology Conference, it's announcing what it calls immediately computable access with its neural net repository.

Repackaging neural networks, with batteries included

"Machine learning is a field in hypergrowth right now, with interesting new results being published every week. Our goal with the Wolfram Neural Net Repository is to let people immediately integrate the latest research neural nets into their work," said Stephen Wolfram, founder and CEO of Wolfram Research.

"Like everything we do with Wolfram Language, our goal is to make everything as smooth and automated as possible, so it's immediate to include a new neural net from the repository, with all encoding, decoding, etc. handled automatically," Wolfram added.

Neural nets (NNs) are at the core of Deep Learning. Neural nets are inspired by trying to mimic how neurons in human brains operate -- by connecting with other neurons and processing input in a networked way. Deep learning is used to describe algorithms with many layers of neurons. Interestingly, Wolfram chose not to go for the deep learning buzzword. But that's not the only interesting thing about Wolfram's NNs.

Read also: Deep Learning: The interest is more than latent

When discussing what is actually in this repository so far, the Wolfram team has an unexpected answer. They explain they have invested much effort in converting publicly available models from other neural net frameworks, such as Caffe, Torch, MXNet, and TensorFlow, into the Wolfram Neural Net format. In addition, they have trained a number of NNs themselves.

So, Wolfram is basically announcing repackaging existing models into its own framework. How does that make sense -- where is the added value?

object-detection.png

Object detection is one of the things Wolfram's neural nets can be used for. (Image: Wolfram Research)

The announcement mentions that Wolfram's NNs are curated by its researchers and scientists, and that the repository provides a uniform system for storing and deploying neural network models in an immediately computable form. But here is what Sebastian Bodenstein, a consultant with Wolfram Advanced Research Group, had to say when asked point blank:

"It's a single format. Sure, we've imported things from PyTorch, from Lua Torch, PaddlePaddle, a whole bunch of these. The value is that it's a single format. You could run things in other frameworks, store PaddlePaddle on your machine, but it's annoying, and there's the whole dependency issues.

There is also value in curation. We've curated all these things from these disparate places and made them uniformly available to users. Plus, we do all sorts of things with testing to make sure the import was correct, things the users don't necessarily want to deal with.

And it's in a form that's just easy to use. We also do pre-processing, often things with separate scripts that are passed around in different frameworks, but in our framework they just work and you can start applying it to data right away.

'Batteries included' is the kind of philosophy," Bodenstein said.

Why choose Wolfram's neural nets over the competition?

Bodenstein sees the amount of automation as the single most tempting reason for someone to use Wolfram NNs compared to other alternatives. He mentioned how Wolfram's NNs handle variable length sequences (for example -- "We don't require any masking from the user, we compile it in the backend, and that is a tricky thing to generally do").

Read also: Intel unveils the Nervana Neural Network Processor

Bodenstein added how it make it easy for the user:

"They can focus on building the net instead of something like esoteric details like converting and preprocessing the input data into the right form, tracking batch dimensions, applying sequence masking, and the like, which is a difficulty in other frameworks that is often foisted onto developers.

Because the nets are part of our language, they are completely cross-platform. There's no dependencies on externally maintained packages -- things work out of the box very nicely. With a language like Python, you're relying on a lot of dependencies and libraries, and getting it to work on different platforms can be a nightmare.

For example, packages for processing audio files often have difficulties cross-platform. TensorFlow has historically had issues working smoothly on Windows, and this applies even more so to extra modules that one needs to do specific tasks like speech recognition. The network repository is also something that's nicer compared to something like Keras. It's much richer."

That sounds great, but there's a catch.

What is Wolfram's neural net repository target audience?

Did you notice the reference to Wolfram's language? Mathematica is the foundation on which Wolfram was built. There's a lot to be said about it, but let's start with this one thing: Considering the effort it would take for someone to become familiar with Wolfram, is the NN repository primarily aimed at existing users, or looking to attract new ones?

Read also: Nvidia researchers create AI, deep-learning system

Bodenstein said they are catering to two different types of people: Researchers and those interested in developing applications:

"We're more geared toward application development, and we think there's a huge market for that. We think people can be much more productive this way. We are, of course, also trying to attract new users. For example, people focused on application development -- it's a very friendly framework to use.

The No. 1 feature that makes application developers productive is having an incredibly rich repository of nets and being able to manipulate them easily and symbolically.

Many frameworks are developed by researchers for researchers, and we're trying to make people who are not experts more productive, but also cater toward experts as well, while making the nets easy to use by automating as much as possible. We believe that to be productive in a framework like this, having pre-trained nets is absolutely essential."

wolfram-corporate-logo-stacked-lg-1.png

What Wolfram sees as an advantage may be a barrier for adoption for others. (Image: Wolfram Research)

Bodenstein is not the only one making that point about many of the frameworks out there being hard to use and intended mainly for use by researchers. But talking about ease of use, it's time to address the elephants in the room.

Speaking Wolfram

We already mentioned how Wolfram does things in its own way, including having developed Mathematica. Mathematica is a symbolic programming language, in the same family with LISP and Prolog. Such languages are not very popular, and learning to program in them requires a certain rewiring for people versed in, say, Java or Python.

So, what Wolfram sees as an advantage may as well be a barrier. It all comes down to how easy it would be for people to learn and adopt Mathematica, and whether there would be enough return on investment in doing this. Can this be a self-service process?

Read also: What is AI? Everything you need to know

Swede White, lead communications strategist for Wolfram, pointed to a number of freely available educational resources, such as Wolfram U, Stephen Wolfram's Elementary Introduction to the Wolfram Language, and the Community and Challenges websites.

Mathematica, White said, has been a staple on college campuses and in research facilities for about three decades. And the Wolfram Language, the programming language used in Mathematica, is at the core of Wolfram's tech stack and one important way in which people become familiar with Wolfram's technologies:

"Our language largely uses a functional programming paradigm, and the syntax is not terribly foreign to most people, and it also has natural language input for many things. We do offer a fast introduction for programmers that actually has tracks for those familiar with Java or Python.

We also offer summer programs for high school students, undergraduate and graduate college students, and professionals. One thing we often hear is that programming in Wolfram Language is actually fun, in large part due to the superfunctions we've baked into the language, and we have several superfunctions for machine learning."

hqdefault-1.jpg

Learning to speak Wolfram can be a barrier to adoption. (Image: Wolfram Research)

There's no easy way to answer this question, admittedly. Katie Drenstein, a Wolfram camp alumnus, said that wrapping its heads around Mathematica was hard, despite its experience in Java and Python. Experience is a relative term here, considering that Drenstein and others in that camp were mostly overachieving teens. As Drenstein said, not having any previous experience with mainstream programming languages may turn out to be an advantage.

Connecting Wolfram to the outside world

So, if you want to go for Wolfram, should you go all-in, and just port everything to Wolfram? If you have a pipeline, or an algorithm, in some other framework, would it be possible to migrate those, or call them from within Wolfram? And what about training and reusing Wolfram artifacts beyond Wolfram?

Read also: Amazon's new research center seeks to improve AI vision (CNET)

White said that it is possible to call Python, Java, or R from within a Wolfram Notebook and then do computations on the results in Wolfram's language or vice versa. There's also the ability to call APIs and external services from a Wolfram Notebook.

Bodenstein added that pre-trained nets can be deployed in Wolfram Cloud with a REST API, but third-party libraries or executables are generally not permitted for security reasons. But there is also Wolfram's Enterprise Private Cloud (EPC), which customers may configure as they like.

wolframpipeline.jpg

That's what processing in Wolfram looks like. (Image: Wolfram)

As for NNs, there is a process through which the models can be exported to MXNet, which Wolfram has chosen as an interface to the outside world. There is also the option of exposing Wolfram functionality via APIs, and that applies to NNs as well. Bodenstein noted that (upcoming) version 12 of Wolfram's language will have support for ONNX, an open ecosystem for interchangeable AI models.

ONNX can be run directly on several inference backends immediately, so things like CoreML (iOS) or TensorRT (NVIDIA). ONNX models can also be exchanged with other frameworks like PyTorch, Microsoft Cognitive Toolkit, and Caffe2, and there is third-party support for TensorFlow.

Things are looking less bright when it comes to ingesting data to train those NNs within Wolfram though. When discussing this, Bodenstein mostly pointed toward Wolfram-specific tools, such as ImageIdentify and FindTextualAnswer. There is also support for MongoDB, and for consuming streaming data, he said. Support for cloud storage such as S3 is missing for the time being.

Software 2.0

These NNs are not the first time Wolfram makes algorithms available. Although more algorithms have been in Wolfram Algorithmbase for a while now, Bodenstein said the NNs are an important and ever growing part of it:

"We're in the process of building a lot of functionality based on neural networks -- they provide an alternative way to do software development (example-driven), Software 2.0 so to speak.

We see it as a way to build this paradigm into the Wolfram Language. There's over fifty functions we want to build with this example-driven approach that have been impossible before this, e.g. speech recognition, FindTextualAnswer, etc. We want to integrate AI deeply into the Wolfram Language, which is quite different from other languages."

Software 2.0 is a term used to describe ways of building non-deterministic, adaptive software. This relies on using data rather than hard-coded rules to operate, and it's an emerging notion. Neural networks is a big part of this, and seeing legacy vendors like CA adopting such principles, it should not come as a surprise to see a vendor like Wolfram in that camp, as well.

Read also: What is machine learning? Everything you need to know (TechRepublic)

This discipline is still in its infancy, and it makes much sense for Wolfram to go for it. Whether it makes sense for you to go for Wolfram, is something you'll have to evaluate.

Innovative artificial intelligence, machine learning projects to watch

Related stories

Editorial standards