A computer science undergrad has released Deep TabNine, his autocomplete tool for multiple programming languages that suggests the finished line of code from a few keystrokes.
The TabNine 'autocompleter' has been under development for the past year, but now has been refined with a deep-learning model to improve the quality of its suggested code.
The new and improved TabNine is based on the predictive text deep-learning language model GPT-2 from Open AI, the organization Microsoft just announced it will invest $1bn in as its "preferred partner for commercializing new AI technologies".
The autocomplete tool was written by Jacob Jackson, an undergraduate at the University of Waterloo in Canada, who in a blogpost says the aim of the tool is to help developers code faster.
Deep TabNine was trained on two million files from GitHub and is built to "predict each token given the tokens that come before it". That's essentially the same goal GPT-2 was trained for but Deep TabNine predicts the building blocks of code rather than human-written sentences.
There are similar tools to Deep TabNine, such as Microsoft's IntelliSense for Visual Studio, however rather than suggesting a single token, Deep TabNine suggests multiple ones, as you can see in action here.
NOW READ: Six in-demand programming languages: Getting started (free PDF)
One catch for now is that Deep TabNine is probably too intensive to run on a laptop, so it wouldn't deliver suggestions as fast as the standard TabNine. Until a more pared-back model is created, Jackson is offering a TabNine Cloud beta service that use GPUs to speed up autocomplete suggestions.
Jackson says he is working on a model that can run on a laptop with "reasonable latency", while for enterprise customers are being offered a license to run the model on their own hardware.