X
Innovation

AI will change software development in massive ways, says MongoDB CTO

AI, especially the generative sort, is going to "let developers write code at the quality and the speed and the completeness that we've always wanted to," says Mark Porter.
Written by Tiernan Ray, Senior Contributing Writer
mongdodb-mark-porter

"There's this stereotype of how long it takes to write computer software and how long it takes to get it right," says MongoDB CTO Mark Porter. "I think generative AI is gonna change all that in massive ways."

Tiernan Ray

Artificial intelligence, including the most popular form at the moment, generative AI such as OpenAI's ChatGPT, is going to provide tremendous leverage to software developers and make them vastly more productive, according to the chief technologist of MongoDB, the document database maker. 

"One of the things that I strongly believe is that there's all this hype out there about how generative AI may put developers out of business, and I think that's wrong," said Mark Porter, MongoDB's CTO, in an interview with ZDNET.

Also: More developers are coding with AI than you think 

"What generative AI is doing is helping us with code, helping us with test cases, helping us with finding bugs in our code, helping us with looking up documentation faster," said Porter.

"It's gonna let developers write code at the quality and the speed and the completeness that we've always wanted to."

Not just generative AI, said Porter, "but models and all the other stuff that's been around for 15 to 20 years that's now really solid" will mean that "we can do things which transform how developers write code."

Porter met with ZDNET last week during MongoDB.local, the company's developer conference in New York. The conference is one of 29 such developer events MongoDB is hosting this year in various cities in the US and abroad. 

Prior to becoming CTO of MongoDB three and a half years ago, Porter held numerous key database roles, including running relational database operations for Amazon AWS RDS, running core technology development as CTO at Grab, the Southeast Asia ride-hailing service, and over a decade in numerous roles at Oracle, including a stint as one of the original database kernel developers. 

AI is "an acceleration of the developer ecosystem," added Porter. "I think more apps are going to be written."

Also: Serving Generative AI just got a lot easier with OctoML's OctoAI

"There's this stereotype of how long it takes to write computer software and how long it takes to get it right," said Porter. "I think generative AI is going change all that in massive ways, where we're going to be able to write the apps we want to write at the speed we want to write them, at the quality we want to have them written."

A big element of MongoDB's one-day event was the company's discussion of new AI capabilities for the MongoDB database. 

"MongoDB is actually the foundation of hundreds of companies building AI," said Porter. Indeed, the show floor, at Jacob Javits convention center in Manhattan, featured numerous booths from the likes of Confluent, Hashicorp, IBM, and Amazon AWS, where presenters explained the use of MongoDB with their respective software technologies. 

mongodb-new-york-local-conference-june-2023

Crowds at MongoDB's New York local conference for developers.

Tiernan Ray

Porter emphasized new functionality in MongoDB that incorporates vector values as a native data type of the database. By supporting vectors, a developer can take the context vectors produced by the large language model, which represent an approximate answer to a query, store them in the database, and then retrieve them later using relevance searches that produce a precise answer with the necessary recall parameters. 

Also: AMD unveils MI300x AI chip as 'generative AI accelerator'

When a user asks ChatGPT or another LLM a question, explained Porter, "I'm going to get a vector of that question, and then I'm going to put that vector into my database, and I'm then going to ask for vectors near it," which will produce a set of relevant articles, for example. 

"Then I'm going to take those articles and prompt my LLM with all those articles, and I'm going to say, you may not say anything that is not in these articles, please answer this question with these articles."

The LLM can then perform functions such as summarizing a long article, offered Porter. "I love to use LLMs to take an article and make it shorter."

In that way, AI and the database have a division of labor. 

Also: Microsoft unveils Fabric analytics program, OneLake data lake to span cloud providers

"You would never want to put an LLM in an online transaction processing system," said Porter. "I think you want to use the LLMs where they belong, and you want to use database technology and matrix technology where it belongs."

While there are standalone vector databases from other vendors, Porter told ZDNET that incorporating the functionality will reduce the burden for application developers. "It means that you don't have to have pipelines between the two [databases], copying data around," said Porter, "You don't have to manage two different systems, it's all in one system, your core data, your metadata, and your vectors all sit in one data store."

No matter what comes next with AI, said Porter, "It ain't going to put developers out of business. 

"Developers are still going to be the ones who listen to their customers, listen to their leaders, and decide what to write." 

Also: These are my 5 favorite AI tools for work

Editorial standards