Zoom says AI features should come at no additional cost. Here's why

The company wants to give its video-conferencing customers more data-led features using a federated approach to AI. Here's what that means.
Written by Eileen Yu, Senior Contributing Editor
Zoom meeting app
Jakub Porzycki/NurPhoto via Getty Images

Zoom is pledging to provide artificial intelligence (AI) features at no additional cost to paid customers on its video-conferencing platform.

The technology company believes bundling these extra functions as part of its paid platform service will prove a key advantage as businesses begin considering the price tags of other offerings in the market. Zoom also advocates the merits of a federated multi-model architecture, which it says will allow for better efficiencies. 

Also: Five ways to use AI responsibly

Noting that customer concerns have surfaced over the potential cost of using generative AI, especially for larger enterprises, Zoom's Asia-Pacific head Ricky Kapur said: "At $30 per user a month? That's a significant cost factor."

Large organizations are not going to want to give access to every employee if it is too costly to do so, Kapur said in an interview with ZDNET. Executives have to decide who should and should not get access to generative AI tools, which can be a complex decision.

Kapur said there have been "accelerated" projects around generative AI among Zoom's paying customers because these features are provided at no additional cost.

Also: These 5 major tech advances of 2023 were the biggest game-changers

The video-conferencing platform has introduced several AI-powered tools during the past year, including AI Companion and Zoom Docs, the latter of which is slated for general availability next year. Zoom Docs is described as a next-generation document workspace with "modern collaboration tools". The tool is integrated into the Zoom interface, and is accessible In Meetings and Team Chat, and via online and mobile apps.

AI Companion, formerly called Zoom IQ, is a generative AI assistant for the video-conferencing service and it helps to automate time-consuming tasks. The tool can compose chat responses with a customizable tone and length, which are based on the user's prompts, and it can summarize unread chat messages. Zoom IQ can also summarizes meetings, providing an account of what was said and who said it, highlighting key topics. 

AI Companion is available at no additional cost for customers signed up to Zoom's subscription plans. The Pro plan is priced at $149.9 per user per year, and Business plan is tagged at $219.9 per user per year. Two other plans, Business Plus and Enterprise, are priced according to the customer's requirements. 

The integration of Zoom Docs and AI Companion means users will be able to receive a summary of their last five meetings and a list of action items, said Zoom's chief growth officer Graeme Geddes. More than 220,000 users have tapped AI Companion since its launch in September. The AI tool now supports 33 languages, including Chinese, Korean, and Japanese. 

Also: AI in 2023: A year of breakthroughs that left no human thing unchanged

Pointing to Zoom's move to include AI Companion at no additional cost for paying customers, Geddes said the vendor believes these data-led features are fundamental capabilities that everyone in the organization should be able to access.

Geddes said Zoom's federated approach to its AI architecture is crucial. He said rather than anchoring its build on a single AI provider, as some tech vendors have opted to do, Zoom has chosen to incorporate multiple large language models (LLMs). These models include its own LLM, as well as third-party models, including Meta Llama 2, OpenAI GPT 3.5 and GPT 4, and Anthropic Claude 2.

This broad approach means AI Companion can evolve and be optimized for quality and performance, Geddes said, adding that this strategy leads to reduced latency and cost efficiencies for customers. 

"With our federated approach to AI, according to our own internal testing, our team has improved the relative quality of AI Companion over single-model approaches, such as OpenAI GPT-3.5 Turbo or several other state-of-the-art LLMs," CTO Huang Xuedong wrote in a post. "The relative difference is 99% vs 93% quality rating, per our proprietary quality evaluation methodology." 

Also: Two breakthroughs made 2023 tech's most innovative year in over a decade

Huang also said: "We're measuring performance as a combination of lower cost, faster response time, and higher-quality outputs. In comparison to OpenAI's GPT-4-32k model as the proxy of Microsoft Copilot, Zoom AI Companion's meeting questions capability offers reduced cost and faster response time while maintaining comparable AI quality."

"Our true north to democratizing access to these technologies is making sure the economics work," Geddes noted. He also said privacy is a top priority, stating Zoom will not train its own AI models and third-party models with customers' meetings data. 

No customer data will be used to train AI models

In August, Data privacy became a sore point for the vendor, when news emerged that changes made to Zoom's Terms of Service could give the company the right to use customers' video and chat data for its AI initiatives.  

Following public outcry, Zoom eventually reconsidered the move and declared that data generated from users of either its free or paid services will not be used to train its AI models. 

The episode provided some important learnings, with feedback from customers the key driving force behind Zoom's blanket decision not to train its LLMs based on data of all its customers, across the board, Geddes said. 

Also: The best AI chatbots: ChatGPT and other noteworthy alternatives

This approach means the company does not use any customer audio, video, chat, screensharing, attachments, or other communications from customer content -- such as poll results, whiteboards, and reactions -- to train its AI models. 

"We have been delivering our new generative AI-powered features without using customer content to train Zoom's or our third-party models," he said. "Instead, we use a combination of public domain data -- data we purchased, and data we created -- to train these models."

Editorial standards