X
Tech

Audience unveils multi-sensory chipset for mobile devices that always listens

The newest chipset from Audience brings always on capability to phones, wearables, and tablets, with context awareness.
Written by James Kendrick, Contributor
nuen100.jpg
(Image: Audience)
At the Mobile World Congress (MWC), Audience, maker of audio components for mobile devices, has unveiled the N100. The new chipset goes beyond audio, interacting with multiple sensors in mobile devices to determine what the user is doing at any given time. It does this with a hands-free method by always monitoring what the device is doing.

Audience is the mobile tech company you've probably never heard of, as it makes components for phones and tablets. It produces chipsets that sit between a phone's microphone and processor, providing clean audio for better speech recognition. It is a growing company, with components installed in over half a billion devices. It has partnerships with China Mobile and ZTE, among others.

Always-on operation as in the new component by Audience is nothing new, but in other products it is hard on power consumption with the continual operation. The N100 deals with this in a clever way, as it shuts down certain monitoring based on what it detects is happening.

Audience gave ZDNet this example to explain how this works - the mobile device in which the N100 is installed will use the GPS to be ready to provide features needed when the owner is walking or running. When the user stops, the GPS is shut down to reduce power consumption.

The company calls this NUE, or natural user experience. As explained to us, NUE uses multi-sensory input to not only reduce power consumption as described earlier, it also permits the mobile device to be always listening for a keyword to trigger an action.

Audience calls this VoiceQ, and it works in a similar fashion to the Amazon Echo. The Echo is always listening for a keyword to tell the device to listen to a command or question.

In the case of the N100, keyword triggers can invoke any function the mobile device is programmed to perform, without touching the device. It's a very low power process as only the microphone and the N100 need to be powered for this always on listening.

The context awareness of the N100 is called MotionQ, and uses multiple sensors for reducing power consumption and to trigger various actions. Through the device's GPS, accelerometer, and other sensors, the N100 can determine if the user is walking, running, riding in a car, standing up, or sitting down. It knows if the phone or other device is lying on a table or in the owner's hand.

It can detect when a phone is put in a pocket and turn the screen off to save the phone's battery. When the device stops for a while, it shuts down as many sensors as possible except the microphone. This makes the already low power even more efficient as things are shut down by the Audience component.

The low power needed for the chipset combined with the tiny size of the N100 make it a good fit for any wearable.

Audience doesn't make devices, it sells the N100 to device makers. It will be up to the OEMs to implement the functions enabled by the Audience component. The first devices with the chipset are expected in mid-2015.

When questioned about privacy concerns some might have with the mobile device always listening and tracking the user, Audience indicated it only sells the N100 to OEMs and doesn't provide the functions it enables. The OEM will be responsible for dealing with customer concerns in the way it feels best adds value.

Editorial standards