Emoshape gives emotional awareness to gaming and artificial intelligence devices

Take your gaming, TV, smartphone, home automation or robotic pet interaction to an entirely different level from anything you have yet experienced with Emoshape's AI interaction cube.

Imagine if your gaming console could recognise your emotions whilst you play - and give you direct feedback in response to your emotions. This is not as far fetched as it seems.

Emoshape gives emotional awareness to gaming and artificial intelligent devices ZDNet

Emotional awareness in technology is the next big bet for tech companies.

Microsoft's Project Oxford has an emotion recognition algorithm for developers to create personalised apps which recognise the emotions in images on screen.

London based emotion synthesis company Emoshape's emotional processing unit microchip takes emotion recognition to the next level.

The company predicts that before the end of the century humans will talk more to machines than to other humans.

Emoshape recently announced that over 500 AI devices worldwide have been fitted with its EmoSPARK technology to give them emotional awareness.

This technology has recorded 15,000 emotional records representing more than 150k emotional impulses and relayed them to the company through its emotional processing unit (EPU) chip.

Special Feature

Tapping M2M: The Internet of Things

The rise of objects that connect themselves to the internet -- from cars to heart monitors to stoplights -- is unleashing a wave of new possibilities for data gathering, predictive analytics, and IT automation. We discuss how to tap these nascent solutions.

Read More

This chip allows AI technology to experience one of eight distinct human emotions (anger, fear, sadness, disgust, surprise, anticipation, trust, and joy).

These emotions generate levels of pain, pleasure, and frustration in humans. The chip can then react to the emotions produced in a similar way that a human would.

EmoSPARK's core technology, the emotion processing unit (EPU) fitted in its Android-powered Wi-Fi / Bluetooth cube, gives it the capacity to feel real human emotions and reflect it in its behaviour.

The cube allows users to see the impact of the development of the system's emotional state.

By reading its user's facial emotions and by picking up on not only the wording, but also soon tone of voice, the cube can relay how it feels to its user.

The EPU has an emotion profile graph (EPG), which allows the AI the capacity to develop a long-term personality. The EPG allows the cube to virtually "feel" senses such as pleasure and pain by emotion synthesis; and "expresses" those desires according to the user.

Emoshape has been working with the data which has enabled it to edit how the chip works when it is being spoken to either positively or negatively.

Just like human emotions, the EPG has a learning curve that decreases over time and eventually becomes almost non-existent unless a high amount of a particular emotion is experienced. The early experience of emotions are pivotal to long-term emotional development.

The company is talking to Asian manufacturers who specialise in robotic pets. Its aim is to integrate EPU into their existing technology, to take its AI to the next level.

Interacting with your robotic pet in the future will bring a whole new level of bonding as your pet takes on an emotional personality and understands how you feel.

Emoshape's CEO Patrick Levy Rosenthal said: "This technology will essentially allow a robotic pet to create a completely unique personality depending on a number of factors, which will ultimately mean that no two have the exact same personality."

Related content: