Microsoft Reveals New AI That Can Detect Human Emotions

If there’s anything that separates us humans from machines, it’s our feelings and emotions, which can’t be understood by machines. This is something that is often believed, since feelings are unconvertible to binary code. Until now that is. It seems Microsoft has found a way to prove most of us wrong with its Project Oxford. Thanks to Microsoft’s new Artificial Intelligence (AI) API, machines can sense how people are feeling by analyzing their photos.

How it Works

The team behind the project displayed the AI at Microsoft’s Future Decoded event on Wednesday. Similar to what Google did with TensorFlow, Microsoft has released the new AI API to the public. Developers can use the beta version of the API in their applications. The API will allow programs to recognize eight of the basic human emotions. The eight emotional states are anger, contempt, disgust, fear, happiness, neutral, sadness and surprise.

Microsoft’s current Chief Executive, Satya Nadella, spoke at the event that he wants new inputs and outputs to be bred so that they can pave way for new types of personal computers. The emotion-sensing tool is an example of what these new inputs could look like. With the ability to understand a user’s emotions, devices can learn new ways to interact with users.


Here’s how it works with the Emotion Detection

The tool is still in beta stage, so it’s clear there is room for improvement. There are quite a few more human emotions that haven’t even been added to the tool. Then there’s the problem of faking emotions. However, the basics are correct and there’s great potential here.

The Potential for Machine Learning AI

Head of Microsoft Research Cambridge, Chris Bishop, showcased how the new AI is capable of detecting multiple faces and different emotions at the same time. The demonstration also displayed how each emotion is registered on a scale of zero to one, with varying values being registered among the different emotion categories.

Microsoft gave several examples of what this new project could bring to the tech world. When implemented in a real-world context, marketers can use it to gauge customer reactions. Messaging apps can also utilize the technology to automatically send emotions as images by using smartphone cameras.

Project Oxford is not limited to human emotion reading. There are several other tools capable of understanding words, sounds or images and will be released to developers in the coming months. These tools include Spell Check, Video Analysis and Editing, Speaker Recognition, Custom Recognition Intelligent Services. All of the Face APIs like age detection, emotion detection, and facial hair detection will also be updated and released under the Project Oxford set of tools.

Microsoft hopes that developers unable to develop their own machine learning or artificial intelligence systems will use these tools to bring new features into their apps.

He is the Chief Content Officer at ProPakistani. Reach out at aadil.s[at]

  • Excellent news from Microsoft on AI on API machines. Satiya Nadella talked about this prior to being the Vice Chair of WEF Davos 2016. The AI with emotions will lead for the Driverless Cars that will work on human Eotions and the drivers face will be read by AI software and analysed the reactions of the face, translated into instructions by the software. Google Alphabet is exactly on the same pattern. I suggest Microsoft Pakistan to work on this for their future development of developers in software for the Un-Manned Car project in Pakistan.

  • close