It is no surprise that with the increased use of computer systems and applications, interdisciplinary fields have become necessary in progressing forward. One such field is affective computing, which comprises of computer science, psychology and cognitive science. Affective computing allows for machines to recognize and compute human affects. The fictional stereotype of computers and by extension, robots, having a “does not compute” response to emotion is simply not true anymore.

affective computing ai

“Emotional computers” may sound like a scary concept but stick with us because affective computing is becoming an important concept in how we approach artificial intelligence.

The use of artificial intelligence has increased in several industries and is used in consumer products as well such as voice assistants and video games. AIaaS otherwise known as AI as a Service is also an industry of its on.

Before going further into the relationship between affective computing and artificial intelligence, a short history lesson on the history behind affective computing.

Affective computing history

In 1997, Rosalind W. Picard published Affective Computing. Picard herself holds a doctorate in electrical engineering and computer science. She is also credited as beginning the study of affective computing.

Within it, she highlighted the fact that:

Emotion is not only necessary for creative behavior in humans, but neurological studies indicate that decision-making without emotion can be just as impaired as decision-making with too much emotion. Based on this evidence, to build computers that make intelligent decisions may require building computers that “have emotions.”

Machines have both become increasingly intelligent and relied upon more heavily since the 90’s. This is mainly due to the rise of mobile computing and the influx of information available.

Having briefly covered the beginning of affective computing, it is time to look at how it is being applied today and how artificial intelligence is involved. If you are interested in how it all got started, Dr. Picard’s book is available on Amazon.

Affective computing applications

Until recently historically speaking, artificial intelligence has been the stuff of science fiction. Nowadays, AI is being built for a variety of everyday applications such as self-driving cars, facial recognition, and people analytics. While these applications involve creating advanced AI, they also require an understanding of emotion.

This is where affective computing comes into play. Simplifying processes through technology can only go so far and eventually an emotional barrier rises.

Visualizing a computer understanding emotion can be tough yet we do it every day through our understanding of emotion and the cultural nuances behind expressions.

One method of understanding affective computing is through emotion-tracking software. Since 2009, Affectiva have been using their Emotion AI to identify emotions. Below is a description of their product:

Our technology first identifies a human face in real time or in an image or video. Computer vision algorithms identify key landmarks on the face – for example the corners of your eyebrows, the tip of your nose, the corners of your mouth. Machine learning algorithms then analyze pixels in those regions to classify facial expressions. Combinations of these facial expressions are then mapped to emotions.

Affective computing facial recog

Affectiva is not alone in utilizing affective computing. Below are some other companies that are utilizing this technology to shape the future of AI.

AI these days are being built for specific purposes in contrast to the know-it-all AI found in science fiction. In the following companies’ case studies, the AI is being built around voice and productivity respectively.

Beyond Verbal

Beyond Verbal is an emotion analytics company. Naturally, our voices convey emotions and Beyond Verbal is tapping into the analytics of it. Described as:

Emotions drive everything we do, yet voice-driven emotions analytics remain them most important, unexplored interface today. Beyond Verbal is committed to change that.

Language is not a barrier either, demonstrated by analyzing the emotions of founder and executive chairman of Alibaba Group, Jack Ma. Below is a video with a live analysis.

The company has built a virtual private assistant as well as an API for developers.

Humanyze

Performance at work comes with its own set of analytics. Having satisfied and inspired employees is a hope of every company and understanding people analytics can do that. Although it is reminiscent of George Orwell’s 1984, Humanyze’s badge module is essentially an affective computing wearable. The wearable allows for managers to gather data on collaboration, communication networks in order to find strong interactions.

Humanyze affective computing

Thanks for reading!

We hoped you enjoy our brief dive into affective computing! Want more tech trends? We got you covered.

San Diego Startup of the Week: Shield AI

Top 5 QA Best Practices for Software Development