Quick Links

Right now, being angry at your computer does nothing, but what if your software could take your mood into account? Affective computing lets a computer detect and interpret your emotional state (affect) and use it as a form of input.

Artificial (Emotional) Intelligence

In 1995 Rosalind Picard published a paper and a book outlining the fundamentals of affective computing. The idea is to imbue computers with emotional intelligence (EQ) in addition to the analytical intelligence that makes them so useful.

Affective computing allows a computer system to scan a human being's emotional indicators such as facial expression, vocal tone, body language, and words for information about their mental state.

Once the computer is confident about what its user is feeling, it reacts in a way that's (hopefully) beneficial to the user. There are plenty of ways computers can use this information.

Related: Clippy Is Back in Windows 11's Latest Update

Remember Clippy the Microsoft Office assistant? Imagine that Clippy could tell when you were actually frustrated and only popped up when you really needed help, instead of when you're just trying to get your work done.

Affective computing could even be used to great effect in games, virtual reality applications, or when interacting with natural computer interfaces such as Siri.

Computers Are Getting Good at Faces

Humans display emotions in a variety of ways, but our faces are the main canvas where we paint our feelings for the world to see. Even the best poker face can't hide tiny microexpressions, although it's still not clear how those should be interpreted.

Related: How Does Facial Recognition Work?

When the original paper on affective computing was written, the challenge of getting a computer to recognize and interpret a human face was truly daunting. Now we have efficient machine learning hardware in our gadgets that can recognize and map a face in fractions of a second.

Of course, you need more than just the ability to recognize and map a face to get affective information from it, but at least now we can get the raw facial information with relative ease. That same machine learning technology, combined with heaps of facial data, will probably tease out the most important information about emotions we need to get affective computing to work well.

We're Treating Our Computers More Like People

Computers interfaces are becoming more like us every day. Living beings like humans take millions of years to change, but our computers are changing and improving at lightning speeds.

In the early days, simple computers needed us to adapt to them by using punched cards, cryptic computer language, command prompts, and eventually the graphical user interfaces of today. Touch screens have helped make computers easier to pick up and use for everyone because they translate our in-born spatial intelligence into a digital format.

Today computers are powerful enough to understand natural speech. You're more likely to deal with a virtual agent when asking for help or information. We have voice assistants everywhere.

As computer interfaces get every more intuitive and natural, adding emotional information to that interaction could transform how well those interfaces work.

Related: How to Use a Voice Assistant Without It "Always Listening"

Emotions Are Hard for People Too

Despite the fact that we're evolved to understand and express emotions, humans get it wrong all the time. While some people seem to have an almost supernatural level of emotional intelligence, for most people it remains a complex task.

So while affective computing sounds like a great idea on paper, in practice it's not so simple. Even with all the amazing new technology we have. It's reasonable to expect that the first systems that make use of this approach in the mainstream will focus on a small set of crude emotional expressions.

If your computer knows you're exhausted, it might suggest taking a break. If it knows that certain pictures in your wallpaper slideshow make you happier than others, it could put those on high rotation or add more images that are similar.

Clearly, there are many ways affective computing could benefit us, but don't expect it to be perfect from day one!

The Dark Side of Affective Computing

Affective computing represents an important leap in how people interact with machines, but it also opens users up to new forms of exploitation.

Marketing psychology is already adept at manipulating our emotions to change our buying behavior. That's why a car advertisement focuses on how a car will make you feel rather than how much horsepower it has or how fuel-efficient it is.

A good chunk of our decision-making is driven by emotion, so imagine if social media companies could read your emotional reaction to posts or ads. One day you might have to tap an "emotional scanning permissions" button along with ones for your camera or microphone permission.

Related: 5 Psychological Tricks in Free-To-Play Games (and How to Avoid Them)