Should Technology Be Able To Harness Our Emotions?.

5 min to read

I overheard someone recently say that we’ve reached the ‘Age of Emotional Machines’. Quite a sweeping statement perhaps, but it’s hard to deny that these past couple of years have seen intense advancements in the way that machines react to human needs. Affective computing has taken off, and it’s clever. Very clever; overconfident-University-Challenge-contestant kind of clever. For those of you who haven’t heard of it before, affective computing – a phrase coined in the 1990s – refers to the development and creation of technology that reads and responds to human emotion. Yep. Your computer or phone might soon be able to gauge exactly when you’ve had a good day at work, or sense when you’ve arrived home and want to hit something. Which is more than a lot of other people can do, to be honest.

So how does it work? Good question. In brief, affective computing uses webcams, microphones, sensors and other software to carefully recognise and analyse human physical reactions – from body language, to facial expressions and vocal cues. It then takes this data to monitor, mimic or manipulate its reactions and behaviour accordingly. Medical technology, for example, could detect early warning signs of a seizure or suicide attempt and alert for assistance before the situation escalated. Current early versions of the software have been used to help those with autism to react and interact more easily with others, and for counsellors or tutors providing therapy or teaching online, this technology gives them a more accurate evaluation of how their patients or students are reacting to the content.

The implications and potential for such advanced systems are staggering, and it’s not just scientific fields that have the potential to benefit from affective computing – this could be a breakthrough for the world of advertising too. Businesses spend billions of pounds every year attempting to add personality to their brands, and for good reason – it works. There’s a huge payoff for companies that make an effort to build emotional connections with their customers. A study from the Harvard Business Review listed the need for consumers to “feel a sense of belonging” and “thrill” as two extremely important factors that drive behaviour. Since brands need to sustain our emotional interest to give them an edge over their competitors, this new software – which would be capable of mood-tracking and personalising its digital adverts en masse – would be able to tick many of those boxes to carve themselves a captivated and involved audience.

But as with any situation where technology starts to get a little big for its mechanical boots, a whole host of ethical issues tag along for the ride. One particular risk with affective computing lies with the wellbeing of those who get targeted. Body image a big thing for so many people, and certain industries have the potential to cash in on their marketing if they try to manipulate those deemed emotionally vulnerable. A girl with confidence issues, for example, might be more susceptible to a beauty brand who has captured her data and specifically shown her adverts that play on her insecurities. And what about when it comes to important democratic decisions, such as another election or Brexit scenario?

According to leading company Affectiva, there have already been attempts by government agencies to start using this technology to monitor and survey our emotional responses to campaigns. On a larger scale, this could give parties the potential to sway voters unlike ever before. We’ve only got to look back over 2016 with its catty presidential campaigns and unstable politics to see that using technology to influence people’s emotions might be a step too far.

The other primary problem with affective computing is its intrusive qualities. Should a computer be able to read you like it does a line of code, for example? Emotion is exactly what makes us human, and it plays into every aspect of our lives; we are incapable of doing very much without an emotive reason or response. It therefore seems bizarre to start creating emotional ‘databases’ as though our feelings are just another logical field of science we have yet to properly make use of. Surely emotion, something so fluid and hard to quantify, can’t be analysed so simply.

As pronounced as some of the benefits of affective computing are, (and certainly for medical advancements I fully support its uses), I have a hard time imagining this new technology working effectively for everyone. If computers are always doing everything in their power to make our lives stress-free and happy, then are we not going to end up missing out on important information, trapped in a fantasy world which aims to smooth over the broad range of emotions that make us human?

Affective computing also runs a huge risk of interpretive errors. Machines are rational, programmed logically; emotions are quite the opposite. Memories and associations can make the most mundane of images or information mean a little or a lot to different people, so it seems crazy that a computer could ever be intelligent enough to comprehend this. You can’t always explain why sometimes you love a song or book that your friend can’t stand, or why some things just strike you as funny – they just do. I once cried with laughter for at least five minutes at a picture of a fish.

Does that mean that I’ll collapse into giggles as soon as I come across another? Of course not. But if I’d been emotionally analysed on that particular day, I could assume that my computer would have been showing me pictures of salmon or cod for weeks in an attempt to make me smile. My point here is that without context, emotional analysis actually counts for very little. A laptop might be able to recognise that you reacted badly to a joke about money for example, but it doesn’t know that only last week you and your partner were arguing about the rent, or that you still need to pay your mate back for a pint or three. As humans we are still often incapable of correctly reading someone else’s body language, so who’s to say that a computer or phone, designed by the same humans, is going to be able to do a better job?

There still remains a huge disjunction between humans and machines, and for good reason. 2017 is all set to be another year of advancements in this field, so it will be fascinating to see where this technology goes next. Bringing emotional intelligence into the digital world is a bold move, and certainly a life-changing one, but not one that I can see going smoothly.