5 Min Read

When life imitates life, what makes us human?

We can teach machines emotional intelligence by exploring the way humans understand each other.

In 1896, Russian author Maxim Gorky wrote of his first experience watching a motion picture:

Last night I was in the Kingdom of Shadows. If you only knew how strange it is to be there. It is a world without sound, without colour. Everything there — the earth, the trees, the people, the water and the air — is dipped in monotonous grey. Grey rays of the sun across the grey sky, grey eyes in grey faces, and the leaves of the trees are ashen grey. It is not life but its shadow. It is not motion but its soundless spectre.

Gorky was experiencing the dual repulsion and attraction of the uncanny: the sourness of a burgeoning technology that was able to capture the human experience in part, yet not truly replicate it. 120 years later, it’s the same feeling we get when we hear Alexa blundering through casual niceties or Google Assistant booking a hair appointment with perfectly imperfect intonation. It’s the slightly repellent feeling that something isn’t quite what it seems. In the pursuit of making artificial intelligence less artificial, what Gorky was missing still eludes us today: a recognition and expression of human emotion. Moving pictures have come a long way since the silent, flickering images of the early twentieth century: they convey and elicit emotion through a host of techniques merging the aural and visual. For two hours, we enter a different world and ride an emotional roller coaster, but then we step off the ride and back into our reality. We never quite fully let go of the knowledge that it’s only a movie.

We now face an unprecedented paradigm shift, and it’s no longer a matter of art imitating life. This is life imitating life. Researchers at PWC recently predicted that by the mid-2030s, 30% of jobs are at risk of complete automation. 45% of jobs in the manufacturing sector will have been replaced by machines. 39% of jobs in construction. 34% of jobs in retail. These are only predictions, of course, but the inexorable onward march of artificial intelligence is forcing us to grapple with deep philosophical questions, as we slowly yawn ourselves awake into the dawn of the autonomous machines.

What does it mean to be human? How do we keep hold of our humanity, as machines take over the more fundamental functions of life? In a world of increasing automation, where do we find serendipity, empathy, and compassion? I’ve been following the Frankenstein AI project at my alma mater Columbia University, as they explore the intersection of machine intelligence and human emotion; using a range of media, experiences, and AI. It’s a fascinating attempt to come to terms with this brave new world, with storytelling at its foundation and a desire to understand human needs as the basis for any technological blueprint.

Frankenstein AI is challenging dystopian visions of an automated future by exploring the ways humans understand each other, and how we might use this knowledge to teach machines emotional intelligence. The point is clear: a better future is one in which we augment the human experience, not eliminate it.

As product designers, we have a responsibility to use our powers for good. To inform data with the most human of things: emotion. This will enable AI to do what it’s best suited for - the things humans can’t do on their own - in order to make their lives ever better. (As designers we’re lucky: creativity is likely one of the last human skills machines will be able to replace.) This is already having a profound impact on the process of research and design.

For almost twenty years, designers have talked about being “user-centred”; we’ve waxed eloquently about how data sits at the heart of our understanding of people; we’ve gently nudged all sorts of metrics by delivering “user value”. But one fundamental thing has always been missing: a real-time triangulation of individual emotion, context, and desire. At Constellation AI, we’re using the full research and design toolkit in new and wonderful ways to build a picture of what it means to be human — and how our AI can learn from this. We’re investigating how context changes expectations and desires by building out real-time, data-driven, mental models. We’re using digital diary studies to monitor changes in emotion as people interact with others throughout their day, and we’re exploring ways to visualise this data. We’re building incredibly detailed maps of the human experience with life pain and pleasure points and we’re training our AI to use a complex range of factors — including emotion, context, and desire — to help individuals uncover insights. We’re learning from subject matter experts across a range of fields — from therapists to biographers — about how to design conversation that allows the user to easily engage with their own narrative; however, whenever, and wherever they like.

We have the opportunity to use data in ways we could only dream of a few years ago: to understand a person’s mental profile and then deliver a highly individualised experience based on that understanding. The days of recommendations built on aggregate assumptions are fading. Soon, we’ll be able to positively augment any aspect of our lives with intimate understanding; from relationships to health, from career to finance.

Imagine Gorky sitting down to watch his film, but this time it’s created for him. Frame by frame, its look-and-feel designed to augment his mood, gauging his positive responses. AI composing the music on the fly based on an intimate knowledge of Gorky’s unique mental state at the time, monitoring his physical reaction and adjusting accordingly. The story unfolds with individualised knowledge of Gorky’s needs at each and every revelation of the narrative arc. It would be a Gesamtkunstwerk in the truest sense of the word, created just for him. Farewell, he’d doubtlessly say, to the Kingdom of Shadows.

Guest blog by Christopher Lee Ball — VP, Research & Design, Constellation AI

Christopher has designed experiences for Thompson Reuters, BBC, Samsung, Volvo, BNP Paribas and Digitas; working on notable projects such as BBC iPlayer and the Virgin Atlantic website. He has led teams to win multiple awards; is on the awards panel for D&AD; and teaches user experience design at the D&AD, Hyper Island, and the University of Graz. Watch Christopher talk about creativity in the age of the 4th industrial revolution here.