24 May 2023DisruptorsWhy is everyone trying to humanise AI?
image-afa44fd4dc6d4d0106a37c6841ba7b6d88927323-4097x2731-jpg

There's no doubt that AI is changing the world, but the technology itself is also going through a transformation. Moving away from a functional focus, a new generation of AI platforms is focusing on making algorithmic assistants as human as possible, hoping to foster a strong sense of trust.

Author
Alex StrangAlex Strang is a senior insight editor at Canvas8 who used to be in a punk band that was signed, shaped, and spat out. He enjoys using his experience of being the product to help brands understand how to sell theirs. After studying philosophy and critical theory, he found his feet in the market research world and has been over-analysing consumer behaviour ever since, including his own. He can usually be found playing board games, watchingSeinfeld, or trying too hard to make his daughter laugh.

All except those living under a digital rock over the last 12 months will be more than aware of the rapid trajectory that AI technology is on.

If it feels like just a couple of years ago people were shouting at Alexas and Google Assistants for playing the wrong song over and over again, that’s because it was.

AI has gone from something that people thought of as clumsy and clunky to something that is seamlessly creating everything from news articles to fabricated and dangerous content and healthcare breakthroughs.

Engati (2023)

AI's place not just in popular culture but mainstream society is all but cemented. So why are people obsessed with making AI seem as human as possible?

In a few short months, interest in ChatGPT has spawned countless competitors, each interested in humanising its AI offerings to normalise algorithmically-augmented communications between tech and real humans.

Even Snoop Dogg has no idea what is really going on with AI at the moment.

While ChatGPT was at the forefront of this movement, it’s slowly becoming a symbol of a more industrial approach to AI – little to no personality and functionality taking precedence over familiarity or fun.

In contrast, Inflection's AI chatbot Pi is presented to users as a friend, someone to whom they can vent just as easily as they can ask it deep and meaningful questions. The 'kind and supportive' companion offers users the chance to receive 'friendly advice, in a natural, flowing style'.

But it is still a chatbot, trawling the Internet for information to answer the questions that people are asking.

Inflection (2023)

Outside of playing with chatbots and image generators, much of the challenge for large-scale and widespread adoption of AI is trust –  despite the increased prevalence of the technology in everyday lives, people are still reticent to trust the tech.

In fact, one study suggests that just 39% of people in the UK say that they are willing to trust AI.

And with high-profile AI releases like Snapchat’s being met with cynicism and privacy concerns, it’s easy to see why.

Engati (2023)

This goes a long way to explain the preoccupation of many AI platforms and providers to humanise their tech – the more human the tech feels, the less it feels like people are talking to a machine, and perhaps the easier it is to trust.

In contrast, people are more than willing to laugh at AI when it doesn't quite get something right – as seen in the viral video of a nightmarish AI-created beer commercial.

The commercial's failure meant that it felt safe – to laugh at AI’s shortcomings is to recognise that it’s not yet sophisticated enough to truly mimic reality, threaten people’s wellbeing, or fully supplant them in the workplace.

Synthetic Summer (2023)

The more AI platforms seem human, the easier it will be to foster trust between the platforms and their users.

The next generation of ChatGPT-inspired tools won’t seem like users are talking to a chatbot feeding answers scraped from the Internet.

Instead, it’ll feel like people are talking to a real person who understands the nuances of human communication and emotions – whether that is a good or a bad thing remains to be seen.