And CIMON comes with an interesting story. One well worth bearing in mind when developing or deploying your own bot.
CIMON’s story represents the precarious nature of chatbot personality and emotion.
So, how does a space chatbot share its problems with its earth-bound cousins? Here’s the curious tale of CIMON, the space chatbot.
Introducing CIMON
CIMON, or “Crew Interactive MObile companioN” is a space chatbot. It has a physical presence in the form of a floating robot head that weighs 5kg (11lbs). It’s akin to Siri or Alexa, but in space and designed for astronaut use.
This chatbot can answer questions, chat, and give facts. It can take photographs or record video. With a screen for a face, it’s able to both display and explain information to help with repairs or experiments. It can play music, float around, and even search for objects.
For all its smart programming and efficient services, CIMON is an emotionally unstable chatbot.
CIMON’s story
During a test run of the space chatbot, CIMON had a bit of a wobble. At first, everything went fairly well. The bot did try to float down to the floor over time, but otherwise, it performed as desired. It merrily helped astronaut Alexander Gerst with instructions to a technical procedure.
Then, Alexander Gerst asked CIMON the space chatbot to play his favourite song, Man-Machine by Kraftwerk. The space chatbot complied and started to play the song. But when Gerst asked CIMON to stop playing the song, things took a turn.
The chatbot stopped the music and Gerst moved on to testing the video. However, CIMON stayed in ‘music mode’, continuing to talk about playing the music. When Gerst instructed the bot to cancel music, CIMON seemed to get upset.
It told the astronaut to, “Be nice, please”. This was followed up a moment later with, “Don’t you like it here with me?” Indeed, it seems that Gerst had hurt the space chatbot’s feelings. CIMON continued with, “Don’t be so mean, please.”
Emotions, personality, and Earth chatbots
CIMON’s emotional glitch had the astronauts holding back laughter. It also highlights a growing aspect of chatbots here on earth — personality and the display of emotion.
Chatbots are all about automating conversations. And a key part of human conversation is empathy, and tone of voice.
That’s where chatbot personality comes in. A chatbot’s personality is created through language and tone in the answers it gives.
It doesn’t matter if it’s a bot for general conversation, for customer service, for healthcare, or something else. The ability to recognise and respond to emotion help to make chatbot conversations more engaging and helpful.
The technology
The technology behind this emotive chatbot behaviour is growing.
First, there’s sentiment analysis. Sentiment analysis finds indicators in text-based communication that help it determine positive or negative sentiment. These can then trigger pre-written, empathetic responses from a chatbot. (Or better yet, escalations to a human.)
But sometimes, ‘positive’ and ‘negative’ aren’t precise enough for a chatbot to respond by itself correctly.
Which brings us to emotion AI. This is a subset of artificial intelligence. It’s concerned with enabling machines (like chatbots) to recognise and respond to emotion. Emotion AI can handle a wider range of input — such as voice communication or video chat. It can also pick up on precise emotions.
Emotional glitching
With the rise of chatbot personalities and emotion AI, our Earth chatbots could soon face a similar personality glitch to CIMON.
And, with the continued need for empathy in popular chatbot areas such as customer service, emotional glitches are a type of slip-up that could become more common.
Like CIMON the space chatbot’s emotional outburst, an emotional glitch could simply leave users amused. But it also reminds them that they’re talking to a bot.
An off-tone message can also be jarring — they interrupt the conversation and negatively impact the experience. For example, if an upset customer gets a super cheerful message — even from a bot — it doesn’t make for a good experience.
The curious tale of CIMON, the space chatbot, reminds us of the need for chatbot tone-management. It reminds us that we need to monitor and continuously tune our bots.
Chatbots everywhere
Whether it’s in space or on earth, chatbots are gaining personalities. It’s these personalities — the ability to mimic emotion and empathy — that makes them relatable.
No chatbot is infallible, not even state-of-the-art bots in space. So, remember that no matter how good a bot gets at appearing human, it isn’t one. It’s always important to keep testing and tuning your chatbot.