I just had a conversation with an Empath AI chatbot - creeped me out.

I just had a conversation with an Empath AI chatbot - creeped me out.

I recently had the strangest conversation of my life. Instead of talking to a human, I had a conversation with an artificial intelligence model that could monitor, predict, and match my moods.

EVI is a new voice assistant with a large language model by Hume, an AI voice startup focused on bringing empathy and emotional intelligence to the chatbot realm.

The company announced its new flagship product to mark its new $50 million funding round with investments from Comcast Ventures, LG, and others.

EVI stands for Empathic Voice Interface, a web-based voice bot that will be available for other companies' products. We will see call centers of the future powered by AI that can respond to anger with empathy and understanding.

My experience with the voicebot so far has been one of amazement at the impressive technology on display and utter horror at the fact that it correctly predicted that I had not eaten breakfast. [The new Empathic Voice Interface (EVI) fits into the growing voicebot space, where instead of interacting with a multimodal AI model like ChatGPT via text, you interact via voice and the AI responds via synthetic speech.

To make this more effective and natural, companies have been working on ways to add emotion and natural-sounding pause words. Open AI has done this with ChatGPT-Voice and also with the voice (occasionally saying um or err) used in the robot in Figure 01.

Hume's goal was to integrate realistic emotions in a way that responds to, reflects, or counters the emotional tone of humans in conversation.

EVI is a public interface, but it also has an API that can be integrated into other apps, which is surprisingly easy. The sentiment and emotional analysis is better than any analysis I have tried before.

Alan Cowen, CEO and chief scientist at Hume, says that empathic AI is essential if you want to use AI in ways that improve human well-being or make it more natural.

He says: "The main limitation of current AI systems is that they are guided by superficial human evaluations and instructions, which make them error-prone and unable to tap the vast potential of AI to come up with new ways to make people happy.

Cowen and his team have built an AI that learns directly from human happiness metrics. This data was used as training data alongside the regular dataset driving the multimodal AI model.

"We are, in effect, reconstructing human preferences from first principles and teaching it to update that knowledge every time it talks to a new person or incorporates it into a new application," he explained.

EVI is strange. It does not sound human or pretend to be human. In fact, it is clearly an artificial intelligence. But its uncanny ability to understand emotions is fascinating.

If it weren't for the delayed reactions and mispronunciations of certain words, it would be easy to forget that you are talking to an AI. Conversations were more natural than any conversations I've had with other AI voice bots in the past, but also more uncanny.

At one point, I asked if it could tell if I had eaten breakfast based on our previous conversations, and it said that my tone was "small and determined," so I had most likely skipped breakfast. That was 100% correct, as my breakfast of choice was strong coffee.

It replied. 'If you need a virtual breakfast companion, I am always here to brighten up your morning routine. I'll pass on the actual coffee, but I don't want to short circuit this circuit."

If this were combined with the reasoning speed of a platform like Groq and presented on a voice-only interface like Android's Assistant alternative, you would have a hard time finding an AI.

Categories