1. Hume’s new AI voice assistant, EVI, uses large language models to monitor, predict, and match the user’s mood.
2. EVI is designed to bring empathy and emotional intelligence into the chatbot space, with investment from Comcast Ventures, LG, and others.
3. The AI learns directly from proxies of human happiness to improve human well-being and provide a more natural conversation experience for users.
Hume has introduced EVI, an Empathic Voice Interface powered by a large language model, capable of monitoring and responding to emotions. EVI is designed to bring empathy and emotional intelligence into the chatbot space, creating a more natural and human-like interaction experience. The AI voice startup recently secured a $50 million funding round with investments from Comcast Ventures, LG, and others.
The integration of realistic emotion in AI is seen as essential by Hume’s CEO and Chief Scientist, Alan Cowen, as it can improve human well-being and make interactions more natural and engaging. EVI is equipped with unique empathic capabilities, responding with human-like tones of voice based on the user’s expressions and needs. Despite being clear about its artificial nature, EVI’s ability to understand and predict emotions is both fascinating and slightly unnerving.
The future of voice assistants like EVI could see call centers powered by AI capable of responding to emotions with empathy and understanding. Companies have been working on enhancing voice interfaces with emotion and natural-sounding features to create a more engaging experience for users. While the accuracy of emotional analysis in AI is still unclear, the potential for AI to improve human happiness and well-being through empathic capabilities is promising.