Amelia’s 3D human avatar based on real-life model Lauren Hayes was recently highlighted on ServiceNow’s Workflow blog. As the public becomes more comfortable interacting with AI, the ability to mimic wordless communication will become increasingly vital.
One of the most exciting frontiers in Artificial Intelligence (AI) is how AI-powered cognitive agents like Amelia allow humans to interact with technology using natural language, just as they would if they were talking to another human. However, there is obviously much more to human communication than merely understanding and processing words and phrases. The human face is just as important to convey information and emotion. This is why IPsoft has developed a 3D human avatar for Amelia (based on real-life model Lauren Hayes) that can foster natural, human-like interactions.
Amelia’s human avatar was highlighted on ServiceNow’s Workflow site in a blog titled “The changing faces of AI.” Those IPsoft customers that utilize Amelia’s 3D avatar can open new dimensions of interactivity. As IPsoft services more than a dozen industries across a wide range of use cases, we’ve adopted “a mix-and-match” approach, according to IPsoft’s Chief Commercial Officer Jonathan Crane, who is quoted in the piece.
AI-powered virtual assistants, as the piece highlights, are growing at an astounding 25% annual rate and are predicted by Juniper Research to save businesses $8 billion per year in overhead costs by 2022. Not only can Amelia understand human speech, but her industry-leading sentiment analysis allows her to discern human emotional states, which the avatar can be trained to react to. As users become more comfortable with interacting with cognitive agents, this additional dimension of communication will become an increasingly vital part of the user experience.