logo

Are you using the appropriate tone in your conversational AI?

Posted by Marbenz Antonio on September 28, 2022

4 Ways Conversational AI Can Benefit Your Business – SistemasGeniales.com  Desarrollo de software y páginas web Colombia

AI that can engage in a conversation need not be robotic. Allow human emotions to be expressed by your virtual agents.

Conversational AI is too artificial

Nothing is more annoying than dialing a customer service number and being greeted by a robotic, monotone voice. The time it takes the voice on the other end of the line to read you the menu options is painful. You have two seconds to hang up, yell “representative” into the phone, or slam the zero button repeatedly until a human agent answers. The issue with many current IVR solutions is this. AI for conversation is overly artificial. Customers simply want to speak with a human agent since they feel that they aren’t being heard or listened to.

IBM Watson Expressive Voices 

Fortunately, there is a solution that will address this issue and improve the customer experience. You won’t feel like you’re speaking to a conventional robot anymore thanks to IBM Watson’s most recent expressive voice technology; instead, you’ll think you’re speaking to a real human agent without having to wait. The conversational abilities of these incredible natural voices include expressive styles, emotions, word emphasis, and interjections. These voices not only help customers avoid the annoyance of speaking to a robot, but they also help achieve the objective of diverting calls away from human agents. Both customers and businesses benefit from it.

The voices support a neutral style that may be more suitable for various use cases, but by default they will have a conversational style enabled, making them best suited for the customer service domain (newscasting, e-learning, audiobooks, etc.). Listen to the examples of expressive voice below:

Emotions, Emphasis, Interjections

Whether we are aware of it or not, humans communicate emotion through their speech. When we apologize to one another, we usually show sympathy. When we don’t know the solution to anything, we sound unsure, and when we do, we could sound positive. What makes us human is our ability to express emotion. With the help of IBM Watson’s expressive voices, customers will have less irritation while dealing with today’s phone encounters. These voices may portray emotion to better explain the meaning behind the words. While informing a customer that their item has been delayed or when successfully assisting the consumer in booking an airplane ticket, your voice bot will seem sympathetic.

Another important component of human communication is emphasized. August or Austin did you say? Did you mention losing the 4876-number card? Word emphasis is supported by IBM expressive voices so that your bot may more effectively convey the intended meaning of the text. Users can choose between four levels of stress: none, moderate, intense, and reduced.

Another aspect of human speech that IBM expressive voices now allow for creating a conversation that feels more natural and human-like is interjecting with words like “hmm,” “um,” “oh,” “aha,” or “huh.” These interjections in the text will be automatically recognized and handled as such by the new expressive voices without any SSML (Speech Synthesis Markup Language) indication. When it’s inappropriate, you can choose to disable interjections (for example, “oh” can be used as an interjection or spell out the number 0).

How to Get Started with Expressive Voices

In September 2022, US English will be the first language to have expressive voices and features available, with other languages following in early 2023. Michael, Allison, Lisa, and Emma are the American English expressive voices. Customers who are currently using Michael, Allison, or Lisa in their V3 form shouldn’t experience any interruptions while switching to the expressive voices because it will still sound like the same speaker, just more naturally and conversationally. The new voices are simple to use; just include the voice name in the API reference, just like you would for any other voice.

 


Here at CourseMonster, we know how hard it may be to find the right time and funds for training. We provide effective training programs that enable you to select the training option that best meets the demands of your company.

For more information, please get in touch with one of our course advisers today or contact us at training@coursemonster.com

Verified by MonsterInsights