Exploring Conversational UX

I don’t like talking. I’m the type of person that can happily sit silently in a group. So I felt quite out of place when “Voice” was being touted as THE next big mode in interactive design. For myself, speaking to an app sounded like a chore, but after jumping in to explore this newer form of interaction design and designing Astrology Zone for Alexa I’m excited about what I’ve learned and what can be done.
Having a dialogue
Part of a designer’s job is to tell a story to the user. Conversational design is a way to directly have a dialogue with the user about your message, to tell them your story through a conversation.
It’s not one sided though, your user needs to have some variance to respond. This will greatly depend on which technology is being utilized in how much natural language is recognized: Alexa is limited with her structure and recognition while the Assistant will understand different phrasing more easily, with Siri falling somewhere in the middle. Not everyone speaks in the same manner, so you should anticipate that.
Think about how a conversation is structured and normal speech patterns, this is what you should follow when designing, both the natural back and forth of conversation and “threading” or how a conversation and it’s context is woven together, gone are the days of touch-tone phone tree interactions, there is no one path to follow.

Remember when I said that I don’t like talking? It turns out many people feel that way. Google, Amazon, and Apple have responded by adding more visual components into their virtual assistants. For example, Google Assistant recently updated to allow for text input in addition to voice on the phone, which is very helpful when I can’t speak over a babbling baby, but you still need to apply the same design tips, whether the user is literally speaking or conversing via text. The virtual assistants provide opportunity on mobile to enhance the experience with visual elements like images, search suggestions, links, and the app’s content.

Exploration
Most my fun design exploration was with the Google Assistant, it is by far the most robust of our new robotic voiced friends. Actions on Google is amazing and enjoyable to use, I wish every tool was the same way. After being introduced to it at this year’s Google I/O I delved in to discover more, and was able to create a demo in no time. It allowed me to focus on writing interactions and possible responses rather than the format and more technical aspects, since with every input the system trains the action for variants, where on the other hand writing for Alexa feels like diagramming sentences.
Speak Up
Some thoughts after designing Astrology Zone for Alexa and exploring the Google Assitant:
Add personality: the Assistant has the ability to add in small talk, this can keep the user engaged. With Astrology Zone, we added in messaging that Susan Miller uses to address her followers during the sign off phase.

Don’t be too wordy, since users won’t want to wait around, be succinct if you can.
Let the user be succinct, make sure to allow only the invocation, and parameters to be said to initiate the action.
Provide natural guidance. Having an introduction to tell the user what the app can do and some suggested interactions is an effective way to onboard the user. Any conversational based app has a different mode of discovery, a user does not have the benefit of simply tapping around a screen to find features. But maintain the conversational tone, it would be awkward if the person you are talking to started talking like their automated.
Let the user mess up and guide them back. If an app only states that something went wrong, the user can become frustrated, they don’t know if it was user error, interference, or a number of other problems. To relieve this, give an error message (if able give the reason), then let the user know some alternatives or how to correct.
Limits

Read the guidelines, but realize they are a work in progress. When working on Astrology Zone for Alexa, after much reading, scouring articles, and interacting with Alexa, we quickly hit roadblocks with development. Why? I’d written our phrases following what the guidelines said we could do, but they didn’t say what we couldn’t do. This is what lead us to discover that phrases had to be ordered a certain way, Alexa would hear an invocation and stop listening, there is a limited amount of connecting words and verbs that can be used, possessives are a pain to deal with, and no matter what we could never get Alexa to understand ‘yesterday’s’ correctly.
Accessibility
The emergence and refinement of conversation interaction has an important accessibility consideration as well. For mobile devices, blind and limited vision users depend on voiceover and talkback features, but these are more a transcription of an app’s visuals (and that’s if the app was even setup properly to handle these features). Conversational design can be utilized to enhance these current features, “scripting: the user’s experience with the app. When working on a proof of concept for indoor wayfinding for blind/limited vision, I had to think through and write the experience, here the user’s dialogue with the app was in the form of gestures. Imagine if this was push further, with the use of an assistant, without having to rely on merely a ‘transcribed’ experience. And not all disabilities are permanent, temporary disabilities include situations like driving, where designing for voice interaction is something that is essential.
Now with increasing natural language support and easier methods of entry conversational and voice design are becoming an essential part of UX. I look forward to building even more in the future. In the meantime, you can check out our first Alexa app: Astrology Zone.