Select Page

Improving the Watson Assistant Onboarding Process

Bringing the power of Watson Assistant to a new type of user

I worked on improving the onboarding experience of Watson Assistant to educate and empower business users to build their own chatbots quickly and efficiently. I developed several prototype iterations based on an in-depth analysis of user testing with each new implementation.

THE PROBLEM
Onboarding to Watson Assistant is a complicated process, and our primary users are business experts – not developers.
OUR GOALS
7 out of 10 users will be able to create an assistant in under 30 minutes
40% conversion rate (user gets a successful response from their assistant)
– Get our users to the “wow moment” in as close to 14 steps as possible
THE “WOW MOMENT”
A user tests their assistant in the Try it Out panel and receives a correct response.

Who?

I worked with a product manager and developer to coordinate design efforts based on business goals and implement each version.

When?

Summer 2019

Where?

IBM, Watson Assistant (Austin, TX)

How?

Wireframes, Prototypes, User Testing, Competitive Audits
Sketch, Invision, Appcues

Process
I spent the first few weeks of my internship learning from Watson experts and experimenting with building a cognitive assistant to fully immerse myself in the product. In a way, this was a perfect project for me to tackle – as an outsider, someone completely new to Watson Assistant, I could empathize with our primary user.

My work is organized in three versions that build off one another. Each version was tested with real users, and the following iteration builds off of the results. I inherited the beginnings of V1 from my coworker and completed it, designed V2 entirely myself, and began work on V3 before handing it off to a coworker when I completed my internship.

Version 1
The first iteration was a collaboration between myself and another designer. We knew this iteration was nowhere near meeting our goal of getting the user to the “wow moment” in 14 steps, but we wanted to see how our language and flow succeeded in guiding the user.

After user testing, we found only 10% of users continued with the tour after the welcome modal. We also only reached a 12% conversion rate, falling short of our 40% conversion rate goal. Finally, we found that starting the users directly on the Assistant page led to a higher percentage of users testing their assistant.

Version 2
I designed version two based on our user testing findings from version one. The most notable change was that I eliminated two concepts from the tour. These concepts weren’t necessary to get our user to the “wow moment” of testing their assistant, and added unnecessary complexity. This change, in combination with refining our copy, let me shorten the tour from 65 steps to 21 steps.

I also wanted to allow users to enter their own content in this iteration. Previously, we required users to follow a specific example for a sample use case. I hypothesized that letting users enter custom content would make them more engaged in the process.

I implemented a few additional design elements: a compressed welcome modal, refined tooltips, and a new checklist.

This iteration was still in the user testing phase at the completion of my internship and final results were in progress. This version was showing promise, with a higher completion rate and lower abandonment rate than version one.
Version 3
While version two was in user testing, I started work on version three to test a few more of our hypotheses.

The final design of this version would depend largely on the performance of version two, so I worked on getting this iteration to a good point for handoff to my coworker when my internship concluded.

Want to learn more? Let’s chat!

Feel free to reach out to me on LinkedIn to learn more.