NOTE: Client findings are confidential, making it difficult to show our process. The following case study is of a real usability study, based on a fictionalized business scenario.
Prior usability testing suggested that Airbnb hosts are frustrated when they want to update the nightly prices on their property(s), feedback that concerned the product manager. However, the development team pushed back. They claimed that their prior usability tests were unable to see experienced participants try out their products, and instead had too limited a focus on novice users. We agree.
Too often usability tests evaluate ease of learning, not ease of use.
Too often usability tests evaluate ‘‘ease of learning’’ (the ability to initially grasp the key features of a product) and do not truly examine ‘‘ease of use’’ (the ability to grasp features once familiar with the product). For this reason, it’s important to employ usability testing on both types of users. In the present study, we evaluated whether experienced Airbnb hosts (those who have been hosting for 2+ months) were able to successfully complete a series of tasks related to reservations.
Determine whether usability issues with Airbnb’s Host website can be overcome by users or whether usability issues should be corrected
After identifying the key research question, we established specific usability tasks for participants to complete.
“You realize that you’ve been successfully booking every weekend and want to try to raise your prices to see if you can increase your revenue. You decide to raise your rates for one weekend to see how it goes. Change the price of September 22nd and 23rd to $200.”
Well-defined usability tasks provide context to motivate the participant as if they were a real user. Importantly, it gives the participant exact directions (i.e., change the price of two specific days to a specific price) without implying them how to complete the task (e.g., click “Hosting.”)
Define task success.
Once the tasks were defined, we created outcome measures and defined success criteria.
For task 1, participants should complete it within 1.5 minutes to be considered a "success."
For task 1, at least 75% of new participants should be able to successfully complete the task.
The right people for the right research.
We recruited Airbnb Hosts to participate in a moderated remote usability study, as well as people who had not hosted on Airbnb.
All participants completed a series of tasks, including one task known to have usability issues among new users. Because it is reasonable to assume that experienced users would always outperform novice users (regardless of usability issues), additional control tasks were also included. It was predicted that all users would be able to successfully complete the control tasks. However, it was predicted that novices would not be able to complete the key task of interest.
Did we replicate past usability issues?
“That's super confusing. It should be under ‘pricing.’”
Hosts had trouble changing the rates of an upcoming weekend, with testers consistently looking in the wrong place. This confirms the product manager’s original concern that this task is not intuitive.
Can this be overcome?
We would expect novices to always be slower than experienced users, but we would expect them to succeed nonetheless. If they fail the task, there’s a problem. Control tasks evaluated whether novices and experienced hosts were able to succeed at mosts tasks.
Novices were successfully able to complete other tasks, confirming that their difficulty with task 1 was due to usability issues, not the fact that they were novices.
Do experienced Airbnb hosts continue to struggle with the task?
The key question of interest was whether this usability issue was cause for concern. Novices struggled with the task more than experienced hosts. However, we would expect that inexperienced users should always be less efficient than experienced ones. How do we know whether this specific task is a usability issue?
We recruited a separate sample of Airbnb hosts with at least 1 month of hosting experience. If this is a true usability problem, we would expect to see that experienced hosts on average have more difficulty with this task than with control tasks. However, usability testing discovered that experienced hosts were able to easily learn how to complete the task.
We just saved a boatload of time and money.
What did we learn? A redesign isn’t necessary.
Novices struggled with the task, but experienced hosts were able to overcome initial usability issues. This task may not be easy to learn, but it is easy (enough) to use.
Rather than diverting resources to redesign the site, test that redesign, and deploy that code, the product manager would now be free to focus on more pressing priorities.