NOTE: Client findings are confidential, making it difficult to show our process. The following case study is of a real usability study, based on a fictionalized business scenario.

Prior usability testing suggested that Airbnb hosts are frustrated when they want to update the nightly prices on their property(s), feedback that concerned the product manager. However, the development team pushed back. They claimed that their prior usability tests were unable to see experienced participants try out their products, and instead had too limited a focus on novice users. We agree.

Too often usability tests evaluate ease of learning, not ease of use.

Too often usability tests evaluate ‘‘ease of learning’’ (the ability to initially grasp the key features of a product) and do not truly examine ‘‘ease of use’’ (the ability to grasp features once familiar with the product).  For this reason, it’s important to employ usability testing on both types of users. In the present study, we evaluated whether experienced Airbnb hosts (those who have been hosting for 1+ months) were able to successfully complete a series of tasks related to reservations.


Determine whether usability issues with Airbnb’s Host website can be overcome by users or whether usability issues should be corrected

Our process:

Get Specific.

After identifying the key research question, we established specific usability tasks for participants to complete.

“You realize that you’ve been successfully booking every weekend and want to try to raise your prices to see if you can increase your revenue. You decide to raise your rates for one weekend to see how it goes. Change the price of September 22nd and 23rd to $200.”

Well-defined usability tasks provide context to motivate the participant as if they were a real user. Importantly, it gives the participant exact directions (i.e., change the price of two specific days to a specific price) without implying how to complete the task (e.g., click “Hosting.”)

Define task success.

  • Once the tasks were defined, we created outcome measures and defined success criteria.

    • For the key task, participants should complete it within 1.5 minutes to be considered a "success."

    • For the key task, at least 75% of new participants should be able to successfully complete the task.

The right people for the right research.

We recruited Airbnb Hosts to participate in a moderated remote usability study, as well as people who had not hosted on Airbnb.

All participants completed a series of tasks, including a key task known to have usability issues among new users: to change the price of a given date. Because it is reasonable to assume that experienced users would always outperform novice users (regardless of usability issues), additional control tasks were also included. It was predicted that all users would be able to successfully complete the control tasks within the allotted time (although experienced participants would be faster). However, it was predicted that novices would not be able to complete the key task of interest.

Did we replicate past usability issues?

“That's super confusing. It should be under ‘pricing.’”

Novice users had trouble changing the rates of an upcoming weekend, with testers consistently looking in the wrong place. This confirms the product manager’s original concern that this task is not intuitive.

Can this be overcome?

We would expect novices to always be slower than experienced users, but we would expect them to succeed nonetheless. If they fail the task, there’s a problem. Control tasks evaluated whether novices and experienced hosts were able to succeed at mosts tasks.

Novices were successfully able to complete other tasks, confirming that their difficulty with the key task was due to usability issues, not the fact that they were novices.

Do experienced Airbnb hosts continue to struggle with the task?

The key question of interest was whether this usability issue was cause for concern. Novices struggled with the task more than experienced hosts. However, we would expect that inexperienced users should always be less efficient than experienced ones. How do we know whether this specific task is a usability issue?

We recruited a separate sample of Airbnb hosts with at least 1 month of hosting experience. If this is a true usability problem, we would expect to see that experienced hosts have more difficulty with the key task than with control tasks. However, usability testing discovered that experienced hosts were able to easily complete the task.


Drawing conclusions

The current study replicated the past concern that novice users have difficulty changing the price of a specific hosting date. This was indicated by novices not meeting the success criteria for this key task, but meeting success criteria for control tasks. Importantly, experienced users were able to easily complete the key task. Although novices struggled with the task, experienced hosts were able to overcome initial usability issues. This task may not be easy to learn, but it is easy (enough) to use.

PhD Insights recommends fixing this interface issue. Although it is easy to overcome with experience, it would also be a simple fix. First, we know that participants consistently looked in the same (albeit incorrect) place: pricing settings. This tell us something about the intuition of new users, and gives us a great place to start testing an improved design. Moving this functionality to pricing settings may solve the problem. Second, the change to the pricing settings page could be extremely minimal. We created a possible solution to the current usability issue and tested it against the original design.


Pricing Settings: Original Design

The original design made it confusing for novice users to adjust individual prices. When they chose "pricing settings" they arrived on this page. Key usability problems are highlighted in red.


Pricing Settings: UX Update

The updated design improves the user experience by clarifying existing functionality: "Nightly Pricing" becomes "Pricing Standards." A new section "Price and availability calendar" is really just a link to the "View calendar" page, now directly embedded in the pricing settings page.


Results: The new UX improved usability by 16%

Fifty participants were shown one of the two designs. They were given the prompt: "Imagine you are an Airbnb Host. You want to raise the daily rate of September 22nd to $200. Where would you click?" Areas of success and failure were defined a priori. Participants given the original UX successfully clicked a correct button 23.8% of the time, while those given the updated UX clicked a correct button 27.6% of the time. This is a 16% improvement in first-click usability. Although other, deeper changes to the information architecture would likely be more effective, they would also be more expensive and more time consuming. Thus, the present UX revision is appropriate given the minimal amount of work required to improve ease of learning for novice hosts.


Phd Insights no longer offers usability testing.

Learn more about how user research surveys can help.