Case Study · Peerspace

Improving Checkout Conversion on Mobile Web by 7%

Spearheaded collaborative initiative that improved checkout conversion on mobile web by 7% in just two weeks.

Company Peerspace
Team Monetization
My Role Product Designer
Outcome +7% checkout conversion
Peerspace mobile checkout — Review & Pay screen

What is Peerspace?

Peerspace is the #1 marketplace for renting event, meeting, and studio space — with $500+ million in bookings. Think Airbnb, but for booking a meeting room for a couple hours, a rooftop for your sister's baby shower, or studio space for a photoshoot.


A team in paralysis — and a lot broken in checkout

After a reorg at Peerspace, I found myself on the newly formed Monetization team. Revenue and reaching profitability was leadership's primary focus, so our roadmap was primarily top-down and strategy-driven — with a lot of eyeballs on our progress.

At the time, the team had a broad, shared understanding that there was a lot broken in the current checkout flow, especially on mobile web, which had been neglected for quite some time. Because there seemed to be so much to fix, it was overwhelming to get started. The team was in a state of paralysis, and these issues continued to go unfixed.

So as we started on bigger roadmap projects like adding a Split Payment option, I organized a series of workshops to generate some bottom-up initiatives starting at the user level — with a goal of finding obvious, easy wins we could add to the roadmap. This would allow us to report incremental progress towards profitability while we worked on our big features.


Running a User Journey workshop to align the team

Prepping the Workshop

Peerspace didn't have any UX research or usability testing infrastructure we could use to understand how the current experience was affecting users' ability to checkout. So while I worked on spinning up those capabilities, I decided to leverage internal testing as a way to gather insights and align the team around our new product surface area. Two birds, one stone.

I chose to do a User Journey workshop as a way to align the team around what the current experience was for an average user, as well as a diverging step to start generating and documenting all the ways we could improve it. I invited all the engineers, product managers, and data analysts on our team — about 8 people in total.

To prep, I created a User Journey Map canvas in Figma that the team could work through together. On the x-axis, I provided a screenshot of each step of the current mobile web checkout flow.

User Journey Map canvas — x-axis showing checkout flow steps

On the y-axis, I had the following rows: User Actions, Needs, Pains/Frictions, Analytics or Data Questions, Opportunities, and Other Questions.

User Journey Map — y-axis categories

I also outlined clear outcomes I hoped the team would accomplish by the end of our hour-long workshop, sharing these at the start to keep us aligned and accountable.

Workshop outcomes shared with team

Running the Workshop

With 8 people and only an hour, I divided attendees into pairs and assigned each pair a row/category to take notes on as they went through the checkout flow on their own device. Each pair had 20 minutes to go through the experience and fill out their assigned row. We then used the rest of the time to share, add, piggyback, and refine our post-its as a team.

Below is the completed User Journey Canvas at the conclusion of the workshop.

Completed User Journey Canvas from workshop

Prioritization Discussion & Team Alignment

In the last 10 minutes, we shifted to the age-old question — what next? Specifically: how do we get these UX improvements into the roadmap? How do we justify that they should be prioritized?

Prioritization discussion notes

We discussed determining ways to measure the impact of UX improvements — if we could get some baseline readings, we could make the case these baselines needed improving. But in practice, this would be a bit like creating red tape for ourselves. We decided as a team to start by putting the no-brainer, "low hanging fruit" ideas in the Jira backlog for the engineering team to pick up when they had time.


A hidden checkbox blocking checkout completion

Agreeing to host rules is required to complete checkout — however, the current UI resulted in this section being missed by many users. Compounding the issue, when the user clicked "Request to Book," they were given an error but not scrolled to the place where the error occurred, so they didn't know what they needed to fix to move forward with their booking.

Before: host rules checkbox buried in checkout, error with no scroll guidance
Before: the host rules checkbox was easy to miss, and error messaging left users stranded

Two small changes, one clear win

Sometimes the biggest growth wins come from small, iterative improvements. Our final design accomplished two things:

  • Gave the host rules acceptance checkbox more visual prominence so it would be missed less often
  • Guided the user to the missed section in the case they still missed it

To increase visual prominence, we put the checkbox in a bounded grey box with a light grey background, making it stand out from the other, non-interactive parts of checkout. To guide the user to the error if they still missed this UI, we gave the grey box a red border and scrolled the user to the first error after clicking the Review & Pay button.

Before and after design comparison
Before → After: adding visual weight and error-scroll behavior to the host rules checkbox
Design detail — default state
Design detail — error state with red border
Final checkout screen with updated host rules UI

+7% conversion — the first time mobile web checkout cleared 60% in over a year

+7% increase in checkout conversion on mobile web
54% → 61% payment page conversion, from year average to post-release
1 week time to see measurable impact after release

Since this was a relatively simple, front-end-only change, we added it to the Jira backlog with a low priority. The ticket was picked up fairly quickly — and just a week after the release, we saw a 7% increase in conversion on the payment page on mobile web, from an average of 54% conversion over the previous year to a year-high of 61%. It was the first time checkout on mobile web was over 60% in over a year.

Our PM, Stu, called out the win over Slack.

Slack message from PM celebrating the conversion win
PM shoutout in Slack — the win didn't go unnoticed

We can see this lift more prominently when we zoom in on the conversion data.

Conversion rate graph showing lift after release
Conversion rate data zoomed in — clear lift visible after release
Caveat: Since we didn't do a proper A/B test, we're using just before-and-after comparisons of the metric. There's a chance these effects were influenced by other factors — primarily seasonal effects of increased bookings during the holiday season. We would need more time to observe whether the overall average conversion rate over a longer period increased. Regardless, this was a meaningful win.

Conversion metrics are hard to move — and we saw a measurable improvement in a very short timeframe with a very low-effort, simple change. More importantly, this project unblocked and empowered the team to fix major UX and UI issues alongside higher-priority, higher-effort projects.


Continuing to carve out small UX wins

With this quick win under our belt, we continued to identify and backlog small UX wins. Next up was improving the communication of cancellation policies — specifically, surfacing the reassuring "free cancel within 24 hours" policy near the confirm button.

Next initiative: cancellation policy placement — before
Next initiative: cancellation policy placement — after

I also wanted to take advantage of the opportunity this newly-formed team presented to push us to be more data-driven and connect our work back to KPIs. Rather than get bogged down defining and aligning on metrics right away, it made sense to save that formal definition for later — and focus on getting a small win under our belt first.

I made a note of this on our workshop Figma board and continued the conversation past this project.

Figma board note on being more data-driven as a team