New User Experience

Creating a more robust internal understanding of our customers to help them find value quickly.

Role Lead Product Designer

Timeline — 3 months

Platform — Web App

Team Product Designer (me), Product Manager, User Researcher, Tech Lead

Overview

A redesigned onboarding survey for new users signing up to build with Mapbox.

Key Outcomes

Moved from a single-question onboarding survey, to a robust 2-page 6 question survey while maintaining the previous completion rate.

This data was used for roadmap planning by 6 service owners, sales, and marketing to improve personalization for new users.

Background

This project was categorized as a “Discovery/Learning” project - our main objective was to bring a deeper knowledge of our new users to Mapbox in order to help team leads make data-informed strategic decisions.

We shipped the following 2-page in-app survey which was presented immediately after a user completed their sign-up flow.

Key Design Decisions

  • Collect response data after the first page of the survey: these are the only mandatory questions and give us enough information to personalize the landing page (the accounts page).

  • The answers to the second page (e.g. map-customization and individual preferences) can help us add another layer of customization to the accounts page - but these are not mandatory.

  • Keep a simple survey design that doesn’t look too “branded” or “flashy” so it doesn’t read as an ad

Iterations of the survey visual design

Impact

Our main goal was to ensure that we could collect richer, more-nuanced data about our new users in order to increase personalization and help them find value quickly.

After shipping we learned that:

  • 63% of all new users fill out the survey. This is exactly the same completion rate as our previous one-question survey.

  • Only 17% of users drop off between pages 1 and 2. Indicating that users who fill out the survey are highly likely to fill out the second page.

Process

The original survey consisted of a single question that was presented to users after they completed the sign up flow. It appeared as a modal on the account page (the first page a user is brought to).

The original 1-question onboarding survey

We knew that in the original survey 40% of users selected the “Something else” option. We wanted to:

  • Revise our use-case options to better align with customer language

  • Provide a input text box to capture any “Other” options that we may have missed

User Research

To kick of the project I led a brainstorm with the whole cross-functional project team to brainstorm what we’d like to learn. This included looking through some competitive research on “new user experiences and sign up flows” that we’d previously done as a team.

The board I set up to run a team brainstorm session to give context & kickoff the project

I collaborated closely with the Product Manager and User Researcher on my team to create a list of possible questions.

We then presented these for feedback to representatives from all the service offerings at Mapbox - including Maps, Automotive, Logistics, Platform etc.

A framework I created for the team to use while discussing the questions we wanted to ask: it wasn’t just about the questions but the value we could drive through the data insights

I then worked with the researcher on my team to create a Typeform survey that we sent to our pay-as-you-go user base - this served as a “survey prototype” and helped us understand whether the answers we got to the questions were valuable. It also helped us wordsmith and refine our questions to better get the data we were aiming for.

Concurrently, I planned and led 5 interviews to understand the pain points of churned users - this turned out to be very challenging to source for, as our churned users tended to not have much investment in giving feedback (makes sense!).

I used pivot charts in Google Sheets to analyze the Typeform survey responses and then brought them into FigJam so we could analyze them, call out surprising data, and notice trends as a team.

Synthesizing Learnings

I created new-user profiles based on this research which were valuable for discussions on our project team as we sifted through our learnings and worked to refine questions that would help us better understand these distinct user’s needs.

Onboarding user profiles I created based on our interview sessions and Typeform survey responses. Ultimately this helped us narrow down on an even simpler framework to speak about our new users.

Based on each team’s key learning goals AND our research insights, I was able to cut down our long list of questions into a curated set of 6 questions as part of a 2-page survey. I pushed for a 2-page survey design so that we could ask the important questions first, collect the answers, and then move on to non-mandatory questions on the second page.

Design Spec

I worked with the frontend engineer on my team to select the most straightforward survey behavior (as we were on a time crunch for the implementation). At this point we scrapped an idea we’d had of progressive disclosure of the survey questions.

The WIP design spec: here I outlined the detailed interaction design of each question type, iterated on copy, and considered edge cases.

Brainstorming the Personalized New-User Experience

Alongside the design work being done to improve the Onboarding Survey itself, we were brainstorming how to improve the new-user experience beyond the survey. I mocked up several ideas for the account page that aimed to improve activation for different key services.

An idea for how to increase activation for users looking to create a map style who have low “customization” needs - give them a “Quick start” option to hit the ground running.

These ideas worked to increase excitement around the possibilities that were unlocked by these new data insights.

Because the project was getting handed off before any of these ideas were implemented I made sure to create thorough documentation. I created 3 guiding design principles that synthesized our research learnings and provided guidance on what to keep in mind when making improvements to the account page.

One of three design principles I developed as a deliverable of this work - each principle was informed by data insights and paired with low-fidelity brainstorms of how to put the principle into action on the account page.

In conclusion

Although the design surface was small, the impact of the project was huge. We helped 6+ orgs across the company make more user-centric decisions and start to ask data-driven questions about their funnels and activation rates. We also made a bet on the fact that we could collect a lot more information without affecting drop-off, and running an A/B experiment against the old survey proved this to be true. This was a big win because it meant that the onboarding survey could be a place for teams to come together, iterate, and get better answers to questions about their new users.