Winner was:
Subscribe to a tutor
The winner had a great balance of strategic and user value, fitted into Preply’s business model and were (relatively) easy to test We can reuse many existing concepts that we know work for Preply currently, whilst pushing students into a 'journey to subscription'.
In this test, we changed packages (current business model) to a monthly subscription at the tutor’s rate. Users can subscribe to their particular tutor.
Started by
Defining success metrics, based on our company North-Star metrics.
Conversions to paid:
How many that view the offer convert to subscribe? How much is the drop vs. packages?
Hours bought:
Do we see an increase in hours bought?
GMV
Churn Rate
Additional health metric:
Confirmed hours
Are they actually learning more? Lesson utilisation or Breakage
& the hypothesis based on research and preparations that we did.
By changing the concept to subscriptions, we will retain students better, and affect GMV, Hours Bought while not drastically decreasing CR to paid.
Defined the expectation, that conversion will dip, while GMV will increase (logically outcome of making our product higher commitment + more expensive /upfront payment).
Firstly,
Analysed production flow.
To proceed with design, firstly I had to review our current flow to find, discuss and analyse key places where the flow will be changed. It was important to define on this stage. Without it, we will stuck with legacy problems, bottlenecks and other teams miscommunication. Also, I identified the opportunities and collected feedback from stakeholders. With the aim to fully define the MVP flow.
On this step I collaborated with product leads and developers to understand how we can make decisions that translate our subscription business goal idea into product experiences, turning complex task into simple user-centric experience.
In parallel,
defined decisions & trade-offs for A/B test with developers and data scientists.
- Whom we add and exclude from experiment. With the team, calculated the possible sample size (estimated participants & minimum detectable effect and stat. significance).
- Defined estimates.
- Payment options.
- Test constraints by parallel-running experiments.
and to
clearly define the scope of design work and plan our sprints,
we worked in close collaboration with product manager on creating experiment high-level description. Based on discussions from previous step, we started to answer high-level questions, like:
- What will change in the student experience?
- What will change in the tutor experience?
- What emails do we need to adapt or create?
- How user can manage their subscriptions?
etc.
Began
the design process with entrypoints.
First challenge for us was defining the key entrypoints, where the A/B test would start. Based on previous analysis and review of current product screens, I came up with design suggestions. Stuck with a problem, that we can not change the copy for the key action from "Buy Hours" to "Subscribe", because experiment will be triggered with this action of clicking the button. And control group will also see the wording "subscribe". Solved this problem in close collaboration with copywriter to find the unified wording for both subscribers and package buyers. Changed the wording to "Add hours", that will be more flexible for both groups.
Collaborated with designers from other teams to understand – what additional entrypoints we can also add to the experiment, so more users will see the experiment and we can achieve statistical significant results faster. Came up with solutions in core product, messages, emails and pop-ups that our students see in product.
Iterated on the
next step – pricing plans.
After entrypoints, worked on pricing plans screen – the core step that needs to be as clear as possible to motivate user go forward. For MVP came up with simple solution, collaborating with copy, management and research. With the aim to iterate on updating this design solution with copy & visual changes, based on insights that we will receive from quality and quantity data soon.
Closely collaborated
on payment flow.
With the aim of making clear experience for our subscribers and minimise development effort in MVP, I closely collaborated with payment team to discuss the possible changes, that we need to change in our payment flow. Keeping in mind code legacy & previous payment quality / quantity data and researches.
Made
user interviews for the flow (2-4).
Finished the top of the funnel flow. To be sure, that this flow is easy and clear for users – decided to do user research with the new designed screens - evaluate what users understand / expect / etc. Conducted 10 interviews to have enough amount of feedback, that have certain patterns to implement. With research team, participated as an interviewer and observer.
Key insights:
- 6/10 didn’t understand it’s subscription, not packages.
- 7/10 of the students were not sure that their account would refill in a month.
- 8/10 easily found the entrypoints to subscribe.
- 4/10 didn't understand that they need to scheduled weekly lessons.
Based on these insights, iterated to make top of the funnel flow to be much easier in clarity that it is subscription, how it works and what users will get. These changes positively affected the final result.
Participated in
design critique session.
To even better understand the key problems we may face with the flow after user interviews process, I made the design critiques session. For me is very helpful to collaborate with designers from different teams and see new perspectives on current tasks and problems. To achieve better and more quality feedback, we created the unified structure of our design critiques.
Structure:
I like -
I wish -
What if? -
Takeaways
This type of feedback helped me to see additional gaps in the flow. Iterate on it and made several ideas for future iterations.
Worked on
scheduling the lesson.
On this step, worked closely with User Researchers, Product Managers and Engineers from core product team to build simple user centered digital experiences of booking regular classes. Their knowledge helped me to not guess, but make design solution based on data and insights that we have.
Designed the new flow of regular lessons, with focus on edge cases. Minimised major changes in regards to current production version, so developers can implement it with minimal time effort and no complexity.
and
settings + secondary screens.
We faced the problem, that some edge cases were not defined. How user can reschedule lessons, how canceling subscription will work. How we can get that users back, etc. I focused on finishing it for MVP with secondary screens, pop-ups and the settings page, where users have the ability to cancel their subscription.
Finalised the flow and launched the test.
Finalising the flow consisted for me from several steps:
1. Localisation discussion.
We localise all experiments for 7 languages. With localisaton team discussed and fixed potential bugs, so design will be adaptive and ready for language changes.
2. Design review of development results.
Reviewed the final result with development team with the aim to fix bugs or new states that were not taken into account in the design.
3. Approval discussion with the management team.
I effectively communicate with C-level stakeholders the final results for MVP and what we plan to do next.