The Question: How might we help students who have had a bad experience with a course at the right time?

From September 2017, I have been working with Udemy's product design team to help create a new step in the current review process for rating a course. If a student rates a course poorly (i.e., giving a course 0.5-3 stars), we want the flow to be able to come in and help the student when and where they need it most. As it is now, the review system will get data, which helps us, but doesn't help the student much. Udemy offers a 30 day money back guarantee, so we hypothesized that adding that in early may help with conversion.

The current review system.

The current review system.


I was both a UX designer as well as the domain expert in customer support. I also wrote the user research questions alongside the UX research team. I worked with product designer Tiffany Hsieh and product manager David Kim, later joined by principal product manager for discovery, David Little. On the UX research team, I worked with Claire Menke and Michelle Fiesta.

CURRENT STATUS, as of October 2017

Running user research via user interviews to understand what students expect when they rate a course poorly. Medium fidelity wireframes are being shared with participants at the end of the interview phase.

Whiteboarding to get all of our ideas aligned after 15 minutes of individual brainstorming.

Whiteboarding to get all of our ideas aligned after 15 minutes of individual brainstorming.


I've been at Udemy for nearly four years, so I know quite a bit about the product. I was approached to help design this because of my expertise on the student side as well as my knowledge of the instructor side, which is the other half of the marketplace. A couple years prior, I had run a test to see if proactively reaching out to students who had rated a course 0.5-2.5 stars with a reminder on the refund guarantee as well as offering other similar courses would change their long term value. Since these were all emails, the initial test wasn't statistically significant, but we did see that of the emails we sent, almost 50% got a response back, either confirming the refund request or thanking us for the email. Some even went on to purchase more than one.

With this in mind, I shared all of my previous findings with David K. and Tiffany. The three of us sat together and started brainstorming ways of how we could make this more scalable.

A quick mockup with the old styleguide.

A quick mockup with the old styleguide.



  • Tiffany, David, and I brainstormed together to understand the user's pain points. The contents were added to the one-pager.
  • Job stories were created. "When I have a bad experience with a course, I want..."
  • Thinking it would be straightforward, Tiffany and I drafted up some simple user flows based on data pulled by David, who wanted us to have wireframes within the next week.
  • I informed the team of what we did and didn't have built out by the engineering team, as much of the systems they wanted to implement were requested by our support team. I also informed them of the backlash we may see from instructors, since I had worked on the community team before as well.
  • We met with the engineers to see if what we had sketched out would be feasible. More questions came out of it.
  • I spoke with the support team and asked them if what we had sketched out would be feasible. More questions came out of it, and stakeholders were added.
  • PPM David Little joins the party.
  • David L. asks us what the real question is, and we see that it is time to go back to user research to fully understand what students want before we can go about with making wireframes.


  • Tiffany, the Davids and I met with the engineering team to understand the limitations we have. Once again, we are back to the drawing board and chuck out our wireframes.
  • I worked with the support team's analyst, Jessie Huang, as well as the manager, Sachi Yokose, and the product liaison, Noah Ferns, to make sure the support team was looped in at all times, since they would likely get affected with more manual work than expected with this project.
  • A new user flow is drafted, though this time with a couple of routes that we need to confirm with user research. We can't just offer refunds off the bat and we can't just have students manually write into support.
  • I drafted the user interview questions to gather more insight on what a student would expect when they rate a course badly. Would they want the refund off the bat, or something else? Claire and Michelle assisted in editing and helped me make the questions more precise.
  • Interviews are currently underway.