Skip to 0 minutes and 1 second We’ve talked about using analytics to amplify and to scale an existing product market fit. And today we have with us Eric Chi Dong, most recently with about.me. Eric ran the product management and the analytics program there. And Eric is a Darden alumnus, I will mention. Eric, thanks for joining us. >> Thank you for having me. >> You have a rich background in this area. Can you tell us a little bit about what it’s like to approach problems that you think are amenable to resolution with analytics? >> Yeah, definitely, I think when it comes to solving the data analytics problem, you need to take some step by step approach. First, you need us understand what is the question?
Skip to 0 minutes and 47 seconds Oftentimes, asking question like, what is purpose of asking for the data? In what format do you want that data? And it can help clarify the question. And secondly, look at the question and see if you have the data to solve that problem. >> What do you do if we don’t have the data to solve the problem? >> Well I think that, then you need to think about where to get the data. >> All right. >> It could be from using a third-party integration with a certain product.
Skip to 1 minute and 24 seconds Or working with engineering in terms of building a new solution where when you put your project management hats on you, need to understand how do you prioritize this task or this project against, as you’re running, is it more important? Can I get the resource do that? >> Yep, that’s great. And do you have any examples that you think kind of fit some of those patterns that you could tell us about? >> Yeah, absolutely, so I can talk about my experiences when I was at about.me. >> Mm-hm, great. >> So yeah, at about.em, we helped people create personal pages for different purposes.
Skip to 2 minutes and 57 seconds The other part is to monitor the web analytics funnel, and understand where can we do a better job with the AB testing in order to reach that goal. >> Uh-huh. >> Yeah, so along with that two sort of directions, what I was really focusing on was working with designers, making recommendations of existing user data. As well as, designing A/B testings, monitoring the key metrics and making recommendations on how to increase that by running those testings. >> And how did you keep all that organized? And did you have a particular cadence or set of cycles you would use to do this?
Skip to 3 minutes and 43 seconds >> Yeah, so absolutely, so we were focusing on starting from end to end. Starting from building this hypothesis using some user survey to validate the hypothesis. And then comes down to why framing. And after we have a good why frame, we build a working prototype where we can do internal testing as well as user testing. That’s where my recommendations of executing user data really comes into play when I can recommend the user experience design with existing data. And then it comes down to before we decided to reset 100% to production, we really want to know how the new feature, how the new process may face.
Skip to 4 minutes and 37 seconds So we in the AB testing matter, meaning that we open a small percentage of the production traffic to the new feature, which is the new onboarding process. >> Mm-hm. >> And compare the key metrics with the existing one. That really helps increase. And then we did some AB testing with it. So before we decided to push that 100% to we already have the confidence where the data increase could be. So that really helps us to have the confidence to push that 100% to production and to celebrate with a lot more customer communication and even PR releases.
Skip to 5 minutes and 23 seconds >> And how did it work out? What kind of outcomes did you get? >> Yeah, so we were very excited. After we pushed production, letting it run for more than a week. We were happy to see that the goal of the project, the increase of user activation conversion, jumped by more than 130%. That was a big- >> 130%? >> 130%, more than that. So that was a big win to the team, and it’s a collaborative effort.
Skip to 5 minutes and 55 seconds Not only data analytics, but also redesign of the use of engineering, product marketing. All teams was involved in that and happily celebrating that moment. >> Eric, one thing we like to do is talk about three tips. What are your three tips for product managers that want to use these analytics driven techniques to make their products better? >> Absolutely, so what I was at about.me, my job was really focusing on using data to draw those key metrics. So my recommendations are going to focus on that side. So I think, number one, you need to understand growing the data, growing the metrics, is not just one person’s efforts.
Skip to 6 minutes and 37 seconds Sometimes you need to involve different teams, different cross-functional team in order to, not only to be on the same page. But also using different actors to go over the data. Number two, you can’t underestimate small wins. When you’re on AB testing, you may only have a 5% increase. But all those small wins will finally accumulate to a big win. I think number three is you can’t grow your product based on a bad product market fit or a bad product. So examples being you can’t really have 100% increase of your product just by running AB testing. A lot of times you have to be involved with redesigning the user experience, focusing more on the product market fit.
Skip to 7 minutes and 26 seconds Doing a better job of product marketing those efforts. So yeah, these are my three recommendations. >> Great practical advice. Thanks, Eric, for joining us. >> Thank you.
Eric Qi Dong on disciplined analysis for the product manager
© Copyright Rector and Visitors of the University of Virginia