Asia founded DemandMaven five years ago to help early-stage teams with their growth:
"We work with early-stage teams on reaching their growth milestones, troubleshooting growth, and finding growth opportunities."
Prior to DemandMaven, Asia served in different roles along her very interesting journey. She worked with a couple of VC-funded startups in their demand generation and marketing departments. And before the successful acquisition in June 2021, Asia was also on the board of Moz.
Setting growth sprints
Similar to the concept of sprints in agile methodology, growth sprints are two-week cycles and focused on growth-related activities. Asia emphasizes that the two weeks are just a baseline and could be expanded if needed.
Determining the North Star KPI
But how do you organize your team around the growth sprint to get it done? The first step is to aggressively set the North Star KPI, a metric directly correlated to growth.
In terms of levers, Asia says that we can think of it in terms of five growth levers:
- Sales: closing deals, trial conversions
- Customer retention
- Growth: hiring, tools, processes
And when determining your North Star KPI, one of these levers or metrics should result in a dramatic change in growth.
"For some teams it's as simple as an increase in traffic. For others, it's going to be activation rates. So when we convert more customers into paying customers, we get more revenue. Or when we get more people who visit the website to jump into a trial or book a demo, we get more customers."
Asia also emphasizes that there's no one-size-fits-all when it comes to determining the North Star KPI:
"It has to be based on your circumstance, your team, and your business."
Identifying the focus area
After setting the North Star KPI, you now have to identify the focus area or the area of business that the KPI most correlates to:
"For example, if we decide that our North Star KPI is increasing traffic, then our focus area is now acquisition. Our focus area could also technically be the number of blog posts that we publish."
Asia cautions that you have to be careful during this step because it's really easy to incentivize execution without actual results.
She also says that there's no template for this process because it would depend on your business and circumstances:
"This is a process. This isn't like a one-size-fits-all But we always start with one North Star KPI and one focus area. And then from there, we get to do the fun work of reverse engineering how we think we're gonna get there."
Adjusting sprints to your circumstance
Growth experiments also don't have to be done in big chunks:
"Let's say for example, your focus area is activation and your North Star KPI is increasing conversions from trial to paid. In two weeks, you could easily and completely change your signup flow. You could also write a couple of new onboarding emails."
But for activities like overhauling the website design, that would require a couple of months to really see the effect.
The timeframe also depends on your circumstance:
"If you're a solo founder, two weeks is probably not enough and you probably need a little bit more time. If you have a few contractors or freelancers that you can work with, then two weeks could actually be enough time to really prioritize things. But at the same time, your mileage may vary."
Doing and learning fast
"For the teams that are able to make the two-week sprint time work, they are able to move incredibly fast and learn very fast. The important thing here is to not let the fear of execution hang you up from actually doing it."
Asia also emphasizes that the goal of a growth sprint is not just about the execution itself, but what you learn from these experiments:
"It's not just that we execute, it's that we learn from it. The learning is actually where the secret sauce is. The learning is where the magic happens."
When you don't define a hypothesis before executing your growth sprint, you wouldn't know for sure if the sprint worked:
"The gray space of not knowing makes us afraid to do something else. That's the mindset that we have to shift in a growth sprint like this."
Dealing with pricing experiments
Pricing experiments are a bit tricky to execute because while they don't require too much product work, they can really affect your revenue in the bottom line.
At DemandMaven, they take the Price Intelligently and ProfitWell approach to pricing and modeling.
"We always start with both qualitative and quantitative data. The first thing that we do is we have to get a sense for what the value metric is, specifically from a product perspective. When we think about that North Star KPI, chances are that if it's something product-related, it could actually be indicative of your value metric, which is what you use to determine pricing."
The value metric is the activity that correlates the most with the value to a customer, or a very active user who uses it on a daily basis.
Quantitative data: product events
"When we look at product events, what are the things that people who are very high value use and do the most? That can give us a sense of what the value metric might be."
Qualitative data: price sensitivity survey
"The analysis that you do from this survey will tell you your price elasticity. It will also tell you what the ideal price would be, based on how customers respond."
Qualitative data: product-market fit survey
Asia recommends to also add another layer to the qualitative data by doing the product-market fit survey similar to what Superhuman and Basecamp did:
"The most critical question was 'how would you feel if you could no longer use this product?' Respondents could choose: very disappointed, somewhat disappointed, not disappointed at all.
If you were to combine both of those in the same survey, you could get price sensitivity, not just globally, but also based on customers who said 'very disappointed' and who also fit the ideal target profile that you're looking for."
Combining these two frameworks is powerful because if you have 100 respondents, it could give you statistically relevant data very quickly.
Qualitative data: feature preference analysis
"What are the features that ultimately drive revenue? And you can do that with a feature preference analysis. You're basically asking customers, 'out of all of our features, what's the most valuable to you and what's the least valuable to you? And you can only pick two.'"
Because the survey could end up long if you include all questions, you can split them in two so you can focus on one type of information per survey.
The results of these surveys would help you gain important insights about features and pricing:
"You can get a level of analysis where not only could you say, 'these are the features that drive value, we should probably price around them.' But you also get a sense for what the best paying customers say versus not the best paying customers. So not only do you get a sense for what to charge them, but you also know around what features to charge them."
What to do when you don't have enough survey respondents
Don't send the survey out once. Instead, send it out periodically:
"Over time, you do get enough data to make a good decision. But 20 to 25 responses would be enough."
If you're still lacking responses, one option would be to pay for survey responses:
"Not my favorite, but it is an option if you're really desperate."
Another options is leveraging audience interviews:
"You wouldn't be able to ask questions like, 'would you pay this much?' But what you could do is gather data about what they're already paying for. That could also be an interesting way to get insights. However, this is risky especially if you don't necessarily have enough survey responses, like to how people are using your product."
You could also focus on qualitative interviews and research.
"For example, maybe your audience are not survey takers, but they might be talkers. And they would talk to you on a 10 or 15 minute phone call. That could be enough to get an understanding of how they think about something."
Working with a North Star KPI attached to many levers
For a metric like the number of demos, that's attached to many levers, Asia says that the focus area will narrow down as to which inputs could really improve that KPI:
"There's so many different inputs to demos. Could be the number of outbound calls, the number of cold email sent, or the number of contacts or relationships created at conferences. It could be so specific and independent. But if you know what the North Star one is, your focus area should really tear up into that."
Which is why you have to really think about your focus area carefully:
"You have to pause and ask yourself about, 'if I'm choosing this KPI, what are the different ways that this isn't or is working? And what are the areas that seem to need the most help? And that's how you determine the focus area."
Asia shares her experience with a client who chose a North Star KPI that was product activity and was specific to the number of accounts:
"So it was the number of customers who connected their financial accounts within 24 hours. If that number is low, we do not make money. That's just speedy activation rate"
For this client, activation rate alone as a North Star KPI isn't enough. It had to be a specific action because without it, it's difficult to determine the experiments they should run to improve the North Star:
"That's where choosing your North Star KPI is going to make or break your growth function.
But for the most part for this client in particular, that activity is the most critical. Because if we were to choose something else, we would optimize in a different area and we wouldn't actually get to the heart of the problem which is: this product experience has to get customers to this moment."
Asia's team narrowed this down by pulling up their data on Amplitude:
"We have a 99% drop-off rate within 24 hours when someone uses the product for the very first time. That's ridiculous and that's the thing that's hemorrhaging. It's not the traffic.
The fact that prospects will come into the product for the very first time and do quite literally nothing within 24 hours is a huge problem. That basically means no one's using it and they will never get the value."
After seeing the quantitative data, they also looked at their qualitative data:
"When we interviewed customers, I think the number one thing that people complained about was the product experience. Like the UI and the UX itself were very buggy, it didn't work all the time, and it was also very underwhelming."
Aside from how the product looked, they also determined that the onboarding process was not smooth for users.
Asia shares that even though she might have her own suggestions on how to move forward, they don't automatically happen. Talking it out with clients is key if you want something done:
"Part of the decision-making process is having those discussions, talking it out with clients, making commitments, and doing it."
Changing your North Star KPI
For some people, they believe that you should only change the North Star KPI once you've made enough progress to satisfy your goal.
"This take might feel spicy for some people. But there are some that would argue that as you get more data and insight, you could decide that actually this other metric over here is what's painful."
For example, after four to five weeks of collecting insights and data, they tried to convince their clients to also look at another set of metrics while they're also working on the North Star KPI.
"I had to convince the growth team that yes, traffic is a priority. Yes, we're going to put some effort towards that because marketing can't code. So while this is actually happening in the background, there's other things for us to be doing."
So whether you do the split focus strategy or just focus on the North Star KPI, that would be up to your team:
"I don't think that there's a perfect answer for when or why or where that happens. But I do think it's a strategic decision and one that should heavily involve the founders so that they're aware, they agree, and they're aligned.
Shifting the focus of the team can be very costly. If it happens too much, it can be costly for even a very small team. So we have to weigh the pros and cons, and figure out the tradeoffs."
Changing the focus area
There could also be cases where you keep the same North Star KPI but have a different focus area on your next growth sprint. For example, if you want to still look at traffic to trial conversion rate. You can test areas like design, copy, messaging, experience flow, specific pages of your website, and more.
But even if you wanted to test all areas and spend years pursuing that North Star KPI, your mileage will still vary:
"Don't spend two years on just doing that, and that alone. Get to a place where you can feel like you've dramatically improved this North Star KPI, and you can move on to the next thing that needs help."
Thanks for listening! If you found the episode useful, please spread the word on Twitter mentioning @userlist, or leave us a review on iTunes.