Leave a comment

Growth Studies

This is going to be a slight departure from our typical growth study. Our other growth studies generally cover the entire growth history for a company. But this one begins in January of this year after we’d experience three frustrating months of stagnating growth. Up until this point we had largely grown on the back of Twitter, reaching around 90,000 MAU (Monthly Active Users) in the first year. But increasingly the hands-on execution of our Twitter playbook was preventing us from generating and testing many new ideas. This is ironic because I generally advise other growth teams to “test, test and test some more.” I explain that rapid testing is the most effective way to predictably grow a business. But it took the frustration of stagnating growth to cause us to fully embrace a High Tempo Testing program at GrowthHackers.com. Starting in January we set the goal of launching at least three new experiments per week. We use the term experiment broadly to include new initiatives, product feature releases, and yes, A/B tests. When we’re running high tempo, we’re pushing on all vectors to find growth in a systematic, measurable way. The graph below shows what happened:

GrowthHackers.com Growth

We grew from 90,000 MAU to 152,000 MAU in about eleven weeks without spending a dollar on advertising or increasing the size of our growth team. The only thing that changed was the velocity of our experimentation. We achieved that velocity improvement by adopting the high tempo testing framework to help us prioritize and close the loop on the growth experiments that we thought would move the needle.

What is High Tempo Testing?

Most growth professionals understand that growth hacking isn’t about a specific tactic, but rather it’s about a process of discovering which tactics will be effective for growing your business. This discovery happens through testing potential growth drivers ranging from new channels to new sharing and engagement features deep within a product. The more tests you run, the more you learn about how to grow your business. So it’s only natural to want to run as many tests per period of time as possible. That’s the idea behind high tempo testing. My inspiration for high tempo testing comes from American football and the “high tempo offense.” Perhaps the best example of this is the Oregon Ducks’ college football team. From the moment they hit the field they are go, go, go… The opposing teams often end up on their heels and Oregon is able to find weaknesses in their defense. It’s exciting and frankly kind of exhausting to watch. High tempo testing is approached with a similar energy and urgency to quickly uncover growth opportunities to exploit.

Requirements for High Tempo Testing

Unbridled Ideation

High tempo testing starts with identifying experiments worth running. If one person is responsible for all ideation, they generally run out of ideas within a few weeks (at least ideas worth testing). Even within a dedicated team, ideation can become ad hoc and stagnate without a process in place that acts as a springwell of new ideas. To jump start fresh ideas for growing GrowthHackers.com we opened the ideation process up to a much broader team. Naturally this includes everyone on the growth team, but we also invited our interns, engineers, salespeople, support people and more. Essentially any employee or contractor in the company that wanted to participate. But we didn’t stop there. We included advisors, friends and even some of the most active community members. We collected hundreds of ideas from this group. Interestingly, different groups contributed ideas based on insights that were unique to them. Some people gravitated toward ideas to improve retention, while others focused on acquisition or referral. The ideas themselves ranged from ways to improve community engagement to product enhancements. For example, one of the engineers suggested we should use Oembed to make it easier to embed highly engaging multimedia content. Most of the rest of us didn’t even know that was possible and had suggested a more manual media upload process.

Canvas Ideas

  After the ideation started to slow down, we decided to add a leaderboard to celebrate the people who were generating the most ideas. The leaderboard re-energized the ideation process, so that we are now adding about 2 - 3X more ideas per week than we can test. Therefore, we are continuing to grow our backlog of hundreds of ideas.

Canvas Leaderboard

Prioritizing Ideas

Eventually we faced the challenge of sorting through all of the ideas. With hundreds of ideas, it was tough to know where to start. We scored every new idea based on its potential impact on growth, our confidence that it will be successful, and how easy it is to implement. In other words, using an ICE score. Once the ideas were scored, it was easy to sort them and find the best ideas in the areas we thought we should prioritize first (like activation).

Canvas list view

Managing Tempo

In the beginning we often got bogged down trying to launch a big idea. Each time our tempo slowed down (missing our goal of launching three tests per week), our growth slowed down. The solution was a Weekly Growth Master meeting. This meeting is not for ideation. It is for processing the backlog of ideas that we already have, prioritizing and assigning the list of ideas for the next week. Think of it like a sprint kick-off in agile development. The backlog has already been groomed, the user stories are already written, it’s time to build the weekly growth agenda (or sprint in agile terms). Some growth experiments can be implemented by the marketing team, others by product managers and others require deep engineering skills. Balancing the workload of top priority experiments across different teams makes it much easier to hit our tempo goal. While our goal is to launch at least three tests per week, we generally shoot for five tests per week. That way if we hit roadblocks on some tests, we still hit the tempo goal. Many weeks we are able to get all five tests out the door. Our weekly growth master meeting is also used to review the tests from previous weeks. Were we able to implement all of the tests planned? Have any tests concluded and if so what were the key lessons learned? This accountability and planning meeting is an essential part of keeping the growth process on track.

Capturing Learning

All of the completed experiments are ultimately stored in a knowledge base so we ensure that we capture this learning and don’t keep repeating the same tests. The knowledge base is also very helpful when adding new members to the team, so they can understand what has been tested, what worked and what didn’t. The analysis has started to become a bottleneck for us, so we recently added a role that is responsible for analyzing the completed tests and managing the knowledge base. Previously this responsibility was shared by the team.

Canvas knowledgebase

High Tempo Testing Drives Results

Using a High Tempo Testing approach we’ve been able to add as many new MAUs in the last 11 weeks as we did in the first 32 weeks following the launch of GrowthHackers.com. High tempo testing isn’t easy, but it is extremely effective. It requires both a rigorous process and a system to manage that process. Much like lean manufacturing it focuses on throughput and small batch sizes. By adopting high tempo testing, we eliminated the bottlenecks at the ideation and experimentation steps to unlock more growth. If your team is truly committed to driving growth, High Tempo Testing is the most predictable way we’ve found to hit aggressive targets. If you’d like to be an early beta tester of the system we used to manage our high tempo testing program, you can sign up here.

Written by
SE
Sean Ellis
  • TO

    Trevor Owens

    over 2 years ago #

    This is easily my favorite article on GrowthHackers ever. What tool are you guys using in the photos? As a science nerd I'd love to know more about your process and some of the tests where results surprised you. Did you come up with a Riskiest Assumption and a Success Criteria for the tests? How many of the tests were production-tests vs user research? What do you think are the drawbacks of ICE?

    • SE

      Sean Ellis

      over 2 years ago #

      Hey Trevor, I'm head down preparing for a board meeting on Wednesday. But wanted to let you know I saw your questions and will get back to them. Thanks for the enthusiastic feedback on the article.

  • MB

    Morgan Brown

    over 2 years ago #

    Great insights on how we've been able to catalyze growth through the process of high tempo testing. I think the framework has really created a culture of speed, accountability and insights that have taken us to the next level.

    I think any team looking for a process to unlock growth can use this as a blueprint to re-energize their efforts.

  • SE

    Sean Ellis

    over 2 years ago #

    Growth is hard (even for us), but the effort is way better than the alternative of slow or stagnating growth. It's super exciting to check stats when you are growing quickly. For example, this weekend we had more than 20% week-over-week growth from last weekend.

    Let me know if you have any questions or advice on implementing a high tempo testing process and culture. Once you commit to it, it's very powerful.

  • DK

    David Kolodny

    over 2 years ago #

    This is so awesome. Amazing study with such strong and practical takeaways. I will be applying several of these processes with my team!

    + I love to see you help scale this process via the Canvas tool. GrowthHackers practicing what you preach! Amen!

    • DK

      David Kolodny

      over 2 years ago #

      Have you found any trends with which tests generate strongest impact? Either through category of test (i.e. one's carried out by mkt team vs. eng)? And how about by ICE score (high scoring on 'I' is better predictor of success vs. 'C' for instance)?

      • SE

        Sean Ellis

        over 2 years ago #

        Actually we've found that it's pretty hard to predict the impact. Big tests that take weeks to implement are often failures (despite high impact prediction) and easy tests that take an hour or less to implement are often highly impactful (despite low impact prediction). So it becomes really important to hit your velocity goals by layering in some really easy tests to implement.

  • KK

    Kennett Kwok

    over 2 years ago #

    Love this post.

    At what size do you consider something a "test"? Could is be as small as testing a gif on Reddit?

    • SE

      Sean Ellis

      over 2 years ago #

      Thanks Kennett. I consider anything that can potentially move the needle on growth and can actually be measured to be a test. So it could be as small as testing a gif on Reddit if you feel like you can measure the impact.

  • LT

    Luke Thomas

    over 2 years ago #

    Cool result - how are MAUs defined in this context? Is this based on unique visitors in Google Analytics that come back, registered active accounts, etc?

    I'd be curious to hear what criteria you picked for this metric. Thanks!

  • AS

    Amit Sonawane

    over 2 years ago #

    Great write-up, @sean and team!

    I can imagine a future where startups crowdsource this ideation process to their outside community aka customers and incentivize the ones that move the needle.

    Excited for Canvas!

  • GG

    Glenn Gillen

    over 2 years ago #

    Would love to know how you balance such high test tempo with having high confidence of impact. How are you sure the tests are having the desired impact so quickly/have a low p-value, and either aren't impacting each other or aren't temporarily successful just because of the "woo, different!" effect?

    The results obviously speak for themselves. Just curious if there's more specifics on implementation and iteration that's worth digging into. Thanks.

    • SE

      Sean Ellis

      over 2 years ago #

      Thanks for the comment Glenn. A lot of tests end up in the "inconclusive" column in our knowledge base and we have a pretty big backlog of tests to still be analyzed. But so far we have 10 declared winners. Some are clear winners, like when we moved our email collector from the bottom of the page to the top of the page and saw a 700% increase in emails collected vs the control. But we definitely have a lot of room for improvement in how we analyze the results for individual tests and how we communicate those results in a way that is easily understandable for the entire team. The minimum we run any test is for a week, but generally we try to run them for quite a bit longer. Also remember that some tests are external tests (ie running in Twitter), so we can get big sample sizes pretty quickly.

  • GC

    Guillaume Cabane

    over 2 years ago #

    Hey, fantastic momentum here !
    One question thought : with all these tests running, how do you manage the measurement of success ?
    Do you have a control group of users who are never shown any test, or do you apply each test only to a fraction of your users ?

    • SE

      Sean Ellis

      over 2 years ago #

      Good question @gyu - it's easy for the whole testing process to get pretty chaotic. Generally we're running the tests as A/B tests.

  • MB

    Madhav Bhandari

    over 2 years ago #

    This is absolutely great @sean and the GH team! I used to keep track of all my growth experiments in a spreadsheet till now, but that's definitely going to change now.

    I know it's kind of a huge favor to ask for, but it'll be great if you could make your Growth Canvas public (only the tests that have already worked for you and you are ready to share it with the community) sometime in the future. Instead of sharing them as an article, sharing the canvas itself would be so much cooler. Or maybe have a public growth canvas for the GH community where everyone can share their tests and results, and the community member at the top of the leaderboard gets to see your Canvas!

    • SE

      Sean Ellis

      over 2 years ago #

      Thanks @madhav for the idea to make public Canvases. I'm not sure I'd want to publish our whole Canvas publicly, but I definitely see a day when people trade access to their Canvases.

  • DK

    Dimitris Kalogeros

    over 2 years ago #

    nice post

  • JP

    Jonathan Peacock

    over 2 years ago #

    Love the idea of High Tempo Testing @sean. Can't wait to try out Canvas. I'm currently using a @bbalfour inspired spreadsheet (https://docs.google.com/a/zibbet.com/spreadsheet/ccc?key=0Ai8bEvrgpnWUdDZ5TDRqUFpLSmtxT0NKaVE3WElmelE#gid=4) and would love a more streamlined approach.

  • AS

    Adam Szabo

    over 2 years ago #

    Love this. I was about to ask if you're making the system public when I saw the signup link for Canvas at the bottom :)

    I have close to a hundred little growth ideas for my projects on a spreadsheet, Canvas will be a perfect fit for them.

  • MS

    Michael Sarlitt

    over 2 years ago #

    *Awesome* growth hack for getting Canvas signups that actually had some real value. I was literally thinking, "I wish they'd say what tool they use for prioritization" as I was reading.

    Might want to highlight the "sign up here" link. Nearly left before signing up because I missed it at first!

    • SE

      Sean Ellis

      over 2 years ago #

      Thanks for the suggestion. Just made the update. We should invite you submit more ideas in our Canvas for Canvas...

  • LR

    Laura Roeder

    over 2 years ago #

    How large is your team? It feels like we would need a huge team of writers, designers and engineers to execute 5 tests per week! (Not to mention the people to analyze the results.)

    Also, do you have a process for following up on long-term results? We find that the LTV of customers from different channels can be very different, but of course this data can't emerge until 6 months later.

  • JS

    Jovan Sterling-Noel

    over 2 years ago #

    Are all test evaluated and ended within one week or do you carry some over to the next week? I can't imagine all tests could be complete by one week, as all tests vary

  • GD

    Gergana Dimova

    over 2 years ago #

    Thanks for sharing, Sean!

    That was very helpful and inspiring. :) I have one question, though: How would you actually implement the high tempo testing with a very small team (think 2 people)? Is there any alternative you'd recommend for small teams?

    Thanks. I'd appreciate your reply.

  • ZK

    Zoran Knezevic

    over 2 years ago #

    My good friend shard this article I was hooked by title... after reading it trough I see a lot of 'reverse engineering' here.

    First after the New Year you decide to try new things and actually WORK. Then you saw encouraging results and kept pushing until you landed to this what you are right now and now you try to formalize it in an 'method' give a buzz name and all that good stuff.

    • SE

      Sean Ellis

      over 2 years ago #

      Glad the title hooked you. That was certainly the goal! Agree it's not rocket science and is a lot of work. I think the biggest difference is that before January we were divided between executing a known playbook and trying to do a few big win tests. Starting in January we set a weekly test launch goal and did everything possible to hit it. It wasn't necessarily more work though. It was a different kind of work where we mixed small tests and big tests to hit a weekly testing goal.

      The overall method isn't actually that unusual. It's the same approach that most of the big growth teams use like Facebook, Uber and Pinterest. Though I'm not sure how much they focus on a weekly test launch goal. Once you start running tests at a high frequency, then suddenly you need to become a lot more systematic about the whole thing...

      • ZK

        Zoran Knezevic

        over 2 years ago #

        Hi guys!

        Just to make sure very inspirational article and it is slap in the face to many of us... that actually made last experiments in 2014 :D and now work on platform, scalability etc. and loosing focus of very essential things like new features etc.

    • MB

      Morgan Brown

      over 2 years ago #

      I don't think anyone on the team believes that we were all on holiday before the new year. Working on the wrong things, ok, but definitely still working hard.

  • KC

    Kyle Crawford

    over 2 years ago #

    Like everyone else said, this is fantastic!

    I was particularly thrilled to see your take on ideation and making the case that it is actually quite difficult to consistently generate new ideas unless that responsibility is distributed. Too often we forget that execution requires ideation, and in my mind the high tempo testing makes that more apparent than most.

    I've curated thousands of ideas in the last month through my startup so I definitely empathize with what a painstaking process that is.

    All around this is great!! Go team!

  • JO

    Jarno Oksanen

    over 2 years ago #

    Awesome point of view and a great article.

    Being from a smaller market-area (a nation of 5million with unique language) I'm sometimes a bit sceptical if we would be able to implement these ideas. Have any of you found a base requirement (existing customers, target group, workers..) that should be in place before hitting the High Tempo testing lane? What I'm really worried about is the statistical significance and the length of testing when sample size is too small.

  • BG

    brett gordan

    about 1 year ago #

    super detailed study, love it :)

Join over 70,000 growth pros from companies like Uber, Pinterest & Twitter

Get Weekly Top Posts
High five! You’re in.
SHARE
338
338