Leave a comment

"Our SEO goal is to help billions of internet users discover Pinterest and find value in it as a visual bookmarking tool. Over time we’ve found the only way to verify if a change affects the user behavior positively is to run an A/B test. Unfortunately, we didn’t have similar tools to test search engine behavior, so we built an experiment framework and used it to turn 'magic' into deterministic science."

  • JS

    James Spittal

    over 4 years ago #

    Does this seem kind of high-risk to any of the other SEO-oriented growth hackers here? There's some key information missing from that blog post.

    If the 'User-Agent' indicates that it is 'googlebot' - do they show the variation? Or the control? Or exclude that from the A/B test?

    I'm thinking this could accidentally trigger 'website cloaking' type 'penalties'/problems.

    • RP

      Rich Pocock

      over 4 years ago #

      what you'll find is the A/B testing is on a page basis not user basis. So page x will be served the control, page y will be served the test. The two pages are about 2 different topics and over a large enough size the results should give a good indication of how well they perform.

      Using this approach they will not be doing any cloaking or variant testing of an individual page. Hence running at low risk.

      • JS

        James Spittal

        over 4 years ago #

        Interesting.

        So, basically, instead of using cookies with ~10yr expiry and a virtual coin-flip to determine CONTROL or VARIATION - rather, there's a "database" of URLs and hypothetically speaking - they might set half of their URLs as part of the experiment and the other half of their URLs as excluded from the experiment?

        According to the article, they look at organic traffic to determine the 'test' results.

        How do they determine statistical significance? I guess it's possible to compare organic traffic on all enabled before/after and see if it's statistically significant.

        They don't mention looking at rankings - but presumably that would make sense too.

        • RP

          Rich Pocock

          over 4 years ago #

          Yea that's the way it looks for the way the set it up.

          On such a large scale covering many possible searches using traffic as an indicator while not 100% conclusive it would definitely show changes in traffic.

          What they show in figure 2, is they have been tracking traffic across the control and variant pages for some time before conducting the change. This gives a baseline to work from.

          From there it looks like they've made best guess estimates on improvement in traffic.

          • JS

            James Spittal

            over 4 years ago #

            @Rich: Very interesting Rich. Thanks for your insights/thoughts. Do you have an e-mail address to discuss further?

            Some thoughts come to mind now that I think I understand some more of the specifics of how their platform works:
            - On sites where the crawl rate is low, Google's crawler won't notice the changes for a while - so detecting the traffic improvement (if any) may be "delayed" - so it seems that you'd want to adjust test duration accordingly.
            - On lower traffic websites, detecting the traffic improvement by using organic traffic may be difficult and could lead to false positives. It seems that using accurate rank tracking as a secondary metric might help (but again, it's not a perfect metric, have to take into consideration 'natural' fluctuation, etc.)
            - If other changes are happening on various pages throughout the website at the same time, there's likely to be some 'conflicts' and you don't know whether traffic improvement/decrease is due to X or Y.

            Has anyone ever attempted to build something like this or seen anything else this available off-the-shelf or as a proof of concept? I haven't.

            Somewhat related I did find this:
            http://moz.com/community/q/how-do-you-a-b-test-for-seo

            Historically, I always discounted A/B testing SEO to be somewhat impossible - but it's very interesting to see Pinterest taking this serious and going to the lengths of building an in-house platform for it. Kudos to them for that.

            Anyone interested in some public and/or private experimentation? Perhaps whacking together an open-source proof-of-concept?

            I will probably be doing so myself to scratch my own itch/curiosity.

Join over 70,000 growth pros from companies like Uber, Pinterest & Twitter

Get Weekly Top Posts
High five! You’re in.
SHARE
41
41