Leave a comment

Earlier this year, we asked the awesome /r/BigSEO redditors to tell us about the most embarrassing SEO fails they’ve ever witnessed. And boy did they respond! If you want to read something fun and educational, you'll love this. Common themes we see time after time:

  • Migrations gone wrong (classic)
  • Canonicalization gone wrong (often misunderstood)
  • Duplicate content issues (can be tricky)
  • Content pruning gone wrong (interesting!)
  • Implementing pagination using JS (oops)
  • Search and Replace mistakes (classic)

If you witnessed (or caused ;)) any juicy SEO fails yourself - please share them in the comments :)

  • IG

    Igor Gorbenko

    about 1 month ago #

    does staring at GA traffic drop for like a week without checking the website count?
    because back who knows when, my first website went from 10 visits/day to 0 visits/day and was like that for a week until I realized I forgot to pay hosting bill :D

    • SV

      Steven van Vessum

      about 1 month ago #

      Hahaha, that definitely counts as an fail. Not sure if it's an SEO fail, but it's a good fail that made my chuckle for sure :)

  • TC

    Tad Chef

    about 1 month ago #

    Oh my! All technical changes are a minefield for Google sabotage. Don't let web developers do it without due process. That's the lesson I learned here.

    • SV

      Steven van Vessum

      about 1 month ago #

      Exactly, and have the proper tools in place. I mean, we all make mistakes but I think we should strive to minimize that. I like how they approach that in aviation. Every type of accident can only occur once. After each accident, they update the regulations and processes. We should be doing the same in SEO :)

  • DG

    Dejan Gajsek

    about 1 month ago #

    Had the same problem with our in-house team. Of course, we couldn't be found on Google if robots.txt disallow crawling.
    🤷

    • SV

      Steven van Vessum

      about 1 month ago #

      Ouch, that's a bad one. Fortunately (yes, there's good news too) it doesn't lead to URLs being removed from the index right away but over-time Google will display the classic "A description for this result is not available because of this site's robots.txt – learn more." message. Not good for your CTR, and obviously not letting search engines access both new and existing pages is horrible for your overall SEO.

  • JH

    James Hackett

    about 1 month ago #

    Great article...glad to see it posted here now.

  • MS

    Martijn Scheijbeler

    about 1 month ago #

    My favourite one from back in the days still was the fact that we rebuild the whole site in another language and after a week figured out that the page was rendering fine for most user agents, except Googlebot. Somehow the default was to serve them the API data in XML instead of rendering a regular HTML page. Within a week we lost 30% of traffic on these pages, we were able to fix the issues within minutes of discovery as it basically was a setting somewhere that we didn't know needed to change. But it was a good wake up call from there on to always make sure that big implementations are also tested from an SEO point of view. In the end the recovery was quick, within 2-4 weeks we recovered on most positions and got all of our traffic back but it still costs you a few percent on your yearly results for SEO when you make such a mistake.

    • SV

      Steven van Vessum

      about 1 month ago #

      Thanks for sharing @martijnsch, that's an interesting one for sure!

      It's one of those things that you don't typically check, but it has to go wrong once before you can add it to your release checklist.

  • NR

    Nikola Roza

    11 days ago #

    Great article Steven, thanks!

    My two key takeaways are;
    Technical Seo is powerful, but like most things technical it's largely shied away from. This leaves room aplenty for...

    Mistakes.
    Like the scary ones that wipe whole sites away.

    They are common, but solutions to them are mandatory, and effective... which is comforting.

    You just need an expert close by.

  • EK

    Elena Kyrzhilova

    1 day ago #

    Great post, Steven. I saw your contest on /r/BigSEO :) Glad to see it posted! BTW, the winner is my favorite fail from the list.

  • JQ

    Jason Quey

    about 1 month ago #

    Curious, do you think you can get around creating duplicate content on another site (in other words, syndicating), if you target different keywords?

    • SV

      Steven van Vessum

      about 1 month ago #

      Hi @jdquey! What do you mean exactly, grab an existing piece of text and edit it so it's focused on different keywords? How much would you be rewritten?

      Let me know :)

      • JQ

        Jason Quey

        about 1 month ago #

        1. I publish an article on Site A.
        2. I publish the same article on Site B, changing 10-20% of text and target keyword to something similar (for example, "lead magnet" to "opt-in bribe").
        3. Could both articles get organic traffic?

      • SV

        Steven van Vessum

        about 1 month ago #

        Since it's impossible on GH to respond directly to your last comment, here it goes: yes, you may very well get organic traffic to both articles. The degree to which they can both be successful differs case by case, sometimes you see the strangest things happening. I'd say, give it a try :)

Join over 70,000 growth pros from companies like Uber, Pinterest & Twitter

Get Weekly Top Posts
High five! You’re in.
SHARE
31
31