Last month was a huge one for SEO. Not just because of what went down, but because of what has been announced. Everything points to a landscape that will be changing massively over the next few years. Miss this roundup and you might find yourself left in the dust.

First, you’ll be enhancing your insight with the biggest case studies of the last month. Data answered some big questions this month, including what changed in Google’s’ core update, how Google is developing its sense of searcher intent and whether ranking #1 is really as valuable as it seems to be.

Coming up next are the best SEO guides. You’ll learn how to improve keywords with internal links and which backlink checker has taken the lead in 2019 after the biggest players raced to add new features.

Finally, we’ve got the latest news. Is Ahrefs is entering the search engine game? Are big changes are coming anchor text signals? Is Google still quietly indexing sites blocked by robots.txt?

By the time you get to the end of this roundup, you’ll have all the answers. But if you really want to get ahead of the game, you won’t miss this years’ Chiang Mai SEO Conference.

Chiang Mai SEO Conference 2019 Early Bird Tickets on Sale

The Chiang Mai SEO Conference returns for its third year with over 750 big players including agency owners, super affiliates and e-commerce magnates. They’ll be powering two days of presentation, workshops and unmissable networking opportunities

You’ll meet dozens of people who are regularly featured in these roundups because of their news-making moves. People like Matthew Woodward, Charles Floate, Gael Breton, and of course, yours truly.


Past events have completely sold out, so make sure you pick up your tickets today. VIP tickets sold out in 2 hours and there’s a few deluxe still left.  Buy now, and you can snag some early-bird savings.

Now, let’s move onto the roundup items, starting with a case study that examines what happened in the quiet March Google update.

What has changed in Google’s March Core Update

The medical update that hit Google several months back inspired a lot of wailing and gnashing of teeth.

A lot of sites were hit hard, some unfairly because they were legitimate providers of services. At least a little relief seems to have come in this recent update. Sites that were affected were showing improvements in rank by as much as 30% overnight.

This case study was thoughtfully prepared for us by Eric Lancheres of Traffic Research. Over 1.7 million sites competing for 3000+ keywords were analyzed to determine what exactly was targeted in this quiet update, and who saw the most changes.

The results seem to suggest that the update targeted the entire domains of the affected websites. While many sites hit by the medic update were affected, it does not appear that there was a rollback of that update or its effects.

Instead, whatever factors were being used to target those sites were modified to produce a more measured outcome. That may mean that if you were hit and didn’t recover, Google has better reasons that it had before to believe that you deserved it.

That seems to be further confirmed by results in the case study that showed some sites that survived the medic update unscathed are now being hit with penalties.

site wide traffic increaseAffiliate sites and commercial results seem to have gotten the worst of the new penalties, but sites with notably poor content and some questionable UX practices also suffered traffic losses.

The case study closes with some helpful tips to recover from the latest penalty, and those are worth checking out if you’ve taken damage from either the original medic update or this one.

Updates like these are part of Google’s ongoing mission to understand user intent. Now, thanks to Moz, we have a better understanding than ever of what signals it’s watching to judge that.

How do SERPs change as Google senses searcher intent?

Google has been working toward more dynamic SERPs for a long time. The top results can change dramatically based on where the user is searching from, and what Google assumes that they want.

That was the focus of this case study by Moz. It looks at tens of thousands of results in dozens of niches to determine how Google thinks it can sense intent, and how it behaves in response. The results were a little different than the researchers expected.

SERP comicsIt’s worth going through the entire case study, but one of the big takeaways was that ecommerce category pages (pages that feature small profiles of several products instead of the individual products)  are the most prominent results for most types of searches.

That this was true for retail keywords at all levels of intent—from information-gathering to purchasing—seems to suggest that Google is either still developing the ability to sense intent, or that its own data shows something that the rest of us are still figuring out.

Of course, the stress of wondering how to make it to the top spot for all searchers may not be worth it. As this next case study shows, being #1 isn’t necessarily all it’s cracked up to be.

Is ranking #1 as worth it as it seems? (maybe not)

For more than a decade now, the #1 spot in Google results has been the most prized jewel of all the biggest companies and their marketing partners. No less than billions of dollars have been spent on campaign strategies, content rollouts, ad buys and social media endorsements.

page ranking memeBut of course, that spot is only the means to an end, the end being the traffic and conversions advantage that comes from holding it. Doesn’t the top result always get the most traffic? No, this case study reveals.

In fact, the top spot guarantees the most traffic in fewer than half of all cases.

In this Ahrefs study, over 100,000 queries were examined to determine that the #1 spot gets the most traffic 49% of the time. The #2 spot claimed the most traffic in nearly a quarter of all cases, and #3 could reliably claim the most traffic in around a sixth of all cases.

This begs some big questions like, how much more is the top performer paying for that spot than the next closest competitors? If something can be done to claim the most traffic from lower spots, would that make a better investment? The entry tries to answers those questions, too.

The case study closes with an excellent guide portion that explains how the 2nd, 3rd and even lower results are collecting more traffic than the SERPs winner just by providing a more useful and more relevant introduction to their pages.

And speaking of guides, we’re now ready to jump into the best ones from this month. First, an exhaustive look at the best backlink checkers.

The best backlink checkers ranked

Competing in modern search takes a deep understanding of your link profile and how it compares to your biggest competitors. Fortunately, that’s easier than ever with a slew of powerful tools that are designed to automate and visualize every part for you.

You can use Moz Pro, Ahrefs, SEMrush or Majestic—but which one should you use? You are not going to find another guide that goes further in depth to answer that question than this one.

Every contingency is analyzed in this guide from Backlinko.

Backlinko screenshotWhich one finds the most links? Which one does it the fastest? Which one includes the coolest special features? Which one finds you the most link opportunities?  Which one saves you the most money? Those are all questions that are answered with data, detail, and plenty of colorful pictures.

As you can probably imagine, there is some dispute both in the conclusion and in the comments about which one truly comes out on top. My personal favorite tool wasn’t Brian’s, either.

Some features that are exclusive to one tool are indispensable in certain niches, and nothing is going persuade those users to try a different one.

We’ve talked about backlinks quite a bit by now, but what about internal links? The next guide in the list is one of the best breakdowns you’ll find of why internal links matter and how to maximize their value.

How to improve keywords ranking with internal links

Links have always been a vital part of SEO, but it’s only with recent updates that they’ve become the stars of any campaign. As the overwhelming power of links has become clear, more and more analytical work has gone into understanding why they work and how.

This has led to deconstructions of every part of links from the balance of entire profiles to the precise terms used in the anchor text. Now, the people at SEMrush are diving even further down the rabbit hole with a close examination of just internal links and their effect.

SEO keywords meme tobby mcguirreThis guide will introduce you to some analytical terms you can use to ultra-optimize your internal links including how to use them strategically, how to recognize and use “bubble keywords”, and how to track the results of your efforts.

With the guides now out of the way, it’s time to start looking at the big seo news drops of the month. I can tell you that most of the SEOs I know are talking about one in particular: Ahrefs is planning to enter the search engine game.

Ahrefs is building a search engine to compete with Google?

Ahrefs has been a big player in the SEO game for a number of years. They’ve built an impressive data-mining operation and a tool that helps you explore almost everything that can be known about how a site ranks, from keywords to backlinks and content.

Their tool is primarily used to improve performance on Google, but with their next step, they’re looking to use everything they’ve learned to become a direct competitor.

The announcement came on right near the end of March. The CEO Dmitry Gerasimenko nailed a Martin Luther-style list of grievances to the wall of his Twitter account.

Life-Beyond-Google-MemeIn a long thread, he cited his frustration with Google’s protections for privacy and its tendency to treat major publishers as beggars instead of partners.

While there is next to no news on what an Ahrefs engine will look like (and was it born at CMSEO? :p), he promised a commitment to both privacy and profit-sharing with major content creators and the networks (like Facebook) that host them.

If this happens, it’s probably something that won’t be launching for years, but it could mean changes in the short-term. Google has faced pushback on its practices recently, not just from publishers but from its own employees. News like this could make for some important reforms.

Google is constantly making changes, after all. Our next item is a lot more low-key, but very important if you’ve been working on your anchor text game.

How is Google planning to change anchor text signals?

The strategy of anchor-text generation is something that’s treated like a science, lately, but there’s a big change coming that may shake up the entire game. This entry was inspired by a question John Mueller answered recently in a QA session.

Google recently applied for an update on an existing patent that is focused on something called “annotation text”. This is defined as the text that surrounds links and is used by crawlers currently to develop additional context.

anchor texts linkingThis text may soon form a much larger basis for the relevance of links, meaning that the unanchored text that surrounds a link may be just as important as the few words that are chosen for the anchor.

This has been an assumption for years and an anchor text practice of my own.

However, it’s not something Google is being very clear on at the moment, but thinking about it now may help you future-proof the content you’re creating until then.

John Mueller took the chance to communicate on another issue around the same time he was providing an answer on this one: Is Google indexing sites blocked by robots.txt?

Is Google still indexing sites blocked by robots.txt?

John fielded this one on Twitter. A troubled SEO noticed that a site fully-blocked from its birth was already showing up in Google’s index. It’s always been understood that robots.txt gives directives and that Google will respect them. Learn more about how to control crawling and indexing here.


In this case, there does not appear to be any cause for alarm. John responded pointing out that the information gathered (titles and descriptions) doesn’t need to be crawled to be retrieved. He promised to look into reports that snippets were somehow being drawn.

Many commenters piped in to suggest additional ways to prevent a site from being indexed, including by using noindex directives and password protections. At this time, the crawling itself (if it happened) doesn’t appear to be intentional, though it’s always helpful to take the recommended extra steps to be sure.



Article by

Matt Diggity

Matt is the founder of Diggity Marketing, LeadSpring, The Search Initiative, The Affiliate Lab, and the Chiang Mai SEO Conference. He actually does SEO too.