The last couple of months have been crazy for everyone. Now, we can see the light at the end of the tunnel and it’s time to hit the ground running. Need to gather some momentum? That’s where we come in. This month’s roundup is filled with all the latest tricks to carry you forward.
First, we have the month’s top guides. You’ll see the latest keyword research tips from Moz, get actionable advice for building tier 2 links, and pick up a new technique to track anchor text for incoming links.
After that, we have some data-packed case studies. Learn some new ways to analyze Google Core Updates, what the data says about how Coronavirus has impacted SEO visibility, and why you should “launder” irrelevant content by turning it into duplicate content.
After that, we’ll catch you up on the latest news. Get the latest on the recent algorithm updates. Then, learn what Google has to say about old content, find out how much GMB impressions have collapsed, and discover whether Google is still not indexing new content.
The Keyword Research Master Guide
Now is a great time to start thinking about the fundamentals of what we do. We can come out of this with better standards and practices, and pretty much everything begins with keyword research.
This timely guide from Moz has some ideas for how you can optimize your methods to be in line with the latest updates.
It covers a range of topics including:
- How to distinguish between valuable and time-wasting keywords
- How to pull “seed” keywords from search data and competitors
- How to transfer what you’ve learned into a content strategy
- How/when to make on-page tweaks
- What tools provide the most essential data (Moz may be a bit biased here, naturally)
Not all this information will be new, especially if you’re a regular reader of our seo news roundups. However, Moz is one of the biggest names in SEO, and their guides can influence what clients expect from SEO agency reports.
Better to be ahead of the curve, right? If that’s where you like to be, RankClub’s Tier 2 link building guide offers you an efficient way to take the lead in links.
RC’s Tier 2 link building Guide
A lot of factors go into the value of any given link. This guide makes the case that you can supercharge the best links you’ve built by pointing additional links toward those placements rather than your website.
According to Rankclub, these secondary (tier 2) links can make links that are already strong into long-term authority engines.
Even better, they claim that this strategy is now simpler to pull off than it was in the past. PBNs have replaced GSA spam, web 2.0 blogs, and complex 3-4 layer tier schemes as a one-stop source for effective tier 2 links.
The guide covers how to recognize proper tier 1 and tier 2 opportunities, some options for variety, and even some data from a tier 2 experiment.
While the tier 1 and tier 2 links you build are important, you also need to analyze the links that you didn’t build. The next guide in line will tell you how to track the anchor text for incoming links—using only Google Tag Manager.
Tracking the anchor text for incoming links in Google Tag Manager
The anchor text that strangers are using to link to your site can tell you a lot about what information visitors find most valuable.
This data can be key to your anchor text optimization efforts, and this guide by David Vallejo tells you how you can finally start collecting it.
This process uses a custom HTML tag to make an XMLRequest to a PHP file that scrapes each visitor’s referring source and copies the anchor for your review.
If some of that sounds like gibberish, don’t worry. While this method does require some coding, all of the code is provided for you. You can simply paste it into place.
As the author himself states, the code is pretty rudimentary. You or your developer may be able to improve on what’s there.
If that still sounds a bit complicated, don’t worry. The author has also provided a video for the entire process.
If this guide whets your appetite for backend optimization, you’ll also enjoy the next one. Ahrefs has found that some SEOs are breaking their own pagination. Here’s how to find out if you’re one, and how to fix it.
SEOs Are Breaking Pagination After Google Changed Rel=Prev/Next — Here’s How to Get It Right
Google announced late last year that they no longer recognized the rel=prev/next markup.
In response, SEO teams across the web began changing their implementation. According to Ahrefs, that may have been a mistake.
First, they point out that Google isn’t the only party that ever used this markup. Other search engines still do, and it remains part of the ADA (American Disability Act) compliance and part of the standards published by the World Web Consortium.
Google seems to have some other way to get the same information. That’s not something that other parties that use the tag can do anytime soon.
Furthermore, a lot of SEOs who set out to change their implementation may have made things worse in unexpected ways. The guide contains some plans to help if you:
- Canonicalized the first page
- Orphaned your own content with misapplied noindex tags
- Blocked crawling and cut off later pages
For each one, it also tells you how to find out if you’ve made any of these mistakes.
That covers the guides for this month, but the upcoming case studies teach their own kinds of lessons. First, let’s look at an argument for why you need to change the way you analyze core updates.
Google Core Updates: Stop Analyzing them like it’s 2013
If you’re an SEO, you’re probably pretty confident in your understanding of traffic, and how to tell when and how an update has affected it.
This chart-packed piece by Dan Shure may put that to the test. He shows you how to break down your averaged traffic, and how to analyze whether an update was better or worse for it than the first glance suggests.
He argues that analyzing traffic changes at the domain level is one of the least-insightful ways to judge whether a core update was good or bad for a website.
The problem with assessing domain traffic is that there is rarely a domain-level solution for the effects of updates. Instead, different pages are taking hits or climbing based on other factors.
He suggests (and lays out) a plan for segmenting your traffic by:
- Page Types
- Query Types
- Device Type
This, along with EAT-based analysis, can give you a lot more information about what an update really did to your site. Thanks to the algorithm update that just dropped, you’ll have a chance to put this into action. More on that in the news items.
Unfortunately for us all, core updates are likely less responsible for big traffic changes lately than the Coronavirus. The next case study examines the impact that has had on visibility.
How the Coronavirus Has Impacted SEO Visibility Across Categories
Nearly every niche is experiencing volatility right now because, as this case study points out, the intentions and motivations of searchers are in flux.
Some goods no longer fit in most family budgets, while others have experienced massive surges (like gaming consoles) and even shortages because of their increased value during the quarantine.
The study examines the effect the Coronavirus has had on 18 niches, including all of the following:
- Addictions & Recovery
- Alternative & Natural Medicine
- Food & Drink
- News & Media
- Nutrition & Fitness
- Restaurants & Delivery
- TV, Movies & Streaming
- Video Games, Consoles & Entertainment
A series of charts break down not only who the winners and losers are, but also how much they’re winning or losing and which domains are benefitting.
Altogether, it provides some great data you can use to target your advertising or affiliate marketing to where the money is moving right now.
If you are responding to sudden changes in visibility, especially on older sites, you may be struggling to manage some old content. The next case study has some ideas for what you can do.
Launder irrelevant content by turning it into duplicate content
There aren’t many satisfying methods yet for dealing with expired content. Oliver HG Mason has an idea for a workaround: turn it into duplicate content and apply what already works there.
His method leans on the fact that duplicate content (when canonically linked) more reliably passes along ranking signals than irrelevant content.
He claims that by…
- Replacing irrelevant content with content from a preferred destination page
- Making the copied page canonical to the destination
- Waiting for Google to confirm the relationship
- And then 301’ing the copied page directly to the destination page
…you can create a relationship where ranking signals flow from the old page to the new one.
This isn’t the kind of trick that tends to have a good shelf life, but it’s simple and could mean some solid short-term gains if you have a site with older content.
Of course, some best-laid plans can stop working right when they’re needed. We just saw a new core update land, and we’re still working on what that means.
May 2020 Core Update Drops
The May 2020 Core Update (the second one so far this year) has started rolling out, according to Google. What does that mean for you? If you’re in SEO, now is the time to keep your eyes on your and your client’s stats.
As usual, Google linked their piece from last year about how to respond to core updates. It has some reasonable advice, but naturally, it’s not specific about anything that’s being targeted or relieved.
That work is going to fall to the community, who is always up to the task. Until then, we recommend you chill. In the next week or two, there will be answers and possible rollbacks of whatever caused any problems with your site.
The community still has a lot of analysis to do, so look for more news about this update in our next news roundup.
For now, let’s look at what Google is saying about what to do with old content.
Google Advice On Old Content On News Sites: Remove, Noindex Or Leave It
In a recent Reddit “ask-me-anything” appearance, John Mueller took a question about what a site with massive archives of ancient content should be doing with it.
As he does in many cases, he pushed back on the idea that there was something “to do” about it. He suggested that reduced interest from crawlers was natural for old content, and not necessarily a problem that could be optimized away.
He did offer a few suggestions, but stressed that they wouldn’t be appropriate in all situations for all sites:
- Focus on the new content (since that’s what’s likely driving the traffic of your site)
- Remove old, unused content
- Keep category pages indexed, but noindex the articles
The first recommendation is going to apply in almost all cases, but the other two may not even make a difference if older content is already ignored.
Not all traffic problems require hard decisions. Some, you just can’t do much about. For example, the traffic changes that happen because of global plagues.
Google My Business Impressions Down 59%
Anyone in local search has probably taken some hits, but the damage was hard to quantify without data. This release by SEJ gives us a breakdown of what happened by the numbers.
It’s not pretty.
Total GMB impressions are down by a whopping ~60%. Though some services matter more in a crisis, the initial data suggests that nearly every industry got some black eyes at first.
Clicks that follow the (already severely reduced) impressions were themselves down by 31%, and ⅕ fewer people were using click-to-call to reach out.
Despite these revelations, the analysis ends on a high note. The leveling out of these trends appears to be happening already. The trend lines are stabilizing and even picking up for many niches.
You won’t need to make any changes if you were hit by local traffic drops. If you decided to make some anyway, it might not have done anything. As we learn in our final news item, Google is having some trouble indexing new content.
Google not indexing new content again
Ben Schwartz of Search Engine Land is documenting an ongoing issue where Google is failing to index new content for unusual periods of time.
Using several examples, he demonstrates that new content is taking close to an hour to initialize once it has been added. This has been happening even on major and authoritative sites.
This is not a new issue. It was a recurring issue last year, but (until recently) it appeared to have been solved.
For their part, Google is acknowledging the problem and claim to be addressing it. We’ll have to wait and see if that means a permanent solution in the short-term.
Got Questions or Comments?
Join the discussion here on Facebook.