I may have said this before, but this is one news roundup that you aren’t going to want to miss. The last 30 days were filled with important events that are only going to become more consequential as we work our way into the new year.
I’m going to start with all of the latest developments—the big changes that you can’t afford to miss if you’re involved in the search industry.
After that, I’m going to take through a small collection of the month’s best guides that you can put into action on your sites right now. Finally, I’m going to cover a few of the most interesting analysis pieces and discuss what they mean for the future.
Coming up first, the changes that are being made to Google’s search console
What is being added and removed from Google’s Search Console?
Webmasters of all types use Google’s search console and the tools that come with it to measure site traffic, diagnose issues and find areas for improvement.
It stands to reason that Google’s own tools are going to give you the most insight into what they want from you, so big changes to those tools are always worth watching.
In this case, the changes are pretty significant. Some legacy features that are still used are going to be removed, and a slew of other tools are being added as new features or replacements.
Here are the spark notes for those who rely on the console.
The old Crawl Errors Report is gone, but don’t worry, because all the old functions are going to be part of the newly added Index Coverage Reports. Sitemaps Data will also now be included in these reports as part of the rollout.
Collecting this data in a single place will eliminate quite a few steps between locating the errors and actioning them, so I think it’s a positive change overall.
Google is also launching a revamped URL inspection too that will, among other things, allow for simpler reviews of URLs and “Fetch as Google” functionality. It will streamline the process of adding new images to search results.
The new inspection tool will also cover the identification of blocked resources, so the standalone tool for that is going away.
A few items, however, are simply gone forever and not coming back. The Property Sets tool is going, but I don’t know anyone who uses it, so I doubt it will cause any fuss. Google will also no longer offer HTML improvement suggestions, but that’s mostly automated at this point, anyway.
Speaking of Google killing things, a lot of noise has been made about Google “killing the URL”. If you’ve been wondering what that was all about, you don’t want to miss the next item.
What does it mean that Google is “Killing the URL?”
This recent Wired article is a good introduction to something that Google has been working on for quite a long time, and some of the progress they’ve made. To understand what this means, it’s helpful to begin with why it’s happening.
It’s not just that URLs are getting increasingly long and complicated. The primary issue is phishing.
An SEO probably doesn’t need an introduction to this topic, but phishing is what happens when a third party uses email (and increasingly, websites) to falsely represent themselves as a person or service. Phishing is most often done to collect login data or payment information.
Phishing used to be thought of as something that only affected the online-inexperienced like dear old Grandma. Lately, though, it’s advanced to the point that a phishing site is nearly indistinguishable from the real thing—right down to what looks like the correct URL.
It’s not as easy anymore to escape phishing just by looking for a badly-misspelled word or the wrong top-level domain ( for example, .cm instead of .com). URLs can be spoofed. Whole websites can be swiped by an automated bot after registration lapses for a single second.
What Google is after with its effort to “kill the URL” is a second form of ID for a website. Specifically, a form that doesn’t rely on the user to stay on top of all the ways that a URL can be manipulated.
For the time being, the focus is on a secondary warning system that tattles on URLs that share features common to phishing websites. Chrome is soon going to flag URLs that seem too close to popular websites with warning messages.
That may be just the first step in an effort to move URLs to the machine room of the web while a more secure ID method stands in for the identity of the site.
With the big news out of the way, it’s now time to focus on the stuff that you can put to use right now. Some wonderful guides and case studies came out in the last 30 days, including one by Authority Builders that measures the lingering power of fundamental techniques.
(Case Study) Can you 4X organic traffic using only SEO basics?
A strong command of the fundamentals is a helpful thing to have when playing any sport, but what about SEO? The conventional wisdom is probably that every SEO skill has a lifespan of only a few years. After that, it’s outdated and maybe even dangerous.
The proof comes in the form of a long-term case study that also serves pretty well as a guide.
At the beginning of the work, the site had only 7,000 visitors per month. Between the months of September 2017 and March 2018, that number rose to 68,000.
The changes, which are documented in a helpful amount of detail, consist of a lot of spring cleaning. Missing anchors were added, 404s and redirect chains were addressed while titles and page content was updated to eliminate redundant information.
The off-page side of the effort consisted of guest posts that were added at a steadily-increasing velocity. As Mario details in the post, most of the links were built to aged blogs on closely-related sites. Once again, not a particularly new or high-skill practice.
The whole guide is worth absorbing for anyone who is working with a limited budget or struggling with a site that isn’t responding to more recent techniques.
On the topic of fundamental techniques that have hidden power, the next guide on the list puts a lot of data to something we’ve all suspected for a long time: Google likes it a lot when websites try to answer search queries.
What’s the best way to take advantage of “question keywords”?
Google has had a laser focus on user intent for a long time. Recent features like the answer snippet are just the next step in focusing search results around that intent.
There’s been a lot of debate in the SEO community around how to target featured snippets, but few guides are as comprehensive and educational as this one.
In this SEMrush post from mid-January, Joydeep Bhattacharya steadily breaks down all the different types of questions that Google recognizes, and how it chooses responds to each one.
He demonstrates when Google prefers to provide direct answers (such as when you ask for the date), and when it hands off the snippet to a website for a short or long-form answer. Even more helpfully, he lays out the process of finding the questions for your niche and what makes you likely to rank for each type.
To Google’s credit, a lot of this information is pretty intuitive. If you’re already committed to answering searcher questions, you have some of this advice in place already. However, given the comprehensive coverage, even an expert is likely to find something actionable in here.
Next up, a guide that covers another factor given an unusual amount of weight by Google: The freshness factor.
(Case Study)How to get a boost from Google’s freshness factor
There is one Google ranking factor that can be more frustrating than all the others combined. The contradiction is this: As content gets older, Google starts to value it less. But when new content is posted, it begins its life basically invisible to searchers.
Annoying, right? The content you post will get less influential. However, when you place new content, it will spend months even less valuable than the content that’s decaying.
This kind of formula encourages an expensive cycle of constant content churn where you’re creating new content even when you’ve already got higher-quality content that covers the same topics.
This case study from Gael Breton offers a possible way around that by examining what happens when old content (and its metadata) is refreshed instead of replaced.
Refreshing quality content isn’t a new strategy, but this study goes further in depth. It looks at what changes, in particular, cause the algorithm to respond, how many changes are required and how much those changes are worth.
If you’ve ever replaced content in the past without getting results, this case study might have the missing pieces of your puzzle. The data near the end shows that even minor changes are worth it, as long as those changes are to the right parts.
I tested this myself on various pages, and while I didn’t have the same success at this post (most keywords that I tested stayed flat), I did indeed have 2/11 keywords shoot up. I’ll be looking more into this myself.
Of course, there is an alternative to letting content grow stale in the first place. If you understand how to promote your blogs, they can stay fresh for an impressively long time.
What works best when promoting blog content?
The last couple of roundup items have run a bit on the technical side, but I put this one here because it’s simply great for all skill levels. It’s just a hefty, helpful list of all the ways that a blog can be promoted under the latest version of the algorithm.
You may have tried some of the items on this list if you’ve been in the game for a long time, but it’s so comprehensive that there’s almost no chance you won’t learn something. For example, have you ever seen numbers on what podcast advertising can do for you? Click and find out.
With the guide covered, it’s time to start looking into some analysis of what’s coming up in search. To start with, this video by Charles Floate covers the latest in best practices, and why they’re going to matter more than in the past.
What are the latest SEO best practices for 2019?
At only 30 minutes long, this video is one of the most valuable things you can absorb over a single lunch break or walk about the park.
Charles breaks down all the changes that happened over 2018 (which might be more than you remember, since there were so many of them), including the algorithm updates and changes to PBNs, ranking signals and link juice.
The meat of the video, however, comes with the predictions for the future, which all rung pretty true to me based on my experience in SEO.
Additionally, quality isn’t only going to matter for your own content, but for the content that you’re linking to. Guest posts need to be informative and lengthy. PBNs need to be tested, aged and populated with good content.
Long story short, content has been promoted from “King” to “Emperor”. Whatever makes yours better—on site and off—is probably a good investment into the future.
If this video convinces you that your site needs a better class of link, then the final item on the roundup may be of special interest to you. There may be hidden benefits to disavowing scummy links before you’re forced to do it by a manual penalty.
Will voluntarily disavowing links make Google trust you?
This final item comes straight from the mouth of Google. John Mueller hinted in a recent Webmaster Hangout that disavowing bad links can have a serious effect on how much scrutiny Google applies to your latest links.
While the full exchange is covered on SEO Round Table, the gist of his comments is that a slew of bad links can make Google suspicious of just about every link, even when they come from stirling sources.
For example, let’s say you have a bunch of spammy links from an earlier strategy, and you’re now trying to get back on the righteous path with quality links. If you’ve noticed that strategy isn’t working at all, it may be because the new links are being judged by the company they keep.
To unlock the full power of those new quality links and finally collect a return on your investment, you may have to take out the trash.