Welcome to another month in the wild world of SEO. I have some exciting new technical tips for this month that will help you flaunt your search position better, as well a lot of the latest analysis of the Oct 16th update.
If you’re like many of my readers, you spent the last month carefully watching what crawled out of the crater of that update. Positions for a lot of keywords were thrown into disarray.
Terms targeted by money sites seemed to be targeted in particular, so I received a lot of questions about how to mitigate the damage.
My last roundup came out just days before that update, but that’s probably for the best.
Now, we all have a chance to stand back and look at what the web’s top minds have discovered about how this update works over the last few weeks.
In the first few items of the roundup, we’re going to cover responses and research of the latest update. Then, I’m going to show you some of the latest news items for insights on how to protect and improve your positions.
First, a look at some of the ways that keywords were affected.
Post-impact trends of the Oct 16th update
At this point, it does not appear that the sky is falling. The initial result was a lot of volatility, that the people at SEO Roundtable were nice enough to illustrate with a great collection of graphs.
You can find the full collection on them on the article, but this example from SEMRush should give you an idea of what they’re seeing.
The positions changed for many keywords, some dramatically.
Of course, by this point, you’re probably building your own collection of graphs to assess the damage if you’ve been targeted, or to get your house in order if you may be targeted in the future.
If so, I have some news for you’ll find helpful. Some recent research has determined that Google seems to be targeting certain topics.
How the Google update is targeting certain topics, and which ones
The changes that dropped on the 16th are part of a larger rollout that began a couple months ago. Even then, a lot of top minds were speculating on what was being targeted, and the way that the most recent update hit certain sites seems to suggest that they were right.
This breakdown on the topics that were targeted was published by the sistrix blog. It’s definitely worth a read if you have the time, but I also have some quick cliff notes that may not be news to anyone who is running these sites.
In short, sites that targeted sensitive terms were hit harder than the others, particularly those that were involved with medicine, finances or security.
The takeaway seems to be that the bar for content quality has been raised for all of these sites.
Sites that are targeting these terms, but not using language appropriate to a subject matter expert are not likely to recover, at least compared to more refined competitors.
Fortunately, there is a way to improve targeting of these terms through TF*IDF tools that tell you what networks of terms are being used by the pages that are moving to the top.
The most recent update only seemed to confirm these earlier theories, so if you have a site that is in these families, it’s time to consider changes.
But everyone’s a little burned out on the update now, even the latest one. So, let’s move on to the latest news and tips for search engine pages.
Google is testing delivery of two snippets for comparison shopping queries
The featured snippet has been a coveted position since it was first introduced. A lot of research has already gone into what it takes to steal the snippet from the competition.
Now, it looks like Google is now trying to turn up the competition while improving results for searchers.As of just a few weeks ago, Google began showing two snippets in some results instead of a single one.
Twitter user Sanket Kedari noticed the following on a search for strollers.
The new second snippet appears just below the first one. his makes it easy to scan back and forth between the different results, leading some to speculate that the point of the change is to speed comparison shopping on the results page.
This feature is still in limited testing, but the products that are receiving them now seem to be those that have long lists of specs that searches may want to consider.
This has some amazing implications for SEOs, because it means that there are going to be twice as many snippet results as there were in the past. Even if you can’t match the budget of whoever is going for the single snippet, you may still be able to beat out #2.
In fact, you should probably check results now for any site you own that’s been focusing on seizing a snippet. You may have just snagged the second snippet, and you need to make sure it’s optimized.
There is another advantage factor that came up in SEO circles late last month, and that was the emergence of some new testing on pre-rendered results.
Is Google giving some results prerender?
Some very recent testing by twitter user Kane Jamison gave images and some testing to earlier rumors to suggested Google is giving some results with a really high click-through-rate the advantage of a rel=”prerender” tag.
The testing revealed this advantage seems to be going almost exclusively to big branded websites that are generally titans for their terms. In other words, when Google seems very certain of what you’re looking for.
This likely means the sites that are receiving this benefit are going to be harder to dethrone than ever before. However, testing has shown this feature to be mainly repeatable under anonymous browsing.
Though fewer internet users than ever are likely to see much loading time between the top results and lower ones, it’s likely that this measure will result in statistical improvements to already high CTR rates.
Probably not much you can do to improve your chances of getting this if you don’t have it already. Fortunately, I have some news coming up next about how you can do some optimization this month with some data you have easy access to right now: your log files.
Can the log file analysis be used to see what Googlebot is doing?
Yes. At least that’s the way that Moz sees it. Just last week, they released a beginners guide to analyzing the different log files that you have access to. That may be a lot more than you think.
You’ll need a few tools to do it properly, but it’s possible to glean a lot of information from your log files about what exactly the Googlebot is doing when it’s landing on and maneuvering through your site.
When it comes to determining the quality of your website, some pages are weighted more heavily than others. These analyses of log files may give you some great insight into which pages are considered the most important by bots.
If you want to try this yourself as an optimization measure, you should make sure that you watch in particular for the bots running into errors and other problems.
If you find that the bots are constantly running into 404 errors and 301 chains, that is a problem that you are going to want to target as a top priority.
If you’re working with a site that has a lot of navigation problems, targeting the ones that the bots are hitting first should help you start growing more quickly while you’re still working on the rest.
Of course, search engines may not be your only concern when it comes to quality factors, soon. There is evidence now that Facebook is starting to take a look at better quality control.
Facebook now also penalizing sites with stolen content?
Facebook has been making a lot of big moves lately, possibly in response to a faltering user base.
Even as it has lost the younger crowd though, it’s consolidated its features into parts of people’s non-social lives like issue organizing and employment.
It’s a powerful ad platform even after the recent video data scandal, and possibly in response to public pressure from different angles, is working to boost more reputable content. To accomplish this, it has begun to apply its own form of penalties to content.
The penalties are being applied most notably to content that’s fighting for a place in the news feed. The sites that are being hit are those that rehost or republish existing content.
The content, the ads on the site and the general quality of the site are all likely to be factors in whether or not Facebook applies the new penalty.
As part of the rollout, Facebook updated its publisher terms to more clearly define bad practices and to announce that they would be sending out warnings to publishers who were falling short of the new standards.
Facebook isn’t the only alternative search option that’s been handing out more information, lately. In an interesting move, Bing recently revealed the way that it’s crawling methodology works.
Bing recently revealed its crawling methodology
Bing, possibly as a way to attract more experienced marketers to their platform, started a move to more transparency on how their search results are determined. their release of crawling methodology with a couple of blogs mid and late last month on the subject of their crawlers.
However, they took it a step even further during their appearance at SMX East by giving a detailed presentation on their crawl factors (now helpfully hosted on SEO Roundtable), and how those factors can go on to affect your score.
They were kind enough to host the entire slideshow along with the notes that went into the presentation. You can learn from a little of the analysis that they’ve put into it already and a lot from the presentation itself.
If you’ve given any thought at all to expanding your presence on Bing, you’re going to find a lot of helpful information here that will help you optimize your site further.
With those alternatives covered, let’s head back to Google for the final two news item. One for those who prefer to focus on the map results, and after that, a refreshing change of pace from Google’s typically guarded stance toward information.
Reserve it with Google is expanding
When you search for a local business, the 3 GMB results that appear in the snippet (along with a call button) are called the Local 3-Pack.
That pack has been the testing ground for a new feature recently that should be important to SEOs that have clients focused on Map results.
“Book” is the newest option that is now appearing for some results, and it’s part of the next stage of testing for the ’Reserve it with Google’ feature that is beginning to roll out.
This makes a position in the pack all that more important for people attempting to rank for local positions. That pack now has a direct relationship to conversions and will be a good prize for any client you can secure it for.
Google is sending “fix slow page loading” notices to sites where it is affecting rank
Google doesn’t have a great track record of letting you know why their algorithms don’t like you. That’s why this last item might be a refreshing change of pace for some.
Google is now sending out direct emails to people who have loading problems so bad that it has resulted in a penalty to their position rank. Even for a helpful gesture, it feels a little passive aggressive, right?
Whatever the motivation, If this notice directs your attention to a site you don’t pay much attention to or a problem with one of your lesser pages, it could really be a lifesaver.