You’ve made it to the end of the year, but this week’s roundup will give you no excuses for slowing down. We’re closing out 2021 with an excellent set of guides, case studies, and news items.
First, we have some guides that show off SEOs latest tricks. You’ll learn how to create topic clusters using Wikipedia, how to get a lot more insight into your bounce rate, and why indexing is getting harder.
Next, we’ll look at some case studies. You’ll find a huge analysis of the winners and losers of the last big update and some hard data on building better titles.
We’ll close our roundup with the news. Google released some important statements about product reviews, mobile indexing, and crawl bugs.
Thread: Does Wikipedia Give You Enough Data To Create A Topic Cluster (For A Niche You Know Nothing About)?
Twitter user @jsvxc brings us this Twitter thread shared by several big SEOs this month. He’s developed a fast (5-minute) method to pull topic clusters out of Wikipedia pages when you know nothing about the niche.
He details how to run wiki pages through either Ahrefs or other free tools. He provides tips on finding the keywords with intent and how to generate more keywords from free tools like MissingTopics.
Using the example of a “personal injury lawyer niche”, he shows how he could generate a list of intentional phrases and answerable queries using only the Wikipedia page as a reference.
He admits that this process is designed for situations where you don’t have the time or budget, to perform more intense research.
If this guide helps you plan out some content, the next one on the list will help you optimize it. It teaches you how to measure and optimize your bounce rate.
How To Calculate, Audit & Improve Bounce Rate For SEO Success
Kayle Larkin brings us this nuts-and-bolts look at how to take bounce rate seriously as one of your SEO KPIs to track.
This guide aims to help you find out what yours is, determine if it’s good or bad, and improve it. Kayle starts by providing new SEOs with definitions and links to Google resources to get them up to speed.
She goes over implementing your Google Analytics tag and setting up event tracking for behaviors that align with your objectives. Then, she dives into more technical work.
Over the rest of the guide, she teaches you how to organize bounce rates by marketing channels, set up advanced filters for your data, and troubleshoot problems.
For many SEOs, bounce rate alone isn’t considered a useful indicator for sites. Kayle addresses this and gives advice even experienced SEOs can use to drive people further into a site.
Organic traffic is often a major focus for bounce rate improvements. Before you can develop organic traffic, you’ll need to get indexed – our next guide details why you may find that a lot harder than it used to be.
Why Getting Indexed By Google Is So Difficult
Tomek Rudzki, writing for Moz, brings us this breakdown of why some websites, particularly large ones, are waiting longer for indexing. For example, he reveals that many of the largest e-commerce stores online fail to get 15% or more of their pages indexed.
Then, he dives into a long, growing list of reasons why even pages on authoritative sites are not getting indexed properly. He defines and provides the solutions for all of the following common problems:
- “Crawled – currently not indexed”
- “Discovered – currently not indexed”
- “Duplicate content”
He includes some instructions for checking your index rate and a list of ways that you can increase the probability that Google will index your future pages.
At the same time, he warns that Google has finite resources. In some cases, indexing problems may result from Google downgrading the priority of certain types of pages.
That’s it for the guides. The upcoming sections will cover some big case studies that dropped over the last month. First, we’ll look at a big breakdown of the Google November 2021 Core Update.
Google November 2021 Core Update: Winners, Losers & Analysis
Lily Ray brings us this look at how the latest major core update changed our world. This update was a contentious one. It dropped just a week before Black Friday, and many e-commerce sites were worried that the impact would disrupt their biggest sales.
Let’s see how things broke down. Lily and her team examined nearly 1,500 domains across dozens of niches for this case study. They found a significant number of changes.
According to her analysis, reference-style sites were the biggest winners. These included dictionaries, encyclopedias, and other types of educational sites. Several of them saw 200% or more gains in visibility.
News and publisher sites took the biggest hit. Major news sites like APNews, Forbes, and Reuters saw statistically significant drops. Lily theorizes that this may be because the algorithm is being retooled to favor fresher content.
The full analysis also includes major swings in niches such as health, law & government, and stock photography sites. Google tends to make adjustments to each core update, so stay tuned for additional discoveries.
Our next case study examines whether brand names or target terms matter more in titles.
The Result Of An SEO Split-Test: Does Adding Your Brand Or Target Terms To Titles Matter More?
Brian Moseley brings us this look at the value of including brand names vs. target terms in your titles. He measures whether your brand name or a target term appeals more to searchers’ intent.
For the test, Brian was given access to a massive recruitment site that helps employers find staff. He changed the titles on nearly 2500 pages.
For one group, he made sure the title was always short enough to adequately display the brand name at the end. For the second group, he cut the brand name to include descriptive terms for the service, such as “employees”.
The result was that the pages with the brand name performed significantly better than the pages that used the target terms. Brian explored several reasons this could have happened.
First, he pointed out that searchers use brand names as a quality signal, especially for big brands.
He also theorized that the use of target keywords in titles may have confused searchers. The target terms added to the titles didn’t always match the topics covered on each page. Adding these terms may have led searchers to believe these pages covered different information.
Our final case study also looks at title tags. The author has analyzed nearly a million of them to extract some actionable insights.
6 Important Insights About Title Tags (953,276 Pages Studied)
Michal Pecánek brings us this look at the state of title tags after Google’s recent changes and minor rollbacks. If you’ve been following the story since September, you know that Google started generating titles for a significant number of searches.
After some well-publicized cases of searches returning bizarre titles, Google appears to have backed down a little. They claim that they use existing titles around 87% of the time.
Starting from this point, Michal began a project to document what was happening with title tags by examining 953,276 top-10 pages. Using this research, he produced his own data.
First, he learned that 7.4% of top-ranking pages don’t even have a title tag. That’s a surprisingly high number for sites that are all in the top 10 for their queries. This may be explained by snippets answering queries better in certain searches.
Michal was also able to document how often Google rewrites tags and how they choose to do so.
He found that Google rewrites titles 33.4% of the time. That’s quite a bit higher than their advertised rate, but remember this test group only includes top-10 pages. They may be more likely to face changes than sites outside that range.
He also found that in 50% of cases where Google changed the titles, they preferred to use the existing H1 title. The data suggested that Google has a special zeal for rewriting long titles. They are 57% more likely to change a long title.
Michal’s full writeup contains a lot of other insights that may help you keep your existing titles. That covers the case studies for the month, and we’re ready to look at the news. First, Google has some news about product reviews.
Product Reviews Update And Your Site
Google recently announced changes to the product review update that landed in April 2021. The new update, rolling out now, will build on Google’s feedback and data from the first one.
The new best practices released with this update should be noted by any SEOs working with review sites. First, quality reviews are expected to provide resources that allow users to experience the product. That may include videos, recordings, images, or other media.
Second, quality reviews are expected to offer links to multiple sellers so that the reader can choose from multiple retailers.
The Google release doesn’t say much more than that, but I’ve got a full breakdown of the product review update for you here. It covers what’s changed in this update and some ideas to stay ahead.
Next, Google is changing the timetable on mobile indexing.
A 2021 Update On Mobile-Indexing
Early in 2020, Google announced that all sites would need to be ready for mobile-first indexing in 2021. The original deadline for that announcement was March 2021, but that deadline came and passed without much official communication from Google’s team.
Since our last roundup, Google has formally declared its plans for the future of mobile-first indexing. First, the good news for sites that are still catching up: you have more time. Google has announced that the full implementation will be delayed while some sites catch up.
Their current recommendation is that you continue any work you’re doing to get your site ready. The policy has not changed, just the timeline for those changes.
Google has not announced a new deadline for the complete move to mobile-first indexing. In the end, John Mueller states that the team is taking a deeper look at the sites having the hardest time with the transition. The transfer may not be complete by the end of 2022.
If you’re already seeing bizarre changes in your site data, it may not be a problem. Google recently admitted that a bug might be playing a role.
Google Confirmed Crawling Bug But Said There Were No Negative Effects
Dramatic reductions in website crawling concerned some big publishers around mid-November. The data seemed to suggest that Googlebot was barely interacting with their sites and that new information was not likely being reviewed or indexed.
This topic was brought up by several big SEOs who heard from their clients, and Google’s John Mueller finally responded in a Twitter thread.
According to him, a bug that slowed down crawling for caches was responsible for the changes publishers were seeing. He claims that the bug has since been resolved, and there weren’t widespread negative effects.
Some SEOs in the conversation disputed the idea that there were no negative effects. However, Google deals with millions of sites, and there’s a lot of room for exceptions even without the effects being “widespread”.
If this or another issue continues to stress SEOs, you’ll be able to read about it in an upcoming roundup.
Got Questions or Comments?
Join the discussion here on Facebook.