For the past several months, I’ve been involved as a director for a very talented agency called The Search Initiative (TSI).
The reason I teamed up with these guys was very simple.
These guys were testers. Like me, they only rely on experience, data, and test results for their ranking strategies.
It was a match made in heaven.
Since then, we’ve had the pleasure of onboarding over 50 new clients offering a wide range of organic seo consulting services.
Many of these new partners have had seen huge growth, others have had penalties removed, and all have clear roadmaps on how to grow in the future.
I wanted to share with you the top 10 problems we typically encounter when onboarding new clients. I hope that by sharing with you some of these common mistakes, that you can use this knowledge to your advantage and make some serious improvements to your rankings.
The 10 Most Common SEO Issues in 2020
- 1 The 10 Most Common SEO Issues in 2020
- 1.1 1. Index Management Problems
- 1.2 2. Localization Issues
- 1.3 3. Keyword Cannibalization
- 1.4 4. Over-Optimized Anchor Text
- 1.5 5. Poor Linking Strategy
- 1.6 6. Low Quality Affiliate Content
- 1.7 7. User Performance Metrics
- 1.8 8. Titles & Meta Descriptions
- 1.9 9. Internal Redirects
- 1.10 10. Low Quality Pillow Links
- 2 Conclusion
Many of you will be familiar with the inverted pyramid writing style, where the most newsworthy content is at the top and the least is at the bottom. I’ve tried to follow this structure; however, all the points below are not to be slept on. They’re all major issues that commonly appear amongst even the best sites.
If you want to get the most out of this article, check out every point here. As you know, there are no shortcuts when it comes to SEO.
- Index Management
- Keyword Cannibalization
- Over Optimized Anchor Text
- Poor Linking Strategy
- Low Quality Affiliate Content
- User Performance Metrics
- Titles & Meta Descriptions
- Internal Redirects
- Low Quality Pillow Links
1. Index Management Problems
The first and most common issue that we’re seeing is accidental devaluation of the website because of indexing issues.
It stems from a common misunderstanding about how Google actually works.
(More on this in a bit…)
Most people think that if they build links and noindex junk pages they’re fine. However, it’s not that simple – and I’m about to show you a real example.
It’s quite hard to see but you may notice I have highlighted the number of HTML pages that are filtered. It’s a whopping 32,064 pages and, yes, it took us a long ass time to crawl.
None of the 32,064 pages found in this crawl included a noindex tag, which means (in theory) Google should be able to crawl and index these pages. So, let’s check this against our numbers in the Google Search Console:
When we check in Webmaster Tools, we’re seeing 14,823 pages indexed. While this is a large volume, it’s still less than 50% of the pages that were found with Screaming Frog.
This is the first sign that something is seriously wrong, but the next screenshot will show you the extent of how badly our client had been stung with Panda’s low-quality algorithm. We use the “site:domain.com” operator to pull up the number of indexed pages:
Despite the website having 32,064 pages crawlable and with index tags, and despite Google having indexed 14,823 in Search Console – only 664 have made it into the actual index. This site search shows us that Google has highly devalued most of the website.
It is a crawling nightmare.
So, the question is, how can you fix this?
Thankfully the answer for most people is quite simple.
Start by performing a site:domain.com search and auditing Google’s index of your site. If you go to the final page and you’re greeted with the below message, you have work to do:
Take a hard look at which pages shouldn’t be indexed and start proactively removing them from your crawl budget.
The problem with Google is that despite you adding a noindex to your pages, they remain indexed until Google recrawls. Some people add robots.txt to block these pages and save crawl budget – which is a good idea, but only after the pages are removed.
For the rest of us, we’re going to need to use the URL Removal Tool.
2. Localization Issues
The second most common issue we are seeing is when clients have multiple languages. While it’s great to have international coverage and provide foreign users with localized text – it’s a nightmare for Panda penalties if not setup correctly.
Many people are familiar with the URL structure that you should use for localized text, but many people forget to setup HREFLang on their website. My buddy Tom talked it in this interview I did with him here.
If you are looking to setup HREFLang codes, I suggest you use this website to get the right country and location code every time.
Below is an example of an eCommerce client. Whereas the previous client had issues with index management, this time it’s caused by HREFLang, and one more thing that goes unnoticed…
While the client has successfully included hreflang in their source code, they had not included both the location and language code. The one time they try to do this with en-GB, the page no longer exists and redirects to their sitemap.
To add, this covers just 50% of the languages their website operates under. This has created an enormous amount of duplication to be indexed.
However, there’s still one more thing that was missed. Each page has the Open Graph local set for en_US:
This includes the pages that aren’t in English.
While this setting isn’t as clear cut as hreflang, it is indeed something that will provide Google with information on locale, and therefore creates confusion.
If your website has a similar issue, we advise you make the locale dynamic to match the current language.
For more help with locality and its importance, check out this page on local seo solutions.
“The guys at TSI make SEO look easy!.. We were a completely new website planning to operate in arguably the most competitive online marketplace which made the task ahead extremely difficult as we were going up against many well-known global businesses which also resulted in many different SEO agencies reluctant to work with us. TSI wasted no time in implementing their campaign and within 4 months our website was ranking on page 1 for some of our most profitable keywords. The guys at TSI are constantly keeping me updated and send me monthly reports on my campaign performance and keyword tracking. I would recommend their services in a heartbeat!” – Jon H
3. Keyword Cannibalization
This is a surprisingly common issue for most websites that we encounter. Despite the large amount of resources online to help with cannibalization, you would be surprised how many people still suffer from it.
Never heard of it?
Quite simply, it’s when you have multiple pages on your site competing for the same keywords.
And guess what? Google doesn’t like it.
The first step is to learn to diagnose the culprit pages, because if you cannot find cannibalization – how can you fix what you can’t see?
At The Search Initiative we have a few ways to find cannibalization but here’s the easiest and most effective.
Use Keyword Tracking Tools
One of the benefits a client gets from working with TSI is that we track keywords up to twice daily with Agency Analytics, one of our partners.
The tool includes an overview of the site’s overall keyword performance, such as below:
Aside from showing us an overview of how this client has performed over the past 7 days, we can use this to track each keyword’s performance independently too:
In this photo, you might notice that there had been a jump from the 3rd page to the 1st page for their target term after implementing some of our onsite advice. However, more importantly you will also be able to see that the Google URL had started flipping between their category page and their homepage.
This is an obvious sign of cannibalization and once noticed we jumped into action to fix the problem.
To learn more about keyword cannibalization, I have a master guide here.
4. Over-Optimized Anchor Text
There was a significant update in October 2016 as Penguin 4.0 rolled out.
Penguin 4.0 was an update that changed how Google perceives and interacts with links. I even wrote an article for Ahrefs covering how it affected anchor text optimization.
As part of our auditing process for each new client, we analyze your existing anchor text and break the types down into the below values:
- Branded – an anchor text that includes your brand name or slight variation, for example: ‘thesearchinitiative’, ‘visit thesearchinitiative’, or ‘TSI’.
- Generic – an anchor that uses a generic term but does not include branding, for example:‘here’, ‘read more’, or ‘visit site’.
- Image – a link that has no anchor is generally shown as a blank in AHREFS export feature. Other clues might be file extensions in the alt attribute ‘jpg’ is probably an image.
- Miscellaneous – an anchor that does not qualify as generic, but is otherwise unrelated to the website. Forum and comment spam often includes anchors such as ‘Steve’, ‘Stuart’, or ‘Stan’.
- Low Quality – an anchor that is more than 100 characters is generally an irrelevant anchor unless it’s a long URL. Another low quality anchor is a foreign language and symbols.
- Targeted – an anchor that includes the exact or partial term you are trying to rank for, effective to gain rankings but higher risk for tripping a Penguin filter against your site.
- Topical – an anchor that is on topic, but does not include your targeted term. For example, an affiliate site reviewing ‘best running shoes’ might include topical anchors such as: ‘healthy workout’, ‘burn lots of calories’, or ‘high impact sport’.
- URL – this is arguably the most obvious one, but anchors that are naked URLs such as ‘example.com’ and ‘http://example.com’ would count as URL.
Here’s an example of a client that recently joined and has an issue with anchor text. The labels match the descriptions above:
In this example, the website in question has chosen to use lots of targeted anchors, but has also picked up lots of low quality anchors along the way. The solution was to increase the amount of topical, branded, and generic anchors so he could meet the anchor requirements as determined by the niche average (read more).
By increasing the volumes of those anchors, the client regained organic traffic loss and is now setup to survive future updates.
It’s important to note that most people use low quality pillow links and press releases to redistribute their anchors.
There are indeed some issues with this that are covered in the next two points.
5. Poor Linking Strategy
Up until this point, 3 of the top 4 issues have been related to onsite. While Issue #5 is indeed another offsite link building issue, it’s important to recognize the connection.
When a website has fixed its technical issues, pumped out valuable content, and improved user performance metrics – link building becomes a lot easier.
Rather than needing 100’s of links to rank a site, you can achieve a lot more with less. Since link building and onsite both cost money – you may be wondering why not just spend money on links?
The answer is simple…
Google has introduced many link building filters to thwart your efforts, therefore, the more links you build the more likely you are to be caught. By delivering better content you will not only improve your conversion rates, but you will make it easier to rank higher, permanently.
Check out one of our older clients who has been relaxing on page 1 for 2 years comfortably:
So, the question is, what makes a link strategy good?
The first thing is to avoid over-optimizing your anchor text, because this is eventually going to cause a penalty. Choose to use topical terms and branded anchors to hit your pages instead.
The second thing is to target other pages than your main core pages. If you have created a blog post that is valuable and internally links to one of your core pages, throw some links at that page too. Avoid becoming the Black Sheep.
Not only will this help prevent overcooking your page, it’s going to help you rank for long tail keywords that you didn’t claim before.
The reason your competition doesn’t do this is because they fail to focus on any pages other than their money pages.
Write that one down. Post it on your wall.
6. Low Quality Affiliate Content
This should go without saying, but if your content is not good then you don’t deserve to rank. Period.
However, what most affiliate sites are guilty of is not mopping up all the juicy long tail keywords that are easy to rank for and provide noticeable traffic. It’s not that they don’t want to rank for those keywords, it’s just that they don’t know how.
We want to share a couple images with you that show just how powerful content can be, and if you don’t value your content – what can happen:
This client has recently joined us and suspected a penalty, and at first glance there’s a dip in visibility, but nothing that seems too unusual. Until you zoom into the top 10 positions:
What initially looks like a slight dip is really a huge drop in rankings, and this person has suffered from decreased traffic for about 8 months.
The culprit? Content.
This is no fault of their own, but somebody has scraped all their website and created duplicate content across the web. We’re currently working to re-write their whole website’s content, file DMCA requests and fix holes in their linking strategy. We anticipate a return in rankings within the next 3 months.
While this highlights the power of content in Google’s algorithm, this is slightly different from what I am describing with low quality affiliate content. The main culprit we see is when every page has an affiliate link and there’s no actual user value.
It’s possible to rank this way, but there are some drawbacks.
Let’s look at User Performance Metrics and how this can help guide our content strategy.
The Search Initiative has helped to grow our business where four other digital marketers have failed. The team is not only flexible, responsive and reliable. They also make decisions that are data-driven with their proprietary tools. They are not testing and guessing with their work as most others do. We will be rounding out the year with them and looking forward to a long and profitable engagement. – Vik C
7. User Performance Metrics
We have noticed that many affiliate sites and even some eCommerce companies are not focused on user performance metrics. This is bad for several reasons:
Firstly, you’re going to be limiting yourself severely.
There is a finite amount of people searching monthly for what you offer. By avoiding the issue of content, you’re forcing yourself to spend money on links to brute force rankings towards terms that are not relevant and not fruitful.
Instead, we would suggest that you should focus on converting your existing traffic while simultaneously growing your potential traffic. Take this example:
If you have 1 customer in 1000 visitors that purchases from you then to triple your customers you have two options:
- Increase traffic from 1,000 to 3,000
- Increase conversion from 0.001% to 0.003%
We applied this strategy to one of the eCommerce websites mentioned earlier in the article. Their website had been devalued and it’s going to take time to grow traffic, so we decided to pursue conversion while working on fixing the huge issues.
Here are the results:
In the past 2 weeks, we have managed to increase the sessions by a modest 4.26%, which we’re happy to take, considering the condition of the site – but lower than average for our clients.
However, the main point to notice is the 79% increase in Conversion Rate, 86% increase in Transactions and 49% increase in Revenue. These changes mean that as we fix the devaluations against the site this client is primed to make a significant revenue gain.
8. Titles & Meta Descriptions
This is like keyword cannibalization in that it’s surprising how many websites still have issues with it. This is something I would expect most people to be getting right by now since there are literally millions of pages on the topic:
I personally cover this in great detail in my Evergreen Onsite SEO guide:
9. Internal Redirects
This is one of the most common issues that websites face. A large volume of 3XX Redirects on your website seems fine to most – if it’s a 301. However, this isn’t strictly true and here’s why:
A 301 redirect is designed for when a user requests a page that is no longer available and has been permanently moved. This is something that happens a lot across the internet. The server after a moment of latency returns a different URL and the page loads as usual.
The issue with the above is the term latency and it’s something most webmasters ignore. The physical distance between a user and your server means that even a tiny bit of header information takes time to send and receive.
If you are looking to improve your user experience, then you should make your website as fast as possible (read this) and therefore remove all 301s when not absolutely needed. This will be better for your user and help authority flow within the website unhindered.
However, what’s the difference between a 301 and a 302 redirect?
Whilst a 301 redirect points towards a permanent move from one location to another, a 302 is a temporary move. From Googlebot’s perspective this means that:
- 301 Redirects should index the new URL, rather than the previous URL
- 302 Redirects should index the previous URL, ignore the new URL
Google has claimed to handle both the same but it doesn’t make sense that they would do this. Both status codes have different purposes and should be treated differently.
Make sure you’re using the right redirect on your website.
10. Low Quality Pillow Links
What are pillow links?
Pillow links are used to diversify your anchor text ratios – in the audio industry this would be comparable with signal-to-noise ratio. When you buy a microphone, you want low noise and high signal – but this is the complete opposite to SEO link profiles.
If your link profile has a high signal and low noise, it’s going to be easy for Google to analyze your website’s links and pick up unnatural trends with their machine learning. You could consider pillow links to be dithering for the audiophiles out there; you add noise to the signal to improve the quality.
Here’s a shocking figure for you from a recent client who had built 100s of pillow links attempting to dilute a high target anchor text distribution:
Despite the client having built 315 pillow links in the past 12 months, only 25 of these links were indexed. This means that despite all the time and money spent, it was providing almost no value.
The solution is to use indexing tools that encourage Googlebot to index your pillow links… or just build all quality links and you won’t have this problem.
For indexing, we use our own proprietary tool but there are some online services that can do this for a fee.
“TSI are very professional and their SEO knowledge is top level. I’d been doing my own SEO for years before hiring them, and had some decent success with it at times, so I know good SEO when I see it. My site’s traffic and rankings are gradually rising and TSI have answered all my questions. We’re only 2 months in, but so far so good!” – Sam D
So there you have it.
These are the 10 most common issues we have found with client websites in the past few months and I hope this helps you in your own ranking endeavors.
Get a Free Website Consultation from The Search Initiative:
If you’re still having trouble then contact The Search Initiative to find out how we could help you gain results that last and improve your conversion rates in the process.