Every once in a while you run an SEO campaign that changes the way you do everything.
The lessons you learn, the challenges you face, and the results you achieve inspire you to rewrite your whole SEO gameplan.
This is the story of one of those SEO campaigns.
As you might already know, I’m a director of a very talented SEO agency called The Search Initiative (TSI). Since coming on, we’ve encountered many wins and this case study is one of them.
In a few months, we lifted their algorithmic penalty and increased traffic by 9,109%. You’re about to learn the exact steps we took to achieve this.
You’ll learn:
- A detailed onsite, offsite, and technical SEO audit process
- How to repair algorithmic penalty problems
- A safe link building strategy for 2024
- Conversion rate optimization strategies for fast growth
Fair warning: the strategies detailed ahead are intense but worth it.
Here’s the success one reader found after following this case study:
Case Study: From 1,036 to 95,411 Organic Visitors Per Month
Table Of Contents
This is the story of a campaign for a social media marketing website.
Our client monetizes their website by selling monthly subscriptions to achieve better social proof on Facebook, Instagram, and other social networks.
If you’ve ever been in this niche before, you’d know it’s not an easy one. It’s one of the hardest niches there is.
The Challenge
The client joined The Search Initiative with a heavy algorithmic penalty. Traffic at the time had decreased significantly to almost 1/10th of the previous volume.
If you’ve ever had an algorithmic penalty before, you can directly connect with the frustration and annoyance of such a disaster.
The main challenge was to determine what type of a penalty hit the site and to take action on getting it lifted.
General Approach
We started by thoroughly analyzing the data based on the tools available to us and the details provided by the client. The initial analysis included looking into:
- Google Analytics
- Google Search Console
- Keyword tracker (Agency Analytics)
- SEMrush
- Ahrefs
- Cloudflare
- Server settings
- Previous link building reports and audits
Once we determined the most probable cause of the penalty, we put together a plan of action.
We created a comprehensive onsite, offsite and technical audit before building the overall domain authority through our own link building strategies and traditional outreach to relevant blogs and sites.
How We Did It
The Dynamic Start: Backlink Review
The link profile of the domain included a lot of spammy, low-value domains.
Since a previous automated backlink audit (most probably done using Link Research Tools) had been performed before the client joined our agency, we started by reviewing its results.
At TSI we know that if it comes to potential link penalties, especially the algorithmic ones, we have to be very thorough with the link reviews. To start the analysis, we downloaded all the link data from the following sources:
- Google Search Console – it’s a real no-brainer to include all the links that Google definitely has in their database. However, according to this Google Webmaster Help page, you have to remember that GSC presents only a sample of links, not all of them.
- Ahrefs – it is our go-to and best 3rd party tool when it comes to links. Their database is an absolute beast and the freshness of the data is also outstanding. To gather all link data, go to Ahrefs, type in your domain and select Backlinks. Now you’re good to Export it to an Excel file:
By the way, make sure you select the Full Export option, otherwise, you’ll be exporting only the first 1000 rows with the Quick Export:
- Majestic – even though their crawler might not be as complete as Ahrefs, you still want to have as many link sources as possible for your audit. With Majestic, you’ll have to type in your domain → Select “Root Domain”→ Export Data.
Now, because of the link memory (AKA ghost links – links that are deleted, but Google still “remembers”), we export the data from both, Fresh and Historic indexes. Also, ensure to set the tool to “Show deleted backlinks”.
- Moz and SEMrush – Similarly to Majestic, with these two we just want to have as many links as possible and complement the database, in case Ahrefs missed some.
How to get links data in Moz Open Site Explorer: Your site → Inbound Links → Link State: All links → Export CSV
How to get links data in SEMrush: Your Site → Backlink Analytics → Backlinks → Export. Please make sure to select “All links” option.
We had all the data now, so it was time to clean it up a bit.
There’s no real secret in how to use Excel or Google Sheets, so I’ll just list what you’ll have to do with all the link data prior to analyzing it:
- Dump all Ahrefs data into a spreadsheet. If you’re wondering why we start with Ahrefs, it’s explained in step 4.
- Add unique links from GSC into the same spreadsheet.
- Add unique links from all other sources to the same spreadsheet.
- Get Ahrefs UR/DR and Traffic metrics for all the links (Ahrefs data will already have these metrics, so you’re saving time and Ahrefs’ credits).
- Spreadsheet ready!
With the spreadsheet, we started a very laborious process of reviewing all the links. We classify them into 3 categories:
- Safe – these are good quality links.
- Neutral – these are links that are somehow suspicious and Google might not like them that much – although they’re quite unlikely to be flagged as harmful. We always highlight these in case we were to re-run the link audit operation (for example if the penalty did not get lifted).
- Toxic – all the spammy and harmful stuff you’d rather stay away from.
Some of the main criteria we’re always checking:
- Does it look spammy/dodgy AF?
- Does it link out to many sites?
- Does the content make sense?
- What is the link type (e.g. comment spam or some sitewide sidebar links would be marked as toxic)?
- Is the link relevant to your site?
- Is the link visible?
- Does it have any traffic/ranks for any keywords? Ahrefs’ data helps here.
- Is the page/site authoritative? Ahrefs’ DR helps here.
- What’s the anchor text? If you have an unnatural ratio, then it might be required to disavow some links with targeted anchor texts.
- Is the link follow/nofollow? No point disavowing nofollow links, right?
- Is it a legit link or one of these scraping/statistical tools?
- Is it a link from a porn site? These are only desirable in specific cases, for example, you’re a porn site. Otherwise, its disavow time.
If it is likely that the whole domain is spammy, we’d disavow the entire domain using “domain:” directive, instead of just a single URL.
Here’s a sneak peek of how the audit document looked like once we finished reviewing all the links:
Then, we compared the results of our audit and current disavow file and uploaded a shiny new one to Google Search Console.
We disavowed 123 domains and 69 URLs.
Additionally, we also used our in-house, proprietary tool to speed up the indexing of all the disavowed links. Something quite similar to Link Detox Boost, but done through our own tool.
Here’s a little screenshot from our tool:
Crucial Stage 2: The Onsite Audit
The next step taken was a full, comprehensive onsite audit.
We reviewed the site and created an in-depth 30-page document addressing many onsite issues. Below is a list of elements covered in the audit:
Technical SEO
Website Penalties
First, we confirmed what the client has told us and established what kind of penalty we’re dealing with. It has to be emphasized that there were no manual actions reported in GSC, so we were dealing with a potential algorithmic penalty.
We searched Google for the brand name and did a “site:” operator search.
If you were able to find your brand name ranking number 1 in Google (or at least among your other profiles, e.g. social media accounts, on the first page) and it’s no longer there, you know you’re in trouble. Basically, if Google devaluates or de-ranks you for your own brand, this is a very strong indicator that you’ve been hit with a penalty.
With the site: operator search it’s a bit more tricky. However, as a rule of thumb, you could expect to have your homepage show as a first result returned for a simple query: “site:domain.com” in Google.
Another way of confirming the content devaluation is to copy and search for a fragment of the text on your core pages. In the example below I do a Google search of 2 sentences from one of my articles (right-click to bring up a search of the text you highlight):
As you can see below, Google finds it on my page and shows as a first result:
If it was not the case and Google did not show me first or at all, then it would be a very strong indication that the article page or site is under a heavy devaluation or even a penalty.
HTTP / HTTPS Conflicts
At this point, it is also a good idea to make sure Google has only indexed the preferred protocol for your site. To do that, run another simple site search:
You don’t want any http results to come up if you’re using https or vice versa. If it happens, you most probably do not have correct redirects in place.
To ensure a 301 redirect from http to https on WordPress, you can try using a combo of these 2 plugins:
- https://wordpress.org/plugins/http-https-remover/ – ensures your internal resources are referenced with protocol-independent links – start with //: instead of https:// or http://
- https://wordpress.org/plugins/really-simple-ssl/ – does a good job and handles the redirects for you. From our experience, it works seamlessly in 9 out of 10 cases.
We describe https conflicts in more detail further down this case study.
Indexed Content
In this section, we focused on everything that was indexed but shouldn’t be.
As Rowan from TSI already discussed in his article The 4 Pillars of Mastering Google Website Crawl, you don’t want Google to crawl stuff that does not represent any value for the search engine. Here are some examples of such pages:
-
- Cart & Checkout (example – it is actually indexed)
- Your internal search results (example – noindexed)
- WordPress image pages (it can actually really affect your rankings!)
- Product review forms
- Product comparison pages
- Session IDs (plenty of examples)
- Empty categories – best not to have them
- Login and Registration (examples)
You can find them by running “site:” operator searches, usually combined with the “inurl:” operator. Make sure to explore the supplemental search results, too, as Google will try to hide stuff from you:
You may be surprised how much of an issue it can really be. In the above case, only 3 URLs were revealed in the normal search, but 145 are really sitting in the supplemental index:
Site Speed Optimization
Site speed optimization is a fundamental part of modern SEO, and a fast loading time has an immediate and significant effect on rankings, with Google actively suppressing the rankings of sites with a slow loading time. To reduce server response time aka load speed, click the previous link.
I go through it in more detail below.
3XX Redirects
If there are any internal pages that are using a 301 redirect from one URL to another, they should be linked directly with a status code 200 to prevent unnecessary load time or loss of authority.
4XX Errors
Very similar to 3XX Errors. You should avoid having any internal 4XX Errors.
Make sure to not only rely on Screaming Frog (or other crawlers) but, first of all, check your Google Search Console.
Site Structure
The site structure plays an important part in helping Google determine the authority of each page and is broken into levels (or distance) from the homepage.
As a general rule, the homepage is most likely to rank for large broader terms, with deeper pages naturally gravitating towards fewer keywords that are more specific.
Level 0 – Homepage
Level 1 – Pages linked directly from homepage
Level 2 – Pages linked directly from Level 1
:
I go through how we improved the Site Structure in more detail below.
Robots.txt
This section of the audit covered a review of robots.txt and its improvements.
Unwanted sections of the site discovered during “indexed content analysis” (in this case it was the Cart and Author pages), which Google was not supposed to even attempt to visit, were excluded in robots.txt.
Additionally, we made sure that XML sitemap was being referenced in robots.txt file.
Here’s an example robots.txt file similar to the one we suggested:
User-agent: * Disallow: /cart/ Disallow: /author/ Sitemap: https://domain.com/sitemap.xml
WHOIS
Google is particularly interested in sites that do not drop and rarely swap ownership, this is because it’s a sign of trust that you are a legitimate site.
I recommend that you always consider renewing your domains for 2 – 3 years at a time since this is a good sign of trust and might be considered a very small ranking factor (note: I have not tested this).
Internal Linking
We evaluated the way the navigation was set up and advised the changes with the top menu to ensure better flow of link juice.
You should also look to change your internal anchor text so that links to the homepage have a more brand focus and internal pages are more targeted with Exact Match, Partial Match, and Topical anchor text. Read my article “A Complete Guide to Anchor Text Optimization” for more info on balancing these ratios.
Mobile Formatting
There were some formatting issues with the way the site was displaying on mobile devices. Although not so much of the ranking factors, cosmetic layout issues might be annoying and scare your potential users away. Who’d want to buy from a site that looks abandoned?
Schema Markup
We reviewed the markup errors Google Search Console has revealed:
Structured Data is a great tool, however, it might be easily messed up by just a simple typo in the code. In this case, the issue came up for a number of author pages, which we wanted to remove anyway, so once we did so, all the issues were gone.
Accelerated Mobile Pages
Similarly to Schema Markup, we also reviewed AMP issues reported in Google Search Console:
AMP, exactly as Structured Data, is very sensitive. In this case, we had to fix an issue marked as “The attribute ‘type’ may not appear in tag ‘li’”.
Here’s exactly what was causing an issue:
To fix it, we simply had to remove the “type” attribute.
More about fixing the most common AMP validation issues can be found here.
Content SEO
Thin Content
Thin content can quickly bring down the overall ranking of a site, as Google decides whether the site is able to deliver useful information to its users.
Once we crawled the site with Screaming Frog, we pulled all pages with less than a 1000 words and suggested to bulk it up:
What you should remember is that Screaming Frog provides a very naive word count. It includes all static on-page elements, such as: menu, footer, header, sidebar, etc. For this reason, 498 words reported in Screaming Frog might, in reality, be only 189 words of primary content.
Also, do not get caught up with this “I have to have 1000 words of content on every page”.
Google is after juicy content more than waffle, therefore if you can fully cover the subject and make it 100% on-topic within only 700 words, then I do not encourage you to be adding some random stuff only to hit the desired 1000 words.
Google won’t appreciate that. It’s better to leave it with 700 and see what happens.
Alternatively, research the subject more and add some relevant information there.
Recently we had an emergency plumbing site with city-specific pages talking about festivals in the area and showing some statistical data about the city.
You might have already guessed it – Google would not rank these pages for terms like “emergency plumber [city]”.
However, they ranked very well for keywords like “[city] population”, “festivals near [city]”, etc.
Why? Because the content was irrelevant to the promoted services and Google found absolutely different search intent for these pages.
Duplicate Content
Duplicate content can quickly bring down the overall ranking of a site, as Google decides whether the site is able to deliver useful information to the users.
We run a Siteliner crawl to see the “DC ratio” and where the most of the duplicated content was. Here’s how the tool looks like while scanning my site:
The recommended maximum amount of duplicated content is 10%. Some pages will be way over this mark, which might be quite normal. You should review each individually to ensure that the content there is, in fact, OK to be duplicated.
Some of the duplicated stuff in this case study came from the /author/ pages mentioned earlier and needed to be ROBOTS.TXT blocked.
Page Title Optimization
Page titles (title tags) signal to users (and Google) a summary of your page and what they will find within.
When the page title includes a core keyword it is normally highlighted to users in the SERPs and can also compel users to click-through.
However, long page titles are often truncated and do not look professional, so it is wise to create tidy titles that will entice users to read more from your website and include your targeted keywords to feed the Google algorithm information that it needs to establish relevancy.
Once you crawl your site with Screaming Frog, make sure to review each element under the Page Titles section and act accordingly to the issue:
Also, make sure you read Ahrefs’ guide on How to Craft the Perfect SEO Title Tag to up your page titles game.
In this case, we also later used page titles to fight some keyword cannibalization issues, which I talk about a bit more below.
A very easy win for you could be just to extend the shortest page titles within your site by enriching them with some of your core keywords.
However, while building the page titles, be careful not to cause cannibalization issues or over-stuffing the title tags.
Heading Optimization
Headings are an important way to signal to Google more than just site structure, but also relevancy of content on a page.
We recommended that the client included a single H1 tag per page and used H2 tags for SEO relevant headings, and H3 tags for non-SEO-relevant headings.
This is a big relevancy signal to Google that could be well optimized on your site.
Similarly to Page Titles, Screaming Frog should help you while reviewing all your headings.
Having a combined list in front of you, you can notice some optimization issues (too long, too short or not unique headings) and rectify all them. Headings, especially H1 and H2 are often an overlooked optimization element. Just by improving their readability, adding some keywords in or making them more relevant to the page content, you can get some very quick wins.
Meta Description Optimization
Meta descriptions don’t contribute to a site’s rankings, but relevant, compelling meta descriptions can encourage more people to click through from the SERPs.
Since meta descriptions are just a snippet of text, it’s important to include vital information about whether you are going to fulfil a person’s needs, giving them a good reason to click through.
I would recommend that you write a description that includes keywords you are targeting for each page and summarizes what your page is going to provide. When the meta description is too short or non-descriptive, Google uses random text from the page that doesn’t always encourage users to click through.
This site had a lot of pages with missing or poorly auto-generated meta descriptions. Even though it’s not always efficient to manually do all your page descriptions, you should still do that for your core pages.
Image Optimization
Image alt attributes are used to make it easier for Google to understand what your images present. They also help you to show up in the Google Image Search.
Alt attributes also provide a place where you can include your core keywords, helping Google to get a better understanding of your page contextually, whilst also helping you to improve your keyword density.
Be careful not to over-optimize though.
Proper image optimization, however, should also focus on decreasing the size of images. As you can see in the above screenshot, there were many images above 1 MB on our site. This meant that not only we looked at missing, over-optimized or too long alt text, but also at all images above an average size of 100-150 KB.
In WordPress you can use many plugins that will automatically optimize your images:
Another Discovery: Potential Cloudflare Issue
In the meantime, while creating the audit, the Campaign Manager spotted a serious Cloudflare configuration issue, which might have had been causing Googlebot crawlability issues. The setting was amended immediately and crawl speed in Google Search Console increased in order to force a re-crawl on the site as soon as possible.
If you use Cloudflare, it might be worth looking at the below option:
It’s located under “Firewall → Package: Cloudflare Rule Set → Advanced → 100035 Prevent fake googlebots from crawling”.
As far as I know, it’s a part of Web Application Firewall (WAF).
By the way, my colleague, friend, and director of SEO at TSI, Rad Paluszak (also one of the speakers at Chiang Mai SEO Conference) told me of some cases of cr*ppy hosting providers who were blocking Google and other search engine bots just to save money on the traffic and internet bills.
I know, right…WTF?!?
Campaign Goals Breakdown
Our audits actually check for many more website shortcomings, but these are the main issues that we uncovered for this client’s site.
We then put together a custom strategy to not only recover from the penalty but to get high levels of traffic back into the site.
We outlined the core campaign goals of the strategy as follows:
- Get the penalty lifted, otherwise, all other changes would have no impact.
- Push core keywords into the index. We noticed a lot of low hanging fruits, however, the keywords with the highest search volumes were not within the first 100 search results.
- Bring keywords on the first 3 pages to page 1. Specifically terms with 1,000+ monthly searches.
- Improve CTR of the site once traffic returns to normal.
- Work on the conversion side of the site, as we were predicting that with great visibility, the site might still suffer from a lack of decent conversion rate.
Success: Algorithmic Penalty Lifted
We executed the onsite audit and assisted with additional support during the very early stages of the campaign.
Even for us, it was a bit complicated to point out a definite trick we did to lift the penalty because we had taken a fair number of actions at the early stages, however, we would point our finger at the link profile as the most likely cause.
Let’s have a look at other key areas we worked on AFTER the penalty removal.
More Technical Discoveries
Other than what I already described above, there were many interesting technical developments once we audited the site. Below are a few of them:
Page Speed
We discovered and advised on some basic (and advanced) page speed optimization strategies, including deferring JavaScript, improving time-to-first-byte (TTFB) times and using of sprites.
The guys started at 41/100 Low (Mobile) and 71/100 Medium (Desktop) for the site. Now it looks like this ?
Don’t know about you, but it already looks quite good to me.
However, we’re still planning to tweak it even more with Critical CSS Path optimization and improving (or actually getting completely rid of) render blocking scripts. One nifty tool they use for that is https://criticalcss.com/
HTTPS Conflicts
The site was using one particular HTTPS certificate for the login page and different one across the other pages of the site, which was creating a conflict in some cases with the browser highlighting a change in the certificate.
Cert #1 | Cert #2 |
As you can see, there were 2 completely different SSL certificates installed. This might not make much of a difference to Google, but since Chrome’s becoming more and more paranoid about security, we wanted it unified.
Site structure and Internal Linking
We suggested a better approach to siloing the site’s content. We also gave recommendations regarding improved internal linking strategies.
Very basic stuff to show Google the site’s hierarchy: Homepage → Category → Subcategory → Product.
This is how it looked before:
Notice there was only 1 edge (connection) between the homepage and one of the most important core pages linked with all the other categories, products and services. Additionally, the connection went through another unimportant page. This obviously messes up with the site flow and negatively impacts the site’s crawlability.
And here’s how it looks now:
In case you’re wondering why there are fewer vertices (pages) in the “after” graph, it is because we also did some serious cleaning and trimmed the site a lot.
By the way, if you don’t know this tool, the graphs were generated using Sitebulb – new but one of the most favorite toys in TSI’s toolbox.
LSI Keyword Research
We also suggested to the client that the best approach to the keyword research was to target LSI terms. This has later improved the content relevancy and decreased content cannibalization issues.
Couple of the tools we used to get LSI keywords and inspirations were: LSI Graph and AnswerThePublic.
Ongoing Work: Link Building
A big part of our efforts once the penalty was removed, was the link building element.
Considering that the site had almost certainly been penalized with regards to its link profile, we knew that we needed to work towards getting very good quality links. We achieved this through our natural outreach activities supported by an engineered link building element (e.g. guest posts).
During the course of the campaign, we leveraging high traffic guest posts from my guest post service Authority Builders. Here’s an example of Ahrefs metrics for one of them:
With our manual, custom outreach, we started by finding a huge list of relevant websites and blogs.
Once we assessed each one of them in terms of traffic, SEO value, relevancy and quality, we started contacting them through the contact details available on the sites or via contact forms.
If you want to learn more about TSI’s vetting process, it’s quite similar to how Authority Builders approaches its link prospecting criteria.
All the blogs that responded, were then provided with high-quality content, relevant to our client’s site. The content usually included a link back to a selected page within the client’s website or a mention of the client’s brand.
Here’s Ahrefs graph showing an example of healthy link growth:
An average number of unique links we’re building every month is between 10 and 25 fresh domains.
More Adjustments: Keyword Relevance and Cannibalization
During the campaign, we went through a few Google updates requiring us to shift the goal focus slightly. Some of the things we adjusted were onsite page titles, meta descriptions and the structure of headings for the core pages.
For example, one of the core pages was heavily cannibalized against the homepage, causing Google to juggle between them. This always results in dipping rankings for both pages. Therefore, we suggested a new page title for both pages with new core keyword focus and relevance.
The solution was really simple and can be described in 2 steps:
- De-optimize the colliding page by removing the keyword from its page title.
Example Before: Our Core Keyword and Some Other Social Media Related Keywords – Brand
Example After: Synonym To Our Core Keyword and Other Social Media Related Keywords – Brand - Move the focus keyword to the front of the homepage and write a better-optimized page title.
Example Before: Something, something, Our Core Keyword – Brand
Example After: [BEST] Our Core Keyword from $9.99 a month – Brand
We got lucky as the keyword was quite short – only 12 characters – and we could easily improve the page title on the core page. Also, we could replace the term with a synonym on the homepage without affecting the sound or meaning of the page title.
We did the same with the meta descriptions and headings.
In effect, Google was no longer getting confused by choosing between these 2 pages and the page we were targeting the core keyword with, leaped over 20 positions and started ranking on the first page.
We (and the client) had a beer for this one.
Below is how a cannibalization issue can look like on a timeline and how the URLs behaved:
Please bear in mind that the positions above are the calculated average for each month. Almost on a daily basis, they can be swaying up and down of up to 25 spots.
Also, you need to be aware that when you’re fixing cannibalization issues, you may see a dip in the overall number of keywords your site is ranking for. It does not mean you’re losing visibility!
It means that if you had 5 URLs ranking for 1 keyword, tools like SEMrush would count it 5 times. Once you resolved cannibalization, you will only have 1x URL counted in the tool, but you should see increased rankings and, shortly after, the traffic ($$$).
Content Improvements
During the course of the campaign, we suggested creating new articles regarding their topic (social media) and the use of individual platforms for the business’s benefits.
We used our specialized copywriting team to create 5 cornerstone articles which could be published on the site and be a base content for further blog posts.
Cornerstone articles are usually explainers; relatively long articles combining insights from different blog posts.
It’s important to review competitors and other industry leaders and their blogs, not just their competing pages. By doing research into your competition’s link building strategy you can often find reliable link opportunities and content topics that can work.
Take, for example, the payday loans website called “Quick Quid”. They’re using an infographic outreach strategy for their links. The majority of their blog content is around cooking, food, and healthy living – but with finance tied into it.
These are industries that have much higher volumes of bloggers and link opportunities, and through internal linking, they are building up lots of links.
While this link baiting approach isn’t the best tactic for every site; it’s an important reminder that a functioning blog is crucial for success.
Conversion Rate Optimization: Increased Leads by 7.5x
A few months after we managed to remove the penalty and had continued building high-quality links, we installed Hotjar tracking code in order to create a comprehensive Conversion Rate Optimization (CRO) audit.
Hotjar took a couple of months to get all the data required, while we have been feeding the site with authority and tweaking up the relevancy signals.
Our CRO audit was focused on providing the best user experience while improving the % of the visitors who were becoming customers.
We created a 20-page document including the below contents:
- User Journey
- SERPs
- Landing Page
- Hotjar Analysis
We performed a review of each of the Hotjar click & heat maps to determine which elements were getting the most engagement and on what devices. - UX Designing for Big Screens
Here we made a reference to typical screen sizes and used Screenfly to show differences on most popular devices. Since the site had a decent mobile version, we only made suggestions regarding the use of buttons and popular Javascript tabs. The tabs in our case were not aligned properly on mobile, which could have had an impact on how users interacted with the pages. This change was also to lift some of the content above-the-fold. We improved the engagement by ~50% just by fixing the tabs layout. - Above the Fold Content
In this case, we found that most mobile users could only see a huge headline above the fold of the homepage, which might have looked like there was nothing past the headline text. We suggested redesigning the header to accommodate this and make the headline smaller while also including product links right below it, expecting at least 15% improvement in the time on page.
- Hotjar Analysis
- Product Pages
In this case, we discovered that the pricing table was too cluttered with too many elements and ~40% of users were not even getting to read all of the listed features. Additionally, there were some additional features and information listed below the pricing table (e.g. How the order gets delivered) which less than 30% of users were scrolling down read. We suggested to declutter the pricing table and only focus on the elements differentiating each of the packages in the table. Additional information was added as a visible button, so users interested in the process could quickly access the details. A pricing table clean up could be a quick win for your site, too!
Unfortunately, I can’t share the domain URL or screenshots of the site with you and since all the heatmaps would include the screenshot, I can’t share them either.
What I can tell you, other than what I already did above, is that we also looked at some other cool stuff, like this statistic of a typical mobile device usage by Adobe.
If I was to give you a quick hint as to where you could start the CRO analysis of your site with only Google Analytics, I’d suggest having a look at your Behavior Flow report (GA → Reports → Behaviour → Behaviour Flow):
The Behavior Flow report visualizes the path users travelled from one page or event to the next.
This report can help you discover what content keeps users engaged with your site. The Behavior Flow report can also help identify potential content issues.
At TSI we usually pick the route with the most traffic (the widest one – as highlighted in the screenshot above) and dig into each step of the way to check where the most of the drop-offs happen.
You can see the percentage of drop-offs when you hover over each page. When users are dropping-off, it clearly suggests there’s something wrong with the page or funnel and it’s worth analyzing what it is.
Make sure to read my articles Conversion Rate Optimization (CRO) for SEOs and Guide to Content for CRO and SEO, where I give more examples of simple CRO improvements and tests that can become game-changers.
Here are some other tools which may help with the testing:
- Google Analytics – especially notable Google content experiments;
- HotJar – heatmaps, mouse tracking, user behavior tracking;
- Yandex Metrica – Google Analytics alternative including heatmaps and mouse tracking;
- Optimizely: A/B and multivariate tests on existing pages.
Great collection of case studies, practical guides, and design recommendations can be found at Google Design.
- The Definitive Guide To Conversion Optimization – https://goo.gl/y8L1ao.
- 21 Conversion Rate Optimization (CRO) Tools – https://goo.gl/pRQ7xN.
The Results (AKA Traffic Boner)
Since implementing the strategies below, the site has seen traffic continuously grow month-on-month.
Below is a keyword visibility graph from Ahrefs:
Comparing against the starting month, we’ve gained the following results:
- 15,644.04% increase in search traffic from 1,308 to 205,932 sessions a month.
- 9,109.56% increase in search traffic from 1,036 to 95,411 users a month.
- 431.58% increase in average monthly CTR from 3.8% to 16.4%.
- 2,022 total positions increased across 33 keywords tracked in our tracking tool.
- Average monthly position increase from 24.4 to 8.6
Organic Traffic Improvements
By implementing our overall strategy of auditing the website in the first month, onsite implementations and links built, we saw over a 17 fold number of users by the third month compared to the start of the campaign.
The number of users from organic traffic has gone up from 1,308 in the first month to 22,497 by the end of the third.
At the same time, the number of sessions went up from 1,308 to 30,265.
By the tenth month mark, once link building effects were in full force and CRO audit performed, the traffic was as follows:
- Users up from 1,036 to 95,411
- Sessions up from 1,308 to 205,932
Client Testimonial
“We apparently had gotten a Google Penalty, losing 6,000 Keyword positions overnight. Naturally we wanted to recover as fast as possible. After talking with many people in the SEO community we decided to work with https://thesearchinitiative.com/.
Not only did we get a full Penalty recovery but our traffic has since tripled and our income doubling. We’ve seen a consistent month over month growth in traffic and positions. Looking forward to gaining more positions and higher rankings in more terms.“
Conclusion
Hopefully, this shares some insight into removing penalties, and how building long-term strategies can pay off in a big way.
You’ve learned strategies on performing detailed onsite, offsite, and technical audits – and how to implement them.
You’ve also learned some link building and conversion techniques that will significantly grow the top-line of your website.
At this point in time, I’ll leave you with this, but if you’re still having trouble then contact The Search Initiative so we can discuss how to get explosive and lasting results for your (or your clients’) sites as well.
And one more thing…
We just had a recent core algorithm update. How’s the client’s site doing now?
I’ll leave this here:
Get a Free Website Consultation from The Search Initiative:
Damn boy! That was a really good reading!
Are you sure it was not coused by this Cloud Flare bug?
How long it took you to complete the Audit?
Who implemented changes? You or client?
Great reading as always! Thanks!
Highly doubt it was just the cloudflare bug. Some of the initial checking revealed it definitely had to do with links.
You need to read the article again.
Really it is an outstanding outline. Thank brother
Great post. Thanks for share.
Is this case study can be use as date nov 2020? Is there any significant changes
Absolutely, this case study is still relevant.
This is a masterpiece!
Thanks bro.
Matt, outstanding outline. Thank you very much brother.
Thanks for stopping by Ralph.
Nice one Matt. Thanks again
Thanks, bro.
Hi Matt,
An amazing read as always!
I doubt that any SEO blog can provide this level of quality and quantity of information.
Thats why when something big as the recent update happens I wait for you to come out and say something about it.
I have one question though. It has been my observation that the recent update from Aug 1st has something to do with the E-A-T google ranking factor.
I have noticed that sites who have more information about the author (s) of the text have outranked those who do not.
What is your take on this?
Did you work on your client’s site E-A-T too?
I’m sure a lot of sites in this last update were YMYL. This site was not, and got a huge ranking increase. I’ve also had reports of people getting creamed in hosting niches and amazon bathroom product reviews. So yeah.
Hey Matt,
Do you have any tips on how to get out of YMYL penalty? My traffic became almost to zero after this update. Please advise.
Thanks,
Shannon
Hi Shannon… I’m not in the camp that believes the update was solely focused on YMYL. I also don’t believe updates are targeting one small aspect at a time. The best advice I can give is to “do all the things”. This article is a good place to start.
How much did TSI charge for the algo fix, Matt?
We don’t disclose stuff like that… but prices are bespoke on the amount of work.
Wow – that’s the mother of all posts!
Re 404 errors … I did a bunch of reorganizing, including killing off a lot of content which was not ranking and was too thin (there was no way this could have been somehow made better), so I unpublished it.
What would you recommend one does with the 404s generating from those old URLs?
Leave them to throw errors, or 301ing to anything relevant … most of this stuff does not have any incoming links, so there’s no real value in them, but there’s quite a lot of 404s in Search Console because of this.
I’m a completionist, so I’d probably 301 them anyways.
So how much did you get paid for all this hard work?)
A gentleman never tells.
To me this is a great example of a thorough, all-around audit. This website that you were working with is a little outside of my normal scope of customers, but the basics are all the same. Most likely just one or two of the major changes were the fix (combined with the link building of course), but it is so much better to just fix everything and start building on top of that. I’m going to look into a couple of those image compression tools, as I have not used some of those before. Great stuff as always Matt, thanks for being so thorough.
-Mark Cherkowski
Thanks for stopping by, Mark.
Great post Matt! I have a question about your outreach method for securing links on relevant websites. Do you primarily target websites that have “write for us” pages, or do you just email tons of sites within your niche asking if they accept guest posts and see who responds?
At Authority Builders, basically, we outreach to every site on the internet with traffic.
Hey Matt,
How do you know whether they get traffic?
Do you rely on ahrefs/semrush/moz/etc or use similarsites / alexa
I typically use Ahrefs.
Thank you for sharing this step by step case study.
1. May I ask if we should leave the disavow file in GSC or we have to continue updating the file and submit to GSC?
2. When should we remove the disavow file from GSC completely?
3. Would Google regard previously penalized sites (recovered now) any lesser from those not penalized before in ranking?
Regards
1) If you add new lines to the disavow you need to upload it again.
2) Never. You probably don’t want to allow those links back to your site if you removed them once.
3) Not in my experience.
Awesome Matt.
I read the whole piece and to be honest was glued to the screen. Now my eyes hurt and you’re responsible for it:)
Anyway, good job on removing that penalty. The amount of action your team took is insane and it just shows how much knowledge is needed to pull this off.
I tip my hat to you.
The good news for me is that I could follow along just nicely, and I imagined myself doing the same thing.
I say that because I’m a relative newbie to SEO, but the excitement I felt reading this case study and looking at those traffic charts is proof that I’ve found my one true passion.
No, I won’t ever give up
Cheers Matt, and have a nice day.
Go get em, Nikola!
Hi Matt,
I am the first time lander in your Blogging world. And your articles are great, I learn a lot from this article. Now, I am going to follow these tips to increase my visitors too.
I have a question, that how much backlinks does affect the SERP ranking?
I am waiting for your next article.
Thank you so much for this valuable article!
Backlinks? Pretty darn significant. Hard to rank without them, unless you’re in some easy niches.
Wow! This is an amazing case study that covers just about every aspect of technical seo, on page, off page, etc…
Thank you for sharing
Thanks Nathan.
Nice one, Mr Sir Diggity! 😉
Gonna try to revive a site of mine that got molested by G’s algo previously…wish me luck 😉
“We used our specialized copywriting team to create 5 cornerstone articles which could be published on the site and be a base content for further blog posts.”
Do you build outreach links to these as well? (e.g. resources, BLB etc)
Or just let them sit?
Cheers m8 😎
We build links to everything.
Got it.
Thanks man 😉
Hello, Matt Diggity
I think it’s great to approach to recover the algorithmic penalty. For the current time, It’s very helpful for everyone. Thanks for sharing this.
Hey Matt,
Do DAS technique and press release service still work on 2018?
Awesome work, very impressive.
Do you work on projects in other languages than English?
At this time, no. But we’re working on strengthing our international department.
Outstanding case study. Now, I’m going to implement this to my clients websites. Hopefully i build long term websites
Hey, thanks for sharing. I am making backlink from High Da sites, some are relevant to my topic, and others are not. Will it he harmful for my sites. Because others are making backlink from this way.
Very good
first time to read your blog, though I heard your interview on authority hacker podcast and I can say you really know your stuff, thanks for sharing this
Thanks for stopping by!
I came here from Facebook ads, and Now i realize its too good!
Haha.
Really awesome job Matt!
Cool to read the whole case study with so many details. Was actually surprised by the increased amount of traffic, but good job and nice results! ?
Keep going,
Cheers,
Such a dinosaur piece of content! I had to bookmark it yesterday and return to finish it today . Definitely worthy though! Thank Matt for sharing!
One question about cannibalization part. What will you do when changing title and keyword anchor text from core page point to homepage doesn’t not work ( because the core page has url contain keyword in it, homepage is just brandname.com) ?
Go to the offending page and replace the cannibalizing words with synonyms. https://diggitymarketing.com/keyword-cannibalization/
Wow Wow very Impressive! Thanks for Sharing
Hi,
The update came on 1st of August but my website in badly hit on 5th of August and totally dead now. I don’t understand its due to now google algorithm or something else. Is there anyone who can help me?
Thanks
Sajjad
The first great penalty removal case study I’ve ever seen.
The “traffic boner” is definitely a new work in my lexicon.
Great article, Thanks!
Don’t forget the hashtag #trafficboner.
cool stuff Matt, thanks for sharing.
but you added sitemap address in robots.txt ?
Yup. Cross your T’s and dot your I’s.
Really awesome job Matt!
I Read It Carefully On the whole case study On your site I saw It Have so many details. Really surprised by the increased amount of traffic, I get, good job and nice results! ?
Thanks!
Carry On Bro
Netorola
wow amazing thanks for sharing .
amazing
hello Matt
My site is faulty as you said above. I searched for the sentences in the article but not found on google.
My content is about 1000 words long and written by myself.
I do not know what my fault is. Can you help me
My site: Blogchamsoc.com
I am looking forward to your answer
thanks you
Please reach out to The Search Initiative and we’ll get you sorted.
Really awesome case study Matt..
I am completely new in SEO and I really looking for some actionable tips for grow my traffic..so your epic blog post really help me a lot..
Glad to hear, Bishnu.
Hey Matt,
you guys did an amazing work, thanks for sharing!
Do you have any case study or indication of improvements by doing only technical SEO?
Thanks!
By just doing technical SEO? Not really. With our clients, we go full steam on everything at once.
Thanks for sharing your case study.its help me a lot.
Hey Matt, thanks for sharing your case study.
Is it possible that a site gets penalized for bad backlinks without receiving any notification in webmaster tools?
For sure. Penguin algorithm.
Hey Matt, Thankyou so much for posting this amazing case study, I see some great ideas that can really help, Myself I’m guilty of the Cloudflare misconfiguration, I have had some problems in the past, with it, and Yes is Like what the F**k
Great information really enjoy it, keep up the great work
Thanks again Matt
Wow..! Really Awesome Guide u have shared bro where i Got each and every point how the traffic and penalty occurred.
Thanks for sharing
Cheers.
Hey Matt, I’ve read half of your post and that was really interesting will definitely read it soon. I’ve bookmarked your website. Actually, I’m going through some crazy situation that my site is fluctuating. Sometimes it shows up on my targeted keywords in top num, sometimes it gets lost. I’m very confused it’s happing from last month. You have my website link now. 🙂
Can you please give me a little tip.
Thanks
Hi Syed, I recommend checking out this article.
Hey Thanks for sharing the article. I have website, and it is not appearing in google even after doing site:ownclasses.com search.
Please help to figure out the problem.
Please send me more useful article.
You are deindexed. Check your robot’s file, or check for GSC actions.
Yeah, you are right Matt,
I guess now I have fixed all things can you please help me why my website is deindexed?
Hey Matt, Amazing article! Learned a lot of things! One question, I have uploaded a disavow file and its name is “disavow-updated.txt” and Google successfully uploaded it, my question is, is the name “disavow-updated” instead of “disavow” a problem?
It’s fine.
Holy f… This is probably one of the BEST seo articles I have ever read. Awesome stuff Matt. One more proof that SEO, when done right, is a pure art.
Thanks for that. Put a lot of love into this one. 🙂
Thank you for sharing this wonderful case study in detailed info. Hope it helps !
Amazing case study , learned a lot of things , thanks for sharing this awesome case study.it will help a lot.
So when you mention sitemap in your robots.txt with https indicator. Does google only crawl on https pages or will it also crawl on HTTP? actually, i want to block the bot to crawl http pages because it creating internal duplicate content as per siteliner.
or shall i need to add the command to disallow http pages in a robots.txt file?
You don’t need to add the sitemap to robots if you submit it in GSC. But either way, if your site is https, keep all references https.
Hi Matt,
Great article, I’ve read many times over since you published.
Regarding links like the directories/statistical tools that sites tend to pick up over time, do you find value in disavowing even if you don’t actually see a link to your page? If they are showing up in GSC/ahrefs/etc I assume there is a reason.
Thanks,
Zach
If they link, then I’ll get rid of them.
Great job. All SEO information at one place, in one article that is this article.
Hey Matt, impressive work. Great case study!
Informative article for SEO, when google frequently update their ranking algorithm.
Hi Matt,
I think I have read this article around 5 times as of now. It’s very impressive and one of the most informative articles about Google alogritmic penalty removal.
Now I have started implementing your guidelines in one of my websites which has lost traffics one month back. I started with the onpage part and it’s glad to tell you that I have seen improvements in the rankings.
I have few questions though:
1. Why did you start with the backlink audit? Can it be the other way? Onsite audit and then backlinks’ reviews?
2. While disavowing links you have mentioned the following points:
-Does it link out to many sites?
-Is the page/site authoritative? Ahrefs’ DR helps here.
a) I have backlinks coming from websites which are linking to many other websites. But these links are of DA 30+. Should I disavow them?
b) What’s the least DR/DA I should look for?
Thanks in advance Matt!
1) It was our best hunch on what the problem was so we started there.
2 a) These two factors definitely shouldn’t be your only criteria for deciding if you want to disavow. Also look at the traffic of the website, the types of sites it links out to, etc.
2 b) See above.
Great and much detailed post. Now we should more focus the sites which are links to our site. Thanks, Matt Diggity.
Hey, Thanks for providing such a detailed case study, learned a few important factors concerning ranking. I had a question though. Currently, I’m more focused on creating blog posts and I’m not paying attention to my backlinks. Is that fine or should I focus equally on building backlinks to my site?
With SEO, you need to do “all the things”
I found your site when i was scroolling in my FB..
Then when i read the Article here.. You know what i was thinking?
“WOW! This is sooo goood”
Yeah, honestly, this is the best SEO Tips article i have ever read..
You answer my all question with the solution
Please, keep doing it! I’ll often go here to read your next post Matt
Hi Matt,
Can you suggest any way to remove scrap content, media files and archive pages from the index ASAP? because if we just add no index tag then we have to wait until the crawler come again.
Do we just need to add noindex tag to those pages or we have to remove them from search console (Link Removal Tool) as well.
I do what you mentioned.
Hey Matt, impressive work. Great case study!
Matt, you should make a video on this case study.
Reading your guidelines, I found many new factors to look into for working on my site to recover the penalty (I think so). Thank you so much for the detail. Keep helping people like this.
Hey, Matt, I always read your article. It is a Great case study!
Thanks for sharing. I will be waiting for the next case study.
Your case study is pretty impressive. I learned some new method which i really haven’t heard off before. Thanks for sharing such an valuable information. I agree with your point of doing proper on page of your site. A perfect onpage of your site no doubt improve your keyword’s ranking. Than the links give a push to the keywords to move the higher ranking.
Great case study, But you should make a video of this research page.
When it comes to SEO. You are the best.
Oh stop…
how long theses line are !, plz tell me ashortcut for SEO of my site?
thanks
sabir
really it is a complete guide, maybe its good you offer an PDF version too
That is a great article, so impressive work. I am not doing all of these things when the site is penalized because in Poland in most cases working with links is enough. Only sometimes I have to work on Onpage to fix some issues.
I am not also familiar in disavowing but in most cases this is the only way to clean sh**tty links.
Very good case. In SEO, sometimes you have to take a step back to do two things forward 🙂
Great article. My blog is facing Google penalty issues for the last some weeks & I was looking for the solutions to recover my lost traffic. This is a terrific guide on this. Thanks for the step by step instructions. It’s really helpful.
Best,
Great blog post – the HotJar plugin looks very promising – and it’s free for under 500 hits p/day. Very detailed article I really enjoyed reading it. Thanks.
Man, this is not a blog post. you wrote a book.
Thank you for all the information you shared.
Cheers.
thanks alot Matt, ive been study SEO 3 months now and built almost 150 tasks. And this article just like a bible to me
Hi, if I edit or remove duplicated titles, the algorithm devaluation will be lifted? Already finished other optmizations.
Duplicate titles is not something that will cause an algo penalty by itself. You’re helping out your site’s over quality score though.
And the content? The site has original content, but also many press releases. It happened because of layout change, but I don’t know what more to do haha.
Press release syndicated content is fine.
I discovered today that it was the HTTP HTTPS migration causing duplicated content issues, not us :O
Hi Matt, great article! Very useful information here.
If the Penguin hit was caused by overusing exact match keyword anchors, would you recommend changing these (to generic/brand/naked) if the webmasters can be contacted, or straight up deleting/disavowing?
Thanks
There’s (justified) tin hat thought around the concern that if you change your anchors it sounds like you control your links (which Google doesn’t like). How tin hat are you?
Hey Matt, very insightful article about backlinks and getting traffic. Quick question, instead of using anchor texts, how about the “exact URL backlinks”? I mean backlinks without anchor text. I’m guessing they should be more safer, or are they considered spammy? thanks in advance.
URL anchor types are pretty damn common.
Wow, great results. Thanks for the write-up. Was the new content you created responsible for any of the big traffic increases? Or did most flow to the original existing content that was now re-indexed?
The gains were made on the previous content which was ranking their services pages.
Nice informative article, I am a beginner to SEO and I have been looking for some actionable tips to grow my traffic and your case study really help me a lot. It’s too long your article but worth to read your post.
Thank for sharing.
Hi,
I followed this article and made changes to the site accordingly. Had a spent some money on some of the tools advised too, wasn’t much. But the results are shinning. Just under a month the ranking and traffic has boosted up to 1400%.
Thank you very much.
My pleasure.
Wow Great Job Matt
I just read almost the whole article, and I have a website based on Mobile Phone. I am doing seo on that for last one year, but not satisfied with the ranking. I am pretty much sure that I am following the technical stuffs and making the webmaster tools error free. But it doesn’t work. Though the lacking part of my site is, (-). I have (-) in my domain. Is it really a big deal?
That’s not a deal breaker.
Amazing post, Matt! Thank you so much.
I just found your blog and… there is too much knowledge holy sh… I love your articles! There is just clear knowledge. Im totally noob in SEO soo…Long guide that’s what I like. I will stay here and wait for more content from you 🙂
Cabbage that article, you’ve really examined everything possible, congratulations.
You just do it man really informative article while using this article we are making a nice traffic for my client Many many thanks. 🙂
Hello Matt Diggity,
Thanks for deeply guide and explain each and every points. i really like it. i have one question. How much time take to sales generate in ecommerce site. (site is fresh)
It’s going to vary, of course. Your competition level is key and how quickly you can boost your own domain’s authority is also.
nice case study and great suggestion and tips, thanks
Hey buddy! How you increased site speed?
Blog post coming soon.
Wow Great Job Matt
I wish if you have mentioned the hreflang and where it should be used so we can cover only international market and not only regional market.
First off all, congratulations on this post. This i really awesome but that’s what you always crank out my friend. Great posts that we can sink our teeth into and really go to work.
Thanks Emila!
Okay, I will join and try.
Now, I can see what really happened to my blog, it went from 5k views to 100 in 24hours. Its up to a year it occurred.
So how I really know if the algorithmic penalty has been lifted.
Dude, your articles rock. Never ever seen this type of content, you seriously rock the house. Thanks for the detailed guide!
With this case study, I become your big fan man. What a wonderful way of explaining each and every topic. Lots of love. Keen to learn from you. 😍🥰
I learnt penguin now ignores toxic links…. I have a site that lost about 60-70 of it’s traffic after Dec 3 upgrade. After reading your article i checked my link profile and saw that about 200,000 bad links were directed to my site as against my former 1000.
What do you think could be the way forward on this situation.
Rick Lomas (https://www.indexicon.com/) is my go-to guy for removing toxic links
Love what you are teaching
Comments are closed.