How to 91x Website Traffic – A Case Study Blueprint for 2024

Table Of ContentsCase Study: From 1,036 to 95,411 Organic Visitors Per MonthThe ChallengeGeneral ApproachHow We Did ItThe Dynamic Start: Backlink ReviewCrucial Stage 2: The Onsite AuditAnother Discovery: Potential Cloudflare IssueCampaign Goals BreakdownSuccess: Algorithmic Penalty LiftedMore Technical DiscoveriesOngoing Work: Link BuildingMore Adjustments: Keyword Relevance and CannibalizationContent ImprovementsConversion Rate Optimization: Increased Leads by 7.5xThe Results (AKA Traffic Boner)Organic Traffic ImprovementsConclusionAnd one more thing… Every once in a while you run an SEO campaign that changes the way you do everything. The lessons you learn, the challenges you face, and the results you achieve inspire you to rewrite your whole SEO gameplan. This is the story of one of those SEO campaigns. As you might already know, I’m a director of a very talented SEO agency called The Search Initiative (TSI).  Since coming on, we’ve encountered many wins and this case study is one of them. In a few months, we lifted their algorithmic penalty and increased traffic by 9,109%.  You’re about to learn the exact steps we took to achieve this. You’ll learn: A detailed onsite, offsite, and technical SEO audit process How to repair algorithmic penalty problems A safe link building strategy for Conversion rate optimization strategies for fast growth Fair warning: the strategies detailed ahead are intense but worth it. Here’s the success one reader found after following this case study: Case Study: From 1,036 to 95,411 Organic Visitors Per Month This is the story of a campaign for a social media marketing website. Our client monetizes their website by selling monthly subscriptions to achieve better social proof on Facebook, Instagram, and other social networks. If you’ve ever been in this niche before, you’d know it’s not an easy one.  It’s one of the hardest niches there is. The Challenge The client joined The Search Initiative with a heavy algorithmic penalty. Traffic at the time had decreased significantly to almost 1/10th of the previous volume. If you’ve ever had an algorithmic penalty before, you can directly connect with the frustration and annoyance of such a disaster. The main challenge was to determine what type of a penalty hit the site and to take action on getting it lifted. General Approach We started by thoroughly analyzing the data based on the tools available to us and the details provided by the client. The initial analysis included looking into: Google Analytics Google Search Console Keyword tracker (Agency Analytics) SEMrush Ahrefs Cloudflare Server settings Previous link building reports and audits Once we determined the most probable cause of the penalty, we put together a plan of action. We created a comprehensive onsite, offsite and technical audit before building the overall domain authority through our own link building strategies and traditional outreach to relevant blogs and sites. How We Did It The Dynamic Start: Backlink Review The link profile of the domain included a lot of spammy, low-value domains. Since a previous automated backlink audit (most probably done using Link Research Tools) had been performed before the client joined our agency, we started by reviewing its results. At TSI we know that if it comes to potential link penalties, especially the algorithmic ones, we have to be very thorough with the link reviews. To start the analysis, we downloaded all the link data from the following sources: Google Search Console – it’s a real no-brainer to include all the links that Google definitely has in their database. However, according to this Google Webmaster Help page, you have to remember that GSC presents only a sample of links, not all of them. Ahrefs – it is our go-to and best 3rd party tool when it comes to links. Their database is an absolute beast and the freshness of the data is also outstanding. To gather all link data, go to Ahrefs, type in your domain and select Backlinks. Now you’re good to Export it to an Excel file: By the way, make sure you select the Full Export option, otherwise, you’ll be exporting only the first 1000 rows with the Quick Export: Majestic – even though their crawler might not be as complete as Ahrefs, you still want to have as many link sources as possible for your audit. With Majestic, you’ll have to type in your domain → Select “Root Domain”→ Export Data. Now, because of the link memory (AKA ghost links – links that are deleted, but Google still “remembers”), we export the data from both, Fresh and Historic indexes. Also, ensure to set the tool to “Show deleted backlinks”. Moz and SEMrush – Similarly to Majestic, with these two we just want to have as many links as possible and complement the database, in case Ahrefs missed some. How to get links data in Moz Open Site Explorer: Your site → Inbound Links → Link State: All links → Export CSV How to get links data in SEMrush: Your Site → Backlink Analytics → Backlinks → Export. Please make sure to select “All links” option. We had all the data now, so it was time to clean it up a bit. There’s no real secret in how to use Excel or Google Sheets, so I’ll just list what you’ll have to do with all the link data prior to analyzing it: Dump all Ahrefs data into a spreadsheet. If you’re wondering why we start with Ahrefs, it’s explained in step 4. Add unique links from GSC into the same spreadsheet. Add unique links from all other sources to the same spreadsheet. Get Ahrefs UR/DR and Traffic metrics for all the links (Ahrefs data will already have these metrics, so you’re saving time and Ahrefs’ credits). Spreadsheet ready! With the spreadsheet, we started a very laborious process of reviewing all the links. We classify them into 3 categories: Safe – these are good quality links. Neutral – these are links that are somehow suspicious and Google might not like them that much – although they’re quite unlikely to be flagged as harmful. We always highlight these in case we were to re-run the link audit operation (for example if the penalty did not get lifted). Toxic – all the spammy and harmful stuff you’d rather stay away from. Some of the main criteria we’re always checking: Does it look spammy/dodgy AF? Does it link out to many sites? Does the content make sense? What is the link type (e.g. comment spam or some sitewide sidebar links would be marked as toxic)? Is the link relevant to your site? Is the link visible? Does it have any traffic/ranks for any keywords? Ahrefs’ data helps here. Is the page/site authoritative? Ahrefs’ DR helps here. What’s the anchor text? If you have an unnatural ratio, then it might be required to disavow some links with targeted anchor texts. Is the link follow/nofollow? No point disavowing nofollow links, right? Is it a legit link or one of these scraping/statistical tools? Is it a link from a porn site? These are only desirable in specific cases, for example, you’re a porn site.  Otherwise, its disavow time. If it is likely that the whole domain is spammy, we’d disavow the entire domain using “domain:” directive, instead of just a single URL. Here’s a sneak peek of how the audit document looked like once we finished reviewing all the links: Then, we compared the results of our audit and current disavow file and uploaded a shiny new one to Google Search Console. We disavowed 123 domains and 69 URLs. Additionally, we also used our in-house, proprietary tool to speed up the indexing of all the disavowed links. Something quite similar to Link Detox Boost, but done through our own tool. Here’s a little screenshot from our tool: Crucial Stage 2: The Onsite Audit The next step taken was a full, comprehensive onsite audit. We reviewed the site and created an in-depth 30-page document addressing many onsite issues. Below is a list of elements covered in the audit: Technical SEO Website Penalties First, we confirmed what the client has told us and established what kind of penalty we’re dealing with. It has to be emphasized that there were no manual actions reported in GSC, so we were dealing with a potential algorithmic penalty. We searched Google for the brand name and did a “site:” operator search. If you were able to find your brand name ranking number 1 in Google (or at least among your other profiles, e.g. social media accounts, on the first page) and it’s no longer there, you know you’re in trouble. Basically, if Google devaluates or de-ranks you for your own brand, this is a very strong indicator that you’ve been hit with a penalty. With the site: operator search it’s a bit more tricky. However, as a rule of thumb, you could expect to have your homepage show as a first result returned for a simple query: “site:domain.com” in Google. Another way of confirming the content devaluation is to copy and search for a fragment of the text on your core pages. In the example below I do a Google search of 2 sentences from Read More Read More