If you aren’t keeping every aspect of your website’s SEO performance in tip-top shape, you’ll struggle to see much organic growth.
Every SEO campaign is different – some websites require more focus on just one or two of the core components of SEO (content, backlinks, and technical factors) whereas others may require optimization and improvements across the board.
These SEO issues can be discovered by carrying out an SEO audit of your website which can uncover a whole range of action points that need to be addressed.
That’s why it’s always essential to audit, analyze and optimize what’s already on your site.
In this case study, you’ll learn the exact steps that my team at The Search Initiative took to increase our client’s organic traffic by 90.97%.
In this article, you’ll learn how to:
- Prune your backlink profile by identifying a potential negative SEO attack.
- Optimize your existing content for low-hanging keywords.
- Identify target URLs and anchor texts when building new backlinks for maximum ROI.
- Fix internal 301 redirects on your website to improve page performance and user experience.
- Improve usability by implementing breadcrumb navigation.
Before that, let’s find out a bit more about the website’s goals and the main challenges faced during the campaign.
Table Of Contents
- The Challenge
- Pruning Your Link Profile With A Backlink Audit
- Optimizing Content For Low Hanging Keywords
- Strategic Link Building
- Fixing Internal 301 Redirects
- Improving Usability With Breadcrumb Navigation
- The Results
Before joining The Search Initiative, the site was struggling to break into the first page of the search results for many important keywords. Therefore, the main goal of this campaign was to grow the site’s organic traffic with a focus on optimizing the editorial content.
The client is a real estate website targeting people who want to rent and/or buy properties in Southeast Asia.
This site saw a spike in referring domains. This was an attempted negative SEO attack, which is when a competitor intentionally attempts to sabotage your SEO efforts by building many poor-quality backlinks. If you believe your site has been deliberately attacked in this way, it’s best to audit and tidy up your link profile – read on to learn how to do this.
The website had a lot of content, with many articles that were 10k+ words long. However, there were lots of keywords that these pieces of content were struggling to rank for. In cases like this, you should carry out a content optimization strategy that focuses on improving these long-form pieces for low-hanging keywords that were ranking just outside of the first page.
Finally, we identified two core technical drawbacks facing the website: hundreds of internal redirects and missing breadcrumb navigation.
As a real estate website with hundreds of listings across multiple cities and locales, missing breadcrumb navigation resulted in unnecessarily poor user experience as it made it much more difficult for visitors to navigate the website.
Find out how you can overcome these challenges for your website by following the steps below.
Your backlink profile is like a tree.
Now and then, you want to snip off and prune a few faulty branches (low-quality backlinks) to ensure that the rest of the tree (link profile) is healthy. I.e., There are no spammy backlinks.
This is a procedure you should carry out periodically by only checking the most recent links pointing to your site. But sometimes, the number of backlinks may suddenly shoot up, which could signify foul play.
Google is able to ignore these spammy links in “most cases” but that means that some might slip through.
If you get 1000 spam links, how many of these weren’t ignored? 50? 100?
You’ll find out how to analyze the quality of a backlink pointing to your website later, but first, let’s see how you can identify whether your site’s link profile has seen unnatural growth, as described above.
Identifying A Negative SEO Attack
What Is A Negative SEO Attack?
Your backlink profile may be the victim of a hostile SEO attack where your competitors (or another entity) purposely build hundreds, if not thousands, of unnatural, poor-quality links towards your site.
Such an attack aims to trigger a Google penalty (or manual action) so that you lose rankings.
If this happens to you, you will want to conduct a backlink audit to identify and disavow (ask google to ignore) these malicious links.
How To Identify A Potential Negative SEO Attack
To manually identify a potential negative SEO attack on your website’s backlink profile, you can use the Ahrefs Site Explorer tool.
- Enter your domain into the tool.
- Scroll down to the Referring Domains graph.
If you see a sharp spike like the one above, your site’s likely had a damaging SEO attack.
As mentioned above, Google’s algorithms are getting better at identifying and ignoring poor-quality backlinks – but they aren’t perfect, so you’ll still need to cover your bases to make sure that your tree doesn’t have any faulty branches.
This unnatural link velocity isn’t ideal – so it’s still worth seeing which links you can preemptively tell Google to ignore by disavowing them.
Top tip: you can set up alerts on Ahrefs to automatically monitor new (and lost) referring domains to your website. This will enable you to quickly spot any unnatural increases (or decreases) within your backlink profile and allow you to act sooner to prevent any potential SEO damage to your site’s performance.
Head over to: Alerts > Backlinks > New alert > Enter domain > New backlinks > Set email interval > Add
Once you’ve identified that your site’s seen an unnatural links spike, the next step is to identify the bad links.
It’s also worth noting that the spike may actually be a good thing. For example, one of your articles may have gone viral, so you may have naturally received a bunch of links.
Either way, here’s how to identify whether these links are good or bad for your SEO.
Investigating A Potential Negative SEO Attack
Here’s a step-by-step breakdown of how to investigate a potential negative SEO attack using Ahrefs Site Explorer:
- Click the Backlinks report to view all the backlinks pointing to your website.
- Switch the mode to “One link per domain”. This will reduce the noise as you don’t want to analyze every single backlink.
- Click the Dofollow filter. These are links where Google is instructed to “follow” the URL that is being hyperlinked. In contrast, nofollow links are not followed by the search engine’s crawlers.
- Click the New backlinks filter.
- Select the period when the spike occurred.
- Sort the results by Domain traffic (descending order).
When filtering these results, you’ll likely see some patterns cropping up.
Let’s go through some of the most common culprits regarding links-based negative SEO attacks and how to spot them.
Patterns To Look Out For: The Usual Suspects
When identifying faulty backlinks from a negative SEO attack, there are a few usual suspects that you can look out for as giveaways for potential foul play.
Why? Because these types of links are extremely cheap and easy to get in large quantities – meaning they aren’t too sophisticated.
Remember, most of these websites aren’t legitimate. They’re often built especially for these tactics and won’t have any SEO value.
Blogspot domains are incredibly cheap and easy to build. This makes them perfect for webmasters to exploit and use to build spammy links to your domain.
You can identify these quickly by clicking on “More filters” on the top right of the Backlinks report.
Select Domain name and type in “blogspot”.
Click Apply and Show results.
You’ll now see all backlinks from Blogspot domains.
In this example, there are 66 poor-quality Blogspot domains pointing to this website within the specified timeframe.
In most cases, adding a listing to a web directory is free. Again, this makes it easy for spammers to build hundreds, if not thousands, of links on irrelevant and/or suspicious-looking directories.
They generally look something like this:
Find potential spammy web directories by filtering the backlinks using the same method above, but instead, search for domains that contain “directory”.
Comment spam usually results from automated software being used to place lots and lots of comments on blogs or forums towards a particular website.
These links generally use exact match anchors, i.e. the clickable text of the links are keywords you are likely targeting.
This can be problematic because a high number of keyword-rich anchor texts within your link profile will likely raise some flags in Google’s eyes as being unnatural, which could lead to your website being penalized via a manual links-based penalty.
Here’s an example of a forum link taken directly from Google’s guide on Link Schemes.
Scroll down to the Anchors report right at the bottom of the Site Explorer tool on Ahrefs – this shows you an overview of the most commonly used anchor texts to link to your website.
If you spot lots of exact match (keyword-rich) anchor texts linking to you, it’s likely that these could be comment or forum spam links.
In some cases, you may even find some extremely unnatural or even suspicious anchors used to link towards you. These are much easier to spot as they’ll likely have nothing to do with your website.
Next, click on “View full report” to dig a little deeper.
This is because the report only shows you the top 10 most common anchors (see screenshot below) – and there may be lots more!
Here, you want to order by “/ dofollow” links because dofollow links are the kind that specifically instructs search engine bots to “follow” the link.
Whereas sites that link to you using a nofollow link, are highly devalued by Google’s crawlers.
Click on Details, then Referring domains, to explore further.
In most cases, a website will only link to you once – so if you see that a site is linking to you a lot, it’s a strong indicator that something’s not quite right.
Expand further by clicking on Backlinks.
This Brazilian website has linked to the target domain 41 times using the same anchor.
Manually checking the backlinks/domains will confirm whether these links are generated by spam comments.
Important: Click at your own risk. Remember, these websites are often spammy, so be careful where you click through.
Lo and behold, in this example, there are over 12,000 comments on one of the articles from the Brazilian website.
Not even the best article in the world will get this many comments. It’s a clear example of comment spam.
Another typical pattern to look out for is whether your referring domains share the same IP addresses.
If they do, it’s a sign that these sites are all hosted in the same location and, by extension, are likely owned/controlled by the same person.
Use the Referring IPs report in Site Explorer on Ahrefs to check this.
Remember, it’s normal to have links from a handful of sites that share the same subnet (network of domains). What’s fishy is when you’ve got hundreds or perhaps thousands of referring domains from a single subnet – like the above.
Expand the IPs and domains.
It’s those pesky Blogspot domains again!
Disavowing Spammy Links
Once you’ve analyzed your link profile and identified which domains are problematic, the next step is to make sure that they won’t cause your site any (further) harm.
That’s where the disavow file comes into play.
Now, some people think that uploading a disavow file somehow gets you added to a watchlist, this is far from the truth.
Google understands that websites will receive poor quality backlinks, whether it’s through your own link building efforts or not. The disavow feature is just a way for you to communicate to Google which of these links you want ignored.
What Is The Disavow File & When Should You Use It?
The disavow file is a document you can submit to Google containing a list of the linking pages (or domains) you don’t want to count towards your site’s backlink profile. In other words, you’re telling them to ignore these links.
Google states that you should use this feature only when the following two criteria are met:
- There are a considerable number of spammy, artificial, or low-quality links pointing to your site,
- These links have resulted in your website receiving a manual action, or you feel they will likely trigger a manual action on your site.
Creating Your Disavow File
Here are some things you need to know when putting together your list of spammy links:
- The file must be a text file encoded in UTF-8 or 7-bit ASCII i.e. a .txt file.
- Each line should represent a single domain or URL.
- To disavow all links from a single domain, use the following prefix: domain:
- If you’re disavowing a URL, it cannot exceed 2,048 characters
- Your file cannot exceed 100,000 lines (this includes blank lines and comment lines)
- Your file size must be less than 2MB.
- You can add comments by starting the line with a hashtag (#). These will be ignored by Google.
Here’s an example of a disavow file:
Submitting Your Disavow File
Once you’ve created your list, you can submit it via the Disavow Links Tool.
For each property of your website (i.e. http://, https://, http://www. and https://www.), upload the disavow file you’ve created by clicking “Upload disavow list”.
Already uploaded a previous list?
You’ll need to download it, update it with the new URLs/domains, and then reupload it.
Once you’ve uploaded the file, it can take up to a few weeks for Google to incorporate your list into its index.
Remember, conducting a link audit isn’t something you do once and forget about. It’s essential to keep an eye on your backlink profile to maintain its health. Prune that tree!
Find out more about how to conduct a full backlink audit in more detail here.
Optimizing Content For Low Hanging Keywords
Unless you’re starting a brand new website from scratch, you don’t always need to write fresh content to rank for more keywords.
You’re likely already ranking just outside the first page for several search queries and, as a result, are sitting on a goldmine of potential organic traffic.
These kinds of keywords are referred to as “low hanging fruit”.
This is because they’re already ranking relatively well, which suggests that Google already likes the content, in which case it’s likely that with some optimizations (like the ones outlined below), you will be able to break into the top positions.
Think of it this way… Everyone else is trying to grab the juiciest apples that are harder to get as they’re at the top of the tree (i.e. they’re trying to rank for broader, highly competitive terms like “flower delivery”). So, you can go after the low-hanging apples that are easier to grab but are still pretty sweet (i.e. by targeting “flower delivery in [city]”).
Our client already had a bunch of articles (that were over 10k words long) published on their site, so instead of focusing on writing brand new articles, we identified low-hanging keywords that these pages were already ranking for and optimized them.
How To Find Low Hanging Keywords
A great way to find low-hanging keywords is via Ahrefs Site Explorer.
- Enter your domain (or selected URL) into Site Explorer and open the Organic Keywords. This will show you the keywords you’re currently ranking for. Make sure that you see the keywords for your target location. In this example, it’s the United Kingdom.
- Open the Position drop-down, click 11-20 and then Apply.
- Click Show results.
- You now have a list of keywords that have been ranking from positions 11 to 20.
If you sort by Search volume, you can see that there are still some broad search terms here with a very high Keyword Difficulty score.
These aren’t the ideal “low hanging” keywords you’re after, as they’ll still be challenging to rank for.
- Filter the results further so you’re left with search queries with a Keyword Difficulty of 25 or less. Then click Show results to apply this new filter.
Now you have a new list of less competitive keywords that are easier to rank for, according to Ahrefs.
- Go through these keywords to see if you can spot any terms that can be grouped (or if you’ve entered your domain, then spot keywords that are ranking on the same page).
In this case, I spotted keywords related to “florists in Watford”, which suggests that this page has a bunch of potential keywords to capitalize on.
Now that you’ve identified the low-hanging fruit, it’s time to optimize the content.
How To Optimize Your Content For Low Hanging Keywords
To optimize your content for low-hanging keywords, you can use Surfer’s Content Editor tool to compare your content with the top-ranking competitors.
- Create a New Surfer draft – Enter the low-hanging keyword in the search bar. If you’re adding more than one, separate them with a comma “,”.
Select the target location (in this example, it’s “United Kingdom”).
Switch the toggle Import content from URL and paste in the URL of the page you want to optimize.
Click Create Content Editor.
You’ll then see a page that looks like this.
The tool has analyzed your content and compared it against the top-ranking competitors of your target location by looking at things like:
- The density of common phrases ie. how often important terms and phrases are mentioned on your page compared to the competition
- The number of headings
- The number of paragraphs
- The number of images
Knowing this information allows you to compare your content with your competition and see opportunities for improvement.
The left-hand side is where you can optimize your content, and the right is where you’ll see Surfer’s analysis and suggestions.
- Check the competition – Before optimizing your article, you should manually compare your content against the top rankers because this will help you identify potential content gaps.
Click “BRIEF” and then open the list of competitors.
- Have they included additional information that may be useful for the end user? I.e., FAQs, customer reviews, etc.
- Have they included other forms of content to enrich the user’s experience, i.e., images, videos, animations, graphics, etc.
Once you’ve identified the key differences, you’ll want to start optimizing your page’s content to mirror that of your competition.
- On-Page Optimization – Apart from optimizing the main body of the content, you’ll also want to ensure that other on-page elements are optimized for your target keyword(s).
- Page Title – ensure that the page title is engaging, describes what the page is about, and includes the primary keyword you want to rank for.
For example, a better-optimized page title for this page could be: “Watford Flower Delivery & Florists – Arena Flowers”.
- H1 Heading – the H1 heading should grab the user’s attention, summarize the page’s contents, and ideally include the core keyword(s) you want to rank for.
This H1 heading could be optimized for “florists” related terms by tweaking it to: “WATFORD FLOWER DELIVERY & FLORISTS”.
- Meta Description – while it’s not a direct ranking factor, the meta description can help to convince users to click on your website instead of your competitors.
Therefore, you should aim to describe the page in a little more detail (think of it as an extension of your page title) and try to fit in some keyword(s) and phrases in a natural way.
In this case, the description could be optimized to include important terms and phrases like “florists”, “watford”, “flower delivery” etc. Doing so is important because Google formats words from the user’s search query in bold.
- Don’t rely on the score – As you fulfill the content gaps and optimize the content by increasing/decreasing the frequency of specific terms, you’ll notice that your content score will increase (or decrease) too.
For example, I edited one of the headings to include “FLORISTS IN”. and the score increased by 2 points.
Reaching a perfect score of 100 isn’t as important as ensuring you’ve covered the main topics in a way that will be useful for your audience. Anything above 80 is considered to be a solid piece of content.
Performing such optimizations on Surfer SEO can help you climb into the first page of the search results.
You can see that the number of keywords our client is now ranking for within the top 10 search results increased by 181% from 2010 to 5665 in the past year.
Strategic Link Building
Link building is one of the most powerful SEO tactics you can employ to help boost your rank in the search results.
While the domain that you’re getting the link from is important, you first need to answer:
- What page is best to link to?
- What anchor text will you use?
And that’s what you’ll find out in the next section…
Target URL Selection
Both of the above questions are underpinned by another: Which keyword are you trying to improve your rankings for?
This is easy to answer if you’ve just optimized the content on a specific page for a low-hanging keyword.
Why? Because once you’ve optimized the content, the next step is to give the target page a boost with some backlinks.
Apart from low-hanging keywords ranking just outside the first results page, you may want to target pages that have seen drops in keyword rankings and traffic.
In this case, you can use Ahrefs’ Site Explorer and Top pages report.
Select the Declined option from the Traffic and Keyword drop-down menus.
Then, click Show results and scroll down to the results.
Filter the results by looking at all of the pages that have seen declines over the Past 3 months (or whichever period you prefer).
Under the Changes drop-down, select Percentage – this will help you see which pages have seen the biggest drops.
You can then sort the results by Traffic or Keyword change. In this case, I’ve gone with Traffic to see which of these pages are currently bringing in the most traffic.
The Change column next to each represents the change in traffic (or the change in the number of keywords the page was ranking for) between the selected dates.
The results show that the Valentine’s Day page currently sees the most traffic out of these, even though the top keyword is only ranking in position 14.
Considering that Valentine’s Day is six months away, this may be the perfect time to start building backlinks so that by the time people scramble to order flowers for their loved ones a week before the big day, the page is sitting nicely atop the search results.
You may also want to look at keywords where you’ve completely dropped off the top 100 positions.
Look out for the Lost label under the Status column to identify these.
Apply the same steps to identify if any important pages (or keywords) to your business or website need a boost from link building.
Anchor Text Selection
Once you’ve identified the target page, you’ll want to consider the anchor text(s) that you’ll use for your backlinks.
The anchor text is the clickable part of the hyperlink that will take users to your target URL.
Anchor text selection can make or break a page’s SEO performance as it’s something that Google’s algorithms – especially Penguin – pay close attention to.
Finding The Average Anchor Text Distribution Of Your Competitors
- The first step in selecting your anchor(s) is to look at the anchor text distribution of the top ranking competing pages for the primary keyword of your target URL – in this example, it’s “valentine’s day flowers”.
This is because you’ll want to replicate the anchor text distribution of what Google rewards for this keyword to your website.
To grab the competing pages, do a Google search for your keyword or click on the top keyword from Ahrefs’ Top pages report.
Scroll down to the SERP overview and make a note of the top 5 ranking pages.
Also, glance at the number of backlinks these pages have – this will give you a rough idea of how many backlinks you may need in order to compete with them.
But with the right anchor selection, you can do so with fewer links.
- Put each URL into Ahrefs Site Explorer and follow these steps to download their anchor text data.
- Click on the Anchors report and click Export.
- In the spreadsheet, add an extra column, “Anchor Type” and give each anchor a label based on the following:
- Targeted – keyword-rich anchor texts, i.e. “valentine’s day flowers”
- Topical – anchors related to your topic i.e., “get your loved one a gift”
- Branded – anchors that include your brand name i.e., “Prestige Flowers”
- URL – anchors that are listed as URLs, i.e. “prestigeflowers.com”
- Miscellaneous – i.e., “click here”
- N/A – any other anchors, i.e. “no text”
Your sheet should look something like this:
- To get the anchor text distribution, turn this data into a pie chart in Excel by highlighting the Anchor type data, clicking on Insert, then clicking on the pie chart icon.
You’ll have something like this:
- Repeat Steps 2 – 5 for each of the remaining 4 competing pages and your own page.
- Calculate the average anchor text distribution for all 5 competitors so that you have something like this:
You now have an estimation of what kinds of anchors you need to use in order to match your competition.
In the above example, the competition has a much higher distribution of targeted and topical anchors than “My Site”. This suggests we can be more aggressive during the anchor text selection process by using more keyword-rich anchors.
Check out this article for more tips on selecting and optimizing your anchor texts.
And if you prefer videos, check out the one below.
Fixing Internal 301 Redirects
If you’ve ever had to change the URL of a page or migrate your entire website to a new domain, you’ll likely have implemented a 301 redirect.
But one crucial thing often overlooked after implementing 301 redirects is remembering to update all of the internal links on your website that used to point to the old URLs with the new URLs.
Before diving into the strategy on how to find and fix this issue, let’s briefly explore what 301 redirects are and why they’re important for SEO.
What Are 301 Redirects & Why Are They Important For SEO?
301 redirects are a way to indicate when a URL has permanently been moved to a new location.
They’re usually used for when you want to:
- Change the URL of a page or subfolder i.e., yourdomain.com/blogposts/ to yourdomain.com/blog/
- Move a subdomain to a subfolder i.e., blog.yourdomain.com to yourdomain.com/blog/
- Migrate your entire website to a new domain.
- Switch your website from HTTP to HTTPS.
- Switch your URLs from www. to non-www. and vice versa.
- Merge two different websites and want to ensure that the links to any outdated or deleted URLs are redirected to the correct (or most relevant) pages.
301 redirects are great for SEO because they pass the original URL’s PageRank (a Google formula that calculates the ranking power of a particular page based on the quantity and quality of its backlinks) to the new URL.
Google’s Gary Illyes confirmed this in 2019:
How Internal Redirects May Be Hurting Your SEO
Imagine you’ve recently redirected your blog articles so that they’re now a part of a subfolder instead of a subdomain i.e., you now have yourdomain.com/blog/ instead of blog.yourdomain.
But, you forgot that you have instances where other pages on your website have an internal link (a hyperlink from one page on your website to another) that points to the old blog URLs.
Remember, 301 redirects add an extra “step” in the process of loading the desired page because the server first tries to load the old page, sees the 301 redirect, and then tries to load the new URL.
This means it takes more time to load the content for the user and for search engines like Google to access the page.
The time delay caused by a single redirect may seem trivial, but if you have thousands of redirected pages on your website, this can quickly add up. As a result, Googlebot may not be able to crawl as many pages on your website, and users may experience delays accessing your content – especially if redirect chains are involved.
A redirect chain is where you have a series of redirects between the start and final URL. For example, Page 1 redirects to Page 2, which then redirects to Page 3.
Therefore, you want to minimize the number of times your server has to make that additional step as much as possible.
To do this, you need to find all instances where you’ve added internal links to pages that have a 301 redirect, and update them so that they bypass the redirect and link directly to the final page.
How To Find & Fix Internal Redirects
Below, you’ll find out how to uncover internal redirects using Screaming Frog – but you can also achieve the same results by using any other site crawler of your choice, such as Sitebulb, DeepCrawl, Ahrefs’ Site Audit, etc.
To run an audit on Screaming Frog:
- Copy your website’s homepage URL exactly as it appears if you loaded it on a web browser and paste it into the crawler.
- Hit Start
- Click Bulk Export > All Inlinks and save the file in your desired location.
- Open the exported CSV file and expand the cells in the headings so you can actually see what everything is.
- Apply a filter to the Status code column, deselect all options except 301 and click Ok
- Now apply a filter to the Destination column and click on Text Filters > Contains.
- Type in your domain name (In this example, it’s serenataflowers.com) and click OK.
- You now have a list of internal links that have been 301 redirected to other internal pages on your website.
- (Optional) Tidy up the spreadsheet so that you only have the following columns left:
- Source – the page where the internal redirect exists.
- Destination – the final URL in the redirect sequence.
- Anchor – the textual element of the hyperlink. This is what you’ll use to find the internal link.
You should have something that looks like this:
- Using the anchor texts, review the links on the Source pages and edit them so that they point to the Destination.
Here’s an example:
- Source: https://www.serenataflowers.com/en/uk/flowers/next-day-delivery/yellow
- Destination: https://www.serenataflowers.com/en/uk/flowers/next-day-delivery/florists/online/england/hampshire/southampton
- Anchor: Southampton
Here’s the link on the page:
And here it is again, but on the source code:
You can see that the current link points to: https://www.serenataflowers.com/en/uk/flowers/next-day-delivery/florists/online/england/hampshire/southampton (without a trailing slash), which redirects to https://www.serenataflowers.com/en/uk/flowers/next-day-delivery/florists/online/england/hampshire/southampton/ (with a trailing slash).
The omission of the trailing slash at the end of the URL is what’s causing the unnecessary redirect!
Notably, the link to the Southampton page on this website is on the footer – which means that it’s likely that a significant number of other pages also have the same problem.
Now, consider the fact that this website also has links to 19 other city pages in the footer, and if each of those also has the same issue… that’s 20 unnecessary redirects that the server has to make per page.
Another common cause of an unnecessary redirect is between www. and non-www. URLs.
As you’ve seen from the above example, the number of internal redirects can grow pretty quickly, which is why it’s important to ensure that you keep them to a bare minimum by always linking directly to the destination URL.
Doing so has the benefit of reducing server strain (fewer requests need to be made by the server) and improving usability as it means the content loads quicker.
Have you ever gone down the rabbit hole of a website and been unable to find your way back to where you started?
Breadcrumb navigation can help with that.
Breadcrumb navigation (or breadcrumbs for short) is a line of contextual links usually displayed towards the top of a page indicating the user’s location on the website.
For example, here’s what the breadcrumb navigation looks like on the Hollister website:
The idea is to add a secondary navigation that allows the user to trace their path within your website’s page hierarchy by clicking on any of the elements within the path.
Benefits Of Using Breadcrumbs For SEO
Implementing breadcrumb navigation has the following user benefits:
- Improved Experience – Whether it’s IRL (in real life) or online, users hate getting lost. Implementing breadcrumbs offers a means to enhance your user’s experience by providing a path that allows them to find their way back without clicking on the back button.
- Lower Bounce Rates – Remember that a user may enter your website on any web page through organic search. Therefore, you need to prevent users from bouncing from your website to another by guiding them to other relevant pages on your site if the page they landed on isn’t what they’re looking for.
Implementing breadcrumbs also has additional SEO benefits:
- Improved Crawlability – the internal links created as part of the breadcrumb path helps search engines to understand your site hierarchy.
- Improved Indexability – without breadcrumbs, search engine crawlers rely primarily on your Sitemap to index pages. But one thing the sitemap doesn’t provide is contextual information about the relationship between these pages. Breadcrumbs allow search engines to understand the relevance of a web page in relation to your website.
- Improved SERP Presence – Google includes breadcrumbs in the search results where possible to help users understand the contents on your page.
There are three main types of breadcrumb navigation: hierarchy-based, attribute-based, and history-based breadcrumbs.
Hierarchy-based breadcrumb navigation (aka Location-Based Breadcrumbs) allows you to visualize the depth of a website’s site architecture.
Here’s an example:
Each element within the path represents a new level within the hierarchy:
- Home – top-level category
- Computing – sub-category
- Laptops – sub-sub-category
- All laptops – current page
This is the most common and recommended breadcrumb navigation for SEO due to its intuitive structure and usability.
Attribute-based breadcrumbs are useful when items on a particular page may fall into more than one breadcrumb path. For example, shoes can be black, and made from leather.
But should the breadcrumb look like this?
Or like this?
Due to this ambiguity, attribute-based breadcrumbs are commonly found in eCommerce websites to filter product search results like this:
The filters in the above example i.e. “Black” and “Leather,” serve as attribute-based breadcrumbs, allowing the user to add/remove as many of the filters without having to go to a new category page.
History-based breadcrumbs (aka Path-based breadcrumbs) display the unique steps that the user took to reach their current page.
In other words, they show the history of your browsing experience of the website.
Here’s an example:
The “Back” button serves as a breadcrumb allowing the user to go back to their previous page.
This type of breadcrumb isn’t as useful if users have browsed lots of pages as they could easily achieve the same result by clicking the Back button on their browser tab instead.
Due to its dynamic design (the breadcrumbs are unique to each user session), it means that search engines cannot process the internal links within the navigation.
How you add breadcrumb navigation will vary depending on your website’s platform.
For WooCommerce, use the WooCommerce Breadcrumbs plugin.
For Shopify websites, here’s a guide on how to implement this using Liquid.
If you have a Wix website, follow this guide, and for Squarespace websites, you’ll likely need to implement the code manually.
Here are some points to consider when adding breadcrumbs to your website:
- Keep The Path Short & Simple – Use as few words as possible when naming your breadcrumbs.
This becomes especially important for mobile users, where you want to minimize the amount of space your breadcrumbs take up.
- Location is Key – make your breadcrumbs easy to find by placing them towards the top of your page. I recommend adding them under the main navigation and above the <h1> header, as this is where users expect to see it.
- Sizing is (Also) Key – your breadcrumbs should be easy to spot and unobtrusive.
- Add Breadcrumb Schema Markup – If you want to increase your chances of your breadcrumbs appearing in the search results (as shown below), you’ll need to add BreadcrumbList structured data to your web pages.
You can find more information on how to do this here.
- Maintain Consistent Naming Format – using an inconsistent naming convention across your website may cause unnecessary confusion among your visitors.
The final thing to remember about breadcrumbs is to note that not every website will benefit from or need to implement breadcrumbs.
For example, you don’t need breadcrumbs if your website has a flat site structure. This is where most of the pages on your site are on the same level. Why? Because it would be meaningless to have breadcrumbs where the path only has two elements.
As a result, breadcrumbs should be used for websites with a more complex and deep site structure. The most common application of breadcrumbs is on eCommerce sites where you may have multiple nested categories, sub-categories, and even sub-sub-categories.
Here’s what we’ve achieved by executing the above strategies in just over six months.
When compared year-on-year, the organic traffic grew by 90.97%.
The graph below, which is taken from Ahrefs, shows the site’s keyword visibility within the top 3 positions of Google.
The number of keywords that the site is ranking for in the top 3 positions of Google increased from 420 keywords to 1613 keywords a year later – an increase of 284%.
A large part of SEO is about auditing, analyzing, and optimizing what’s already on your website. But it’s important to remember to check every aspect of SEO – from the pruning of your backlink profile and optimization of existing content to improving technical factors like indexability, page performance etc.
In this case study, you’ve learned how to:
- Identify a potential negative SEO attack on your backlink profile, look out for harmful links (i.e. from comment spam) and disavow them so that Google will ignore them.
- Identify potential low-hanging keywords and optimize your content in order to climb the SERPs.
- Continue to grow your website’s domain authority with a link-building campaign.
- Fix unnecessary internal redirects.
- Improve the usability of your website by implementing breadcrumb navigation.
Implementing the strategy above will ensure that your backlink profile is as healthy as it can be before building more authority.
Analyzing and optimizing your existing content for low-hanging keywords will help improve your visibility by climbing the SERPs.
Likewise, carrying out technical fixes like minimizing internal redirects and adding breadcrumb navigation will help improve page performance and the user’s experience of your site.
If you’re looking for help with your site’s SEO, get in touch with my team at The Search Initiative.