Plenty of SEO news to mull over this month. Mobile first indexing is impacting rankings (finally) and voice search for local SEO is making waves. If you’re not on top of that you’re missing an opportunity.
Down to business.
These are the blogs that caught my attention this last month or so.
As always, I’ve tried to include a mix of more advanced SEO news with some blog posts that focused on the basics. There’s always a snippet you can get from reading other SEOs, even on some of the fundamentals.
If you have an opinion feel free to share it in the comments below.
Is there a Google sandbox in 2018
Here is the million dollar question. The Google sandbox… fact or fiction? Google says FICTION.
” With regards to sandbox, we don’t really have this traditional sandbox that a lot of SEOs used to be talking about in the years past. We have a number of algorithms that might look similar, but these are essentially just algorithms trying to understand how the website fits in with the rest of the websites …”
So, that is sorted then, let’s move on. Or is it? Because Matt Cutts recently said:
“The difference between a domain that’s six months old versus one year old is really not that big at all. So as long as you’ve been around for a least a couple months, a few months, you should be able to make sure that you’re able to show up in the search results.”
The line that jumps out is “so as long as you’ve been around for a couple of months…” Does that mean there is something in place that limits new websites? Well, that would be a sandbox of sorts. At least functionally, if not by name.
There is little point arguing if there is a sandbox or not. Because essentially, it’s not in our control. Ahrefs argue that a better use of time is to look at a new website launch from the perspective of Google.
What do they need to see to start to trust the site with search traffic?
E – A -T: Or Expertise, Authority and Trustworthiness.
A new site is seriously lacking any of the holy trinity of qualities Google looks for in websites. Which, you might say it almost a self-imposed sandbox. According to the article from Ahrefs, some important things to consider when trying to avoid the sandbox (for want of a better name) are:
Content: Google loves content. In most cases, newer sites have less content, perhaps Google is struggling to understand the new site and how to rank it for searchers to find it. Perhaps the relevancy is not clear.
This one has my vote.
User signals: Google likes to evaluate dwell time, bounce rate, CTR and all those other signals that let them know how users are interacting with your site. If you don’t have visitors then Google doesn’t have data to evaluate.
Backlinks: They matter. New sites have to be careful in how they acquire backlinks early on because unless they are completely organic, it’s more difficult to obtain a load of backlinks without it looking suspicious. At the same time, you’ll be punished for a limited number of backlinks, when compared to competitor sites.
Competition: The quality of competition will matter. A newer site is always going to struggle against an established site, but how much depends on how authoritative the sites are. You need to adjust for that.
Are your sitemaps optimized?
This is a great article because it’s about something so basic but still, if you don’t spend time doing it right, it’s going to sting you.
We’re talking about sitemaps. Not a very sexy subject but an easy fix and worth doing right.
The article explains what a sitemap is and what rules or limitations you should be aware of. It’s a good read if you’re new to SEO and not a complete waste of time for more experienced SEOs either.
The article talks about sitemap optimization. Again, we are talking SEO basics about how to install sitemaps into footers or in a robot txt file and how to let Google know what’s going on. This is going to play a role in getting your pages indexed efficiently.
There’s the option of manually creating your sitemap, which is a reasonable option if you have a limited number of pages and you don’t plan to update your site with more pages regularly. Because every time you do, you’ll need to update your sitemap, which is increasing your workload.
Google is telling people your site is not secure
Finally, it’s here.
Https really matters if the searcher is using Chrome. From July Google’s Chrome will mark all sites that don’t have https as “not secure”.
We knew this was happening and now it’s upon us – so if you haven’t changed any of your sites over yet, it really is time.
Will it really matter to your site? Possibly. If you’re selling something on your site then most definitely.
If it’s monetized less directly, for example with Google Adsense, who knows how much it will affect users, but the real question is, is it worth risking it and not changing over?
Page-load speed affects ranking
Oh, btw… It’s not the only change this month. Back in January, we gave you a heads up that Google warned us that mobile search was going to have page speed as a ranking factor.
They gave us till July to get ourselves sorted, and so, here we are in July.
Decent page-load speed can be achieved with a little bit of time spent optimizing for speed, dealing with caching and image resizing for example. Or you can upgrade your hosting to a quicker, better performing package. Or better hosting.
From now, it matters to Google, but if your site is loading slower than 2 seconds, you should be concerned anyway because your drop off rate or bounce rate will be huge, searchers get bored or frustrated very quickly, they don’t like waiting for pages to load.
And let me just part with the notion that optimizing site speed is one of the key factors I’m seeing from recent testing for conversion rate optimization.
Creating Content in 2018
In the digital marketing world, words matter.
SEO just increases the complexity and the skill necessary. You need to be able to engage your target audience, stay on topic and create content that is SEO optimized.
Here’s a quick guide on how to keep people engaged on your page: click here.
It is not enough to identify some keywords and write the content in terms of percentages of those words like we might have 3-4 years ago. I’ve been working with TF-IDF which is a metric way to assess how often keywords are being used in the content of competitor sites that Google must value (because they are ranking for the keywords I am targeting).
But, apart from that, this article is an interesting read, exploring the kind of processes you need to be able to find that balance between content for your users and for Google.
Some good questions you would get value asking yourself when designing content are:
- What is it about your service or product that makes it unique?
- What value do you provide to your customers or clients?
- Who is your ideal customer/client?
- What’s the #1 goal you have when it comes to your website?
- What problems) do your services/products solve for your clients/customers?
- What style/tone appeals best to your target audience?
If you’ve done some copywriting, none of this will be earth-shattering, but now in 2018, more so than ever before, user experience really matters. Google is far savvier at being able to interpret user experience and that should be part of the planning when looking to rank keywords. If you’ve never really experienced writing “on brand” and for conversions, this does a good job at getting you started.
Google focuses on JSON-LD
A search engine that is built around crawling and indexing info found on the structured data of a website works differently than a website designed around looking at the words used in a query and tries to return websites that have the same words as the query.
We understand how structured data works thanks to a flowchart found in one of Google’s patents, back in 2017:
Now Google seems to be focusing some of its gaze on semi-structured data using- JSON- LD. They gave an example of this on their webmaster pages “introduction to structured data”.
“This document describes which fields are required, recommended, or optional for structured data with special meaning to Google Search. Most Search structured data uses schema.org vocabulary, but you should rely on the documentation on developers.google.com as definitive for Google Search behaviour, rather than the schema.org documentation. Attributes or objects not described here are not required by Google Search, even if marked as required by schema.org.”
JSON-LD displays factual information in a machine-readable way on the site in question, rather than in HTML format. The patent further explains: In general, this specification describes techniques for extracting facts from collections of documents. The patent continues by discussing the role of schemas and key/value pairs that could be searched, and details about such a search of semi-structured data on a site:
“The aspect further includes receiving a query for semi-structured data items, wherein the query specifies requirements for values for one or more keys; identifying schemas from the plurality of schemas that identify locations for values corresponding to each of the one or more keys; for each identified schema, searching the encoded data items associated with the schema to identify encoded data items that satisfy the query; and providing data identifying values from the encoded data items that satisfy the query in response to the query. “
What to make of this… If you use structured data, Google likes JSON-LD formatting, for example in schema, and be sure to use the structured data guidelines when you add it to the site.
Redesigning a website without affecting SEO
This might be of interest, especially if you do client SEO and a new website design might be in order.
There’s a lot of potential with redesigning a site. You have the chance to enhance your SEO, making your onsite more powerful.
You also have a chance to damage the SEO already done to the site, in its current incarnation. This article talks about 10 steps you need to consider or work through to make the perfect transition (if perfect is ever possible). Here is a quick overview of the steps that will help you reap the benefits and avoid the pitfalls:
Step 1: make a list of all pages from the old website. It would help if you do this before you do anything else. Just to be sure you have it.
Step 2: the redesign needs to be done on a temporary URL. There are far too many pitfalls to doing it on the real domain. Just do it on a temporary domain and when you are 100% happy you can think about migrating it.
Step 3: test the current website. If you run a full audit of your website from both an SEO and technical perspective you can identify problems that can be fixed in the redesign. Two birds with one stone.
Step 4: do proper 301 redirects. If you have changed the URL structuring on the new website, perhaps something that was addressed after the audit, you need to ensure you’re redirecting the URLs.
Step 5: make the jump to the new website. So it’s site migration time. Obviously try to do it during off-peak hours, the less interruption in traffic the better. Also, make a checklist to ensure you stay on track and nothing gets forgotten.
Step 6: run Google Webmaster tools. It makes sense to ensure you have made the change effectively, there are no broken links for example.
Step 7: check verification status and re-submit. Re-submit a fetch and render for your website. Especially if you have stopped Google crawling it during the migration.
Step 8: Robot.txt. It could have been corrupted during the redesign, you might want to check that.
Step 9: sitemap submission. Simple task but sometimes forgotten during the changeover.
Step 10: monitor the changes. You’ll want to identify problems as quickly as possible. Problems occur, you will want to be on top of them.
You can read about these points in more detail in the article.
How to filter junk traffic in Google Analytics
Google analytics is an important tool. But these days, you need to be comfortable doing some filtering. If you haven’t configured your (your clients) analytics to filter such traffic, Moz does a good job of laying it out.
There’s ghost spam to be aware of. Google Analytics has got better at understanding it but it still affects the data. For example, with ghost data, we know it mostly shows as referral data.
Then there is bot traffic. This tends to be presented as direct traffic. It’s not always easy to see. Once you’ve identified the bot traffic, normally by the IP, you can use Google Analytics to filter it, so the data you’re reading is genuine traffic data.
The most important thing to remember is that you need to get it set up now to be able to read data in the future. It doesn’t work retroactively.
The article does a good job of explaining the different kinds of Google Analytics filters and how to set them up, as well as some of the pitfalls. They go into detail about setting up different filters in the article, if it’s not something you have done before.
English isn’t the only language, you know?
English isn’t the only language to target keywords.
We get so hung up on English, (after all, it is the most used language online) that we don’t look at other markets. There are so many countries out there, so many potential markets where they have a strong GDP and would make financial sense to target.
All you need is the right technical set up and some content in other languages.
It takes a little doing because it’s a mental change. You won’t feel like you are in total control especially if you don’t speak the language, but the big upside is how less competitive it is.
You can target some really valuable keywords that would be monstrous to compete for in English and have far less competition in other languages.
For example. Danish. There are a lot of SEOs in Denmark, with great skill, but compared to the number of valuable keywords that can be ranked for, there just aren’t enough quality SEOs to dominate everything, so there’s low hanging fruit.
If there is enough value in the keyword then it’s worth the investment. Especially as you can do the same thing for 40+ languages, in countries with a strong GDP.
Take a phrase like “diet pills”.
The Danish equivalent:
As you can see, in terms of keyword difficulty, there’s a huge difference. Of course, the amount of search volume is less too, but 3000 monthly visitors are no laughing matter.
Now have a quick look at the sites that are top of the search results in each version of Google.
The site hitting top spot for Diet Pills (translated) in Denmark:
Now US Googe…
The Danish site has DR of 13 and only 650 links sitewide, the US site ranking had DR90 and 4.5k backlinks.
Ranking in Danish for diet pills is achievable and worthwhile, but not even a consideration for most SEOs. That’s one example in one country. There are millions of others. It has to be worth exploring.