What works today probably won’t work tomorrow. And keeping up with all the changes can be a task in itself.
Starting this month, I’ll be releasing monthly SEO news roundups, curating the most pertinent SEO news, so you don’t have to sort through all the mess.
Internal Link Structure: Does your mobile link structure need to be the same as the desktop version?
Way back in June 2017 Google came out and said that if you were thinking about mobile first indexing, you should be looking at making your mobile page internal link structure as similar as possible to your desktop pages.
Thanks for clearing that up. Enough said. Let’s move on.
That is until Google’s John Mueller, trying to offer an insight into Google’s thinking in 2018, took to Twitter to answer a question about mobile first indexing saying:
It doesn't need to be the same but it should be crawlable.— John ☆.o(≧▽≦)o.☆ (@JohnMu) 2 January 2018
So… what now? With mobile searches being increasingly important, does this feedback change your strategy? Read the article to read what Barry Schwartz makes of it.
Voice Search SEO: Are you part of the 38% looking to implement Voice Search in 2018?
SEO doesn’t stand still. Keep up with it’s changes and there are always going to be opportunities to get ahead of the competion.
FACT: Almost all new smartphones have voice enabled personal assistants pre-installed and they are 95% accurate (and above).
FACT: People are inherently lazy and voice searches play to that side of human nature.
Voice search is here and both data and common sense says it looks like it’s here to stay. BrightEdge recently released data from a study that suggests that 62% of marketers have no plans to include voice search in their SEO strategy for 2018.
If you are keen to get ahead of the pack, voice search SEO should be on your horizon.
Context is is now massive. Smartphones are constantly giving and sending data and Google is looking more and more at implied intent in the query:
Example: A search for Canon Cameras
“Length: when a displayed answer is too long, users can quickly scan it visually and locate the relevant information. For voice answers, that is not possible. It is much more important to ensure that we provide a helpful amount of information, hopefully not too much or too little. Some of our previous work is currently in use for identifying the most relevant fragments of answers.“
Can low volume keywords satisfy high value clients?
Dmitry Dragilev adds to the conversation about keyword targeting with a case study where he targets low volume keywords in a competitive niche (you had me at case study).
Depending on your budget, you might take a swing at something like this:
So custom would have us drift down the list (because most keyword search tools list the results with most searches at the top) until you find the sweet spot between search volume and competitiveness
But… the case study tries a different strategy. It looks to piggyback competitors and pick up some traffic from searchers further down the line in their decision making.
No different than you would with an Amazon affiliate site or review site.
He decided to target keywords that are competitor related, with metrics like this:
I think you’ve probably guessed. The keywords were easy to rank for, in fact, they didn’t require any real offsite SEO. No expensive outreach, just good onsite SEO and relevant content.
Are new privacy regulations going to disrupt local search results?
Be warned…Yawn alert… If you are not involved in local SEO you might want to move on.
Bad: As of right now, some US states consider marketing data as personal data and are requiring reporting and notifications of data breaches.
Worse: Other states have gone after location data now requiring affirmative consent on geolocation data.
The Worst:… Where will it stop?
Life is likely to get a little more complicated and most likely more costly when it comes to local search marketing, are you planning ahead?
Quality is King but you have to spread the love
Thin content. It’s a No-no. But if you were not sure how Google now determines a site’s quality score, and you are looking to Google to lead the way then you are going to find this stuff really valuable.
John Mueller said (several times), in a recent WebMaster Hangout, that all pages are considered when the quality algorithm is determining site quality.
“From our point of view, our quality algorithms do look at the website overall, so they look at everything that’s indexed.”
You are likely to see gradual changes in your search results accordingly. Most likely negative changes and Google adjusts your rankings thanks to your thin pages.
To be clear: Thin pages won’t be ignored like after other updates, they will affect the overall result of your site.
If you believe this, managing your indexed pages is going to be critical.
Back in the summer of 2017 Google came out and said that you need to be boosting these thin pages. Which can be a nightmare. But now, John Mueller stated that although the best solution is to boost thin content,
Something he confirmed that Google’s search engineers agreed with,
the second solution is to remove those pages that are dragging down your quality score.
John confirmed that it’s OK to 404 or noindex those URLs. You can also use a 410 header response to really make it clear to Google that the content has been removed for good. (which will most likely speed up the process of having content removed from the index too).
Move over Keyword density, it’s TF*IDF time!
There is an argument being made that topic modelling and content optimization, based on specific concepts, are starting to have more impact than URL and information architecture does.
It looks as though we are seeing a change in the importance of topical relevance and intent.
Tests have proven the hypothesis of it’s increased importance, so how do we measure it?
We can’t, not directly.
But as with all competitor analysis, we can look to analyse the content of the websites that are ranking highest for our keyword searches.
To figure out that, we can use TF*IDF
We are talking about:
term frequency times (*) inverse document frequency.
Before your eyes glaze over, you don’t need to be a mathematician to get your head around this, there are tools out there to do it for you. You just need to recognize its value.
That’s the interesting part. All you need to do is gather the data and apply it.
And here’s why you might want to do that…
Google is really big on understanding content more to determine site quality. That’s been obvious for a while. To keep up with Google’s thinking, I am spending time getting a better understanding of LSI (Latent Semantic Indexing) or NLP (Natural Language Processing).
And TF*IDF really works for SEO because it allows us to put content quality into a metric or as data, and we love the convenience of packaging sites up in neat little scores. It was Page Rank, then DA, PA, TF, CF. And Ahrefs DR. Now it’s the TF*IDF score.
We love simple, comparable data and this is what we get when using TF*IDF… metric signposts.
We can assess which words, phrases and topics are used by the top 20 ranking websites, and in what percentage they are used, to give us trends across the top ranking sites.
Or we can be even more specific.
We can look at specific sites which are ranking well, and plot how our own site differs in terms of topics and phrases, and use that information to make changes to better optimize our own content.
It probably depends on how big of an SEO project you have, how competitive the industry is, and how important it is to you.