Web Analytics

How On-Site SEO is changing in 2015

I never cease to be amazed by how much SEO has changed in recent years. In the past, SEO used to be all about optimising pages using keywords and links. It was decided by the powers that be (Google) that links were more important than the quality of the content on that page.

Of course, “Black Hat” SEOs were quick to take advantage of this all-too-easily-exploitable fact, and by 2007, link spam was everywhere. It seemed like everyone was cheating, creating fake blogs with the aim of linking, and partaking in other rather questionable pursuits such as malicious tagging and cloaking.

Of course, something had to be done by the search engines to combat the problem. Why did link spam pose a problem for search engines? Because every undeserved gain or ranking from a spammer meant a lack of accuracy and value on behalf of a search engine.

A search revolution

Thankfully, the last three years have marked a step change in the ranking system. Google’s algorithm updates were designed primarily to fight fake and manipulative online content and links.

The now infamous updates have ensured that a decade of old school and Black Hat SEO practices have been eliminated. The algorithm updates were designed primarily to fight fake and manipulative online content and links. Seeing spam site after spam site surrender under the wheels of almighty Google, it’s no surprise that fear of penalisation helped keep people in line.

An even more positive element of this step change involved Google focusing on intent – the meaning behind a user’s search terms. For example, if I type in “flowers in Guildford”, it’s likely I’m looking for a florist – rather than pictures of flowers in the park.

The change here is moving to a focus on semantics and meaning, rather than just keywords. Delivering results in this way – as well as making them diverse and fresh – helps to achieve Google’s main aim: delivering the best and most relevant results to searchers.

Changes in Google’s thinking

During all of these changes, the team at Google had to undergo a revolution of sorts, too. Up until this point, the search algorithm still used a manually-crafted formula rather than a machine-learned one, for two reasons – Google believed that humans can predict other humans’ behaviour better than a machine can, and secondly they also thought that a machine might be susceptible to huge errors when unleashed on real world data. However, it was revealed that Google do use machine learning to predict ad click through rates, and by 2013 it became clear that the team were coming around to a more machine-based way of thinking.

For this to work, machines would need to go through an extensive learning process to ascertain what separates a good SERP (low bounce rates, user clicks through on results, doesn’t need to enter another query or search on page two) from a bad SERP (high bounce rate, user resorts to other queries to find what they’re looking for).

What comes next might sound a bit scary for those who’ve seen iRobot. For a machine-based search algorithm to be truly effective, a process called “deep learning” would need to happen. Based on what we know about the human brain, deep learning involves higher layers forming higher levels of abstraction. I.e., the machines learn what is being requested, without humans telling it the answer. The algorithm would become intelligent, automatic and self-sufficient. What do you think: cool, or creepy?

What does this mean for SEO?

A machine-based system will mean that us SEOs should focus less on optimising for ranking inputs, such as keywords, anchor text and content uniqueness – i.e., the things that humans at search engines have considered good indicators of quality content, because the machine won’t care about these. It will only care about how users have acted on the search results, such as social sharing and long to short click ratio.

Optimising for two algorithms

As we can see, there are currently two competing algorithms at play here. To really succeed in 2015 and beyond, we need to keep one eye on our backs and another on the horizon to what’s coming. This basically involves optimising for two algorithms.

Here’s a quick breakdown of what exactly each of these two strategies involves:

Old-school onsite (ranking inputs)

  • Keyword targeting
  • Quality and uniqueness
  • Crawl and bot friendly
  • Snippet optimization
  • UX/multi-device

New on-site (searcher outputs)

  • Relative CTR
  • Short vs. long-click
  • Content gap fulfilment
  • Amplification and loyalty
  • Task completion success

Now, let’s break down further how to succeed in each of these newer areas:

CTR

As we can see, your information on the SERPs needs to encourage the highest possible click-through rate to get you moving up the rankings. Successful elements here include: a relevant and exciting title, freshness and a compelling URL.

Engagement

The secret here, and to beating down your competitors on that SERP, is achieving high engagement. This might involve creating content which fulfils a searcher’s needs, a fast loading time, a good UX on every browser, and avoiding annoying features.

Fulfilling needs

Going back to the semantics angle we mentioned earlier, Google wants to make sure that your page is fulfilling all of the searchers needs. Machine-learning algorithms might be able to learn which results are the most useful, or comprehensive, by the words placed next to each other. For example, a page about cars which doesn’t mention anything to do with wheels is unlikely to be comprehensive.

Encouraging shares and loyalty

Although Google has stated that they don’t count social shares as part of the algorithm, pages with a lot of social shares – but few links – seem to be over-performing. One theory is that while Google doesn’t automatically count raw social shares (we tend to share a lot of content that we don’t read), it does count a lot of metrics which, by nature, mimic this data – such as the rate of click growth as a story goes “viral”.

Most importantly, these shares need to result in loyalty and return visits. Knowing what influencers and searchers share is important; but knowing what makes them keep coming back is arguably even more so.

It’s no longer enough to simply create content which is better than your competitors’. It needs to be 10x better, and unfortunately, the old school tactics of SEO just won’t get this done anymore.

Fulfilling a task – not just a query

As part of Google’s aim of delivering useful and relevant content to users, they also want to help people to accomplish their ultimate tasks faster. Sites who help users to do this will move up in rankings, even if other more traditional ranking signals are absent.

In conclusion, then…

The two algorithms we should be optimising for are as follows: Google, and the people that interact with your content. Ultimately, it all comes back down to that one, cardinal rule of SEO: “create content for people, and not for search engines.” But not just this – you need to create content for both (remember the machines we mentioned earlier?). You need to optimise for both algorithm input and human output.

Scroll to Top