SEO Tips to Achieve Your Goals Quickly

Show Article Summary

Show Full Article

Feel free to use the below menu to help navigate to a particular tip which interests you:

‘It’s a marathon, not a sprint’ is a phrase commonly used to describe how a slower and steadier approach will help you to conserve energy, allowing you to complete the whole journey by taking gradual steps forward. The idea behind this is to help you absorb the finer details instead of having to frantically try to cross the finish line, leaving you feeling burnt out and lacking in motivation to take the next step on another journey.

But what if you can take the qualities of going the distance while combining this with sudden bursts of changes to amplify organic visibility?

By no means does this post replace a long-term strategic approach. Instead, by clearly defining your goals, adopting these techniques cannot only help to edge you closer to those goals at a faster rate, but it can ensure you are being productive on the right things.

So hopefully by reading and implementing the below points, you will be on your way. These points are aimed towards anyone, but as the nature of the post is gaining better coverage within a shorter timeframe, it is especially relevant to those sites who already have a strong domain authority and link profile. There are a few reasons for this:

  1. Links still influence performance:

Despite counter statements indicating that links are either massively devalued compared to a few years ago, or in some cases, perceived as an unnecessary part of the Google machine (this has probably come from those ‘SEO is Dead’ folk); there is such overwhelming evidence to suggest that they are integral in determining the order of search results.

In this post, I share how you can test and witness its influence.

So based on this evidence, if a site already has an established domain, which covers core-ranking factors, it will mean the changes suggested below will have a better chance at succeeding at a faster rate.

  1. Link building takes time:

If you are already familiar with the link building process, then you will know that nowadays, it takes longer than spinning some content and pushing a button to submit them to unsavoury sites.

But if you are less familiar with the concept, acquiring high-quality links takes time! And in fact, even if you were to attempt to fast track the process, this in itself can have a detrimental effect on performance.

A little caveat: for sites which have a stronger link profile, applying a few links can make all the difference, allowing you to move the average keyword group positions for a page from 12th to 5th.

And we know that by increasing ranking positions, the CTR should improve – here’s a chart to help illustrate this:

2014 study conducted by Advanced Web Ranking

This is simply an average across sectors and query types (there are strong fluctuations between these and as shown below, there are a few ways that you can influence the CTR), but it helps to give a good indication on the potential CTR that you can receive.

Identify your current visibility – know what’s realistic & provides value

By paying particular attention to those related page groups which are either placed in the top 10 or between 11th or 20th, subtle movements while in these positions can help you benefit from a better CTR in the short term.

Once deciding which pages to select, extra weight should be given towards those pages which already have links directed to them.

To do this, SEMrush & SEO Monitor are both amazing tools to help pinpoint where those golden nuggets lie.

My personal choice for analysing backlink activity is Majestic SEO, while Screaming Frog is an amazing scraping tool. By merging the strengths of these tools, we are able to understand a few quick insights:

  1. Based on their existing presence, current link profile & content relevancy, determine which pages and sections are realistic targets.
  2. Assess the influence of pages and sections that provide goal completions. This is important in helping to assign value to a session or unique page view.
  3. The number of pages that facilitate a visitor’s user intent. For example, those seeking to know more about what luxury holidays are available across Europe, yet are undecided on a specific location will want to know more information such as weather, things to do, costs and options before they make further decisions.
  4. Whether the bounce rate across different devices is more pronounced. This can help identify whether additional user testing is required to establish why this is happening.

Here are a few steps I use while using this process:

  • Using the export files for SEMrush, Screaming Frog (include HTML and 200 pages), & Majestic SEO (only the URL and root domains), add these into tabs, similar to the below:

  • The GA tab I have uses SEOtools For Excel by tapping into the API of Google Analytics. It plots out the range of metrics, including sessions, bounce rate, goal completions, exit rate, unique pageviews, accessibility score, bounce rate & sessions across mobile, desktop & tablet devices.
  • Then by applying Vlook-up & SUMIFs formulas, in the 1st tab ‘Key Pages’, this will populate the URL, ranking ranges, traffic, bounce rate, goals. Here is a preview with the target page URL’s removed:


Using the above image, it presents the total available monthly searches per page. Once this is combined with metrics such as, organic traffic, Meta titles, goal completions and the total number of inbound links towards the page, it is possible to accurately establish what is influencing this presence.

This is particularly useful during competitor analysis because it can help to isolate the reason why they have been able to acquire this visibility. There are over 200 ranking factors – so obviously, it will be impossible to exactly determine why they have this presence. But by including key factors, such as:

  • Inbound links
  • Internal linking
  • Meta tags
  • Content relevancy/depth

A correlation can be made, meaning if your work is more aligned with what they are undertaking, you will be more confident that by replicating this activity, it can bear fruit.

Finally, in the second tab, ‘insights’, here is a preview of one table:


By tagging the user journey stage against each page (unfortunately this is a manual job), this can help you assess how your site content is distributed.

Importantly, this can provide additional insights. For example, whether new pages need creating or more targeted work needs to be applied to help improve positions for those, which are already ranking in more advanced positions.

1. Refresh Your Content to Reengage Bots Analysis

To help illustrate the strength of this approach, I have selected a site which is in a relatively competitive vertical.

Interestingly, initially, this site was chosen to demonstrate the positive long-term effects subtle changes can have on your organic exposure.

However, in the last few months, it is clear that their organic presence has suffered. See the below SEMrush graphic:

Prior to this decrease, it is important to note that this site experienced sustainable growth since January 2014. They occupied the top positions for the ‘sell your car’ & ‘buy your car’ market for at least three years.

For many of you who are involved in SEO, this would have seemed perplexing, particularly when their techniques appeared to contradict long held beliefs on what influences organic visibility. These include:

  • 90% of their link profile was dominated with link directories.
  • They were using high volumes of keyword focussed anchor text within their internal links.
  • Thin content held within poorly designed templates

These anomalies tickled by curiosity, so I went away and conducted a mini-analysis. Here is a summary of those findings:

  1. On September 2014, the page was only ranking for one keyword. Then, following anchor text changes to their internal links to reveal ‘sell my car’ in November 2014, the number of keywords ranking for the page improved to 20.

By January 2015, that had risen to 39 keywords.

  1. The content or page template did not change until between the period May and September 2015. We cannot precisely isolate when these changes had been applied, but according to SEMrush data, it is clear that their presence had a further increase to 77 keywords ranking in the top 20 positions.

From this, we can infer that the content was added in June or July 2015.

3. Using Majestic’s backlink history comparison tool, it is clear that there was one link built in October 2014 and May 2015.

Links and adding content to influence rankings is certainly not a ground-breaking discovery. The significance of these findings is that the initial improvement in organic visibility occurred following the changes within the anchor text of the internal links.

Granted, the link which was built in October 2014 could have contributed towards this, but it is also interesting to note, that following changes to the navigation and internal linking in November 2016, from this:

To this:

As the table presents, between November and December, there was a slight improvement in organic visibility.

Then in February 2017, the subsequent drop in exposure followed and it continually fell to the present month, May 2017. Notice the significant drop in top three ranking positions for May 2017:


To help better understand the reason behind this drastic shift in performance, according to Search Engine Land, there was an unconfirmed Google Algorithm update, which indicated that the cause is due to spam linking activity.

Folks in the “black hat” SEO community seem to be noticing this and complaining that their tactics are not working as well.

If these suspicions are true, it becomes increasingly apparent that spammy linking activity has started to negatively affect their performance.

To Summarise the Key Insights from Discovering Your Own Technique:

  1. Despite helping to influence performance for a number of years, spam link building tactics are not sustainable and will inevitability be penalised by search algorithms.

It was apparent that following a link being built in October 2014, the number of ranking keywords significantly increased. You can rightly argue that this was a long time ago and today, links may not have the same influence. But following a brief analysis into a luxury travel provider Elegant Resorts, it is clear that the number of links towards a page does positively influence top 10 ranking visibility:

By establishing a relationship between when a link was built to a page and the impact it had on site visibility, 3 answers to difficult questions can be proposed:

  1. Whether that link(s) had a direct impact on performance over a sustained period of time?
  2. How long did it take to see an improvement in rankings for that page? Tip: consider using a competitor’s site, which has a similar link profile to your site. By following this logic, your site has a better chance of following the analysed sites’ success
  3. What was the link in question and how can I better align my activity to similar types of links? For example:
  • Was it a link within the content. If yes, where?
  • What was the anchor text and does the page’s distribution seem balanced?
  • What is the link profile & presence of the site – is it similar to your site, or is it a much stronger site which can distort what you can realistically achieve?

This analysis was undertaken by:

  • Compiling SEMrush ranking data and grouping pages into ranking ranges such as the top 10, 11th to 20th and 21st to 50th (similar to the ranking range image displayed above)
  • Using Majestic SEO data which isolated the number of unique linking root domains per page.

By combining these two sources, I was able to evaluate the extent to which the number of links positively contributed towards the average visibility of the top 10 ranking positions.

This was achieved by dividing the total number of pages which are within that ranking range against the cumulative top 10 monthly search volume for that page.

The table shows a strong correlation favouring more links towards a page. Whereas having zero links directed to a page offers a compelling insight. To put it simply, 170 pages had zero links towards them. Out of those 170 pages, there were three keywords ranking in the top 10. Each had 60 monthly searches, meaning the total cumulative searches was 180.

By dividing the total number of pages (170) with the total monthly search volume (180), it presented – on average – one available monthly search per page for a top 10 result. Minimal presence, indeed!

A caveat: quality links will always outweigh quantity. So do not feel an excessive number of links (over 10) per page is a requirement. In fact, excess inbound links to a page can result in diminishing returns, causing wasted time & effort.

A key point is that links remain a central part of the algorithm, and in my opinion, they are likely to remain that way for the near future.

2. By including keyword rich anchor text within internal links, it does appear to affect ranking changes. But, and this should go without saying it, don’t needlessly stuff your content with repetitive exact match anchor text links. Instead, give the search engines a natural reference point which will help guide them to what the page is about.

This is only one element of undertaking internal linking. Neil Patel presents a comprehensive overview of internal linking here.

3. The navigational change also improved performance. However, it would be interesting to have understood the sustained impact a mega menu would have had if the February penalty did not occur.

4. Unsurprisingly, considering search engine’s index content, following the content changes in June 2015, it was clear that further ranking improvements occurred.

This means creating rich, semantically relevant content (as demonstrated earlier in the post), not only adds value but also, it has the capability of producing results within a short time-frame.

Of course,  this isn’t an exhaustive list by any means. But I hope this has given you a better idea on which techniques to employ, making that all important difference to growing your organic presence. Feel free to share your own ideas and if you attempt these tactics, I would love to hear your successes or failures.

Click this link ‘View Summary’ button at the top of the post for a quick recap.

People are impatient. This is especially true for those who have either devoted their time or paid someone else to enhance their SEO presence. Across many circles, we are told that SEO gains take time, or they may not arrive at all. This combination of thoughts can understandably lead to everyone’s dirty word in business nowadays, prostration. To help get you out of this mode, these points can help you get to work on the right things:

  1. Link’s still matter. It is undeniable that sites with stronger link profiles outrank their weaker companions. Therefore, for those sites who already have a strong profile, there is a greater chance that subtle changes can catapult you to greater success.
  2. Identify your existing search presence for your site, site categories and specific pages. SEMrush or SEOmonitor are two great examples for establishing this. By identifying pages, which are already ranking strongly, it can help you to prioritise your efforts, which can enable you to realise quicker benefits.
  3. Refreshing content: this can involve updating meta titles by harnessing the power of the RankBrain update, updating historic blog blogs which involves adding fresh insights or improving its optimisation; or lastly, deindex your site. This last tip may sound extreme, but I have had evidence of it working.
  4. If you have gone through site migrations and changing pages, ensure redirects are properly mapped to the correct location & if there are multiple redirect hops, consider reducing the length of the chain.
  5. As you know, search engine algorithms are in constant flux. The effect of these changes can change over time, so it will be worthwhile carrying out your own tests to gauge the impact certain tactics are having. Using a combination of ‘The Way Back Machine’, SEMrush & Majestic, it is possible to isolate what onpage or offpage changes occurred within a month to determine if it correlated with an impact in performance, which was also sustained over a longer period of time.
Powered by TLDR