What is SEO - The Definitive Guide to On Page Optimisation

Contents


Chapter 1:
What is SEO


Chapter 2:
How to Choose the
Right Keywords


Chapter 3:
A Practical Guide to
on-page SEO optimization


Chapter 4:
What is the best
Website Structure?


Chapter 5:
What is Technical
SEO – An in-depth look


Chapter 6:
Google Tools – Search Console
Webmaster tools


Chapter 7:
How to make use of Google
Webmaster Guidelines

Chapter 1:

What is SEO?

SEO is essentially split into two parts on-page and off-page. This guide deals solely with what you have control of on your website, what changes you can make, the best practices and how to make sure you are at the top of your game. If you are more interested in off-page SEO, link building etc you can learn about it here link building.

Search Engine Optimisation (SEO) isn't new. It has been around in one way or another for many years. You hear the term SEO used all the time in digital marketing, but how exactly does it work and what is it?

Learning SEO isn't that difficult. Some 'experts' will have you thinking that it takes years to master and you need to be a technical boffin before you can implement a strong optimisation strategy. The truth is you can learn the basics very quickly. So if you are looking to improve your ranking in the Search Engine Results Pages or increase traffic to your website to boost sales, then this guide is for you.

Google and other search engines are essentially a vast index of content. When a web user enters in a search query into Google, Bing or any other search engine, using highly complex algorithms and technologies, the search engine will scan billions of web pages and in a matter of seconds present the web visitor with a list of pages that fits the search query.

SEO is about anticipating searcher intent and optimising your website around key search terms which are closely related to what your customers are searching for. The more relevant and authoritative your website is, the higher your website will rank in the SERPs.

Although SEO is free as in there is no paid option that increases your rankings, it is a systematic process which involves lots of different elements that work together to increase the quality and quantity of traffic to your website, and therefore can be costly depending on your niche.

img

Should SEO or Our Website Come first?

This is one of the most frequently debated questions in businesses who don't understand the process. There are so many people who fail to consider the significance of SEO from the initial stages of website planning, but this could prove to be a huge mistake.

A number of business owners will invest considerable amounts of time and effort in selecting a great design agency to create an amazing website, but little thought is given to important site elements such as a Call to Action, Unique Selling Point or Conversions not to mention, website structure or traffic.

SEO should always be considered at the same time as the website is being created and developed during discussions about design, content, structure, style and branding. Ideally your SEO company and developer should work closely throughout each stage of the build to ensure that the design of your site and SEO complement each other.

How can SEO help me with my websites Google Visibility?

SEO is a great tool for marketers. Not only does it have the power to boost the visibility of your website, it can also help to increase awareness of your brand. If you select and implement the right strategies SEO can help with the visibility of your website in a number of ways:

The first is the technical SEO which relates to how your site is structured, the way your pages have been set up and whether Google can crawl your site.

The second element is your content. Is your content well written and is it useful to the web visitor?

The third element is links. Link building is a complex, lengthy and detailed process which involves creating and promoting outstanding content to encourage links from high authority websites.

This page is solely focused on what elements you have control of on your page, if you want to learn more go to our page about link building

By focusing on each of these areas you will be able to use SEO to boost the visibility of your website.

What is 'Organic' Search Engine Optimisation

Organic Search Engine Optimisation is a process used to rank a website in the search engine results pages using unpaid techniques. There are many different elements to organic SEO including keyword research, backlinks and content development, all of which can help boost the ranking of a web page. Websites that use organic strategies effectively can grow, expand and adapt over time in response to what web users in a specific niche are searching for.

Organic SEO can be achieved in a number of ways including:

Optimising web pages with relevant, high value content

Securing high quality links from reputable, authoritative websites

Ensuring that each website page is suitably optimised in terms of alt tags, meta data and user experience

Organic SEO can offer a number of benefits including:

An increased number of clicks because the content is more tailored to the search preferences of web users

The ability to build and maintain trust amongst the target market

The capacity to build authority and become recognised as an industry expert

Cost effective in relation to paid search

Seeing results with organic search does not happen overnight. You cannot implement a series of SEO strategies and hope that they will start working in a couple of days. It can take up to six months before you start seeing results so you do need to be patient.

Keeping up to date with SEO

SEO is an ever changing field. Content guidelines, optimisation techniques and link building strategies are evolving on a regular basis. If you want to make the best use of the latest SEO tools, techniques and strategies, keeping up to date with industry developments is important.

What was relevant six months ago is perhaps not so relevant today. So how do you keep up to date with the latest developments and trends in the SEO industry? Here are our top methods for maintaining your SEO knowledge.

SEO Newsletters

One of the best ways to keep up to date with SEO is to sign up to newsletters. Subscribe to mailing lists on website such as Search Engine Journal, Neil Patel and CopyBlogger. Each week you will receive a collection of news and information on the best techniques and strategies relating to SEO and content development. There are plenty of lists to subscribe to with each providing unique pieces of information on content, link building or technical aspects of SEO.

One of the best ways to keep up to date with SEO is to sign up to newsletters. Subscribe to mailing lists on website such as Search Engine Journal, Neil Patel and CopyBlogger. Each week you will receive a collection of news and information on the best techniques and strategies relating to SEO and content development. There are plenty of lists to subscribe to with each providing unique pieces of information on content, link building or technical aspects of SEO.

The Moz Top 10

img

Well known industry websites such as Moz and Backlinko regularly distribute new content about the SEO industry. The Moz Top 10 provides lots of information about SEO, social media and inbound marketing delivered to you in a monthly newsletter.

Google Webmaster Forum

Keep up to date with the latest changes at Google through the Webmaster Forum. Google shares its major announcements in this forum as well as discussion about what it means so it's definitely one to follow.

Search Engine Journal

This website curates content from a number of industry leaders who are experts in the field, bringing you the most up to date, relevant content and information surrounding SEO.

Social Media

Follow industry influencers on Social Media websites such as Instagram, Facebook and Twitter. Build up your knowledge by reading posts from industry influencers such as Brian Dean and Neil Patel.

Keeping your knowledge up to date is very important, particularly in the SEO industry. If there is a major algorithm update and you don't know about it, you may experience significant shifts to your website traffic if you don't adequately prepare. That being said, it can be difficult to find definitive answers about updates to algorithms or other tweaks made to search engine results.

Is SEO just speculation or are there any absolutes?

img

There is often much speculation amongst industry experts and influencers about the SEO world. Often when there is any type of change in rankings discussion will increase online surrounding a potential algorithm change or an update of some description.

Whether it's FRED, Penguin, Hummingbird or Panda, any shift in rankings will initiate a flurry of activity and discussion in the SEO world. However, many of these discussions are mere speculation as Google very rarely release official confirmation what updates have taken place and if they do, what exactly they mean.

Furthermore, there are hundreds of different ranking factors, many of which have not been confirmed by Google. No one knows exactly what these ranking factors are or how they are weighted so often, SEO is a matter of chance.

SEO is very hit and miss there are no absolutes or definitive guidelines that state exactly how you should be optimising your website. All you can do is follow best practice, comply with Google guidelines and optimise your site using recommended techniques.

The advice that you find or guides that you read like this one are based on information gathered by numerous industry experts who have spent many years testing different techniques to find out what works and what doesn't.

Google Hangout and Webmaster videos

When it comes to finding out about SEO or trying to obtain some definitive answers, this often proves very difficult. There are plenty of videos on Google Hangouts and in the Webmaster tools but lots of these actually skate around the issues being raised rather than providing solid answers.

This not only makes your job much harder, but it can sometimes prove confusing, particularly if you are new to SEO or still learning.

Unless major updates are released, the responses to questions surrounding minor updates or changes remain very much open to debate with no solid answers coming from Google. The best way to approach this is to follow best practice guidelines from industry influencers and keep using tried and tested techniques.

How long does it take to see results in the SERPS

This is a question where there is no definitive answer, but it would be wise to remember SEO won't work overnight. When a business starts implementing a SEO strategy, they often expect to see results in a matter of days.

The truth is however, SEO can take many months to actually take effect. There are many variables which affect how quickly your SEO strategy will take to start delivering results.

Traffic, the design of your website, the age of your domain, location, competition and your target market will all have an impact on how quickly you should expect to see results. Websites can typically start seeing results within 1 to 6 months of starting an SEO campaign.

Chapter 2:

How to Choose the Right Keywords

Keyword research is a fundamental element of SEO. Keywords need to be deployed strategically throughout your content, ensuring that the content provides value and it is written for a human web visitor rather than a Googlebot.

But how exactly do you choose the right keywords?

This section will show you how.

Although there are many ways to undertake keyword research, certain strategies will bring more effective results than others. The way in which you approach keyword research will depend on:

The website that you currently have to work with – Is the website established or is it completely new, how many pages do you have, is the content of an acceptable quality, is the website considered to be authoritative? Is the website is one which features content that is exceptional and so good that people want to link to and share it.

Objectives – What do you want your marketing strategy to achieve? Exposure, leads, traffic or sales?

Finances – Your resources, budget and deadlines

Industry – How competitive your industry is

Seed keywords are the building blocks of SEO. They are initial words and short phrases that customers or web users enter into search engines to find products, services or solutions that you sell. This type of keyword will aim to identify both your industry and what people are searching for.

If you already have a business, a product or service that you wish to promote, coming up with a set of seed keywords is as easy as describing your business or what you sell. Think about how other people search.

Once you have a good amount of seed keywords you can then move on to the next stage. This involves generating a list of relevant keywords and understanding in more depth how people are searching in your niche.

To do this you will need a number of tools. There are lots of different ones that you can use, but for this guide, we will be looking at Uber Suggest, Google and Answer the Public. The majority of these tools will pull the keywords from a number of places including the Google Keyword Planner, Google Auto Suggest and Related or Similar Searches in Google Search.

Uber Suggest

From your initial list of seed keywords, enter one into the search bar on Uber Suggest. Select the country and click ‘Suggest’. For this example, we will use the seed keyword ‘digital marketing’ as an example.

img

The search will complete and present a list of keywords:

img

You can then download these keywords and add them to your list for the next stage.

Understand your niche

From this initial research you will start to see themes or patterns in the keywords that keep appearing. You will learn more about your niche and how people are searching. As you progress through these steps in the keyword research process, you will be able to identify some great keyword ideas that are not being used. One of the best ways to achieve this is to really understand your audience.

Know your audience – What are they searching?

Identifying the right keywords for your target audience can be a challenge. It is part magic and part science. A strong keyword requires precise knowledge of your audience and more specifically, an understanding of the web user's intent. Why are they searching for a specific keyword? Is it to find out more information? Choose a product or service to buy? Or something else?

There are a number of tools and techniques that you can use to find out what your target audience are searching for.

A useful tool for this task is Answer the Public.

img

For this example, we will use a broad search term such as SEO Services. Enter the search term of your choice and then click on 'Get Questions'.

The website will then bring up a series of sections: Questions, Prepositions, Comparisons, Alphabetical and Related.

The image below is taken from the 'Questions Section'. There are countless questions here that give you an indication of what people are searching for which relate to your niche.

img

As well as Answer the Public you should aim to understand what compels people to start a search for something. Many searchers begin their search with a problem, whether it is searching for a gift or leaking windows that need repairing.

These problems then turn into questions. As the customer learns more about their problem the questions will become more specific until the issue is resolved.

This is recognised as the buyer's journey with the resolution being a sale of a product or service. The buyers journey progresses through three stages: Awareness of the problem, Consideration where they seek out solutions and Decisions where the customer chooses a provider.

As a business owner you need to understand what triggers your audience to search for your product or service to compel them to take action. The best way to do this is to survey your existing customers and carry out research.

Once customers have carried out their research they will begin to shop and compare before making a final decision. Use the various tools such as AhRefs and SEMRush to find high converting keywords and explore how people are finding competitor websites.

Your competitors can be a really useful resource in identifying what your potential customers are searching for and why. All you need to do is to identify a handful of competitors and then enter their URL into a tool such as AhRefs or SEMRush. This will provide you with a vast amount of information on what your competitors are ranking for.

How to use Google Suggest to find valuable keywords

Another great (and free) tool is Google Suggest. This is simply the auto complete function on the Google Search bar when you type in a query. As you begin to type, a series of suggestions will be displayed. It's as easy as that!

img

These are potentially options for keywords. The terms that are displayed by Google Suggest are the most frequently searched terms related to that specific keyword. This feature saves time and provides searchers with additional information about the topic. For the search term 'content writing' we have a range of options from content writing services UK through to content writing tips.

Look at traffic from competitors keywords – SEMRush

As briefly discussed above, competitors will present lots of valuable information that you can use in your own keyword research. A competitor analysis can reveal specific information and about searcher intent and save you a lot of time in the process.

Research specific keywords that they rank for and select the most suitable ones. If you don't have an idea of who your competitors actually are, just enter a few seed keywords into Google and see who is listed on the first page. Let's take an example. If you were searching for business card design you would enter this seed keyword into Google and the following results will be presented:

img

Take one of the search listings in the organic search results above. Let's take Instant Print. Head over to SEMRush and enter the URL of this website:

img

When using this tool, sometimes just a single competitor can give you hundreds of possible keyword ideas to maintain your SEO strategy for several months. However it is wise to note that a single web page can rank for hundreds if not thousands of keywords. Consequently it is much better to focus on top performing pages of competitors rather than focusing on individual keywords.

Once you click through into the process you can see in the organic research section the keywords that this business rank for, what position it is in, and what monthly search volume there is for this particular phrase.

img

There may be quite a few keywords that your competitors are targeting but you are not. This could be one of the biggest opportunities that you are missing out on (providing of course, that it fits in with your business model and the products or services that you sell).

Run keywords through Google Keyword Planner & SEMRush

Take the list of keywords that you have and use the Google Keyword Planner and SEM Rush to find out whether they are a viable target. There are certain things that you will need to look for including:

Search Volume – Displays the overall demand for any given keyword. The majority of keyword tools pull this information from Adwords.

Number of Clicks – If you are researching a specific keyword and you find that it is averaging hundreds of thousands of searches every month this means that it can potentially bring huge amounts of traffic to your website, provided that you can rank for it.

Some keywords will serve to drive traffic to your website while others will only take away your CTR. Using a keyword explorer tool you should be able to identify that certain keywords don’t result in any clicks. This is because the answer that they need is delivered on the SERP. This is an effective way of eliminating high demand keywords but poor click through rates.

Potential – We all want to know how much potential a keyword has. Search volume and clicks are great metrics but one keyword can have hundreds of potential synonyms and related search terms all of which can be targeted by a single web page.

Difficulty – Assessing the ranking difficulty of a specific keyword is achieved by evaluating the search results and using your SEO know how. Each keyword tool that you visit whether it is Ahrefs or SEMRush will have its own way of working out keyword difficulty.

Cost Per Click – This metric is more relevant to paid marketers than those working with SEO but many experts use this as an indication of the keywords commercial intent. This will be discussed in more detail in the next section.

Choose which keywords have high commercial intent

When it comes to choosing keywords, commercial intent is perhaps more important than search volume. With a little research you can actually find what keywords your buyers use.

Once you start ranking for these keywords and your website starts generating traffic from these search terms, it is much easier to turn visitors into customers.

When exploring keywords that have commercial intent, there are several categories; Buy Now, Products, Informational and Low Conversion

Buy Now – These keywords are those which will be used by searchers who are ready to make a purchase immediately. Buy Now keywords will include phrases such as buy, discount, shipping, offer or deal. They may not generate a significant amount of traffic but they deliver an extremely high conversion rate.

Product – These are searches that are built around a brand, product category or service. This type of searcher will convert well but they are not as ready as those using buy now keywords. Typical search terms in this category include best, review, top 10, specific product names or brands, affordable, comparison, best of etc.

Informational – The majority of keywords that you will find are those that are informational. People looking for information generally don't convert very well. You can make good use of these but you have to get creative. Informational keywords include how to, best way to, ways to or I need to.

Informational keywords are great for building content around, content that is really useful people will share, and bookmark. You can of course then advertise your own products within your informational keyword based content. Marketing is all about seeing the opportunities and getting creative.

Low Conversion – These keywords are those which are unlikely to convert at the moment. A few examples of these types of keywords include download or free. However don't dismiss these for the same reason. If you can get traffic to these pages, you can make use of it.

Once you have identified whether your keywords demonstrate buyer intent, you can check how valuable the traffic is which is generated from a specific keyword.

The first way you can do this is to check Adwords suggested bid amount or cost per click. Log in to your Google Adwords Account and locate the Keyword Planner and click on the first option 'Search for New Keywords using a phrase, website or category'.

Enter your search term and click 'Get Ideas'

img

Review the suggested bid column which will give you an idea of commercial intent for a specific keyword.

img

The key search term, small business website design is a product keyword because it has a bid amount of £16.29, while other keywords with a lower bid have a lesser importance to current bidders in the paid auction (PPC) but this doesn't mean you should ignore them.

Looking at competition is another good way of assessing commercial intent. Competition is simply how many advertisers are bidding on a specific keyword. The more bids there are on a particular keyword the more expensive the bid amount is.
Another top tip is to simply search in Google for your keyword. If there are lots of advertisements at the top of the page, you know that you are looking at a top performing keyword and probably one that you should target.

As you start to build your list of keywords, it is a good idea at this point to look at Google Trends, a useful tool which can help you refine your keyword list.

See if Google trends has much growth or demise

Google Trends is a great tool when it comes to SEO. It can help you identify trends for keywords, display seasonal spikes and local differences and help you locate content ideas.

Taking the keyword from the last section, Small Business Website Design, this is what it looks like when entered into Google Trends

img
Chapter 3:

A Practical Guide to on-page SEO optimisation

When it comes to on page SEO there are some practical strategies that you can implement to help generate more traffic.

Page URL – Best Practices

A Uniform Resource Locator or URL for short is simply a website address used to identify the location of a resource on the web. This is often the first thing that customers and Google will see. A URL is a foundation on which to build a strong site hierarchy. Your domain will act as a gateway and direct web visitors to the required page(s).

If you don't construct your URLs correctly they can cause problems. However getting them right requires consideration in terms of accessibility, usability and of course, best practice from an SEO point of view.

Although there isn't a single approach that every website can adopt, there are some best practices that you can follow to make sure you get the most from your URLs.

Keywords – Whenever you create a new page on your domain, it should have a purpose. Whether this is to generate a sale, provide information or serve as an administrative login page, its purpose should be clearly identifiable.

When you create a web page it should be discoverable by the right people and search engine crawlers. As you construct your URL it should keep this in mind, incorporating a suitable search term that will accurately describe the page.

When creating URLs be careful because an incorrect website structure can result in your site becoming a miss-match of sub-domains and multiple paths which arrive at the same products. This not only offers a poor user experience, but it also confuses Google making it difficult for people and search engines to understand how your product offering is structured.

URLs can offer three main benefits to SEO:

#1 User Experience
A well thought out URL will provide information that can be read and understood by both humans and search engines. A URL should always accurately describe where it is taking the web visitor. Even if the title tag was hidden, the URL should provide a clear picture of what the target page is about before a web user clicks on the link.

#2 Rankings
Although URLs are only a minor factor in SEO, they do help when search engines are trying to understand what a particular page on the internet is about. They are also used to assess relevancy when a web visitor types in a search query. Keyword use in the URL may act as a ranking factor. Using keywords in a URL can increase the visibility of your website. It is much better to create a useful, user friendly URL than to create one simply because it includes a keyword.

#3 Links
If a URL is carefully planned, it can serve as its own anchor text when it is copied and pasted into other web pages on blogs, social media platforms and forums.

SEO Best Practice
When it comes to creating the URL for the pages on your site, they should be well structured and clearly identify where they are taking a web visitor. Follow these best practices:

Create clear, relevant and compelling URLs which are incredibly accurate. This is the key to ensuring that both search engines and users can understand them. Although you can include page ID numbers in the URL best practice suggests that you replace these with actual words. If you use WordPress you can change page numbers to names using the 'remove permalink' option in the website's settings.

Only use hyphens (-) when necessary. Usually these are used to separate words. Avoid the use of underscores, spaces or other characters to separate words.

Always use lower case. Sometimes upper-case letters can cause problems with duplicate pages.

Where you are working with a large site with multi-faceted menus and options, as well as large product inventory, correct use of the canonical tag is essential for SEO.

Title tag – How to create effective titles SEO & clicks

Begin the Title Tag with your keyword. This is the most important on page SEO factor. Your keyword should be as close as possible to the beginning of the title. The closer it is to the front the more weight it will carry with search engines.

Also add modifiers to your title such as 'review', 'best', 'guide' which will help you rank for long tail versions of your keywords. Ensure that you have a close variation of your title tag wrapped in the < h1 > tag.

Is a meta description an SEO factor?

From September 2009, Google no longer recognises meta descriptions as part of their algorithm. However, writing effective meta descriptions can actually help with click through rates, which do, ultimately affect rankings, therefore your meta descriptions for pages you build to rank in the SERPS are super important, get them right and experiment with them.

A meta description is a snippet of information, no longer than 160 characters which succinctly summarises the content of a page. The main goal of your meta description is to increase click through.

A meta description should be actionable and include a call to action as well as incorporating structured content. Include phrases such as 'Learn More' or 'Buy Now' to encourage the reader to click onto your website.

Meta descriptions must be accurate. If people are clicking through to your site but then hitting the back button almost immediately, your meta description isn't as accurate as it should be which is not a good thing in terms of SEO. Structured content simply means including useful terms that will tell the reader what they want to find out about.

How to create great page content

Organic SEO depends on great content. If you want to build a strong online brand, boost your SEO efforts, secure more traffic and generate sales, you need exceptional content. If you create content that is unique, engaging and valuable, search engines will help you gain more visibility.

Google will always reward high quality websites that provides readers with highly valuable and original content. There are hundreds of ways that you can create outstanding content but here are just a few:

Originality – Google loves original content, even more so if the concepts contained within the content are also original. Don't be tempted to create lots of content scraped from various places around the web. Be creative, be unique and offer your own spin on things.

Headlines – A good headline will capture a reader's interest and engage. Take time to craft compelling headlines. They are the first piece of content that people will read so make them count and encourage people to read your article, avoid terrible click bait that is not relevant.

Actionable – Your content should be actionable. The best content doesn't just inform; it teaches. Readers of your content should be able to apply the content through top tips. Think of each piece of content as a tutorial. At the end the reader should be able to take away something useful or put into practice some of the information that you have provided.

Answer Questions – A search engine is a place where people ask questions. Your content should provide the answers. Make your content so people can scan and so that readers can pick up on the most important points without having to read the entire guide or article.

Accuracy – As well as being informative, your content must be accurate. If you quote statistics you must ensure that the information you cite is accurate. Inaccuracies can create all kinds of problems. Don't forget, accuracy builds trust.

Content – Engaged readers will interact more with your post, but how do you create an engaging piece of content? Pose thought provoking questions or encourage readers to reflect on the advice that you have provided. Create a promising introduction and tell stories. If you create content that sparks interest, search engines will recognise that your articles are important and it will increase the frequency of your site being indexed.

Video and Images – Everyone absorbs information differently; some learn more effectively by seeing, others by hearing. Visual aids are a great way to facilitate learning. Whether it is through an infographic, video or diagram, they can help communicate your points. However only use media that adds value or complements your article.

How long should your content be?

As a general rule, the longer the content the better. Longer content ranks significantly better than posts of 500 words or less. The content should increase dwell time. If someone lands on your website and immediately clicks on the ‘back’ button this is a signal to Google that it is a low quality web page and does not serve the users query very well. Increase the amount of time spent on your pages by creating long, engaging content that keeps people reading.

What is the optimum keyword density?

Keyword density is an old school SEO tactic and one that you should avoid. It is no longer effective and could actually penalise your website if you over optimise your pages, but you must make sure your content and keywords are relevant, and they read naturally.

Is there not a duplicate content penalty?

Duplicate content is a confusing area for many and there are lots of conflicting ideas about how it is viewed by Google. In terms of a penalty, there is no such thing.

That being said, this does not mean that you can copy other people’s content or publish the same content across multiple web pages. According to Google duplicate content is classed as important blocks of text on the same or different domains that are identical or very similar.

For the most part, this content is not designed to be deceptive. Duplicate content is often mistaken for a penalty because of the way it is processed by Google. All it does is filter the content in the search results. Simple rule, create unique, outstanding content, don’t skimp or use questionable tactics to save money, do the best you can.

Should I have outbound links in my content?

Absolutely. Outbound links to related pages are a strong relevancy signal which helps Google identify what your page is about. Pages with outbound links are more likely to outrank pages without them. Outbound links however shouldn’t just be from anywhere. They should be carefully selected and link to authoritative, industry relevant websites.

Heading tags – how to make best use of H1-H6

Headings tags within your content are important. There are six possible headings from H1 through to H6 and they are used by Google to understand the structure of text on a page. H1 tags are usually those to signal main headings such as page titles.
Then H2 through to H6 tags are sub headings used throughout the content on the page. Use H2 tags to divide your page into easy to digest segments. H3 is for a subheading of H2. Headings from H4 through to H6 are not used as often as the first three, but these are particularly useful if your content is longer than 1000 words.

Alt image text – the best way to use alt text for SEO

Always ensure that your image file names are suitably optimised and include your target keyword. The alt text should always be optimised. This is because search engines cannot yet read images so you need to tell them using text what the image is about – not simply making the alt text your desired seed keyword for example.

Including rich media such as videos & images

To optimise visibility in the search results, your web pages require more than simply written content. It is well documented that including images boosts engagement both on social media and in organic search. The majority of web pages have at least one image but to boost visibility in the SERPs they need to be effectively optimised. Best practice for images suggests that you include:

Optimisation – While SEO best practice encourages you to name your images with suitable tags, your image strategy should include a combination of original images from your brand combined with the usual stock photography.

Quality – Image quality is of utmost importance, they should be easy to see and free from pixilation.

Description – Describe your image clearly and concisely, both using the description and the alt text.
Videos on your website can enhance user experience by providing additional ways for web visitors to access your content. The majority of businesses are slowly moving toward the use of video as part of their content strategy. Best practice suggests that you include:

Questions – Create videos based around questions that your target audience would like answering.

Optimisation – Ensure that your video is optimised for websites such as YouTube but also Google organic search results

Length – Keep your videos short. No more than 30 seconds.

Semantics, Language Rankbrain & text analysis

Semantic search cannot be overlooked when it comes to content creation. When semantics are applied to search, it means the way in which words are used and how they are logically connected. Through semantics, search engines will aim to improve the accuracy of search results by understanding the intent of the searcher using the context in which they are searching.

By matching synonym and word relationships, algorithms that understand natural language, semantic results will deliver more relative search results. Semantic search demands an enhanced understanding of intent, the capacity to provide answers and the ability to offer more personalised user experiences.

As you develop your content there are a number of strategies that you should deploy that will benefit both SEO and semantic search:

Value – Create content that becomes a recognised resource in your industry and become an expert source that Google can reference. Build your business so that you are a go to resource that establishes connections, exchanges information and provides visitors with value.

The following questions will help you to build a winning organic search strategy:

  • What types of keywords or search terms would you like to rank for?
  • What businesses are currently dominating this space?
  • What is the business doing to make them authoritative?
  • How can you make your content 10 times better?
  • Who is liking, linking to and sharing your content at the moment?
  • How are users interacting with your content – conversions, sharing, subscriptions?
  • What stage in the buying cycle is your content targeting customers
  • At the beginning?
  • When they are gathering information?
  • When they are close to making a decision to buy?
  • How can you improve user experience?
  • Content that you create should be targeted and non-branded. Content should be created that relates to the products or services that you sell, but it should first and foremost, educate, advise or inform readers, providing something valuable.

What exactly is Rankbrain and do I need to consider it?

Rankbrain is an Artificial Intelligence element that helps Google process search queries. It incorporates significant amounts of written language into vectors which can be understood by computers.

If RankBrain sees a word or phrase that it is unfamiliar with the machine can make a good guess as to which words or phrases may have the same or similar meanings, subsequently filtering the results.

Once you have your content built around semantic search RankBrain takes this a step further. Using Latent Semantic Indexing, Google will be able to understand the relationship between words, phrases and web pages before making associations between them. This allows the search engine to create a series of expectations in relation to the terms that will appear within a given context.

Using RankBrain, Google will further decide whether these associations are important depending on the context of the web page. If for example, you are creating an in-depth guide on digital marketing, you may want to rank for this search term, but RankBrain may have a better understanding what the most appropriate results are for this query.

As it reviews your content, it will conclude that the best results for this query have a number of elements in common. The majority of the top ranking pages for digital marketing mention related terms such as paid search, email marketing, SEO etc. the terms that are logically connected and should be included in the guide.

This is how Google's RankBrain is connected to semantic search and language selection. RankBrain and LSI focus on identifying whether a page is in depth enough by evaluating the language and detail of the content that you write.

As well as understanding new keywords, Google can now tweak its algorithm to measure user satisfaction. Although there is no definitive proof as to how Google uses UX signals, will usually follow this process:

The user types in a search query

RankBrain will classify this query into concepts

A series of results will be displayed based on these concepts

If the page satisfied the user, the page will be seen as more relevant.

RankBrain works by serving users with a set of pages that it thinks they will like. If lots of people like a particular result, that page will receive a boost to its ranking.

The whole concept of RankBrain is partly looking at how people interact with the search results, specifically focusing on click through rates, dwell time and bounce rates.

Does your site need to be mobile responsive?

Google have stated on multiple occasions that responsive design is essential and there has even been speculation that responsive websites receive a boost in rankings. With more and more people accessing content while on the go through mobile devices it is essential that their user experience is as seamless as possible whatever device they are using.

If a web visitor lands on your website using a mobile device and it is difficult to use, there is a high likelihood that they will leave and go to a competitor site. Furthermore, if a web visitor has a positive experience with your mobile website they are more likely to return.

How to make sure your page loads quickly – Site speed issues

Page load speed is another element that can affect website usability. Pages that load faster rank higher and convert better than those with slower page load times. Google has indicated that the speed of a website is one of the signals that is used by its algorithm to rank web pages.

Follow these steps to ensure that your page loads faster:

Eliminate render-blocking JavaScript and CSS in above-the-fold content

If you run the Google Page Speed Insights tool and you find that there is a suggestion to remove render-blocking scripts and CSS, you will need to install and activate the Auto-ptimize Plugin for WordPress. Begin the process by selecting the JavaScript and CSS Options and then 'Save'. This is all you need to address the problem. You can return to the Page Speed tool and the issue should be fixed.

Minify CSS, HTML and JavaScript

Minifying WordPress JavaScript and CSS can make the pages load faster and speed up your WordPress website. The term minify simply means to make smaller. If you are trying to achieve a higher score on Google Page Speed, then minifying your CSS will certainly help. WordPress Plugins are a great tool for this task. Firstly you will need to install and activate the Better WordPress Minify Plugin. The plugin will be displayed on your menu as BWP Minify.

If you click on the menu item you will be taken to the settings page. All you have to do is to make sure the first two check boxes are ticked:

Minify JS files automatically and Minify CSS files automatically. Click Save.

Now you need to access your website as normal. In your browser, right click and select 'View Page Source'. If you look carefully you will notice that the CSS and JavaScript files are pulled from the plugin folder rather than WordPress themes. This is essentially a minified version of your CSS and JavaScript files.

This Plugin will minify HTML, CSS and JavaScript files. PHP files can be minified but it will not have any impact on page load speed.

Prioritise visible content

Although there is no easy solution to prioritising visible content, it will require you to explore each element of your page and consider how it could be better. There are three main elements that you will need to look at:

  • HTML – Ensure that the main on page content is loading first, before any other page elements. Design your page so that content is above the fold (so users can read the content without having to scroll to reach the first sentence)
  • CSS – Evaluate your CSS delivery using a dedicated CSS tool
  • JavaScript – Prevent JavaScript from starting until after the page has loaded wherever this is possible.


Slow page speed means that Google will crawl fewer web pages and this could have a negative impact on indexation. Increase your page speed through:
Compression – Using software such as Gzip, you can reduce the size of your website files which are more than 150 bytes.

Code – Optimising your code can significantly increase the speed of your web pages. Remove code comments, unused code and formatting.

Redirects – Each time a page redirects, your web visitor has to wait longer for the page to load. Remove redirects to speed up your website.

Browser Caching – Browsers cache lots of information so when a visitor returns to your website the browser doesn't have to reload the entire page. Use a tool such as YSlow if you already have an expiration date in place for your cache.

Server Response – The response time of your server will depend on how much traffic your website receives as well as any other websites you are sharing with, the bandwidth limitation of your server, along with the above issues and any other hosting related software problems.

To speed up the time it takes for your server to respond, identify performance issues and address them with your hosting company.

Optimise Images – Make sure that your images are not larger than they ought to be and they are in the right file format. An image saved as PNG is better for graphics with less than 16 colours, while JPEGs are usually best for photographs.

Chapter 4:

What is the best Website Structure

Search engines love well structured content. A Site with an unorganised mass of pages is not going to fair very well in the SERPs. The system of siloing your website will allow you to group or categories your content so that search engines can easily understand what your site is about and how the pages and posts are linked.

Home Page

Should include content based on your theme, industry on niche.

If for example. you sell SEO services, website designe and social media marketing, your home page should be based around digital marketing.

Blog Page

Every website needs a blog, not only to generate traffic, but to help create an effective silo. Blogs should be in a sub-folder if they are going to contribute to a silo structure:

Within your blog there should also be clearly defined categories which could be in line with your website silo structure.

Silo Page

This is the point when your silo structure is used. Group your content into organised sections. Each group will focus on a separate category such as:
/digital-marketing/
/seo/
/pay-per-click/

Supporting Page

Your supporting pages should provide further information to search engines. The more detailed your supporting pages, the better your chances of ranking for long tail searches:
/digital-marketing/banner-ads/
/seo/organicsearch/
/pay-per-click/smallbusiness/

Developing a clear website structure not only helps users find your content more effectively, but it also makes your website easier to understand from the perspective of a search engine.
There are two main methods of structuring your website; a flat structure or a silo site architecture. But which one of these is better for SEO? Read on to find out.
Many content creators and marketers will create new content and post it on their blog. Is this really the best way to publish and organise your content though? There is often a vague website structure that will include pages for products and services with all new content being posted on the blog.
But blogs don't really make sense from a structural point of view. Rather than completing in depth keyword research and incorporating this into a suitable website structure, most companies will end up with a few top level service pages and all of the valuable content then gets uploaded to the blog.
Blogs are great tools for businesses but they can be used incorrectly. Really, the only content that should be posted on a blog will include company news, announcements, media information and employee profiles.
Any informational resource that you publish should not be in your blog. A blog is often treated independently from a website, often to the point where you are separating your website into two distinct sections.
When a search engine crawler lands on your site, your blog may as well exist as a standalone website. Blogs usually feature a flat structure with every post being on the same level or occasionally they are grouped into categories. Including links from your blog to your website pages doesn't really work or make that much sense.

Flat website structure

A flat website structure would appear as:

  • Domain.com/Page1/
  • Domain.com/Page2/
  • Domain.com/Page3/

A flat website structure is one commonly used on blogs, but a deeper architectural structure provides a much better way to group and organise your content. That's where silo structures come in.

Silo Website Structure

A silo website structure on the other hand is much more organised and you can see how the site is structured into unique categories.

A silo structure is one that is logically organised with a hierarchical grouping which is defined using topics and sub-topics. The more relevant your content is to your silo, the more relevant your website will be for Google. A silo allows you to take a broad topic such as digital marketing and silo your content down into more and more specific categories.
As an example you would start your website with a topic on digital marketing.
A silo structure would work in the following way:

  • Domain.com/Main-Topic1/
  • Domain.com/Main-Topic1/Supporting-Page 1/
  • Domain.com/Main-Topic1/Supporting-Page 2
  • Domain.com/Main-Topic2/
  • Domain.com/Main-Topic2/Supporting-Page 1
  • Etc..

The supporting pages would then link back to the top level page Main Topic 1, 2 etc.

This would be broken up into categories such as SEO, Email Marketing and Social Media Marketing

The SEO category could then be divided further into on page SEO and off page SEO

On page SEO categories could include technical SEO, meta descriptions and content

Off page SEO could be broken down into link building, guest posting

Each of these sub-categories could then be narrowed down even further. As you break the topics down into smaller and smaller categories you are effectively answering more and more user queries.

When you use a blog, more often than not, internal links are not considered. Or if they are there will be a link out to your service page but not one back. If you create a strong silo structure, the structure will make your internal linking much easier to follow.

Why is Website Structure Important

In terms of Google, Silo structures are preferred. Silo structures present a clear and logical order of links and as a result Google finds it much easier to understand how your website is structured. It is also more logical for UX and users find their desired results easier. All round it is a better option for improving your business.

Silos for SEO

When a website closely matches a user's search query it is given extra points by Google. Relevancy is key for SEO. Many websites are a collection of pages with no common theme and as a result they can suffer in the SERPs.

Siloing a website will help to clarify the structure of your site and establish the groundwork for higher keyword rankings. Siloing is very similar to the way in which the information within a textbook is structured, with clear headings and sub-sections so the reader can navigate their way through with relative ease.

Silos for the best user experience

A silo website structure can benefit user experience in a number of ways. With a silo, products, services and useful content is neatly classified into relevant categories.

As the web pages are more organised, the visitor will have a better experience of your site and as a result their engagement will increase. Google favours websites that visitors appreciate. High engagement levels will send Google 'value signals' which will help your SEO ranking.

Silos make the links between your pages easier to see and the search engine will be able to identify how well you cover various topics. Demonstrating to Google that you provide value and depth will increase your domain authority and help your SERP position.

How do Silos help crawlability and indexing?

When you create a silo structure, it is much easier for crawlers to move across each of the pages that appear on your website. The way in which a website is structured will allow the bot to land on one page and then follow the internal linking structure, to see how all of the pages are linked before indexing them to appear in the SERPs.

Internal Navigation & Link Structure

Careful linking within your website demonstrates the relationships between pages and identifies the structure of the topics on your website. It also enhances your main landing pages which are those that you would like to show up in the search results. This is because there are far fewer cross links for the spider to crawl than what exists on a standard website.

Search engine spiders will use links to crawl from page to page on your website. Although there is no right and wrong way to complete your internal navigation and linking structure, there are some methods that will work better than others. One of the most popular methods is using a power page silo.

Google is becoming more and more advanced in terms of identifying web spam and linking schemes. To safeguard yourself from penalties associated with Penguin updates, the majority of inbound links should point to blog posts or quality content. However, the problem arises when these aren't pages that you wish to rank. This is where the power of internal linking comes in. Pages that attract links from external sources pass on link juice so you can distribute this power to your other service pages. A word of warning though, you must be careful not to over optimise the anchor text which can trigger a penalty.

Chapter 5:

What is Technical SEO - An in-depth look

Now that you have mastered on page SEO and you have a good understanding of what off page SEO involves; technical SEO is next on your to do list.
Don't worry if you don't understand what this is or where to start, the next chapters of this guide will teach you everything you need to know.

There's no doubt, technical SEO matters. It is a series of tasks and checks that you need to carry out to ensure that your website is fully compatible with search engine guidelines and your website can be found, indexed and ranked for search queries entered by your target customers.

Technical SEO can affect the performance of your site quite quickly. To have the best chance of ranking in the SERPs you need to ensure that your website is strong from a technical point of view.

There are lots of elements to technical SEO, some of which are more complex than others. To begin with, let's look at your URL or web address.

URL Structure – Absolute or Relative

Relative or Absolute? The way in which you structure your website URL can make or break your SEO. If you choose the wrong type, you will not only make it incredibly difficult for search engines to crawl your website, but it will also have an impact on your SEO.

An absolute URL requires you to use the entire address on the page that you would like to link to. An absolute URL example would be:

A relative URL on the other hand does not use the full address. It assumes that the page you select is based on the same website. A relative URL example would be:

Page.html or ../../Page.html

Canonical Tag – Making the best use of it

Whether you have never heard of the canonical tag before or you have a vague idea what it is, it is important to understand the role that it plays in the optimisation of a website.

A canonical URL is used when we are talking about multiple versions of a web page. It will tell the search engine bot which pieces of content are the main ones and which are duplicates. That way, the bot will ignore the duplicate pages and only index the original content, giving credit to the primary piece of content. A canonical tag will appear like this:

< link rel="canonical" href="http://www.yourdomain.co.uk/sample-page/" >

Here, the purple text represents the way the canonical tag is formatted and the orange text represents the URL. The combination of these two elements is known as the Canonical URL Tag.

This tag is placed in the HTML header of a web page and aims to keep duplicate content out of the search engine index while strengthening the impact of your page.

Perhaps there may be occasions when you want to share the same content on two URLs. In this instance you can use the rel="canonical" tag to indicate that one page is the original and the other is a copy. This can prevent problems with duplicate content.

Social media can cause problems when it comes to duplicate content. Every time a piece of content is shared on social media and a visitor lands on your website, this can create a duplicate content issue. For example:

This is often the case with e-commerce websites also that also use a faceted navigation.

If you don't create a self referencing canonical tag on the page that points to the original version of the content, you do run the risk of suffering from a duplicate content issue.

It is also important to understand what rel="next"and rel="prev"link attributes mean in relation to the canonical tag. In essence, they are pagination attributes which are also located in thesection of your web pages.

Pagination attributes are commonly used on category pages in ecommerce websites. Category pages often contain many different products and they are split across multiple pages with each page showing a section of the category.

One of the downsides to this is that the pages can look very much the same which can create issues with duplicate content. If you make it clear to search engines how pages are connected using pagination attributes you can prevent duplicate content from being a problem.

As an example, if you have a set of three pages this is what the canonical URL will look like:

Page 1 will reference the following page:

< link rel="next" href="http://www.domain.com/page2.html/" >

Page 2 will have the following sequence:

< link rel="prev" href="http://www.domain.com/page1.html/" >
< link rel="next" href="http://www.domain.com/page3.html/" >

When setting up pagination attributes, try not to break the sequence. If you do the search engine may ignore the pagination and index and return all pages which results in duplicate content.

How to use your robots.txt

A robots.txt file is a file which provides specific instructions to search engine robots on how they can crawl web pages.

A robots.txt file will consist of lines of text:

User-agent: [user-agent name]
Disallow: [URL string that you do not want the bots to crawl]

The role of robots.txt

When the search crawler arrives on a website, but before spidering it, it will look for a robots.txt file. If one is located, the crawler will read the file first before continuing to read the page. The robots.txt file includes specific information on how the search engine should crawl the page, the information contained on this page, additional information will be provided which will instruct further crawler action.

Robots.txt files are important because they allow crawlers to access certain locations on your website. This file can help in a number of ways:

  • Reducing problems with duplicate content appearing in the search engine results pages
  • Ensuring that certain pages on your site remain private
  • Preventing internal search pages from being displayed in the SERPs
  • Preventing the search engine from indexing certain files on your site such as images or PDF files.
  • Establishing a crawl delay to prevent servers from becoming overloaded when the crawler loads multiple pieces of content at the same time.

You can check whether you have a robots.txt file on your own website by entering this search string into the address bar in a web browser, replacing yourdomain.com with your website address:

If you would like to test your robots.txt file you can do so using the Google Search Console. Before you can do so however you will need an account. Once logged in, navigate to Enter the URL of your home page or a specific product or service page on your site that you would like to test into the box at the bottom of the screen and click 'Test'.

img

Enter the URL of your home page or a specific product or service page on your site that you would like to test into the box at the bottom of the screen and click 'Test'.

This button should then either change to 'accepted' or 'blocked' to determine whether the URL you have entered is blocked from search engine crawlers.

How to use schema.org markup to help with SEO

Schema.org is part of technical SEO which allows you to provide information that search engines need to understand the content on your website. Incorporating schema markup in the HTML of your website will improve the way in which your page is displayed in the SERPs. It also improves rich snippets, providing more valuable information to searchers such as reviews and dates that the piece of content was created.

To generate your own schema.org markup code you will need access to the Google's Structured Data Markup Helper which can be accessed through the Google Search Console.

img
  • From the list tick the boxes that relate to the type of data you wish to mark up.
  • Then paste in the URL of your web page or article that you wish to mark up and click 'Start Tagging'.
  • Then highlight the type of elements that you would like to mark up
  • Once you have finished, click create html.

Next you will need to go to your Content Management System and add the highlighted snippets to the corresponding places. Find the yellow markers on the scrolling bar to locate the schema mark-up code.

A simple alternative is to download the HTML file and then copy and paste it into your CMS or source code.

Click Finish.

However, there are other ways that you can implement the schema.org markup. WordLift is just one of them if you use WordPress.

WordLift is a paid plug-in that allows you to mark up your content with schema.org without any technical know-how. The plug-in provides the ability for you to structure your websites around Creative Works, Things, Events, Organisations, People, Places and Local Businesses. These are then grouped into four categories: Who, Where, When and What.

img

Google have carried out research and concluded that websites that use schema markup experience higher click through rates than those that don't. Although it won't affect your rankings directly, it can improve the amount of traffic that you receive to your website.

Best way to use HTML & XML Sitemaps

A sitemap is essentially a list of pages on a website that can be accessed by all users. An XML sitemap is a technique that website owners can use to tell search engines about all of the pages that exist on their website.

It can also tell search engines which links on your site are more important and how often your website is updated. Although a sitemap does not impact your ranking in the SERPs, it does allow search engines to crawl your website more effectively.

From an SEO perspective, a sitemap is a very important resource. Sitemaps are incredibly useful for new websites and blogs. This is because many new blogs don't have any backlinks to their posts, it can be even harder for a search engine to discover these pages and posts.

There are many ways that you can create an XML sitemap, particularly if you have WordPress website. If you have the Yoast SEO plug-in you can use this tool to automatically generate your sitemap. Once you have installed the Yoast SEO plug-in you will need to navigate to SEO and then XML sitemaps from your WordPress dashboard:

img

Click on XML Sitemaps. Your sitemap can be accessed by clicking on the blue hyperlink XML Sitemap.

img

With this URL you can then log into the Google Search Console and click on the 'Add/Test Sitemap' pasting in the link from Yoast. In the 'Web Pages' section you will see how many pages have been submitted for indexing.

I also find that adding a html sitemap with a link from the footer of every page helps with indexing. There are certain rules with sitemap file sizes and outbound link volumes but we have created them up to 3500 pages long without issue.

img

How to do a Technical SEO Audit

It doesn't matter how good your SEO content is or how nice your website looks, if there are niggling technical issues with your website, it can significantly reduce the likelihood of your website appearing in the SERPS. Issues with duplicate content, multiple H1 tags, a malfunctioned robots.txt file, too many redirects or poor page speed can all impact on your website's ability to rank.

A technical SEO audit is designed to identify key issues that can stop your website from ranking. All the time and effort that you invest in on page and off page SEO can be wasted if your site has technical SEO issues. So let's get started and guide you through step by step what you need to include in your technical SEO audit:

Step 1 Ensure that your pages can be indexed

Important pages on your website must be visible if they are to rank in the SERPs. To check that all of the pages on your site have been indexed, you will need two tools: Google Search Console and Google Search.

The first stage in the process is to check your index status. For this you will need to log in to the Google Search Console. Once logged in on the left of the screen select the Google Index Status Report.

The Google Index Status Report provides information on how many pages on your website have been indexed by Google. It will show the total number of URLs that appear in search results. Look at this carefully to see whether there are any of your important pages missing. When using the Index Status Report look for the following:

A steady increase in the graph. This means that Google can regularly access your content and your site is being indexed.

Sudden Drop – If you notice a sudden and unexpected drop in the graph, check that your server is not down or overloaded, or Google is having problems accessing your content.

High Index Volume – If your website has a particularly high index volume, this could mean that your website has issues with duplicate content, canonicalization, automatically generated pages or it may have been hacked. In the majority of these cases, Google will send you a notification if they believe that any of these have occurred.

img

Review the site and URL errors to see if there is anything that may be preventing Google bots from crawling your website.

Using the strategies outlined in the previous section, check that your robots.txt file is working correctly.

Log out of Google Search Console and then go to Google Search. Enter the following search term into the search bar:

Site:yourdomain.co.uk replacing 'your domain' with your web address. This will list all of the pages on your website that Google has indexed.

2. Robots.txt Check

Make sure your robots.txt file as explained above is not blocking your resources in or stopping pages or your whole site being indexed

3. 301 – 404 – 500 Server Errors & Redirects

Know your website structure, don't have redirect loops going from 1 page to another page to another, remove them, update the links. Make sure you don't have links pointing to pages that no longer exist and that link juice that is coming from external sites is not wasted.

4. Website Security

Search engines prefer secure sites. As a website owner you should take security seriously. The best way to secure your website is to switch to HTTPS and obtain an SSL certificate. Usually you can purchase an SSL certificate from your hosting provider.

HTTPS provides your website with a strong layer of protection, encrypting communications with the web server.

5. Meta Data

Sometimes when your website is crawled, certain web pages may display a 'no index' tag, This will prevent Google from accessing your website. Check the source code of your pages to make sure that the code < meta name='obots' content='noindex' > doesn't appear on any of the pages that you want to appear in the results.

6. Link Profile Audit – Review of your backlinks

Success in SEO depends largely on a strong link profile. To make sure you are securing links from the right places, your audit needs to feature a link audit. You can use Google Search Console to check how many links are pointing to your domain or you can use a paid tool such as SEMRush, AhRefs or even Open Site Explorer which is completely free.

Links to your website must be authoritative and from a website with a high DA (Domain Authority). Poor linking practices such as directories and PBNs should be avoided. We recommend using a combination of all sources rather than 1.

7. Optimise site speed

Search engines prefer faster websites and a poor page speed is only going to put users off. Web pages should load in 3 seconds or less. How can you test this? Google of course. Google has its own Page Speed Insights tool. Give this a go and follow our guide above to find ways to reduce this, the best we have achieved is 95/100 on desktop and 96/100 on mobile.

8. Duplicate Content

Duplicate content can have a negative impact on your SEO efforts because it reduces the power of the main URL and can pose a threat to your rankings. Make use of the canonical tag as outlined in the previous sections of this guide to notify Google which is the primary URL on your web page. Search rankings can be negatively impacted when very similar or duplicate content is published across multiple URLs. The canonical tag can be used to point to the primary URL where pages contain duplicate information or similar content.

9. Remove Low Quality Pages

Websites shouldn't have hundreds of pages which are filled with low quality or thin content. Although your website may benefit from specific landing pages relating to certain campaigns, if they are not strong enough, remove them from the index and redirect them using a 301 redirect to content rich and valuable pages.

10. Website Structure

The structure of your website is important. Your SEO will benefit from well thought out internal linking structure. Use descriptive anchor text which will pass on link juice to deeper pages in your site.

11. Mobile Optimisation

Following Google's mobile first indexing approach, it is vital that your website appears in mobile search results. Make sure that your site is built using a responsive design. If you are not sure ask your developer. If your site is not mobile friendly it needs to be.

Technical SEO crawling tools

There are lots of tools that you can use to complete a SEO audit and to complete a technical SEO crawling task. The main ones include Deepcrawl, SEMRush and Screaming Frog.

DeepCrawl

DeepCrawl is an advanced technical SEO audit tool which allows you to:

  • Check indexation including all of the pages and templates that exist
  • Identify which are the most important pages such as product pages, category pages and landing pages
  • Verify how many of these pages have been indexed in the Google Search Console
  • Analyse the crawl budget. This is the number of pages that Google assigns to your site each day. Look specifically at your money pages. Are they easy for the bots to crawl? Where do the bots go next once they have crawled these important pages?
  • Audit the sitemap
  • Correct crawl errors
  • Undertake an analysis of JavaScript
img

SEMRush

SemRush provide a site audit tool which presents a detailed report enabling you to identify and fix key issues. The SemRush audit tool will allow you to:

  • Check the health of your website
  • Prioritise key SEO issues
  • Track the SEO process
  • Find and fix mistakes with hreflang
  • Optimise the security of your website
  • Detect issues with AMP implementation

Screaming Frog

Screaming Frog allows you to undertake an in depth technical SEO audit using their dedicated crawling tool, it is so versatile you can learn many things about a website by running a crawl if you know how to look at the data.

Screaming Frog

Screaming Frog allows you to undertake an in depth technical SEO audit using their dedicated crawling tool, it is so versatile you can learn many things about a website by running a crawl if you know how to look at the data.

Chapter 6:

Google Tools - Search Console Webmaster tools

Google Search Console is a very useful tool not just for those developing an SEO strategy, but for marketers, managers, designers and professionals. If you aren’t too sure what Google Webmaster tools are or what the Search Console is for, this section will teach you all you need to know.

The Google Search Console is completely free and you can use it to find out some really valuable information such as how many people are visiting your website, how they are using it and where your customers are coming from.

The Search Console provides a suite of tools and resources to help you succeed with your website. If you own or are responsible for monetising, creating or promoting content online through Google Search you will need to use the console at some stage, particularly if you are using SEO.
Search Engine Optimisation requires you to make ongoing modifications to your website however when they are combined with other optimisation work, they will have a noticeable impact on the performance and user experience of your website.

Google Search Console best practices

The Google Search Console is a suite of tools and resources to help website owners, marketers, webmasters and SEOs monitor website performance in the SERPs. Typical features of the search console include appearance, traffic, technical status and crawl data.

Here are just a few recommended best practices when using the Google Search Console.

Firstly it is important that you set your website up correctly. Adding your website is really simple and all you need to do is to log into your Search Console account. When you do you will see a red button which says 'Add Property'

Enter the URL of your website and click the add button..
The next step is to verify your website. There are multiple ways in which you can do this, but the most suitable will depend on whether you have knowledge of HTML, if you can upload files to your site, if you use the analytics method which usually is a lot simpler.
Make sure if you use https to verify all 4 versions of your website https://www, https://,http://www,http://

Sitemap – The next step is to ensure that you have a sitemap uploaded to Google. As already discussed, a sitemap is an important resource which provides crucial information to web crawlers. Sitemaps will include meta-data, information about images and video content and data about how frequently your website is updated.

Robots.txt – Just because you have a website doesn’t necessarily mean that you want all website pages or directories indexed. If there are certain pages on your site that you would like to keep away from the search engines, you can achieve this using the robots.txt file.
Fetch as Google – If you have made substantial website changes, the best way to get these changes indexed quickly is to submit it manually. This will enable changes to your on page content or title tags to appear in the search results much quicker.

Site Errors – Visitors don’t want to encounter errors as they browse your website or during the purchasing process. Sometimes you may not even realise that there is an issue until someone actually tells you. Google Search Console can immediately notify you if errors occur. The crawl errors page will highlight any errors with your website so you can quickly resolve any issues so that your web visitors don’t experience any problems.
Once you have the basics mastered, the Google Search Console is a valuable tool to help you monitor traffic, collect metrics and optimise the performance of your website.

Crawl Errors – How to find issues within a large site

If you login to webmaster tools then go to Crawl>crawl errors you will a list of any errors, whether it was a 404 or 301 and when it happened first. Once you find one that looks relevant if you click on the link you can see where it is linked from, and more details of the error, you can also check the issue is still live.

Chapter 7:

How to Make Use of Google Webmaster Guidelines

Google Webmaster Guidelines can be incredibly useful and are there to help. There are multiple best practices that you can apply to help Google better understand your website.

Whether it is an issue relating to quality or usability, the guidelines provide a clear framework for you to build a successful website that search engines can read, understand, trust and display in the SERPs.
Google guidelines are useful at all stages of the development process from the design through to the creation of content and optimisation of your site.
At each stage of the design process for your site it is recommended that you check Google Webmaster guidelines to ensure that you are implementing best practice. There are specific rules surrounding design and content.
Webmaster guidelines suggest that text is always preferable to images particularly when it relates to important pieces of content or links. Google will focus on text that can be understood on the pages of your site.

Make your website with users in mind

The phrase 'Website Usability' or User Experience (UX design) is used to describe how easy your website is to use. It considers how site visitors can complete certain tasks on your website without confusion or abandoning the process. Is the website easy to navigate? Does it offer all the features that a web visitor needs to carry out what they intended to do? Can first time users access and use your site with relative ease?

User Experience is an important term in today's world of SEO. User experience includes not only perceptions of your website but also how visitors interact with your pages. UX is about creating a user friendly website covering a number of different elements such as performance, design, accessibility and marketing.

Don't use any form of deception

Deceptive tactics that manipulate search engines will only serve to penalise your website and in the worst cases have it removed entirely from the SERPs. Black Hat techniques should be avoided at all costs. Avoid the following tactics:

Paid for links
Links hidden in HTML
Keyword stuffed pages

Make sure your website stands out

Websites need to be able to stand out from the competition, but for all the right reasons. A professional platform and common theme is strongly recommended. There are hundreds of free template websites out there but if you are trying to build a credible online presence, you need your website to look polished and professional.

This begins with a professional platform. Having a website that looks the part will not only help with usability but it will also increase trust and help you to build your online reputation.

When a visitor lands on your website, they should immediately understand what you do and who it's for. You have about three seconds to capture the interest of your target audience. If you fail, a competitor site is just a few clicks away.

A good website will provide consistency, quality and good usability. Use the same colours across your site and ensure that your fonts, graphics, tone, photography and visual content is uniform across all pages and posts.

Your brand should be carefully incorporated into the pages so that you can convey your USPs and promote the benefits of your company.

Make sure your server/site is secure and not having malware installed or becoming part of a hijacked PBN

Security is a key issue for website owners and visitors. Secure websites are viewed more favourably by Google so you will receive extra points if your site is secure. In 2015, Google began to implement a ranking boost to websites using HTTPs. Google has strongly encouraged webmasters to create secure sites and it has been gradually increasing the weight that HTTPs offers to ranking.

However just because your site is HTTPs you still need to implement multiple best practices in order to secure and retain your position 1 status.

There are some great tools that you can use to ensure that your site is secure. SiteLock and WordFence are just two:

Sitelock – will automatically scan your website to identify any threats from malware. Scans can be carried out on a daily, weekly or monthly basis. A SiteLock security shield will provide your website visitors with greater reassurance that your website is completely secure.

If malware is detected on your website, it allows you to take immediate action. SiteLock also has a find and fix option which automatically removes the malware and addresses the issue to reduce the likelihood of it occurring in future. We have found this priceless on big sites that require 100% uptime.

WordFence – Security should be a key consideration for website owners. If you own a WordPress website there are a number of security plug-ins but one of the most popular is WordFence.

WordFence is a security plug-in that offers first class security featuring a Firewall which prevents your website being attacked and will quickly alert you if your website is compromised.

There are four main features of WordFence including the firewall, scanning function, live traffic and a dedicated suite of tools. Some of the features are free such as the malware scanner, blocked intrusion attempts, repair files, view bots and crawlers and obtain IP info while the paid version provides a range of advanced features such as country blocking, comment spam filters and real time threat defence feeds.

Safeguarding your website is important and it shouldn’t be something that you overlook. Compromised customer or business data can have a disastrous impact on your organisation. It can destroy trust and have a lasting impact on the reputation of your business. We would like to note that none of the above are absolute solutions that are guaranteed, you should use a combination of efforts and keep an eye on things also.

How to get rid of a manual action penalty

Sometimes your website may be hit with a manual action penalty. Often this is because you have used a questionable SEO tactic or you have violated Google guidelines. If you suddenly see a drop in your traffic, it is important to understand what caused it.

There are two main penalties that you can receive from Google. The first is a manual action from the Google web spam team and the second is an algorithmic penalty.

To confirm whether your site has been hit with a manual action penalty, log in to the Google Search Console and check under the ‘Manual Actions' tab.

There are two main types of manual action: Site wide and partial match.

A site wide manual action is one which penalises an entire website

A partial match manual action on the other hand focuses on certain pages or sections of a website.

Although there are two types of manual action there could be many different reasons that cause them to be issued including:

  • Unnatural links
  • Spam
  • Hacked websites
  • Hidden text
  • Keyword stuffing
  • Thin content
  • Image mismatch

In terms of fixing a manual action there is no definitive answer. The fix will depend on a number of issues. There are however steps that you can take which will be based on the notification that you receive. Google will send you specific details of the issues on your site so you have a good insight into how you can fix them.

  • Depending on the type of manual action you receive you can:
  • Undertake a content audit to check whether the content is relevant
  • Review the information on your site to see which brings you the most conversions
  • Identify whether any information on your site should be updated, removed or reduced
  • Determine whether any of your content appears to be spammy or irrelevant


For link related manual actions you need to determine which links are causing problems through a detailed assessment of your back link profile. Use tools such as Google Webmaster, Link Research Tools and Cognitive SEO to assess the value of your links and determine whether they should be removed.

Once you have identified the reason for your manual action penalty you will have to re-submit your site to Google for further consideration. They may or may not decide to remove the penalty depending on how well you have dealt with the penalty. There are no guarantees that your site will be approved.

The best way is to avoid penalties entirely. Apply white hat SEO tactics, adhere to Google webmaster guidelines and ensure that you build a valuable and authoritative website.

Developed By Aroham