SEO Onsite Audit or Optimization Factors for Architecture Review of a Website

Index Analysis

Search engines crawl and index web pages by utilising what are known as search engine robots, such as the Google Bot.

Search Engine Optimisation cannot be carried out on a website if a search engine robot cannot access it. Some websites are designed using technology that search engines cannot read, stopping content from being accessed. The use of iFrames, Java, and dynamic content can increase the chances of this problem occurring.

The manner in which the internal linking structure of a website has been created can also affect the amount of web pages a search engine will index, as can the speed at which a site loads and a whole host of other factors which we will examine within this document.

A large disparity between the amount of unique pages available for indexing by search engines and the amount of pages actually indexed by search engines is often symptomatic of an architectural issue affecting the website’s performance in search engines.

seo audit factors

Domain, IP & Server Issues

Domain Registration

It is important to check the Whois information for a domain to ascertain if the contact details are correct and that the domain is not due to expire. Google uses the Whois information of a domain in a number of different ways when assessing web pages, including Google Place listings.

It is important that the domain name is owned by the owner of the website and not with thedevelopment company, or hosting company (in some cases) as this provides you with actual ownership. If you decide to then change development companies or maintain the site yourself this assists with a clean separation.

Domain usage and duplication

Duplicate content is a major issue within SEO; having duplicate content present, either hosted on a domain, or externally, can have a dramatic effect on a site’s ability to perform in search engines.

A mistake that is commonly come across is the duplication of the main website, or part of the website, on a mirror domain, resulting in an exact replica of the existing website. Without the correct methods in place this can result in the duplicated pages not appearing in Google’s search results.

External links targeting the main site then don’t pass any authority as the primary source of content is not feeling the full benefit. The result of this can be a wasted link profile with neither domain, primary nor duplicated, feeling the benefit of these links built towards the site due to the duplicated content confusing the search engines.Where duplicate domains exist they should be 301 redirected into the main website domain address, which will resolve any issues.

All other URLs within the duplicate domain (such as should also 301 redirect into corresponding URL on the primary website (

IP Neighbourhood

The internet address that a website shares with other sites can potentially have an effect on the website’s search engine ranking positions. A website that is hosted on a ‘bad neighbourhood’ may be sharing its server with undesirable websites that are either infected, or deliberately sharing malware. A bad neighbourhood may also be a server that hosts websites of questionable or undesirable content. Potentially, such an association can damage the authority of any websites hosted on the same server.

Server Type

While the type of server that a website is hosted on does not have a direct effect on the ability of a website to rank within search engines, certain types of servers (and their configurations) can have intrinsic issues that can hinder an SEO campaign. The type of server the website utilises also effects the technical advice we would offer as solutions to specific issues.

A .Net website, hosted on a Microsoft IIS server that utilises View States, would be an example of a website whereby its performance is being (potentially) impacted by the choice of servers and programming language. A site would usually be affected on performance and page size due to a view state being passed to and from the server in a hidden form field. Although this wouldn’t necessarily directly affect the SEO optimisation of the website it would harm the usability of the site therefore decreasing the conversion rate.


In order to ensure that an individual website appears and performs well in nation/region specific search engines, a number of elements need to be in place. A regionally targeted website should have an appropriate Top Level Domain (e.g. for a UK-targeted site) and ideally be hosted on an IP within that country.

The lack of one, or both, of these elements does not mean that a website cannot rank in a specific regional search engine.

However their inclusion, for regionally targeted sites, should ensure that the search engine will recognise their regionality and rank the site in the appropriate regional search engine.


The URL (web address) of a webpage can have a significant effect on the ability of that page to rank highly for a target term; a webpage’s URL is one of the most important on-page SEO factors. Badly formed URLs can also hinder search engines from crawling and indexing a webpage.

Friendly URLs for the Website

A friendly URL should contain words that adequately explain what is present on the page to help the user, reflecting the natural structure of the site. Therefore if the page consists of snack-based products then the URL would be explaining the page in a correct and sufficient manner. URLs should not be parameter based as this adds an additional component for the search engine to decipher. The URL should also not be overly long or contain irrelevant keywords that can dilute the density, or prominence, of the target keyword within the URL.

Session IDs to identify vistiors

Session IDs are parameters that are added to a webpage’s URL to uniquely identify visitors to a website. It is a form of tracking, similar to a cookie, allowing site owners to track visitor actions through the site.

Session IDs are unique to every single visitor to a website, including search engines. This means that session IDs generate unique URLs for each visitor to a site.

Search engines will usually be assigned a new unique session ID each time they visit a site. This creates numerous indexing and duplicate content issues, affecting a sites ability to rank.

Ideally all Session ID usage would be removed from a website. If needed, Session IDs can remain if a number of fixes are put in place; use of the canonical tag and not delivering Session IDs to search engine bots.

Cookies for Rankings

It’s vital that cookies are monitored in the correct manner so that sites now comply with the EU cookie law which came into force on 26th May 2012.

To conform to the new EU law, each website has to gain consent from visitors in order to store or receive information on all computer devices. The law is designed to protect online privacy of customers by making them aware that websites are collecting sensitive data. A user can refrain from a site storing sensitive data by turning off the browsers cookies.

Cookies can affect the rankings of a website depending on the way the cookie law has been implemented onto the site. Making visitors accept a cookie to view the site can burden your site rankings as the search engines will then not be able to access all site content to index. Showing different content depending on whether cookies are activated is the correct method, although only content with no requirement for cookies will be indexed.

On-page SEO Elements

Page Titles for Search Engines

The title tag is one of the most important on-page factors for search engines when assessing what the theme of a page is, what terms it should rank for, and how highly it should rank for those terms.

While the title tag does not visually appear within a webpage, as it is contained within the ‘head’ of an HTML document, it does appear in search engine listings as the main title for each listed.

Each page within a website should have a unique title tag. While having keywords within the title is important, over-optimisation of terms can result in a penalty within the SERPS. Therefore, keywords should no longer be the focus, but a good explanation of what is present on the webpage should be a key factor. An optimised page should have a title tag of 70 characters or less.

Although page titles can help click-through rates, this isn’t necessarily the key objective of a title tag; which is having a clear concise summary of what exists on the page and from which company.


Homepage – <title>The main company objectives | Company Name</title>

Category Page – <title>Category Title | Company Name</title>

Product Page – <title>Product Title</title>

Contact Us – <title>About Us | Company Name</title>

Header Tags with Keywords

Header tags are HTML that is used to indicate to search engines and web browsers the main headers within a webpage and ergo the structure of that page. Header tags are semantic in nature giving additional meaning to plain text, i.e. this is a page header rather than a paragraph.

Due to changes in the search engines’ algorithms header tags no longer have a significant effect on rankings, but aid user comprehension and may assist Google in identifying the topic of a page.

Every single page within a website should have a single H1 (header one) tag which contains a summary of the information present on that page. Usually, each page should only contain one H1 tag and all others should ideally appear below it (H2, H3 etc.).

All headers within a page should be clear and concise offering value to users.

Canonicalisation Issues to avoid Duplication Issues

A lack of canonicalisation causes search engines to see multiple versions of the same website on different addresses. This can also mean that the incoming link equity to a website is not maximised as this equity is split across two different domains.

Canonicalisation issues can have a significant impact on the ability of a website to rank highly for its target terms due to the duplicate content issues it creates.

A choice of domain preference should be made, be it the www version or the non-www version. This choice should then be specified within Google Search Console.

If the www version is chosen then all requests made to the root of the non-www domain ( should ‘301 redirected to’ the www domain ( It is important that this is achieved with a true 301 redirect and that the correct ‘header referrer’ is sent.

On an Apache server this issue can be resolved by the creation of an .htaccess file which is placed within the root of the website. Within this file the following instructions need to be added,

RewriteCond %{HTTP_HOST} ^example\.com$ [NC]

RewriteRule ^(.*)$ http://www.example/$1 [R=301,L]

Meta Keywords

Meta Keywords were commonly used by older search engines in deciding the relevance of a webpage and how highly they should rank that page for a particular term. This element was heavily abused in the past and changes to the major search engine algorithms have ensured that it no longer has any effect on the ranking of a webpage within the vast majority of search engines.

All of the Meta Keywords currently present within your website should be removed as soon as possible as they potentially send a negative spam signal to Google as well as revealing your keyword targets to your competitors.

Meta Description

While Meta Descriptions no longer directly affect a website’s ranking position, they can have a significant effect on click-through rates from natural Search Engine Ranking Positions (SERPs), as they are often used as the ‘snippet’ which search engines use to ‘preview’ a site in results pages.

Each webpage within a website should contain a unique description which accurately describes the page in question, whilst encouraging the users to visit the page through ‘Calls to Action’ contained within the Meta Description.

The search results shown below for the search term “Car Hire” shows the explanation for the site taken from the Meta description. The description below has included an attractive 40% offer to try and entice the potential customers to click through to the site.

Microformatted data can enhance the click-through rates as reviews can be displayed within the SERPs, making that listing stand out from others. The user is more likely to see the company with trusted user reviews as a well-established and trusted company, therefore also increasing the click- through rates.

Lang Tag

The lang tag is used to declare the language of a webpage. The use of this tag can assist browsersand search engines when deciding the regionality and target audience for a website.

The primary language for the website should be declared inside the initial HTML tag like this:

<html lang=”en”>

For XHTML the language tag is also declared within the opening html tag as follows:

<html xmlns=”” lang=”en” xml:lang=”en”>

Robots & Sitemaps

Robots.txt for the Files

The robots.txt file is used to give instructions about a website to search engine robots; this is called The Robots Exclusion Protocol. When a search robot (e.g Google Bot) visits a webpage, it first checks to see if a robots.txt file exists in order to follow the instructions it holds.

The file should be located here:

Content Exclusion

A robot.txt file can tell a search engine not to visit (and index) an entire website, or to just ignore certain directories. A misconfigured robots.txt file has the potential to completely remove all opportunity of appearing in search engine results.

Note that whilst the major search engines pay attention to robots.txt many other web robots do not;

thus robots.txt is not a secure way to hide content.

SEO Friendly

A robots.txt file can also be used to inform search engine as to the location of your sitemap.xml file.This instruction should be included with all robot files.

All sub-folders within a site should also be checked for the existence of the file, as its instructions can be interpreted on a sub-folder level.

HTML Sitemap

HTML sitemaps are useful for visitors to help them find the specific content or webpage they are looking for within a website. They also help search engines identify all of the webpages available within a website and/or the hierarchy of the pages. Sitemaps can also be structured to help drive link equity into key pages within a website.

Depending on the size of the site in question, a HTML sitemap should list all of the web pages available within a website. Multiple sitemaps, themed around specific sections of a website, can also be produced as appropriate for larger websites.

XML Sitemap

An XML sitemap is a semantic document which lists every single webpage available within a website.This is not a document that users usually view; it is solely for the purpose of the search engine. The address of this sitemap should also be listed within robots.txt using the following syntax:


XML sitemaps can be submitted to search engines to help them discover all the pages present within a website, this may be especially important if problems were found within the Index Analysis section. Depending on the website, an XML sitemap should be updated weekly or whenever significant new content is added to the site.

XML sitemaps can also utilise the priority element to indicate to search engines the most important pages within a website as well as re-enforcing any internal linking structure. If the website is a particularly large one, multiple sitemaps may be useful to help break down the website’s pages e.g. an e-commerce site may have separate sitemaps for category pages and product pages.


Unique, relevant, high quality, engaging and regularly updated content is an important ranking factor for search engines. The higher the quality, and the more unique and relevant to your industry the page’s content is, the greater the chance of achieving high rankings.

It is important to note that sites which have been heavily optimised for search in the past may have on-page content issues; more details of potential issues like key word stuffing can be found later in this report.

Text Based Content

Search engines crawl text-based content to assess what terms it should rank that webpage for. If there is no text-based content, or it cannot be crawled, then search engines will find it more difficult to interpret what the page is about and also struggle to rank it for the appropriate terms.

It is important that a site’s content is written for the user; content allows the company an opportunity to speak to its potential customers and can impact the site’s conversion rate. Text-based content can also be used to target long-tail traffic, which generally converts at a higher rate.

Service Website

For a service website, the most important pages should be full of relevant, high quality content. We would recommend a page length of between 400 to 800 words. To produce the highest quality content, we would recommend that where possible the content is, at least initially, written by a member of the business with intimate knowledge of the industry.

Non-duplicate Content

Having duplicated content can seriously hinder a website’s ranking potential. If your website appears to have content that is duplicated, either on other pages of your own site or other websites, this is perceived as a signal of low quality content.


Running a piece of content through Copyscape gives an additional indication of whether any duplicate version of the content exists.

Transient Content

Transient content may affect your website’s search engine performance if you are regularly updating content/products without archiving them. When the content is removed, the page will be de-indexed, losing your rankings for that term.

Content Updates

Having regularly updated content on your website is an important positive signal to send to search engines, be it from visitors to your website e.g. through product reviews or blog comments, or by the owner in the form of a regularly updated blog or news section. This is particularly beneficial in light of Google adding freshness as a ranking factor.

News/Blog/Forum Analysis

Having a news section/blog/forum offers e-commerce and service websites a good chance to produce the aforementioned type of content – regularly updated relevant content.

We would recommend using the industry-leading WordPress Content Management System (CMS) for optimum SEO performance. It is also important to check that the CMS is of the latest version to ensure no security threats are present, the correct plugins can also ensure the security of the website.

Internal Linking

Internal linking offers great benefits not only to the user’s experience but also in terms of SEO. It creates a ‘spider’s web’ effect, making it easier, not only for the search engines to crawl your website, but to also assign semantically related values to the pages if the linking is performed correctly. It also provides a call to action for the user; encouraging them to stay on your website, reducing your bounce rate and allowing you to drill down with your analytics data to which pages are leading to conversions.

Website Hierarchy

A badly formed internal linking structure can mean that important pages are not given the internal link authority that they should be given, affecting their performance in search engines for competitive terms.

An optimised internal linking structure can have a significant effect on the performance of your website in search engines. It is recommended that important pages are no more than three clicks away (level 3) from the homepage.

Anchor text

Anchor text is the clickable text of a link. The best anchor text is a relevant keyword for the page you are linking to, this passes the weight on to the search engines that the page you are linking to is relevant to the keyword.

It is important to vary anchor texts when linking internally; continually linking to a page with the same anchor text comes across unnaturally and may eventually lead to the website receiving a penalty.

First Link Rule

Industry research has shown that Google only counts the anchor text of the first link it finds to a page within an HTML document. As such it is important to ensure that this first anchor text to any page is the most relevant one, which will pass the relevance you want for the page.

Landing Pages

When linking from one page to another it is important to make sure you are linking to the most appropriate page.

It is also important that relevant landing pages exist for all of the keywords that you wish to target. It will be difficult to target highly competitive terms without relevant landing pages.

Absolute V Relative Links

The use of absolute links, rather than relative links, within a website can negate many potential duplicate content issues, this may also prevent the possibility of infinite loops. This issue is low priority and we would recommend addressing any other, more significant problems first.

An absolute link: <a href=””>Link</a>

 A relative link: <a href=”page.html”>Link</a>

E.g. A webpage has the address of This page is on a domain that suffers from canonicalisation, meaning the page also renders on If this page used relative links, all of the internal links within page2.html would point to the non-www version of the site.

So, if a search engine discovers the page it would subsequently discover an entirely duplicated copy of the website on the non-www version of the domain via following the relative links throughout the website.

If absolute links are used then whilst the URL may be discovered, the internal links on that page will point to the correct www-version of the site, meaning an entire duplicate copy of the website will not be discovered by the search engines.

It is also possible to negate this issue by utilising something known as the base tag within a document. The base tag tells search engines what URL to append to the start of every single relative link within the website.

The use of absolute links when including images is also recommended to negate the possibility of dead image links which is a relatively common occurrence when relative linking is used.

Dead Links of the Website

It is important to check that there are no internal ‘dead links’ within a website. A dead link is a link that is either malformed or points to a page that no longer exists.

This is an issue for users as it provides a bad experience when browsing a website. The potential of the link is also wasted as no relevance or authority is sent to the intended target page.

Limit HTML Usage

It is important that a website uses clear, semantic, modern mark-up in the creation of its webpages if it wishes to perform well in search engines.

iFrames, JavaScript & Flash should be Avoided

The use of iFrames, JavaScript and flash can affect the ability of search engines to read the content that is present within these areas of the website. However, with more modern algorithms a more comprehensive array of content types are readable, with even text-within-image content being read by certain search engines.

Although this use to be a major concern and result in content not being indexed, in more recent times this has decreased.

If used with caution JavaScript can be visible to search engines. Ideally other methods would be used

if the highest rankings possible were required. Javascript can tend to be fairly code heavy, resulting in less visible content for search engines.

Search engines aren’t able to crawl text which is embedded in a flash object. This is a big concern when using flash, as it decreases visibility.

iFrame content on a site can result in the duplication of content unless the correct measures are put in place.

Although it is now possible for flash, JavaScript and iFrame content to be indexed it is by no means the best method to produce the best keyword rankings possible. With this in mind all key landing pages should not contain iFrames, JavaScript and Flash to allow for the best key word rankings possible.

Content to Mark-up Ratio

Code bloat, the use of excessive HTML, can affect the ability, and willingness, of search engines to crawl and index a website. Over use of HTML and other elements can cause webpages to have relatively large file sizes, limiting the amount of pages a search engine can crawl and index.

Excessive and incorrect HTML usage regularly takes the guise of the use of tables for layout and inline CSS. An effective website should have a high content to mark-up ratio, which is a high proportion of text-based content in relation to the HTML used to display it.

High levels of code throughout the site can also increase the time in which search engines take to read the content on each webpage, therefore making the page less relevant. Less code can result in the content being indexed in a more timely and regular fashion.

Prominence & Order of elements

Although placement of content can have small benefits and best practise would advise for content and navigation to appear first on the page, this isn’t an urgent matter.

Alt tag optimisation for all Images

Your on-page images can also play an important part in gaining search engine rankings, both in terms of the page on which the image resides, as well as a gaining an additional ranking for the image itself.

Each image should have a simple descriptive file name which reflects the term you want the imageto rank for in a phrase. The alt text for the image should also reflect this target term. Ideally the text near to the image within the source code should again reflect this theme. These elements may also be used to support the overall target for the page in question.

Over-optimisation can exist within images; this may involve factors such as the filename and alt-text being exact match text that ideally the page is intended to rank for, as such we do not recommend using the exact term the page wishes to rank for. Below is an example of the filename and alt-text of an image on a car hire website.

Filename: “car-hire-example-company.jpg”

Alt-text: “large executive car available for hire from Example Company.”

Server Headers

When a user or a search engine visits a webpage, the server returns a server header code to the visitor indicating the current status of the webpage that has been requested. It is important that the right code is sent to the search engines bots in the right circumstances.

200 OK Standard Server Header

200 OK is the standard server header code sent when ‘The request has succeeded’; the requested webpage has been found and successfully delivered. It is important that the server sends this response to search engines when a correct page request is made.

404 Not Found Page Needed

404 Not found is the response that should be sent by servers when the requested webpage has not been found. It is important that this code is sent when a page the users, or search engine, has requested is not found.

Soft 404s are a result of the server delivering a status code other than a 404 for a page that should return a 404, this can happen typically within custom 404 error pages and are identifiable within

Google Webmaster Tool. It is often the case that when a page does not exist, the server forwards the user onto a generic ‘not found’ page via a 301 or 302 redirect which contains part of the offending URL.

301 Moved Permanently

301 Moved Permanently is the response given by servers when a requested webpage has been moved permanently from one address to another. The directive forwards the users from the initially requested address to the new location.

In light of recent changes, it is important to note that 301 redirects can pass a penalty. For this reason, we would recommend that domains which have previously had items which may now be classed as in breach of Google’s search quality T&Cs (resource/directory pages previously used for reciprocal linking in the past) to be 404ed rather than 301ed.

The 301 redirect is a crucial server header code for SEO. In order to pass as much history, authority, trust and link equity that a page (or an entire domain) holds with search engines onto the new location, a 301 Moved Permanently redirect must be used. Any and all existing redirects within a website should be using this directive. These should remain in place forever, only being updated if the location of the pages changes again in the future.

Negative SEO

With the recent algorithm updates from search engines, negative SEO and over optimisation has had a negative effect on website rankings and overall site visibility within the SERPs.

Keyword Stuffing is Dangerous

High keyword density within content is a dated technique which is used to trick the search engines into making a page highly relevant for the regularly used keyword. This is now seen as a spam signal to the search engines and therefore this over optimised content is penalised.

The content has to look natural, talking around the subject and key term creating a more natural piece of content. Interesting content is key; therefore more effort and time should be spent on writing the content for the interest of the reader.

Hidden Text

Hiding text on a webpage can be penalised by search engines as this is seen as a Black Hat technique.

Many techniques used to hide text are implemented using CSS to make the the text invisible to the user. Search engines can still read this though and therefore the thought process being that the site would be gaining the SEO benefit is not always the case.

New search engines algorithms can now read the page style which means it can detect whether a webmaster is trying to cheat the search engines by hiding text from the user. If the search engine detects this technique being used on a webpage it can result in a penalty or the site or even de-indexation of the offending page.

Malware impact on Website

Malware can not only have a major impact on the usability of the website but can cause major penalties in search engines. The malware that exists on your website appears in the Google listings for your site and therefore will decrease the CTR or make it non-existent. If you react quickly to this error though and resolve the issue, Google will not penalise the site.

External Links

Although external links are still an important factor for creating a high authority website, they can also cause penalties if used in the incorrect manner.

Site wide links has Negative Impact

Site wide links such as links located within the footer or side bar can have a large negative impact on site rankings with the modern algorithm updates. Branded links would not result in such an issue unless these site wide branded links exist on a site within a different niche. Exact match anchor text on a site wide scale can result in huge penalties for that term as there will then be a huge amount of links to the intended website with the exact match anchor text making the link profile look unnatural.

Exact Match Anchor

External links should be created in a natural manner with brand anchor text being a majority of the external links which are built towards a site. However, previously a lot of links were built with the intention to target certain terms which has resulted in webmasters creating exact match anchortext.

For example, if you would like to rank for car hire the anchor text was “Car Hire”, with new algorithm updates these types of links have been penalised in Google with this being seen as a method to cheat the search engines.

With this in mind, link profiles need to be examined to ensure that they comply with these new algorithm changes.

Resource Pages / Reciprocal Linking

Resource pages and reciprocal linking usually have a close association with each other as these are dated methods to gain links. Reciprocal linking would consist of the swapping of links between two websites which would usually be stored on a resource page.

A large amount of links pointing out to a range of websites which have no relevance to your niche can result in a highly spammy looking page, which makes search engines aware that some questionable methods may have been used.

All resource pages should be removed with only links to close associates or suppliers kept, as these will have relevance to your site.

Comment Spam

Comment spam appears regularly on blogs as other webmasters try to gain links to low profile websites, however this no longer has any effect on website rankings. This method is now a dated method to gain links, and can have an impact on the authority of your website with outgoing links to spam sites within the comment spam. This can also hamper communication with potential users as the comments are not legitimate, losing the users’ trust.

Auto Linking / Unnatural Linking

Whilst auto linking used to be a popular method of linking to keywords from content within an article to internal pages, this can be seen as a spam signal by search engines. With certain CMS systems, new pages are created for tags with webmasters using keywords as tags to try and gain extra SEO benefit.

This tactic looks unnatural to search engines, ultimately affecting the overall SEO performance of the website and having a negative impact.

Keyword Stuffing in Classes and Divs

Over optimisation can also be present within the coding of the website using key terms as classes instead of using a normal or relevant class. For example on a car hire website a class could be called “.car-hire” within the CSS with very little styling so that this can be added widely across the site.

This technique should not be employed on the site and requires removal if present.

Bold / Strong / Italics and formatting

Although until recently it had been advisable to mark important keywords within content with ‘bold’, ‘strong’ or ‘italic’ tags this is now a strong spam signal to search engines.

With recent updates such as ‘The Penguin Update’, this penalises over-optimisation including marking up keywords with ‘strong’, ‘bold’ or ‘italics’ tags. These tags should only be used for styling benefits.

Website Popups

Page load speed can affect the SEO performance of a website and so adding extra code for the addition of popups can have a negative impact. Popups also restrict the usability of the site, increasing the bounce rate which has a knock on effect on SEO performance. All these metrics are taken into account when search engines rank websites.

Webmaster Tools or Console Setup

Having Webmaster Tools set up for your website can offer valuable insight into the possible issues that are preventing your website from ranking. There is no better way of understanding the searchengine than receiving information directly from them and as such these are important tools for diagnosing issues.

Google Search Console

Google Search Console should be set for deeper analysis into any issues affecting your site’s indexing within Google.

As Google is the leading search engine, it is important to ensure that your website has no issues affecting the Google Bots crawl of the website which may be hurting the website’s rankings.

Bing Webmaster Tools

Bing Webmaster Tools offers additional insight into your site and any issues that may be harming the site’s performance in the search engines.

Geographic Target Specified

Specifying which geographic area you want to target ensures that your website is indexed in the correct Google search engine e.g., .com, .ca.

Crawl Errors

Analysis of the errors from the Google Bot’s crawl of a website can help to identify some of the issues discussed previously e.g. Soft 404 errors and infinite loops.

Sitemap Submitted to Search Engines

As mentioned above, sitemaps are critical for allowing search engines to effectively crawl the website. It is particularly important to submit your sitemap if you have more than one.

Google Analytics Setup

Google Analytics should be set up where applicable. Data about the keywords that drive traffic to your website is invaluable and allows for a fully informed decision as to what keywords to target.

Site Speed is more Important

A slow site speed is not only off-putting for potential customers, but as a confirmed ranking factor it can also have a negative impact on the website’s performance in the search engines.

Further analysis into the reasons behind a slow site performance can be done using Google’s own ‘PageSpeed Insights’ developer tool.

Social Media Signals

Social Media is an increasingly important search engine ranking factor. Along with the backlinks of the website, social media send signals as to the interest and popularity of both a website as a whole and individual pages, with content helping search engines rank pages accordingly. Social Media is also an important source of additional traffic to a website.

Social Media Call To Actions

To ensure that a website is maximising its potential to perform on these social platforms it is important to include social media ‘Call To Actions‘ on appropriate pages, encouraging users to share relevant content with their followers. These CTAs usually take the guise of Social Media Buttons.

Social Media Content

It is also key to ensure that your website contains content that is likely to be shared and perform well on social platforms.

Social Media Presence

As well as encouraging users to share your content, it is also important to have your own effective Social Media presence as another method to gain visits to your site. From an SEO point of view, Social Media can also help you build relationships with key influencers within the niche, leading to effective distribution of your content and also the possibility of converting relationships into links.

Mobile SEO

The importance of Mobile SEO is on the increase as the continued growth in browsing the internet from a handheld device continues. With this in mind, sites have to function correctly on a mobile device whether it is the original website decreased in size for mobile use or a dedicated mobile site specifically made for handheld devices.

Mobile detection

Search engines mobile content crawlers should be directed via the user agent to the same mobile version that users are sent to.

Some specifically made mobile sites refuse access to anything apart from mobile phones, making it impossible for search engines to access the site. Google changes its user-agent information at any time and with no notice so only allowing access to the “Googlebot-Mobile” may have a negative effect on mobile visibility if this occurs.

If a specific handheld device site has been created, then the correct URL has to be used to store the mobile version of the site, with a subdomain being ideal for this. A subdomain offers the weight of the main domain therefore gaining trust from search engines.

Mobile Duplicated Content

The common perception is that having a mobile site and a desktop site with exactly the same content will create duplicate content and therefore issues. However, this is not the case and only relevant queries are returned although both sites are indexed. Therefore when “Facebook Mobile” is searched for via search engines, the content returned is relevant to the mobile site. If the search engines were to return the desktop site for this query then the content wouldn’t be relevant to the query.

Mobile sites should therefore not be blocked by search engines as the mobile site is relevant to more than just the mobile index. Best practise is to make it clear to the different search engines that you have a mobile site so that the crawlers can return relevant content for an array of user queries.

Mobile Sitemaps

Mobile sitemaps similar to the normal sitemaps should be created informing search engines of the pages and products present on the mobile site. This should be updated on a regular basis to inform and direct the search engines when the mobile site has been updated, informing the crawlers whatpages to index. .

The mobile sitemap can only contain URLs that contain content for mobile use and all non-mobilecontent URLs will be ignored by search engines.

Image source:searchengineland

No Comments Yet

Leave a Reply

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>