When a new SEO project falls into my hands I feel the unstoppable need to start doing things, as soon as I see something that may be slightly better I need to fix it, but this is not a good idea, it is best to calm down, take a deep breath and start doing good planning based on a good study of the web and the competition.

Before starting any analysis or study, the first thing to do is navigate the page to get an idea as deep as possible of what we are facing.

There are many steps to follow when making an on-site analysis that goes to the annals of history, but can easily be grouped into four:

  1. Accessibility
  2. Indexability
  3. Have you been penalized?
  4. On-site positioning factors
  5. Accessibility

If Google or users can not access a page, it might also not exist, which would be the same. So the first thing to look for is if our website is visible in the eyes of the search engines.

Robots.txt file

The robots.txt file is used to prevent search engines from accessing and indexing certain parts of your website, although it is very useful, there may be a situation where we block access to our website without realizing it.

In this extreme case, the robots.txt file is blocking access to the entire web:

User-agent: *

Disallow: /

What we must do is to manually check that the robots.txt file (usually the URL is usually www.example.com/robots.txt) is not blocking any important part of our website. We can also do this through Google Webmaster Tools.

Meta tags robots

Meta tags are used to tell search engine robots if they can or can not index that page and follow the links it contains. When analyzing a page we must check if there is a meta-tag that by mistake is blocking access to the robots. This is an example of how these tags look in the HTML code:

<meta name = “robots” content = “noindex, nofollow”>

HTTP status codes

In the event that some URL returns an error (404, 502, etc errors), users and search engines will not be able to access that page. During this step you must identify all the URLs that return error, to later fix them.

To identify these URLs I recommend using Screaming Frog, it quickly shows you the status of all the URLs of your page, although in the free version it only lets you see 500, even if, if the web is not very big, it will be very helpful. Also in Google Webmaster Tools we can see this, but in my opinion it is less reliable.

We will take this step to know if there is any redirection on the web and if so, if it is a temporary redirect 302 (in this case we would turn it into a permanent redirect 301) or if it is a permanent redirect 301 but redirects to a page that did not have nothing to do with it, then we will redirect it to where it should be.

Sitemap

The sitemap is literally a map for Google with which we make sure that Google finds all the important pages of our website. There are several important points to keep in mind with the Sitemap:

  • If the map does not follow the protocols for it, Google will not process it properly.
  • Is the Sitemap uploaded to Google Webmaster Tools?
  • Are there web pages that do not appear in the Sitemap? If so, you will have to update the Sitemap.
  • If you find any page in the sitemap that Google has not indexed it means that you could not find them, make sure that at least one link on the web points to it and that we are not blocking access by mistake.

Web architecture

Make an outline of the whole web in which you can easily see the levels you have from the home to the deepest page and how many “clicks” are needed to reach them. It also searches if all pages have at least one internal link pointing to them. It can be easily checked with Open Site Explorer filtering the links only by internal. This is because Google has a limited time to track a website, the fewer levels you have to jump to get to the bottom, the better.

Flash or Javascript navigation

Although in recent years it has become more intelligent when it comes to reading these types of technologies, it is much better not to use them to avoid possible problems.

To check this you only have to navigate twice through the web: one with Javascript enabled and another with it disabled. Using SEO Toolbar for Firefox will be much easier for us.

Speed ​​of the web

It is well known that one of the factors of the increase in the percentage of bounce is the loading speed of a page, but we must bear in mind that the Google robot also has a limited time when browsing our page, at least Later each page in load to more pages will get to arrive.

You can use several tools to see the loading speed of the web. Google Page Speed ​​offers a large amount of data about what slows the loading of the web as well as tips on how to improve the loading speed.

Pingdom also gives us more graphically the different elements that are loaded on our website and the time it takes our browser to do so.

Indexability

We have already identified which pages can be accessed by search engines, now what we have to do is identify how many of those pages are actually indexed by them.

Search command Site:

Google offers the possibility of doing search with the command “site:” With this command we do a search on a specific web, which gives us a very approximate idea of ​​the number of pages that Google has indexed.

This helps us to compare the number of pages that Google has indexed with the actual number of web pages, which we already know from the sitemap and have previously browsed the web. 3 things can happen:

  • The number in both is very similar … GOOD!
  • The number that appears in the Google search is lower, which means that Google is not indexing many of the pages
  • The number that appears in the Google search is higher, this almost always means that the web has a duplicate content problem.

In the second case what we will do is review all the accessibility points in case something has happened to us. In the third case we will check that our page does not have duplicate content, which I will explain later.

I recommend you to use a good Link Tracking Software in order to measure better your intelligibility and performance of your links.

Relevant pages

To know if the pages that are better positioned are the really important ones we just have to search again with the command “site:”, the logical thing would be that the home appeared first followed by the most important pages of the web, if it is not so we must investigate why (we will see it below)

Brand search

After knowing if the important pages are well positioned we have to know if the web is positioning well for a search with the own name of the web.

If the web appears in the top positions, everything is fine, if the web does not seem anywhere, we have a problem: Google may have penalized the web, it’s time to find out.

At the end!

Following all these steps we will make an on-site SEO analysis of any web with which we can throw ourselves hours in front of the screen and then leave it impeccable. I really like to insist that, no matter how much link-building it makes, if those links point to pages that make us cry, they will not be worth anything.

 

About the author

Walter Ponce

Walter Ponce is the author of the internet marketing blog WalterPonce.com, where he shares proven strategies, tactics and tools to help you build a business that you love and live the life of your dreams. If you’re working on transforming an idea or a passion into an online business, and you want to start making money online, visit the section about Affiliate and Make Money Online.

Leave a Comment