Crawl Stats and Your Site
So you’ve created your sitemap, you’ve told the different webmaster tools where to find it and followed all the advise you can find about optimising your posts for SEO. And yet you still aren’t getting the traffic that you think you should have. You know what the trends are as you bought into the expensive Keyword Tool to see, you know that the difficulty to rank for it is low and yet still things aren’t going as well as you thought they would. Well it may not be a case of your posts SEO is off, it maybe a case of your sites SEO is off. And not in terms of the keywords you have it ranking for but instead the technical side of it. We are starting off looking at the more technical side of SEO here on The Blog Surgery and first off we need to understand what we are looking at.
Bots and Crawlers
The first part is all about “crawling”.
As a mum with 2 crawlers and their friends it gave me a perfect insight into how the bots would work. Imagine putting a bunch of crawling babies in a room and watching them crawl around. Some follow the same paths heading towards the popular areas. Others sit and wonder what on earth is going on and where they should go and then the adventurous find other routes out.
The bots on your site basically do the same. They arrive, they go where you tell them or where they can find a route. So the more exits you have from the page they land on the more routes they take and others stay and explore further down the page until they too go to the link and move on.
BUT…. There are only a certain number of bots going to your site they can only follow a certain amount of links because there are only a certain number of bots going there.
How often these crawlers come to your site and how many is determined by your “crawl budget”. Your server log files will be able to show you exactly this however, in Search Console there is a set of graphs that give you an idea of what is happening.
These are unique to your site, and your site’s crawl budget will depend on a lot of things. Age, number of back links, number of url’s the search bots know about, frequency of posting etc…
On the crawl stats page the 2 most important crafts for you to get your head around are the bottom – Time spent downloading a page and the top graph Pages crawled per day.
With the time spent downloading a page (in milliseconds) it’s worth keeping an eye on this because you want to keep an eye that things aren’t going crazy. 294 milliseconds is 0.294seconds. So even as a “HIGH” value I’m not concerned with this and although YES I would like my site to load faster the bots are seeing it loading fine for them.
The top graph is to do with how many pages are crawled per day. If you have 10,000 URLs on your site then you would expect more to be crawled. I have around 3000 so I’m pleased that on average it is crawling that! The low level that maybe a day when it just crawls the posts instead of images as well.
The most important thing with these is to keep an eye and check to see if things go wrong. Then you can seek help to solve them.
COMING NEXT -> Making Sense of Where the Bots go!