I got this from Google Webmaster Tools about my websites recently:
Quote:http://mysite.biz/: Googlebot can't access your siteOver the last 24 hours, Googlebot encountered 1521 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%.You can see more details about these errors in Webmaster Tools.
First of all, check your site and make sure that your robots.txt file can be accessed, goto http://yoursite.com/robots.txt.
After that, login to your Google Webmaster Tools account then try "Fetch as Google
In my case, my site was loading fine from all possible locations so I tried this "Fetch as Google
" thing. Which resulted in an endless "pending
" status that finally turned into "Failed
This issue cannot access be accessed because of the old Google Analytics tracking code. Somehow it blocked Googlebot from indexing your website in some cases. I spent hours trying to investigate and finally found a solution. So if you ever bump into this, here goes.
How to fix
Just upgrade to the new, asynchronous
tracking code then login to Google Webmaster Tools again and use the "Fetch as Google
" tool once again.