Something that freaks me out more than anything online is seeing some of my websites or pages disappearing from Google’s index from time to time. There is generally a good reason for that to happen and troubleshooting the issue quickly and thoroughly can help you save tons of “leaking traffic”. I had a similar problem over the weekend. Let me tell you what happened and how I fixed it.
How Did I Find Out There Was A Problem?
I check my Google rankings for the term “iPhone blog” on a daily basis. iPhone Download Blog (my iPhone blog) usually ranks #4. Last Monday though, it was nowhere to be found! Doing a search for “site:www.iPhonedownloadblog.com” showed me I had 27,000 pages indexed but again, the homepage was nowhere to be found. Hmm, that doesn’t smell good…
I then went and checked my Google Webmaster Tools account. It usually helps me figure out quickly if there is anything wrong with any of my sites. Usually, everything is fine. From time to time, I get a little error on one or two of my sites, which in most cases can be fixed simply by generating a new sitemap and resubmitting it to Google.
Last Monday, I completely freaked out when I realized that all my sites were showing errors! All of them! I thought it was very strange but I soon realized that my server had been down for a few minutes the day before and I figured that Google must have tried to crawl my sites at this time and was then enable to do it. I simply resubmitted all my sitemaps thinking it would do the trick.
What the Hell Happened?
As a measure of precaution, I checked Google Webmasters Tools again a few minutes later and saw that Google, once again, was giving me errors. The issue was obviously bigger than I thought. I decided to look into the problem more deeply and realized that Google was giving me a 403 error, which means:
The server is refusing the request. If you see that Googlebot received this status code when trying to crawl valid pages of your site (you can see this on the Web crawl page under Diagnostics in Google Webmaster Tools), it’s possible that your server or host is blocking Googlebot’s access.
It was clear now. DreamHost, my hosting company, had restricted Googlebots from crawling my server. I had had the same problem with DreamHost before so it was just half a surprise…
How the Hell Do You Fix This?
The first thing I did was sending a support request to DreamHost. I tried to be kind in my email but I was really furious. I think it’s very unprofessional of them to block access to my server to anyone without even notifying me. A few hours later, I didn’t have an answer yet… DreamHost is usually fast at replying, but I guess they were not that day…
So I sent them a second support request email, a little more spicy than the first one. A couple hours later I received this email:
Hello Sebastien,
I sincerely apologize for this! This was handled incorrectly, and NOT according to our own policy. It looks like a few days ago you wrote in regarding server performance. It appears part of that was actually googlebot slamming your sites. This was blocked to preserve server stability (which is normal troubleshooting), what didn’t happen, and SHOULD have, was that you were not notified and told how to deal with this. I’ve removed the block, but please see the following article:
http://wiki.dreamhost.com/Bots_spiders_and_crawlers
There is a way to slow down googlebot without killing your search results OR the server. This is the preferred approach. We obviously don’t want to hurt your sites.
Again, I apologize for this, and if it helps any, it looks like just an IP range was blocked, and whoever was troubleshooting the server wasn’t really *trying* to block google from your site…just an IP that was causing problems for everyone.
Please let me know if you have any other questions.
Thanks!
Jeff H
Alright, fair enough… What can I do anyway? So this part of the problem was handled but I needed to tell Google everything was back up so it can start crawling my sites asap.
I resubmitted all my sitemaps again. I also wrote a few new posts on my iPhone blog in order to “prove” Google that the site was still alive and doing well. Finally, I linked to my main sites from various other websites in the hope it would help speed up the crawling process.
Googlebots came back and crawled the sites. The next day (on Tuesday), iPhone Download Blog was back in Google’s search results but it was ranking at about #35 for my keyword. I realized that even though it was back in the index, Google didn’t have a cached image of it yet. I assumed that it would come back and crawl the site again and then generate a cached image.
On Wednesday, still no cache! It was ranking slightly better though, showing up on the second page. Still far from what I was looking for, but at least it was progressing a bit.
On Wednesday night, the homepage completely disappeared from the index again. While some may worry about this, I am now used to it and I know what it means. It means that Google has crawled your site again and it is generating a cached image of it. This usually happens when you create a new page. Google will crawl it, rank it, then it will disappear from the index for a few hours, and finally it will come back for good. It’s like Google is hurrying to crawl it, but then takes its time to analyze it and make sure it’s not crap.
I went to bed that night confident that my site would be back on the next morning when I get up. Sure enough, it is ranking again, as good as ever, this morning. Yeahhh!!! All my other sites are back too.
What’s the Moral of the Story?
Well, the moral of the story is that you should check your sites often, if not daily. You should make it a habit to check your Google Webmasters account to make sure there is no problem. If there is a problem, identify it and hurry to fix it. Google Webmaster Tools is a great help for this and if you haven’t created an account yet, I suggest you do now.
This little adventure cost me about a 40% decrease in traffic on my iPhone blog. Yes, it hurts! I haven’t measured my other sites loss as they are not as important as my iPhone blog. My money-making sites rely on PPC to drive traffic so it wasn’t a big issue for those sites.
I’m still pissed at DreamHost but you get what you pay for… I considered moving my sites to a new hosting company but it would be a nightmare. I have over 20 sites, most of them being database driven. My technical skills are just not sharp enough to do all this by myself. I guess I’ll stick with DreamHost until something really major happens.
11 replies on “The Google Freak Out”
Shucks. I had no idea these things could happen. I came into the online business world all rosy and hopeful but now I live in daily fear that some tech problem could ruin my life forever. Sigh. Fortunately I have a very good host… so far
lol…i always tweak my .htaccess code of my website to avoid this error…
Well, in this case, my host directly blocked Google bots so there was nothing I could do
You’re well solve this problem. I learned this so that I will make sure if the same problem happens to me for which there are great chances.
Thank you for you information.
thanks for this, love the pic of dr evil
Hmmm, interesting little saga, shows that you have to be on the ball when it comes to Google. BTW, love your site design.
There is a lot of reasons you can fall out of Google’s listings and sometimes it’s just for a short while. They have a lot of data centers and you will rank differently on each one for whatever reason.
I’m always #1 on one of my keywords and the other day I fell to the second page. I too .. freaked out. Went through all the same things you did and then all of a sudden there I was at the top of the list again.
Well as it turns out the data center close to me was down for a few minutes and who knows what geographical region my search was re-routed to but I’m not #1 there.
I guess, I’m saying…. no need to freak out. Every keyword tends to bounce around a little for one reason or another. Webmaster centrals’ keyword rank is an average.
If you guys use WordPress for your websites, which you should at this point for things like this, I use the “Redirection Plugin in tandem with the “Ultimate SEO” plugin. You get a 404 monitor in there updated daily and then you can redirect in the plugin much easier than dropping in an htaccess file. Super fast and handles the job. Interesting though on the Googlebots getting blocked, never seen that before at the hosting level.
Good Housekeeping prcatices and routines are essential for any website.
scary man, thanks for the info
Gaz