I read Andy Beal’s Lowering The Google Red Flag article this morning, and it contains what I expected to see soon – skilled internet publishers challenging Google’s “ownership” of the web by blocking them.

Andy has been badly hit by page rank reductions because he writes paid reviews with followed links. We aren’t talking 5 minute crappy reviews here, Andy tells us he puts hours of work into each review and is very discriminating in what reviews he takes on. One would have thought he would be the ideal subject for some Google love, but apparently those little $-signs worry the Googlites, and they can’t look past them to see the quality.

Andy’s answer is to block Google from his site. Because of the odd way that Google interprets robots.txt (a robots.txt block will stop them crawling, but not indexing your site) he expects to still have traffic – he thinks it will increase – rather than be removed from the index. This is because his strong link profile means the internet thinks well of him. Page Rank is a measure of link strength – by cutting it anyway, Google is disputing the value of those links.

The only thing I am not clear on is if Andy will cloak his robots.txt to allow other bots in, or if he is blocking all bots – I would have thought the main tactic of this target would be Google?
I’ll have to post a comment and ask him.
(Ah, I see – he’s specifying googlebot in the robots and only to the ‘articles in question’ – he is live with this now, btw)

Overall – interesting times. I will follow his progress with interest.