It must not be not easy for the Google crew to be called "parasites or tech tapeworms in the intestines of the internet," by Robert Thompson, the managing editor of the Wall Street Journal.
Well fine, if publishers don't want their content crawled, they easily tag their content with a "Robot" which blocks Google's spiders from crawling and indexing their pages, says Google spokesman Gabriel Stricker in this interview with Beet.TV
Blocking Google is easily done with a simple tag, called Robots Exclusion Protocol or robots.txt protocol. Google explains how Robot.txt is used here.
Gabriel told us that the reports coming out a recent meeting of newspaper publishers was "confusing."
He told us that Google sends 1 billion clicks per month to the world's newspapers. He says the company hopes to work with newspapers in helping them make money in an expanding online universe.
-- Andy Plesser, Executive Producer
Follow Andy Plesser on Twitter: www.twitter.com/Beet_TV