Homepage > News > High Tech > 

Murdoch considers Google ban

2009-11-11 10:37 BJT

BEIJING, November 10 (Xinhuanet) -- Rupert Murdoch, chairman and chief executive of News Corporation, has said he is likely to remove his newspapers' stories from Google's search index once his company has established a system to start charging for online content.

News Corp is planning to start charging for access to its newspaper websites, including the Sun and the Times from next year. Murdoch told Sky News Australia that a number of websites, including Google, Microsoft and Ask.com "steal our stories without payment".

His comments mark a continuation of his battle against sites such as Google who he accuses of "kleptomania" and acting as a "parasite".

Asked during the interview why News Corp didn't take Google's advice and de-list from its search rankings, the 78-year-old mogul said, "I think we will...but that will be when we start changing. We do it all already with the Wall Street Journal. We have a wall, but it's not right to the ceiling, you can get the first paragraph of every story but if you are not a subscriber to WSJ.com you get a paragraph and a subscription form."

Murdoch also dismissed Google's use of news snippets amounted to fair use. "There's a doctrine called fair use, which we believe (needs) to be challenged in the courts and would bar it altogether… but we'll take that slowly," he told the news station.

He was also asked how his plan would work given the number of other news outlets that effectively provide a free service. "But we are better," Murdoch insisted referring to the BBC. "If you look at them, most of their stuff is stolen from the newspapers now, and we'll be suing them for copyright. They'll have to spend a lot more money on a lot more reporters to cover the world when they can't steal from newspapers."

Google meanwhile seem unconcerned by Mr. Murdoch's threat to de-list his sites. Google founders Sergey Brin and Larry Page have previously declared that if news organizations don't like Google indexing their content, then it only takes two lines of computer code added to a file called "robots.txt", which every website uses to tell search engines where, or not, to wander. By placing the key lines "User-agent: *" and "Disallow: /" into the page, the website will vanish from Google's index.

"Google delivers more than a billion consumer visits to newspaper websites each month. These visits offer the publishers a business opportunity, the chance to hook a reader with compelling content, to make money with advertisements or to offer online subscriptions," Josh Cohen, Google senior business product manager, wrote in a blog post in July. "The truth is that news publishers, like all other content owners, are in complete control when it comes not only to what content they make available on the web, but also who can access it and at what price."