If you are an online publisher, it is probably annoying to see that someone is stealing your content and publishing it somewhere else online. Even worse is to see that this competing site is outranking you on Google for searches related to your own content. If your business relies on converting traffic from search engines, loss of ranking can impair your income severely.
Google recently published guidelines communicating new rules concerning how duplicate content and systematic data theft will penalize search engine rankings. Many businesses have expressed their concerns and irritation at the policy, and as a result Google has launched a reporting tool to address the problem.
The new tool is called Google Scraper Report, and was announced recently by Matt Cutts from Google’s web spam team.
Expectations from site owners are high but the Google Scraper Report form is not a quick fix. Google doesn’t promise any immediate fix or any fix at all. The tools helps site owners to share the original content URL, the scrapers URL, and the search result that was triggered by the outranking.
Our SEO experts explain that a DMCA system already exists. It allows site owners to report infringing content but the processes for sending that report to Google is very time consuming and requires great knowledge. In a best case scenario Google’s spam team will move against the infringing content on the scraper site considering it as a spam offense rather than a copyright violation.
The Google Scraper Report form is unfortunately not a solution to the scraping problem for site owners. There are billions of pages that need to be indexed and to automatically control which content is the original seems to be insurmountable. A slight negative aspect of all this is that your competitor can claim that he or she is the owner of that content and send that information through the form before you did.
It is at least evident that Google tries to harvest examples in order to show original content at the best ranking positions and for that we give them a “B” for effort.
If you suspect that your website is systematically being scraped by others then you need protection in real-time rather than trying to solve the problem reactively by using less effective reporting tools.