Scraper site is a spam website which copies all content from other websites by using web scraping. Do you know the purpose of doing this? It can collect advertising revenue or it intends to manipulate search engine rankings by linking to other sites. Due to this search engines will display the snippets to the original site content when ever raised a query by user. Scrapping would be the most frustrating experiences for publisher. Not only copying the content but also outranks on google for searches may irritate and results a great loss to the publisher. Lot of complaints is being raised by the publishers to google. Now Google launched a wonderful tool called as Google Scraper Report.
Generally the originating content should rank number one in the search results but the sometimes you can find scraper sites ranking above the content originator. In worse cases the original source of the content vanishes from the results and scraper sites only continues to rank well. This is an excellent opportunity to the webmasters. Webmaster can submit scraper sites by providing the URL to google from where the content is taken from and also the URL of the scraper site where the content is republished. You can also mention the keywords where the scraper site was ranking. It also asks the webmaster whether they are following the guidelines before submitting or not.
Some of the scraper sites are not ranking for money keywords but they are interrupting in the search results. The only way to get scraper content out from google results is filling a DMCA. This will help you in removing the infringing content but it will take more time to complete this process. Suppose if you are a blogger you can always find duplicate content using any of the tools. Just fill out the form and help google to understand why they are ranking better. Once you have sent request you can check the status in webmaster tools account.
Google scraper report doesn’t promise any immediate fix or any fix at all. Simply asks people to share their original content URL, search results that triggered outranking and the URL of the content taken from them. This would be a challenge to the google. It needs to apply more common sense to the problem rather than paperwork. Why google calling out everyone on the web to report content scrapers or spam websites? Nothing but google is a constant PR battle to ensure delivering relevant search results to the user. Striving for quality search results may give lot of benefits for users and even SEOs for creating content and deters spammers.
So start using the Google scraper reporting for reporting the scraper sites.