On 25th of February Google produced a change in their search algorithm. It is designed to bring higher-quality, relevant search engine results to users by removing content farms and spam from the search positions. Targeted sites are those currently using duplicate content from authority websites or hosting content that has been replicated by a large amount of scrap sites.
Google also launched Personal Blocklist Stainless extension, developed to allow users in order to block websites, which they’ve found to be worthless. Google sees this as a great tool that investigations whether the algorithm change is performing correctly. It has already proved to work among 84% of sites.
Search engines will not take the Blocklist data into consideration when it comes to spam identification though. It could pose a risk of another black hat SEO technique used enabling people playing the search results.
Who is affected?
Google seems to devalue content that has been produced with inferior in mind such as through hiring writers that have no knowledge of the subjects to mass produce articles, which are later submitted to large amount of article directories. Using automated article submission software program was always considered a black hat SEO technique, “effectively treated by Google”.
Major article directories like EzineArticles or HubPages have been impacted. Although, the articles on these websites are often unique to begin with, they are later copied and populated on additional sites free of charge or submitted to 100s of other article directories.
If you beloved this write-up and you would like to acquire a lot more info regarding google scraper kindly check out the site.
The sites that will copy the article from directories are usually obliged to provide a link back to the content directory. This link building technique must be revised in order to face the formula change.
The good news is that Matt Slashes said that ‘the searchers are more likely to view the websites that are the owners from the original content rather than a site that will scraped or copied the original website’s content’.
Mostly affected sites are the ‘scraper’ sites that do not fill original content themselves but duplicate content from other sources using RSS feed, aggregate small amounts of content or simply “scrape” or copy content from other sites using automated methods.
Search engines Knol?
If EzineArticles, HubPages and Squidoo dropped in rankings so should Knol (Google property) which allows users to post their articles. How is Google Knol different? These content articles can also be submitted to other article web hosting sites.