Most SEOs know for a fact that Google’s current duplicate filter still needs some work. Can this new patent which was filed just last Feb 21, 2008 be the answer?
Ever since blogs and RSS syndication started, thousands upon thousands of web pages are being copied everyday. In fact it became one of the cash cows of search engine spammers. They just create a lot of “splogs”, spam blogs by using scripts that posts content automatically using RSS feeds as input.
This is practically why Google has been keen on cleaning up its act with regards to duplicate content. According to them, it doesn’t make sense for users to “see” the same content come up the SERPS when they search for something.
I do hope that the original copy will not be mistaken as one of the “copies”. I’ve read cases before that some original websites were penalized because of so many dupes.
[via seobythesea.com ]