Bots and automated ranking must have difficulties in judging content with quotes/attributions and those without - can we not assume that is a parameter in 2011?
Any odd developer can fix this - there are 5 copies of the content and google knows which is the original so why not ingore the remaining 4 ? i just see no reason for google to improve from this mistake. Google has yet to remove duplicate content from the search entries. I don't know what's stopping them from doing that, why they need duplicate copies ? People overestimate google's power in case of their algorithm. It's basically a sh88t algorithm from wanna-be phd holders at google. Any tom dick and harry can copy content from other sites and rank number 1.And these phd brains in google are taking time to improve algorithm just to discard duplicate content sites and entries ? check sites like metroadvice .com -pure BS with content from yahoo answers and other sites.
Google algorithm has many flaws as of now-
1. can't differentiate between duplicate and original copies
2. have no way of detecting new sites and their content
3. have no respect for social signals and places (where people flock to discuss or read the info)
4. slaps are unjustified on many pages and sites
5. indexing and de-indexing is very poor
6. No respect to brand (for example, take case of product owner sites, you see less results to him and more to reviews of his products that's sad).