So there is a huge buzz around duplicate content since Google added it to it’s core algorithm. However it’s not as bad as some people make out. You need to think about it the way Google intends, and look at it from a users perspective.
Let’s say you searched the term ”website duplicate content” but then imagine if the listings on Google were all exactly the same article, how annoying and useless would that be? There would be no diverse opinions, no facts. Just the same spun content over and over. In reality that is what you would find but the content is ”different” as in its not duplicate.
So how does it work? Well basically the first recorded version of said duplicate content would stay (it could loose position on a few places) the then duplicate content pages would disappear or fall very very far down in the rankings. Remember this is a usability issue for Google search results so it is not a penalty. Your listings would not disappear, and become blacklisted, however the duplicate content pages would drop dramatically in the Serp results.
So if Google does not see this as a reason to penalise a website then it’s nothing to worry about right? Wrong, if you want to rank well on Google and have all your pages hitting them all important target keywords, then you want to ensure every page contains unique, readable and useful information. The more relevant, unique and informative you can make your content the better as far as Google is concerned.