Not sure if I should call it an issue, however, duplicate content is feared lot by SEOs and webmasters – mostly unnecessarily. There’s the fear that a site could be banned from Google due to the presence of duplicate content, and even though we have no document to prove this, I’d call it more speculative than real.
Having said that, duplicate content is definitely not “ignorable” as it could lead to possible screw ups on the site if left un-dealt with.
But first, what is duplicate content ?
1 – Order one – Inter site duplicate content.
The first order duplicate content occurs due to the repetition on content across web sites. This is mostly as a result of scraping or the very obvious cut-copy-paste syndrome !
2 – Order two – Intra site duplicate content.
The second order duplicate content occurs due to the repetition of content within a site at various locations. This occurs mostly due to technical errors or a bad site structure.
How to reduce duplicate content and avoid possible problems ?
Solution 1 – Do not publish content on your website, without checking for possible duplicates already on the web.
Sometimes websites employ third party content writers and this can prove risky sometimes, as the writers themselves may unknowingly copy content from the web.
Solution 2 – Do not publish the same content at various places on the website.
For instance on WordPress the same content gets repeated at more than one place and if not dealt with properly, this could probably leave you with more risk than otherwise.
Solution 3 – Opt for foolproof website structures.
Sometimes, the website structure gets so dynamic and complex that webmasters don’t have a clue as to where the pages are and how the content is getting re-published. To avoid this, rule off all the possibilities of duplicate publishing by using nofollow meta and canonical tags.
Essentially, duplicate content is either accidental or deliberate, and you’re lucky if its accidental, because you have all the methods to fix it now than ever. So its just a matter of re-arranging and using the tools. But if its deliberately crafted duplicate content (like scraping) the risk is too much and the effects fatal. So refrain from it.