Today’s WebProNews article dropped in with a section highlighting a topic brewing in their forums. If you are syndicating your own content for the sake “serving the localized searches”, is that still considered duplicated content in terms of Google’s eyes?
While it really makes sense that we should work to serve our customers and “not change the business just because of Google”, the big G has also been not very forgiving about duplicated content.
From my point of view, I still feel it is a no-no to have the same content on two or more sites – if you want both to perform well… unless you are already an authority site like most online newspapers, which are known to be the business of reporting syndicated stories from all around.
rel=nofollowtags and the section targeting stuff for Adsense publishers to demarcate which content to base on for their contextual ads, shouldn’t they create one similar to section targeting, but specifically for duplicated content? i.e. For people to demarcate a section saying “This is syndicated content and I know it, I don’t expect to get listed in your rankings with this page, because I’m doing this to serve my customers, but don’t penalise me for it?”
Update 2: Sheesh… I maybe answering my own question, but perhaps a directive in the Robots.txt file might do the trick, albeit not at a scale as fine as section targeting.
What are your thoughts?