Duplicate Content Issues – How They Affect You
Duplicate content issues affect a variety of webmasters including: the affiliate marketer, the mashup developer, and the E-Commerce store manager.
Here are some prime examples:
- Affiliate site with product descriptions coming from Merchant datafeeds
- Shopping mashup where the product information comes from partner(s) Web Services / APIs.
- E-Commerce store where product inventory information comes from a partner or supplier.
What do all of these scenarios have in common? They all can suffer greatly from duplicate content. If you are using a content source that is widely used, chances are your site is probably going to be seen as having duplicate content by the search engines. The best way I can tell you to look at duplicate content is by measuring your own value added. If your site is adding no further value to the content source it is generally not going to be liked by the search engines, and rightly so. From a consumer standpoint we don’t need 100 affiliate websites to appear first in Google with all the same exact content.
From what I gather there is no 100% solution. One suggestion in the session was to have the product information, namely product titles or descriptions, changed or enhanced enough to be considered different (if you going to go to this extreme you probably should optimize them for relevant keywords too). This solution is good in concept, but often not practical. Fine, your store has a couple hundred products listed. Dedicate a person for a few weeks to this. But what if your store has a couple million products? A single affiliate marketer doesn’t have the man power to change a million product descriptions. What can he/she do? It isn’t clear cut, but mixing in unique content, preferably user generated content is a great start.
If I mix together existing (duplicate) content from multiple sources, does it become new unique content?
Even after the session, I’m still not clear on this topic since it was really addressed. It is obvious that in some instances the search engines consider it to be content worthy of indexing. Examples include the many aggregate sites that have popped up that simply mash related feeds together and form a new site from it. Clipfire is one example of a deal aggregator site that does this. It takes feeds from deal sites around the web including our own SecretPrices Deals feed, and puts them together in a smart manner. ClipFire has over 150,000 pages indexed by Google. That’s pretty good for getting all your content from other sites’ feeds.
Should I make every single page on my site indexable?
Redundant content was another topic discussed in the session. The consensus: Avoid allowing pages with the exact same content to be indexed multiple times. A prime example is having a product listing page that can be sorted by Product Title, Price, or Stock Status. While the content’s positioning changes, it ultimately remains the same in the search engines’ eyes and will be seen as redundant. You can tell the engines not to index a specific page by using the META noindex tag.
[Entry related to the Sitemaps & URL Submission session held at Search Engine Strategies 2007 NYC.]