View Single Post
Old 11-03-2006   #6
mcanerin
 
mcanerin's Avatar
 
Join Date: Jun 2004
Location: Calgary, Alberta, Canada
Posts: 1,564
mcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond reputemcanerin has a reputation beyond repute
Let's back up and start again, OK?

There is no duplication penalty - it's more of a filter.

There is lots of duplicate content all over the web - every time someone in the press writes an article it gets duplicated all over the place, and you don't see all the newspapers getting banned, do you?

Further, sites with legitimate and useful content can have duplicate content on them legitimately. For example, pharmacy sites have standard data sheets for drugs. You *don't* re-word them or SEO them - someone could die if you decide to leave off a sentence because it was run-on or something.

Also, every time that someone sells a product, like a book or something, the description and information is provided by the manufacturer and is therefore identical.

So what happens? well the search engines accept that duplication happens, but that doesn't mean that they want to display the same information over and over again. So a filter is put into place, where the search engine attempts to identify the duplicate content and then choose the single best one for display. The remainder are not penalized or considered spam - they are simply not shown.

Now, having said all that, here are some known variations from this:

1. Duplicating an entire domain rather than some content, can get you in trouble.

2. At least in Googles case, the duplicate page with the highest PR usually wins (this is still a use for PR, BTW). Failing that, it tries to choose the oldest.

3. Sites like amazon.com differentiate themselves around the duplicate book descriptions by adding to the content to the point where the duplication is not considered anymore - they use things like customer reviews, and so forth.

I'll tell you this: if you think that just sticking an RSS feed on your site will suddenly make you #1, you are smokin' something interesting and I'd like a puff, please

RSS content can be a useful tool for adding useful content to your site, but unless your site has higher PR than everyone else that is subscribed to the same RSS feed, it won't help you much.

Now, if you had a bunch of related content on that page and the RSS was simply supporting information, then you may find yourself useful to both humans and search engines.

There is nothing wrong with RSS feeds per se - but they are the latest "bling bling seo" tactic and you should treat them accordingly.

I remember when everyone said that having a directory would solve all your problems. Then it was press releases. Then blogs. Now it's RSS.

There is nothing wrong with any of these, but there is nothing magical about them, either. You can expect that as soon as "everyone" has an RSS feed on their site, the search engines will detect and discount them. I'm waiting for the RSS-feed based "made for AdSense" sites to become popular, myself...

Sure, you can have one - it certainly won't hurt you (and may well help) if you put it in a worthwhile and useful site.

But an RSS feed (or blog, or directory, or press release) won't fix a bad site - anything that is easy and is touted as working really well can be counted on to stop working so well in short order - usually shortly after there are a couple of SEW sessions on it

So, the bottom line is that you probably are not at risk for duplication, unless a large portion of your site is based on them, but at the same time I recommend that you not get carried away and buy any hype. It's a tool, not a magic wand.

Ian
__________________
International SEO
mcanerin is offline   Reply With Quote