Wednesday, July 18, 2007

Search Engines and Duplicate Content

It's important to note that other sites are using these articles, and the search engines don't like what they call "duplicate content"-that is, the same information appearing on a bunch of different sites.

To prevent the search engines from seeing your article pages as duplicate content, all you have to do is put between 30% and 40% of unique content

on the page in addition to the article. This means your site's header and footer, and some extra related information.

RSS feeds are great for this. In case you don't know, an RSS feed is just a list of short articles or posts or summaries of content that appears on another web site. It's a kind of "snapshot" of what's new at a website or blog. With the great proliferation of blogs these days, and with almost every blog having an RSS feed, the world is your oyster in terms of extra content for your pages!

It's easy to find RSS feeds to use on your sites. Personally, I like MSN's RSS feed search. Just go to search.msn.com and search for "feed:[keywords]".

For example, if your site is about internet marketing, then search at MSN for "feed:internet marketing" (no quotes). Many of the RSS feeds allow you to republish them on your site. Find some that do and copy/paste those feeds onto the pages of your website. That way the pages are unique in the eyes of the search engines and your pages don't suffer from the "duplicate content penalty".

No comments: