Featured Post

Search Engine Optimization and Your Website’s Structure

Everyone rants and raves over the value of content when working towards the best search engine optimization practices. However, really getting the most out of your site actually depends on much more than that. Search engine optimization also means a website with a good, logical structure. After all,...

Read More

Avasoft Solution - Best Business WebSite Design and Web Hosting Company

Duplicate Content – Harms Your Search Engine Rankings

Posted by Avasoft Team | Posted in Content Writing, Site Traffic | Posted on 17-02-2012

Tags: , , , , , ,

0

Duplicate content, a.k.a. plagiarism, is never something you should tolerate on your own website.  In internet terminology Google defines, “duplicate content” as “substantive blocks of content within or across domains that other completely match content or are appreciably similar.”

It’s important to understand that duplicate content refers to pages that are exactly the same.  Even pages with quite a few similarities usually don’t cause alarm bells to go off with Google and the other search engines.  Avoiding duplicate content can be as simple as changing the order of the paragraphs around, but the most important thing to remember is that you don’t want a lot of duplicate content on your site because it will harm your search engine rankings.

Of course avoiding duplicate content is about more than just making sure that your website does not have articles and content copy and pasted from another site.  There are some forms of duplicate content that most website owners don’t even think about, such as:

  • Discussion forums that have both a regular and a mobile version of the same site
  • Pages with items for sale in your retail store that are linked with multiple distinct URLs
  • Web pages that have printer-only versions of the same content on the page

When Google indexes pages, the filter will essentially choose one version of each page to list.  Sometimes the pros at Google will see that there is a lot of duplicate content within a search engine listing that was meant to take over the majority of traffic for a particular keyword.  The search engine then adjusts and lowers the rankings of sites that seem to be a copy of earlier sites.  Sometimes these sites are completely removed from the Google index so that they will never show up in search results.

Assuming you’re not just out to copy the work of others, here are some things you can do to protect your site’s search engine ranking:

  • Canonicalization and site indexing – tell Google which version of your website or duplicate content you want included in the listings, whether it’s the printer-only version or the main page.  This can be done via the Google Webmaster Tools.
  • 301 Redirects–You can redirect the search spiders and users to the right version of the page via .htaccess files or the administrative console.
  • Consistency – Internal linking among your pages should be consistent rather than utilizing more than one version of the same page.
  • Top-level domains – Make sure the URLs you use for your site have country specific extensions, like .uk for the United Kingdom or .ca for Canada, etc.
  • Proper syndication – When syndicating your content on other sites, include a link back to the original article and ask those who use your article to use a no index meta tag so that their site won’t be indexed in the search engines.
  • Cut down on boilerplate repetition – You don’t really need copyright information on each page.  Just link the copyright mention to a separate page that has complete information.
  • Don’t publish pages until you have content for them – Publishing placeholder pages before they have content creates a lot of duplicate blank pages on your site.  If you must do this, use no-index meta tags to prevent web crawlers from indexing them.
  • Adjust your content management system.  Remember that blog postings may appear in more than one place on your site, depending on how your content management system works.  You may be able to make adjustments to minimize this, thus reducing the amount of duplicate content.
  • Combine pages that have similar information – If you have several pages with content that is similar, look at making those into one page instead of having separate pages with similar content.

You probably won’t be able to get around all of the issues relating to duplicate content, so for the pages you can’t do anything about, you can mark them in a way that lets the crawlers know they are duplicate content.  In the past, Google recommended blocking crawlers from actually finding the duplicate content, but this can be damaging because it causes the crawlers to treat them as separate pages.  Instead, use the rel=”canonical” link, a URL parameter handling tool, or 301 code redirects.  You can also use Webmaster Tools to change the crawl rate setting for your website.

If in spite of your efforts, your page is still removed from search engine rankings, read over the Webmaster Guidelines to find out exactly why.  Once you fix the problems, you can submit a request to have your site included in the rankings again.

Sometimes you may find that another site is actually copying your site in some way.  If this happens, it’s likely nothing to worry about with your search engine rankings.  However, if you do notice the duplicate content causing problems, a DMCA request will allow you to attempt to get that content removed by claiming that you own it.

Avasoft will take care of all your website needs and help you make sure that there’s no duplicate content lowering your search engine ranking.