Two websites - same content, is it a problem?

Status
Not open for further replies.

Dotwebs

New Member
Hi,

I'm not sure if this is the correct discussion area in which to post this.

I'm currently working on two websites for the same client. One is their product website and the other is the company one. Each has it's own domain. The product site is just about ready to go live. All along I envisaged the company site to have company information only and link to the product site. Now the client has asked me to duplicate the content from the product site - to push the product on both.

I'm sure I read somewhere on this forum that duplicating content can have a negative effect on SE indexing because they see it as plagarism. I can't find where I read it, but if anyone can tell me if this is correct or incorrect, I would be most grateful.

p.s. I need to find our as soon as possible! Thank you!
 

gbonnet

New Member
Duplicate content is definitely something to be avoided at all cost.

There are simple tricks to cope with it though:

- meta robots: noindex
- robots.txt rules
- meta canonical (new stuff, not tested yet)

The meta robots that by default is 'All' which means 'follow,index' if set to 'noindex,follow' will prevent search engine from indexing the page but will follow the links within the page. The pages with 'noindex' in the robots meta, get a pagerank though (with all that goes behind that), so it's maybe not the best option in your case.

You can simply block SE bots from crawling such or such pages of the site by adding specific rules in the robots.txt
See : The Web Robots Pages

The canonical meta is a new tool that google came up with to prevent duplicate content. It's quite new though, so I've not tested it yet.
 

Dotwebs

New Member
Thanks gbonnet & link8r - I think I'll have to talk to the client about it.

Incidently, if you have 2 domains pointing at the same site, does this cause similar problems>
 

eteanga

New Member
In my experience, it's not necessarily a penalised problem, as Google at least seems to figure out that they're the same site. All Google does is decide which URL it serves more for searches.

However, it does make statistics harder to verify, and it also takes away the client's control over their own brand as Google decides on your behalf which domain is more important.

And as everyone else has said, be certain to correct it with a 301 redirect.
 

caminowebmaster

New Member
Dup content is okay if it it serving diferent contries AND it is on ccTLD - so the same content to the UK on .co.uk and to Ireland on .ie is fine - as soon as they're both in the smae country with ccTLD you run the risk of Google not showing either - I have had this problem a bit over the last year.
 

Toiletroll

New Member
Guys - Dupe content on the same site/URL is the big NO NO.

Dupe content (as long as its not more than a couple of times) on different URLS is not a very bad thing at all!

Trust me... Im holding positions for some of the most competitive terms in the country for the past 6 months nad noticed improvements with dupe content from 1 source! The content is unique - by unique I mean that only these two sites have it :D :D
 

natural11

New Member
Guys - Dupe content on the same site/URL is the big NO NO.

Dupe content (as long as its not more than a couple of times) on different URLS is not a very bad thing at all!

Trust me... Im holding positions for some of the most competitive terms in the country for the past 6 months nad noticed improvements with dupe content from 1 source! The content is unique - by unique I mean that only these two sites have it :D :D

Hmmp... Why do you say dupe content is not bad. The main reason why dupe content is a problem because the search engine robot confuse which one is the real.
 

Toiletroll

New Member
Hmmp... Why do you say dupe content is not bad. The main reason why dupe content is a problem because the search engine robot confuse which one is the real.

What is your website address? I think I will copy and paste all your content onto 500 websites in order tio get you penalised... See where I am coming from?

What they can do between different URLS is limited
 

tomed

New Member
What is your website address? I think I will copy and paste all your content onto 500 websites in order tio get you penalised... See where I am coming from?

What they can do between different URLS is limited

Your theory wouldn't work, there's a lot more to the algorithmn than this simplistic approach.

You need to look at the many reasons why people would use duplicate content to their benefit and only then would you be able to "guess" what Google's algorithm is doing to preven this sort of behaviour.

The most obvious thing they do is penalise the site with less authority. This has been shown in a lot of cases not to be such a good idea, as higher rated sites that took content from smaller bloggers, got a better rating than the original contributor.

But there's a lot more to it than that.


I added 200 pages of content that was already one just 1 site previously. I noticed a big improvment on rankings when Google crawled the content :)

Would love to see the sites in question. I imagine your improved rankings on the original site are due to something completely unrelated. One of your two websites will suffer soon enough - hopefully for you it's not the important one!

Duplicate content is a serious issue - if it wasn't Google, Yahoo and MSN wouldn't have released their Canonical tag idea. Fight duplicate content with the “Canonical Tag” Tom Doyle :: TALK
 

Toiletroll

New Member
You guys are correct actually... On the original site the articles are linked to with a JS link and Google does not seem to be crawling it...

Shhhh :) Happy days
 

MrFlicks

New Member
Hi,

I'm not sure if this is the correct discussion area in which to post this.

I'm currently working on two websites for the same client. One is their product website and the other is the company one. Each has it's own domain. The product site is just about ready to go live. All along I envisaged the company site to have company information only and link to the product site. Now the client has asked me to duplicate the content from the product site - to push the product on both.

I'm sure I read somewhere on this forum that duplicating content can have a negative effect on SE indexing because they see it as plagarism. I can't find where I read it, but if anyone can tell me if this is correct or incorrect, I would be most grateful.

p.s. I need to find our as soon as possible! Thank you!

Could you not just find a very easy compromise by just ever so slightly rewording things on the second site so it reads differently?

Here is an example

Lets say one site reads

Hilton Paris is the worlds best search engine optimizer and can be hired by .... blah blah

on the scond site write

Hire the Worlds best Search Engine Optimizer Hilton Paris as he is the best SEO by ... Blah Blah

Get my drift/thinking?
 

Vertuboutique

New Member
Hi,

Try to avoid duplicate contents as possible as you can.

Because duplicate content will not give you more value like unique contents.

Thanks :)
 
Status
Not open for further replies.
Top