original-duplicate-content-copywriter-collective

Listen up. This is simple so I’m going to keep it short. There’s still a lot of confusion among writers when it comes to multiple copies of their work (i.e. blog post) circulating around the internet.

Despite excellent technical articles, like this one about Duplicate content on the Moz blog, people still think the Google God in the sky will penalize them for being successful if their article is featured in more than one place because they’ve created ‘duplicate content’.

So, whenever they post the article somewhere else, like an article directory or another blog, they spend a lot of time making little changes it so that each time it’s ‘unique content’.

Whoever told you to do that? It’s a criminal waste of time.

Search engines are smarter than you

Let’s start by giving search engines a bit of the credit they’re due. Managing the world’s information is hard work and they’ve got pretty darn good at it. They’re not the dumb robots we think they are. Think logically like them.

Let’s make some definitions.

Original content

You’ve written a new article all by yourself, therefore it’s original, even if it’s not necessarily a ground breaking original idea. You post it on your blog. Google caches it. Even if it’s copied 1000 times it will still be original content. Always be original.

Unique content

Some websites ask you for a unique article. That means a new article. One that you haven’t posted anywhere else before. So it must also be original of course. The article won’t stay new/unique for very long if other people copy it for their website.

Duplicate content

This only applies to similar pages on a single website. If you have the same article and post it 100 times making only minor changes (so you cover multiple keywords for example without having to write different articles) then this is bad. It’s easy for the bots to see. You are bad. You can be penalized. Don’t do it.

There, I hope that’s clear now. It wasn’t too hard now.

But if not, let’s do a glossary of some more similar terms.

Scraped content (unoriginal content)

When you ‘duplicate’ someone else’s content and claim it as your own, without crediting them or linking back to their original article. I don’t know if the bots can tell, but if they can’t know then they soon will be able to. It’s certainly bad karma anyway. Don’t do it.

Syndicated content

This is when your article is so good that it gets taken up by other blogs and newsletters. Or you spend hours submitting it to article directories or other blog sites because you want some lovely backlinks. The syndicated content might get indexed above your original article and there’s discussion about whether how that happens, because no one knows how Googles’ mind works.

If you’re writing just for yourself, don’t syndicate. If you’re writing for backlinks or to get a wider readership, then syndicate. It’s a good thing and you won’t get penalised for it. If Google is logical then you shouldn’t be.

Example 1: News sites get their articles copied left right and centre. They always rank high in search. If journalists had to rewrite their stories every time they appeared somewhere else there would be a lot less news in the world.

Example 2: Press releases are sent out by companies and dumped all over the internet. The original site won’t be penalised. It still benefits from backlinks. It’s just the value of the backlinks that might get lower if they’re from poorly ranking spam sites.

Example 3: Article directories like ezinearticle.com are huge deposits of original (not unique) articles and they have super-high rankings. Search engines love them. And they don’t link to anybody. They’re so selfish.

So remember…

An article can be ‘syndicated’ around the internet and even if you’re not credited as the author, the bots have the smarts to know that you wrote it first because you posted it first. Just get it cached before you start reposting.

Final summing up

To mop up any last dregs of confusion, let’s use music as an analogy.

Original song: Queen’s Bohemian Rhapsody

Unique song: Queen performing Bohemian Rhapsody live

Syndicated song: Bohemian Rhapsody played on the radio

Unoriginal song: Your terrible cover of Bohemian Rhapsody

Duplicate songs: Status Quo. All their songs sound the same.

1 reply
  1. Lynie
    Lynie says:

    Duplicate content can be the worst problem in SEO. Site owners can suffer from a loss of traffic and rankings. Writers and copywriters should really make sure to write original and unique content because the search engines devalue pages they consider duplicate and sometimes worst by penalizing the website.

Comments are closed.