- Blue Hat SEO-Advanced SEO Tactics - http://www.BlueHatSEO.com -

How To Dupe Content And Get Away With It

Posted By Eli On July 2, 2007 @ 9:52 pm In General Articles | 128 Comments

Let’s do one more post about content. First, consider [1] Google’s Webmaster Blog’s post dispelling common duplicate content myths as a prerequisite read. Do I always trust what Google’s public relations tells me? Absolutely not, but it does confirm my own long standing protests against people who perpetuated the paranoia about duplicate content. So it makes a good point of reference.

The most common myth ensue with the paranoia is, “anytime I use content that is published somewhere else, I am sure to fall victim to duplicate content penalties.” This of course is bunk because for any specific terms related to an article you can show me I can find you 9 other sites that ranks for its terms that aren’t necessarily supplemental and full of penalties. However there is no doubt that there really is a duplicate content penalty. So we’ll discuss ways around it. One of my favorite parroted phrases is, “It’s not what content you use. It’s how you use it.” So we’ll start with formatting.

Here Is Some Spammy Text
Welcome to spam textville. This is just a bunch of spammy text. Text to fill and spam the streets. This is horrible spam text filled content that will surely get my spam site banned. Spam spam spam. It’s not food it’s text. Spammy text. I copied this spam text all over my site and others are copying it for their spammy text sites. I can’t believe I’m keyword stuffing for the words spammy text.

Alone in paragraph form this text is very easy to detect as spam and being autogenned. So the classic SEO ideology of “well written article style paragraphed text does well” gets thrown out the window with this example. However, since I would love nothing more than to rank for the term “Spammy Text” and this is all the content available to me I have to abandon the idea of keyword stuffing and find some new formats to put this text in that search engines will find acceptable.

How about an Ordered List?

  • Welcome to spam textville.
  • This is just a bunch of spammy text.
  • Text to fill and spam the streets.
  • Lists and bulleted points work very well because the text enclosed is meant to be very choppy, short, and contain repetition such as My goals are, The plan is, Do this..etc. etc. If the common ordered list is formatted as such, than we by all right can do the same.

    What about presenting it as user contributed?

    Comments (3)

    John Doe:
    Spam spam spam.

    Jane Doe:
    I copied this spam text all over my site and others are copying it for their spammy text sites.

    John Deer:
    Spammy text.

    How many of you readers have left complete crap in my comments? I’m not banned or penalized yet. :) Faking user contributed material works great because since it’s outcome is unpredictable therefore you can do virtually anything with it and get away. Including but not limited to inner linking.

    Mary Jane:
    I saw this wicked article about this on Eli’s blog subdomainspam.spammydomain.com/spammysubpage.html check it out!

    Break It Up Into Headings


    Heading 1


    Welcome to spam textville. This is just a bunch of spammy text.


    Heading 2


    Text to fill and spam the streets. Spam spam spam. It’s not food it’s text.

    All the keywords are there its just no longer spammy because its been broken up properly into nice little paraphrases. Once again, standardized = acceptable.

    Change The Format
    What about PDF’s? They may not support contextual ads very well but they most certain can cointain affiliate links. The engines also tend to be quite lenient on them and redundant text. For more information read my [2] Document Links post.

    Let’s Move On
    So now that we can format our text to avoid the penalties what if we attempt to side step them all together? I talked about how to swap titles out using a thesaurus and IMDB data in my [3] exploiting LSI in this matter.

    How about scraping the right content?
    Heavily syndicated content works well for duping and it has the added bonus of being exclusively high quality. For instance I sometimes like to snag shit from the good ol’ AP. It’s not the smartest legal move but seriously, who’s going to deeply investigate and do anything about it? In such an event its always an option to just remove the article upon receipt of the CDC letter.

    All in all, theres plenty you can do to dupe content and get away with it. It’s a pretty open game and theres a lot out there.

    Have Fun


    128 Comments To "How To Dupe Content And Get Away With It"

    #1 Comment By Tom On July 2, 2007 @ 10:38 pm

    Just wanted to say that although this is pretty obvious to me, it has sparked some new ideas that I will be working on. Thanks Eli.

    #2 Comment By Gene On July 2, 2007 @ 11:59 pm

    Eli,

    While this seems like common sense and might be totally true, I find it hard to swallow without any empirical proof–or at least some anecdotal evidence.

    Gene

    #3 Comment By ozonew4m On July 3, 2007 @ 1:55 am

    Duplicate is in the eye of the beholder… If you are clever enough you can use as much duplicate content as you want and get away with it, without changing a letter.. Personally I use duplicate content as an excuse for keywords instead of containing them.
    Look at the most popular search page ;) [4] Webmaster search

    #4 Comment By jay On July 3, 2007 @ 5:29 am

    Im going to sound stupid now but please can you un-abbreviate AP? Thanks :-) .

    #5 Comment By Si On July 3, 2007 @ 6:46 am

    Another thought-provoking post, Eli! Makes me realise how much I need to learn some coding - I’m sure if can’t be hard to re-format content with php, can it?

    #6 Comment By Leon On July 3, 2007 @ 7:24 am

    AP means Associated Press

    #7 Comment By Seostomp On July 3, 2007 @ 9:26 am

    On the money again. It works wonders building feeders sites IMO. Have been able to run a bit of SERP domination in a couple markets with similar techniques.

    #8 Comment By The Geeky Blog On July 3, 2007 @ 10:58 am

    Assuming I’m correct with this duplicate content is not duplicate until its found inside of a search engine, for instance if you come across a new post/article (block of content) kind of like how this post is new and you put it on your site, there is a chance you’ll never hit a duplicate filter.

    This is because hypothetically speaking that the only way the search engines know you’ve stolen the content is if they’ve previously indexed that content, so content is not unique until the search engines index it, now I’ve already wrote this all out on Principle Of Marketing but I’ll keep going with it here.

    We know that search engines are predictable, for instance if you write a new article once a week for a while the search engine spiders will learn that behaviour and only come around once or so per week to index the new content (Assuming your not doing anything else to bring the bots in). This means that if you monitor these types of sites with your own little bots you can potentially steal their content and never get a duplicate penalty.

    Here is how it works, some site makes a post once per week, Google sends out their bots on average once per week to look for new content, you send out your own bots to this site once per day to check for unique content, the second your bot finds a new post you simply scrape that post, place it on your site and push as many search engine bots to your new post as quickly as possible.

    What I’m getting at with this comment that turned out to be a mini blog post is if you can get new content indexed in the search engines before anyone else that content is unique to you even if you have stolen that content, simply because the search engines have to have something to compare it to in order to give you a duplicate penalty, if it hasen’t been indexed then its still unique.

    #9 Comment By SeoRookie On July 3, 2007 @ 11:29 am

    Many feel the SE’s are using a shingle algorithm to detect dupe content. Is rearranging sentences or a thesaurus swap enough ?

    I agree that many sites do rank for dupe content but I also think these sites have other strong factors that negate the dupe content penalty.

    #10 Comment By dertim On July 3, 2007 @ 12:50 pm

    as always a very inspiring post. thank you eli.

    #11 Comment By Mark On July 3, 2007 @ 2:14 pm

    Here is the definition of irony:

    [5] http://credit.abcsouth.com/?p=144

    #12 Comment By jdog On July 3, 2007 @ 7:15 pm

    Now that is funny.

    #13 Comment By Bill Hartzer On July 3, 2007 @ 8:17 pm

    I haven’t actually tried to get spammy text into the index and get it to rank well, but this definitely gives me some things to think about.

    Do you really think there’s some sort of correlation between dupe content and supplemental results?

    #14 Comment By Eli On July 3, 2007 @ 9:16 pm

    Thanks guys
    For the most part all this should fairly common sense. I just want to make sure its out of the way before I start talking about mass producing gray hat sites, so I don’t have to deal with a bunch of questions about duplicate content penalties. It’s a great reference post to have.

    #15 Comment By Eli On July 3, 2007 @ 9:19 pm

    great post.
    I actually had a post about all this. I think it was called Maintaining Rank Through manipulating Freshness Factors or something like that.

    #16 Comment By Bob On July 4, 2007 @ 12:41 am

    Quick question, thats kinda related and may actually have been answered somewhere else, but what software are you using to create the pdfs and keep the hyperlinked anchor text hyperlinked??

    Ta muchly

    #17 Comment By Sharif On July 4, 2007 @ 1:17 am

    Something I’ve tried is sourcing relevant content on non English sites, and translating it. It tends to need editing, otherwise the copy has a wonderful auto gen garbled spam feel to it. And of course it’s time consuming (unless it can be automated somehow?) - but for a beginner like myself it’s interesting to play with.

    #18 Comment By Jez On July 4, 2007 @ 1:30 am

    I have built sites like this and getting google to index them is easy peasy.

    Getting links / traffic to them is not easy.

    To make money with this stuff you need thousands and thousands of sites..

    You need to build faster than you get banned… as a lot of your sites WILL get banned…

    For that you need sophisticated automation on a level not discussed on this site.

    Coming up with madcap schemes is easy enough… making money from them is not.

    You are up against professional programmers / server admin’s / SEO’s with years of experience and a LOT of expertise and the right contacts.

    Any noob who thinks they can go on E-lance with $100 and start earning with a script like this is seriously mistaken.

    #19 Comment By Raven Riley Fans On July 4, 2007 @ 3:51 pm

    btw eli, you are ranking #1 with spammy text at google now :)

    great posting :)

    #20 Comment By Ross Johnson On July 5, 2007 @ 11:50 am

    lol ranked number 1 for spammy text

    #21 Comment By M On July 5, 2007 @ 2:59 pm

    hows about doing a search and replace for a term thats close to the original term… rinse and repeat…

    BTW - im not convinced at google downplaying dupe content as articles do well, so do dmoz listings which seem to pop up every where.

    and for the record we’re ranked #1 for seo cock LOL!

    #22 Comment By Abidi On July 6, 2007 @ 1:56 am

    Does anyone know a good commenting system? It would be for HTML sites. I want to make it real.

    #23 Comment By Allen On July 13, 2007 @ 11:02 am

    1000’s of sites coming off the same block must figure negatively into the equation. How is it possible to control a few hundred sites let alone a thousand that come from unique subnets?

    #24 Comment By Kolin On July 14, 2007 @ 10:15 pm

    I have a few content scrapping sites, one is this: [6] http://relevantnews.net - how could they provide a benefit? I don’t know…

    #25 Comment By brazz On July 24, 2007 @ 8:25 am

    To me, the best way to turn a copy into an original document (in G’s eyes) is to add links. Google maintain a separate index for links only; if document A has a link and document B doesn’t have it, it is reasonable to assume that G will see them as different.

    Besides, adding link has another advantage: it will make your page more informative than the original one. Take the keywords you want to target; find a relevant page related to that keyword; place the link and voila, you just created a better page than the original.

    Use your own internal links to boost the new page (I assume you have original quality content with relevant links) and you will outrank the original page.

    #26 Comment By Brandon On August 5, 2007 @ 9:44 am

    Some people forget that search engines do not buy products or sign up for services. I had a client (stressing the word “had”) who was adamant about using software to crank out pages and pages of keyword-based copy for his website.

    The end result was a slight increase in search engine ranking for certain phrases … but his website now reads as though it were written by a 4-year-old.

    #27 Comment By Gary On September 16, 2007 @ 5:47 pm

    I thought the search engines stripped out the html before reading it. If that is true then an Ordered List would look like a paragraph once the tags were removed. That would also apply to H headers.

    If not, then you could put them in random divs, spans, tables, hr etc. Anything to break up the typical paragraph after paragraph. Is that true?

    The comments idea is great!

    I believe my scraper sites have been banned in Google based on my content. If I can just format it differently, then I am good to go!

    Thanks!

    #28 Comment By Vit On September 22, 2007 @ 9:17 am

    My 3 web-sites with duped content have been banned away from google search data base, it is probably better to have a web-site with original content…

    #29 Comment By Pay Per Install On February 9, 2008 @ 2:55 pm

    Or you can have someone rewrite the original content somehow… mazbe there could be some script to replace some synonym words…

    #30 Comment By Jeff On March 26, 2008 @ 5:43 pm

    Great post,

    Obviously G penalizes dupe content, but I can’t believe that it’s as strictly as some would have you believe.

    #31 Comment By Forumistan On April 7, 2008 @ 4:15 pm

    I hate spammers…

    #32 Comment By Gadgets4nowt On April 7, 2008 @ 5:03 pm

    Says the man with 8 posts in 5 minutes - lol !

    #33 Comment By Eva White On May 10, 2008 @ 9:31 pm

    If you do write original content all the time, its quite an achivement.

    #34 Comment By police cars for sale On July 15, 2008 @ 4:16 am

    I have been using the same technique on one of my sites after reading you article. I used a mixture of replacing some words, re-ordering the paragraphs and also the format as per your article. Seems to do the trick!

    #35 Comment By search engine optimisation On August 3, 2008 @ 5:33 am

    By changing the header this makes it different in the eyes of Google

    #36 Comment By Atlanta SEO On September 2, 2008 @ 11:19 am

    According to Google Webmaster Tools, it doesn’t count duplicate content if it is the same article written in a different language. So, has anyone tried Mirroring a site in a different language.

    #37 Comment By Henry Tusco On September 2, 2008 @ 1:46 pm

    I suggest using [7] http://translate.google.com you can be sure that G will register any translations and will be able to make trace backs.

    #38 Comment By Clocks On October 20, 2008 @ 12:08 pm

    One of my sites ranks well for dupe content with no changes at all to the original and has done for months.

    #39 Comment By Article Submit On October 20, 2008 @ 3:58 pm

    This is dead on, we reprint content from multiple writers and even though it is duplicate content it still ranks. We do try to clean up the Title tag and the H1 on the page. The give the url a distinct url.

    #40 Comment By Internet Marketer On October 27, 2008 @ 12:55 am

    I see many outstanding businesses today that depend heavily on duplicate content, like lyrics sites, articles and ezines and many other, however google hates dup content and sooner or later it will filter out all the dups

    #41 Comment By Sit Stay Fetch Review On January 31, 2009 @ 2:34 pm

    Unless some actual human being reviews this articles, you will do what you intended, that is rank for the word spam!

    #42 Comment By rider On February 16, 2009 @ 8:04 am

    Great ideas thanks

    #43 Comment By Pozycjonowanie Poznan On April 15, 2009 @ 12:39 am

    Duping content is pretty risky operation.

    #44 Comment By Odzyskiwanie Danych On April 21, 2009 @ 3:46 am

    I had no idea it was this simple. Thanks.

    #45 Comment By SEO Hosting On May 19, 2009 @ 5:34 pm

    It seems to be pretty good article, those tips are really useful to those who are beginners and don’t have much knowledge about writing. Everyone knows that copying content can drive you into problems. But, what i think is that copy/paste is not bad, only if the original source is credited.

    #46 Comment By MoneyBins On May 26, 2009 @ 12:06 pm

    Great article, as most of us have trouble with writing when we first start out. Always hard to continue to produce quality content. New people will continue to get better and better at writing content though.

    #47 Comment By Conveyancing Solicitors On June 1, 2009 @ 2:26 am

    I like it thanks for the post.

    #48 Comment By Baby Shower Ideas On June 18, 2009 @ 4:04 am

    How to overcome the spam attack on our blogs and identify the spam comments. I used pdf’s too but the spammy commenter’s are so choosy to comment on all the open links in my blog.

    #49 Comment By 太阳博客 On June 18, 2009 @ 7:11 am

    Thats really useful, I will go to try what you said. Thanks

    #50 Comment By Acne On August 4, 2009 @ 5:43 am

    Nice posting!
    great article.
    i learn many things from this article.

    Thanks

    #51 Comment By Walmart Coupons On September 23, 2009 @ 8:08 am

    Google just want to make the web care of duplicate content. I am not sure their is any penalty to duplicate content.

    I wonder if Google engineers laugh at bloggers because they believe in duplicate contents.

    #52 Comment By Golden triangle tour On October 6, 2009 @ 11:47 pm

    Amazing, i learn many things in this article, it’s really nice, keep it up……thanks

    #53 Comment By Faxless payday On October 9, 2009 @ 10:48 pm

    Readers can take the benefit of your thougts ………. Thanks

    #54 Comment By Payday Uk On October 10, 2009 @ 2:42 am

    Boost links for your profitable business by the suggestion of this blog. I really keep it up.

    #55 Comment By Manic On October 16, 2009 @ 12:29 am

    Thanks for good article. I appreciate your thoughts and information presented at this time.

    #56 Comment By Our simply On November 19, 2009 @ 11:05 am

    presented at this time.

    #57 Comment By rude jokes On April 10, 2010 @ 12:51 pm

    On the money again. It works wonders building feeders sites IMO. Have been able to run a bit of SERP domination in a couple markets with similar techniques.

    #58 Comment By louis vuitton On April 26, 2010 @ 8:20 am

    It seems to be pretty good article, those tips are really useful to those who are beginners and don’t have much knowledge about writing.

    #59 Comment By Купить отопительную технику On June 29, 2010 @ 5:27 pm

    Google just want to make the web care of duplicate content. I am not sure their is any penalty to duplicate content.

    #60 Comment By new air max On August 9, 2010 @ 5:49 pm

    Very interesting, something I have been starting to do from scratch with my newer sites before I start any regular linking processes, just to test the waters. I’m compiling my own massive list of “authority” domains that allow a free page.

    What is your opinion, if you are going for a 2 word phrase? Lets take “Dog Training”. When signing up for accounts do you feel it is better to go for “dogtraining” or use the dash “dog-training”. I’m still in debate on how the engines read this, can they actually figure to separate the words/phrase I’m trying to rank for? I know the best answer would be to make pages with both options but just looking for any expert opinions.

    Great blog, I like the outside the box thinking in your posts, keep it up!

    #61 Comment By اخبار الفنانين On August 16, 2010 @ 2:14 am

    Valuable information I’ll just turn applied

    #62 Comment By Peter Dunin On September 13, 2010 @ 5:00 am

    Good article,thanks for posting.

    #63 Comment By Luis On September 20, 2010 @ 12:27 pm

    This a a great question when you see some sites mirror of other sites

    #64 Comment By baseball hats On September 24, 2010 @ 6:52 am

    A good website recommend to
    you: [8] http://www.newerahatstore.com basketball hats

    #65 Comment By Скачать фильм On October 1, 2010 @ 4:49 am

    Great post,

    Obviously G penalizes dupe content, but I can’t believe that it’s as strictly as some would have you believe.

    #66 Comment By India Tour Packeges On October 9, 2010 @ 11:20 am

    I see this post is a few years old, but the concept is intriguing.
    Any feedback on whether this still works? I know backlinks are important, but you could waste a lot of time getting backlinks that never get indexed. If you could help get those sites/pages indexed, you will help yourself and the owner.

    #67 Comment By unemployed loans On October 12, 2010 @ 6:14 am

    It is a clear post. Thanks

    #68 Comment By security exterior door On October 26, 2010 @ 2:51 am

    Given prolific idea, but instead of penalize write original and fresh content.

    #69 Comment By Wireless Networking On December 17, 2010 @ 8:30 am

    a very inspiring post as always. thanks again.

    #70 Comment By smart guy On December 17, 2010 @ 9:13 am

    wonderful, this is quite an achievement.

    #71 Comment By abercrombie New York On December 27, 2010 @ 12:36 am

    When you think of happy, unhappy or think of you when you smile in my mind wander

    #72 Comment By hollister uk On December 31, 2010 @ 8:37 am

    i feel so good

    #73 Comment By filmindir On January 2, 2011 @ 1:13 pm

    congrat then who is luck then you man

    #74 Comment By Скачать фильм On January 15, 2011 @ 5:05 am

    Great article, as most of us have trouble with writing when we first start out.

    #75 Comment By Reliable Webhosting On January 30, 2011 @ 8:22 pm

    There is so much of duplicated content as everyone repeats and re-writes the same stuff just in different words.

    #76 Comment By Dodge Neon SRT4 On January 31, 2011 @ 5:10 am

    I have just started cracking duplicate posts for my website. The Reason behind this, is google some times penalized other sites, which have unique.

    #77 Comment By صور ماسنجر On February 24, 2011 @ 5:53 am

    as always a very inspiring post. thank you eli.

    #78 Comment By sharepoint room reservation On March 22, 2011 @ 1:54 am

    really interesting about seo

    #79 Comment By sac ekim On March 23, 2011 @ 1:21 pm

    This is a really stupid question but what language does the script need to be in?
    r

    #80 Comment By India Tour Packeges On April 23, 2011 @ 5:49 am

    Rajasthan is situated in the North Western part of India and shares geographical boundaries with Punjab, Haryana, Uttar Pradesh, Madhya Pradesh and Gujarat in India. It also has a long international boundary with Pakistan. [9] Rajasthan Tours.

    #81 Comment By Solicitors in Ealing On April 30, 2011 @ 4:32 am

    My own experience is that google is tightening up on everything and it’s simply not worth taking risks anymore. Each to their own but remember that google has some very powerful tools and algos. Thanks for the suggestion anyway.

    #82 Comment By abercrombie milano On May 16, 2011 @ 10:54 pm

    wNice one, there is actually some good points on this blog some of my readers may find this useful, I must send a link, many thanks.

    #83 Comment By abercrombie london On May 17, 2011 @ 2:01 am

    1I believe this really is excellent information. Most of men and women will concur with you and I ought to thank you about it.

    #84 Comment By karen millen uk On June 1, 2011 @ 10:04 pm

    This was very informative. I have been reading your blog a lot over the past few days and it has earned a place in my bookmarks.

    #85 Comment By Craig Sharpe On June 13, 2011 @ 1:30 pm

    On the issue of duplicate content, do any readers use copyscape ? If so, do you fully trust it ?

    #86 Comment By ผ้าม่าน On June 20, 2011 @ 1:29 am

    seo work is long

    #87 Comment By kadın On July 29, 2011 @ 6:03 am

    I do agree with all of the ideas you have presented in your post. They’re really convincing and will definitely work. Still, the posts are too short for newbies. Could you please extend them a bit from next time? Thanks for the post.

    #88 Comment By Solicitors in East London On August 4, 2011 @ 9:27 pm

    Dupe content is the biggest issue out there now, copyscape picks up even a 10-15% similarity, so gotta figure google is even more strict.

    #89 Comment By abercrombie Milano On August 5, 2011 @ 1:46 am

    11. You made some respectable factors there. I appeared on the web for the problem and found most individuals will associate with with your website.

    #90 Comment By Home Online Work On August 12, 2011 @ 6:24 am

    This fake commenting is very interesting!

    #91 Comment By Caldaie Ferroli On September 24, 2011 @ 6:59 am

    I prefer PPC cause I know what I am getting for my money.

    #92 Comment By مدونة On September 26, 2011 @ 8:15 am

    Good stuff thanx

    #93 Comment By thing thing arena 3 On October 5, 2011 @ 2:37 pm

    Very interesting, something I have been starting to do from scratch with my newer sites before I start any regular linking processes, just to test the waters. I’m compiling my own massive list of “authority” domains that allow a free page.

    #94 Comment By Online Tax School On October 10, 2011 @ 4:17 pm

    better to write your own context and then spin it to make more content. start with a short tail artive and add longer tailes to it

    #95 Comment By Property Marbella On October 16, 2011 @ 12:11 am

    1000’s of sites coming off the same block must figure negatively into the equation.

    #96 Comment By Balenciaga Handbags Shop On October 19, 2011 @ 2:48 am

    Balenciaga Handbags Shopjfgh

    #97 Comment By Ritesh On November 19, 2011 @ 2:13 pm

    Nice topic, intresting
    [10] Do follow list PR 7 Blogs SEO

    #98 Comment By Office 2010 On November 25, 2011 @ 8:18 pm

    This article is GREAT it can be EXCELLENT JOB and what a great tool!

    #99 Comment By شات صوتي On December 19, 2011 @ 4:33 am

    okkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkk

    #100 Comment By سعودي انحراف On December 19, 2011 @ 4:34 am

    yesssssssssssssssssssssssssssssssss

    #101 Comment By شات مصرى On December 19, 2011 @ 6:28 am

    nice chat p7bk good website bloog chat egypt girl

    #102 Comment By Moncler On December 21, 2011 @ 8:17 pm

    Although this was a time of Moncler Winter Jackets – think Wall Street, “greed is good” and so on, one of the Moncler Coats Women was, of course, the oil industry, so this part of Moncler Jackets Men was indubitably where the money was. It was a decade dedicated to conspicuous consumption, Moncler Coats Women and branding yourself with designer labels. Moncler Women went from wanting to marry the millionaire to wanting to be the millionaire, and so shows such as Moncler Boots weren’t just television fiction, they reflected the attitude and aesthetic of the time, as well as the financial power wielded

    #103 Comment By Summer Holidays On January 1, 2012 @ 8:04 pm

    This is really a good characteristic of excellent advice. Very true indeed, Eli!

    #104 Comment By PCSO Lotto Results Phil On January 2, 2012 @ 12:56 am

    An incredible post. Puts what I do into perspective. Good day to you, Eli!

    #105 Comment By Mobile Laptop On January 2, 2012 @ 6:07 pm

    Hello Eli, I have been reading your posts for some time and they are great man. Its a gold mine!

    #106 Comment By Nitish On January 5, 2012 @ 11:25 pm

    Now this is the article i have been waiting for, One of the most stressful thing about my job is to write copyscape proof content for clients. Been thinking of it as a headache till now but now things will change, Thanks you to xD

    Great Post! Would definitely try this out

    #107 Comment By Nitish On January 5, 2012 @ 11:26 pm

    Would agree with that.

    #108 Comment By Nitish On January 5, 2012 @ 11:27 pm

    Now that is out of the question :)

    #109 Comment By Nitish On January 5, 2012 @ 11:27 pm

    I’ll second that

    #110 Comment By Intel Core On January 6, 2012 @ 9:42 pm

    Great Article… Keep it up man, help us more!

    #111 Comment By Ismat Zahra On January 8, 2012 @ 3:32 am

    Gr8 post :) and congrats who got the solution for their job :)

    #112 Comment By Ismat Zahra On January 8, 2012 @ 3:33 am

    keep it up and help everyone :)

    #113 Comment By Ismat Zahra On January 8, 2012 @ 3:33 am

    :)

    #114 Comment By Ismat Zahra On January 8, 2012 @ 3:35 am

    ;)

    #115 Comment By Classificados On March 6, 2012 @ 2:40 pm

    Thanks for this article, very interesting.

    #116 Comment By HealthWrong On March 20, 2012 @ 9:43 pm

    it is very hard to get away with duplicate content now. i think it has to be 100% unique.

    #117 Comment By شات كام On April 3, 2012 @ 10:08 am

    شكراااااااااااااااا

    #118 Comment By Justbeenpaid On April 29, 2012 @ 8:27 am

    I don’t think this will work anymore

    #119 Comment By Justbeenpaid On April 29, 2012 @ 8:27 am

    Thank you Eli for taking the time and answering all our questions

    #120 Comment By Divya @ Online Auto Advisor On July 19, 2012 @ 7:45 am

    Of the negative factors one of the most well-known is duplicate content. … detecting duplicate content it would be harder for plagiarists to get away with what they .

    #121 Comment By Jasmine @ Callme.lk On August 20, 2012 @ 11:58 pm

    I was just searching this type of post and I found it. I don’t know about guest posting but now it should be clear that read this post.

    #122 Comment By security guard resume On August 22, 2012 @ 3:47 am

    I do agree with all of the ideas you have presented in your post. They’re really convincing and will definitely work. Still, the posts are too short for newbies. Could you please extend them a bit from next time? Thanks for the post.

    #123 Comment By hut be phot On September 2, 2012 @ 3:59 am

    Thank you Eli for taking the time and answering all our questions

    #124 Comment By thong cong On September 2, 2012 @ 9:32 pm

    Would agree with that.

    #125 Comment By Property Marbella On September 3, 2012 @ 5:09 am

    It is better not to copy texts and articles, but try to write original, google find you sooner or later.

    #126 Comment By iloansnow.com On September 7, 2012 @ 2:40 pm

    It’s obvious to mostly everyone that nowadays it’s better to write unique content since copying articles just doesn’t cut it anymore. If you want long term business than you gotta focus on quality.

    #127 Comment By Jasmine @ Callme.lk On September 24, 2012 @ 5:09 am

    I’m not sure how I’ve avoided this site until now but I’m here now and subscribing immediately. I can’t stand people pissing and moaning about black hats beating them out of some ranking! Deal with it! Get better!

    #128 Comment By munna On October 11, 2012 @ 3:50 am

    Hi

    we have PowerMta Video tutorial in this we cover these topics

    The topics which we covered in this video tutorial
    • Introduction of Power MTA
    • Installing PowerMTA
    • License Activation
    • Installing and configuring DNS
    • Features of PowerMTA
    • Configuring Virtual-MTA
    • Max-message-rate
    • Max-message-per-connection
    • Define bounce after
    • Define Retry Domain on particular time period
    • Giving access on web for monitoring purpose
    • Define HTTP management Port
    • Configure SMTP username and password
    • Configure SMTP port
    • Increase PowerMTA queues
    • Reload PowerMTA configuration file
    • How to debug PowerMTA
    • Generate SPF and DKIM
    • Publish SPF and DKIM on DNS
    • Define DKIM on PowerMTA
    • How to connect Interspire applications or other application
    • Start, Pause and delete queue on PowerMTA
    • Delete particular queue on PowerMTA
    • Checking powerMTA log
    • How to check IP and Domain blacklist
    • Delist your IP And domain

    for more info mail me: [11] [email protected]


    Article printed from Blue Hat SEO-Advanced SEO Tactics: http://www.BlueHatSEO.com

    URL to article: http://www.BlueHatSEO.com/how-to-dupe-content-and-get-away-with-it/

    URLs in this post:
    [1] Google’s Webmaster Blog: http://googlewebmastercentral.blogspot.com/2006/12/deftly-dealing-with-duplicate-content.html
    [2] Document Links: http://www.bluehatseo.com/links-through-document-links/
    [3] exploiting LSI: http://www.digeratimarketing.co.uk/2007/04/27/exploiting-lsi-to-rank-higher/
    [4] Webmaster search: http://webmaster-search.ozonew4m.com/
    [5] http://credit.abcsouth.com/?p=144: http://credit.abcsouth.com/?p=144
    [6] http://relevantnews.net: http://relevantnews.net
    [7] http://translate.google.com: http://translate.google.com
    [8] http://www.newerahatstore.com: http://www.newerahatstore.com
    [9] Rajasthan Tours: http://www.maketripindia.com/rajasthan-tour.html
    [10] Do follow list PR 7 Blogs SEO: http://www.techinspiro.blogspot.com
    [11] [email protected]: http://www.BlueHatSEO.commailto:[email protected]

    Click here to print.