Alright fine. I’m going to call uncle on this one. With my last Black Hole SEO post I talked about Desert Scraping. Now understand, I usually change up my techniques and remove a spin or two before I make them public as to not hurt my own use of it. However on this one, in the process, I totally dumbed it down. Upon retrospect it definitely doesn’t qualify as a Black Hole SEO technique, more like a general article, and yet no one called me on it! Com’n guys you’re starting to slip. :) Enough of this common sense shit, lets do some real black hat. So the deal is I’m going to talk about desert scraping one more time and this time just be perfectly candid and disclose the actual spin I use on the technique.

The Real Way To Desert Scrape
1. Buy a domain name and setup Catch-All subdomains on it using Mod-Rewrite and the Apache config.

2. Write a simple script where you can pull content from a database and spit it out on it’s own subdomain. No general template required.

3. Setup a main page on the domain that points links to the newest subdomains along with their titles to help them get indexed.

4. Signup for a service that monitors expiring domains such as (just a suggested one, there’s plenty much better ones out there).

5. On a cronjob everyday have it scan the newest list of domains that were deleted that day. Store the list in a temporary table in the database.

6. On a second cronjob continuously ran throughout the day have it lookup each expired domain using have it do a deep crawl and replace any links to their local equivalents (ie. becomes /page2.html). Do the same with the images used in the template.

7. Create a simple algorithm to replace all known ads you can find and think of with your own, such as Adsense. Also it doesn’t hurt to replace any outgoing links with other sites of yours that are in need of some link popularity.

8. Put the scraped site up on a subdomain using the old domain minus the tld. So if the site was your subdomain would be

9. Have the cronjob add the new subdomain up on the list of completed ones so it can be listed on the main page and indexed.

What Did This Do?
Now you got a site that grows in unique content and niche coverage. Everyday new content goes up and new niches are created on that domain. By the time each subdomain gets fully indexed much of the old pages on the expired domains will start falling from the index. Ideally you’ll create a near perfect replacement with very little duplicate content problems. Over time your site will start to get huge and start drawing BIG ad revenue. So all you have to do is start creating more of these sites. Since there are easily in the six figures of domains expiring everyday that is obviously too much content for any single domain, so building these sites in a network is almost required. So be sure to preplan the load possible balancing during your coding. The fewer scraped sites each domain has to put up a day the better chances of it all getting properly indexed and ranking.

And THAT is how you Desert Scrape the Eli way. :)

*Wink* I may just have hinted at an unique Black Hole SEO way of finding high profit and easy to conquer niches. How about exploiting natural traffic demand generated by article branding?