Blue Hat Techniques

Blue Hat Techniques21 May 2006 05:49 am

This technique, although painfully obvious, works as a last resort for a site that has been banned in Google (or any other SE for that matter). Lets say for instance that your site uses some spammy techniques and got banned. The first and most obvious step of course should be to file a reinclusion request with Google as soon as you do a little spring cleaning. However there is two reasons why a person would not want to do this.

Reasons to not file a reinclusion request
1) Your spammy technique does so well in MSN and Yahoo that it would not be worth it to remove for the sake of Google. You do not necessarily care about Google, but it would be nice to at least have another chance at Google.

2) Even with the spam removed Google won’t honor your reinclusion request. So you might as well put it back up and do well in MSN and Yahoo. However getting back in Google would be nice as well.

Here’s how to get back in Google without hurting your status on MSN and Yahoo.
Buy another domain extremely similair to yours. For instance if your site is buy 301 redirect from the old domain to the new one. Then make the new domain your new primary domain.

If your wondering why it seems like I’m picking on Google in particular; it is because Google seems to care more about older domains and is most likely to ban your site for a technique they consider spammy. From my experience 301 redirecting from one domain to newer one has only minor temporary effects on your MSN and Yahoo rankings.

Obvious downsides
Although you get to keep your link popularity in the other engines, you obviously loose your current link popularity in the engine you’re unbanning yourself from, since it probably won’t even bother following the links pointed to a banned domain. Hey, what can I say? Gain nothing…loose nothing. At least your back in the index and any links you can get to switch over to the new domain will show up in the now unbanned engine. Plus any new links you gain count as well. Besides, at least your back in the index with a fighting shot.

PS. Why haven’t I seen any tools designed to find all the sites that link to your site and email them requesting a change in the link? I’d be easy to make and might be useful for situations like this. Also wouldn’t be bad for requesting changes to anchor text. I would also love to see one that goes out and finds all the sites that link to you using  rel=nofollow and sends the site owner an email asking them to remove the nofollow tag.

Blue Hat Techniques03 Apr 2006 01:54 am

One thing that can be learned only by running quite a few websites at once is the differences in how the bots treat sites different. One of the biggest differences is how often they pull your pages, and how often they update your site in the index. One day while browsing through my different stats, I noticed how certain sites get updated in the indexes daily and some get updated monthly. Some sites that only have about 1,000 links get hit by Googlebot 700times/day while some others that have over 20,000 links only get hit about 30 times/day. This inspired me to begin an experiment.

The Experiment
Being one of the few that paid attention in Junior High science class I did this test the right way and put on a white lab coat(just kidding, but wouldn’t that be cool. Where do you buy those things?). My constants were simple. Each site was a brand new domain with similair keywords with similair competition and searches/day. Each site had extremely similair content and had the same template. I also pointed exactly 10 links from the same sites to each site. My variables were also simple. Each site was automatically updated with new pages and with new content at random times, the only difference was how many times in one day they would be updated.

Site 1-Updated 1 times/day

Site 2-Updated 3 times/day

Site 3-Updated 5 times/day

The crawlers behave differently depending on how often the site is updated. The indexes will update more or less frequently depending on how often the site is updated.
Time Frames
I let the sites sit for one month. I closely monitored each site and it’s progress each day.

Spider Hits After First Month
Site 1                    Site 2                    Site 3
MSN:214               MSN:478                MSN:1170
Google:184           Google:523            Google:957
Inktomi:226           Inktomi: 391          Inktomi: 514

Time Frames
Then I monitored the sites for 6 months.

Cache Update Averages After 6 Months
Site 1- MSN: 1.52 times/month Google: 1.4 times/month
Site 2- MSN: 18.24 times/month Google: 4.1 times/month
Site 3- MSN: 21.70 times/month Google: 13.4 times/month
*Yahoo excluded because it’s tougher to tell cache times and date stamps vs. cached pages/title changes.

I also tracked the percentage of pages to actual that were indexed across Google, MSN, and Yahoo
Site 1-57%
Site 2-81%
Site 3-83%
It is understood that spiders will hit your site for three primary reasons. First, validating a link from another site. Second, checking for changes to your site. Third, reindexing your site. Fourth, pulling robots.txt. With the first and fourth factor neutralized we can assume the update and spider stats are because of the second and third reasons.

Practical Use
I understand from this experiment that if you keep your updates consistant and at random times it will force the bots to revist your site more often. They will all start visiting your site at a consistant intervals depending on your number of links. Once they start to build a rythmn of how often your content changes, they will adapt and start visiting more. Once they build that rythmn into timing they will update your site in the indexes accordingly.

Therefore a theory can be built. Crawlers are designed to accomidate your site and the practices of the webmaster. Thus, you can train the crawlers to how your site operates and this will conclude in differences in performance in the indexes.

Flaws In The Experiment
Upon factoring the final results I wish I had over done it with a fourth site. Had it update 100 or 1,000 times a day. To see if it performed better or worse than Site 3. The second flaw falls into the category of seasonal changes. I did this experiment between June 2005 - January 2006. The engines could have been acting differently during those times. I know for a fact that MSN was, because it was so new.

Blue Hat Techniques03 Mar 2006 02:53 am

Since Synergy Links are aren’t a topic that I have ever seen discussed I will attempt to define and coin the term.
Synergy Links-Channeling links so that their combined effect is greater than the sum of their individual effects. Channeling reciprocal links through third party pages in order to improve the quality/SEO impact of the links.

The idea is simple. Take shitty reciprocal links and use them to build credit to a page outside of your site which provides a quality link to yours. In other words, since search engines don’t like reciprocal links, so use them to gain quality links.

Characteristics of A Quality Link

    Link written into content
    Your keywords in anchor text
    Content similar to your site’s content
    Low number of links on single page

The Process
Create subdomains on other domains (preferably on separate class-c’s as well). Post a single page or very limited number of pages with articles directly related to the targets keywords. Place your links inside the articles in areas where they would be most beneficial. Then find automated link scripts or link directories to build links to that site. Place the links on a subpage of that site. Build massive links to that page.
Repeat and rinse.

What will this do?
This will in a sense transform low quality reciprocal links, which the search engines don’t like and give very much credit to, into higher quality links which search engines love.

How is this different than a third party doorway page?
Third party doorway pages are considered in themselves, a low quality link. They typically contain hundreds to thousands of pages and usually don’t have full useful articles for the surfer. In the jist, doorway pages are designed to pull search engine traffic and redirect it to your site. Synergy Links are designed to pull low quality links and turn them into higher quality links.

If you still don’t understand the concept ask yourself this question. Would you rather have 100 reciprocal links or 1 link from a site related to yours? If you chose the 100 reciprocals then its time you drop the business and seek ways to whore out your wife for money. The answer was B incase anyone gets this question on who wants to be a millionaire.

Once you truly understand the difference as well as the process you can really excel in a HUGE way on your linking campaigns. Besides this method is a lot better than slamming your own site with thousands of links.

Blue Hat Techniques07 Feb 2006 11:44 am

Someday I wish Google would be straight forward and say if your site is in the sandbox, and why sites get put in the sandbox. SERIOUSLY GOOGLE! What’s the worse that can happen if you use a little honesty in the matter? Until then I use a special trick to keep/get my sites out of the sandbox.

The trick rides off one principal which my own sites’ history has proven to me to be true. Google will almost never put a new domain from an old site in the sandbox.

Therefore to keep out of the sandbox you need to trick Google into thinking your site is from an old domain. You already know I like to keep old domains and websites around even if they don’t make any money. This is one of those times your going to need them again.

Find an old site of yours that gets crawled regularly. Create a subdomain. Mirror the main page of your site on the subdomain. Wait for it to get crawled and indexed. Then put a 301 redirect to your real domain. This will make Google realize that your site which is relatively new is actually an old site that got moved to it’s own domain.

Out of the sandbox you go!

Keep this in mind for your next website project. I have found it doesn’t hurt to plan early and release both at the same time. Then you just have to wait a month or so and put up the 301 redirect.

note: I have only tested this theory four times in the past with four different websites of mine. It did work each time. One of the times I used it to get a site out of the sandbox. I cannot guarantee you that it’s perfect or that it will work for you. Please let me know if it works or fails for you because at this point I still consider it a theory.

Blue Hat Techniques04 Feb 2006 03:59 am

Attention Spam Recipient!

**Limited Time Offer**

For only $29.95 you can be listed in Google within 48 hours and have top 10 placement in Google for any keyword you choose GUARANTEED!

Really?! COOL! I’ll take “Real Estate”, “Computers”, and hell “Internet” to.

I know none of you are amateurs. So as much as I was hoping to bust some pace-makers I won’t bother explaining why this is a scam.

The funny thing is; I get about four or five of these type of emails a day and every time I delete them I catch myself having a quick half second wandering thought. Perhaps, they do know something I don’t. NAH! Then I proceed to hit delete. Admit it, no matter how good you are you catch yourself doing the same thing. I refuse to believe I’m the only insane person in the world.

There is however some truth to the wonderful offer between all the guarantees.

you can be listed in Google within 48 hours

Now to get some things straight. I have no idea what these scammers are talking about. I never bothered to ask them how they plan on getting your site into the Google index within 48 hours. There are really three possibilities:

1) They are straight up lying

2) They are using some crazy blackhat technique

3) They know what I am about to share with you

So to stomp out any myths. Yes, you can get indexed in Google within 48 hours. No, it is not a blackhat technique. It’s really not even that sneaky of a practice. In fact, Google probably likes it because it makes their system better.

Now that I beat around the bush for too long. Here is how you get listed in Google in 48 hours.

1) Get some Google Adsense code.

2) Put the code on the page you want indexed

3) Go to the page in your browser and hit refresh every couple seconds or so until the Community Service ads go away and real advertisers start to appear. This usually takes around 40 or so refreshes, but it really depends.

What happened? Google Adsense does not get paid for the community ads. So they make no money until they can determine exactly what your site is about so they can display relevant ads. So when Google sees people hitting this page with Adsense that isn’t in their index it rushes its little spiders over there to find out what the page is about and spider its content quickly before Google loses more money. For efficiency sake, it uses the information to add your site to the index. This usually happens within 48 hours. Using this technique I’ve seen an entire site go into the index within a couple hours. Some times I’ve seen it take up to 3-4 days. I’m not going to :) Guarantee :) anything, but I will say this. It works!

For those of you who don’t have an Adsense account, but are in way too much of a hurry to create one. This is probably bad advice, but I’ll say it anyways. Borrow someone else’s. You’re going to remove it soon anyways, and they probably won’t get in any trouble for it because no one is there to click on it.

Two quick side notes for you. First, someone is probably wondering if this is just as good as Google naturally finding your site through other peoples’ links. My research shows yes. Second, this technique may be against Adsense’s Terms of Service. I’m not sure if it is or isn’t, so at the very least DON’T click on your own ad. That’s click fraud. Be nice to Google and their advertisers, and hopefully some day they will be nice to you.

Blue Hat Techniques01 Feb 2006 12:31 pm

With the new trend of article sharing there has been raised concerns about duplicate content penalties.  So now that everyone is in a fuss about whether or not to use widely spread articles on article submission sites fears begin to rise as to how search engines will react to this. Well, consider this, the author probably submitted the article to a 100+ article directories. Plus, we’ll assume the article was actually used by a good 10 websites a month. By end of a Google update(3 or so months) thats 130 sites that have the exact same article as you do on your website.

This is bad for several reasons.

1) Your site stands much smaller chances of ranking for small phrases and keyword that aren’t used much, but bring good traffic simply because many articles = lots of potential phrases for people to search for. No matter how small the phrase that fits your article you still have 130 sites competing for it.

2) Possible duplicate content penalties. These are a gray area. None of the research I found has been able to accurately explain if or how search engines penalize a site for having the exact same content as another site.

3) You have to link to the authors site and sometimes the article directory you received it from. Search engines then know you are not the original author and they rank the article directories high and the author’s site highest. You get lost in the middle.

Question of the day: How do we get expert content that is truly fresh and not going to be used on any other sites?

This is so stupidly obvious I can’t believe it’s not already ragingly popular. Ask an expert! Can’t find one? Try using chatrooms like IRC. It’s easy to find a chatroom on almost any subject, and chances are if there are people in it. Not only are they bored, but they truly enjoy talking about the subject at hand. Ask them a long pointed question, that will induce a rant. I’ll give an example.  What dogs are most impressive in a dog show and why? A question of this manner will induce a huge rant and probably a debate. With a little editing and cleaning up you have yourself an unique expert article on subject.

This tactic is advantageous for several reasons.

1) The content is completely unique. You will be the first to have it. Chat room content generally doesn’t get posted on websites or forums. No one will have content like this.

2) With a little fact checking it will be useful and reliable content for your visitors. This helps if you already know a little bit about your field.

3) It’s MUCH easier than writing your own articles.

Blue Hat Techniques27 Jan 2006 09:44 pm

So this sounds impossible right? Well it’s definitely possible it’s just really sneaky and underhanded. However to be clear, this technique involves your competitors WILLINGLY linking to you. This is a natural link and although a underhanded technique it is completely white hat.

First you will need some tools

1) A quality link exchange script (not a link exchange directory script). I suggest you use PHP Link Manager because it does a great job of reading your links on their pages without them having to be absolutely perfect. You will also need some knowledge of PHP because you will need to modify this script.

2) A proxy hitter that allows multiple domains. I would suggest you use one called Stealth Advertiser. Now let me get one thing straight. This Stealth Advertiser program is a WORTHLESS program that does not deliver on it’s promise of “Millions of visitors.” Don’t use it for its intended purpose! I will not give a link to the company that makes it because they are a bogus company. So if you are actually interested, read the readme file. Until then, we are going to use the program in a little more ingenious way to get what we want.

Step 1-Install the link exchange script somewhere on your site. Edit the code so towards the top it says something like this:

Random MYTOPIC Site

Link To Some Site

Then get the script to rotate through a few sites you’ve hand selected that are related to your site’s topic. No more than 10. Then below that; where the script displays the link exchanges put the heading as

Permanent MYTOPIC Links

Link 1

Link 2

etc….you get the idea

Then below that put the form they will use to add their site to the permanent links portion. This script MUST force them to already have a link to your site on theirs before they may submit their link. Otherwise the affect is lost.

Step 2- Install Stealth Advertiser and put in the url to the link exchange page you’ve just created as the referring url 61 or so times. Then remove all items from the list and click Add By Search. This will allow you to pull a HUGE list of sites that are directly your competitors in Google, Yahoo, and MSN. Then add another site or something you own so you can test to make sure it’s really working.

Step 3- Click Start and let it finish. This will take several hours depending on how many competitors you have.
What will happen?

This will pull your competitors website’s 61 or so times and make it look like it’s coming from your links page. When they view their stats they will see a large amount of visitors coming from your site to theirs. They will follow the link and see your links page. They will not see themselves on the Permanent Links section and will assume that all the traffic came from the random rotation. Even if they are a big site and normally have a no linking policy, I guarantee you this. The first thought that will go through their head will be “I gotta get on that permanent links section.” When they read the form and it requires them to have a link back they will decide that it is worth it and they will link to you.

This is a VERY effective technique. In fact, from my experience this works a hundred times better than personally emailing every one of your competitors. If you have a lot of competitors and/or a lot of bit keywords you compete for this will benefit you the most. You can gain a couple hundred to a thousand RELEVANT links within a day or two.

Blue Hat Techniques26 Jan 2006 04:47 am

I promote a lot of websites. Quite a few of which seem like great ideas at first but end up never taking off. This is true for most site owners. You at least have a couple dead sites sitting there. If you’re not at this point yet, remember this one piece of advice. Keep the deadbeats. They will someday become useful to you. Even if they aren’t going to make you any money you can at least get more use out of them than if you let the domain go to a domain sponsor.
I developed a technique for turning these deadbeat sites into great rankings for my newer more promising projects. Chances are these sites have a page rank. Use them. Even if they are small and on the same Class-C IP as your newer sites you may still use them to boost your rankings for your newer projects.

You have several options rating from good, better, best.


Link to your new site on every page of your old site. This will help boost your Page Rank, but probably will do very little for your rank in the SERPS because search engines tend to discredit the same outbound link on every page of a site. Especially if it’s to a site on the same server. This however does tend to work very well for MSN.


Use the deadbeat sites for incentive in link exchanges. Ever try going to a major PR7 site and asking them to exchange links with a PR4? Doesn’t usually work does it? Instead offer them a link from a PR5, PR4, and six PR3’s for example. Give them the addresses and let them know that these extra links will definitely be worth their while. I personally have managed to get many very high ranking, high PR sites to link to me this way. Trust me this actually works very well. What’s the worse that can happen? Your dead site gets cluttered with a few links? Big deal. This way you both win.


Repromote these sites so they pull a good page rank. Then signup with a company that offers their advertisers text link directories on other sites. I’m not going name any particular companies because I don’t want them to get flooded with requests, but there are quite a few out there. Once you have their directories uploaded onto your site and the next Google update gives the directory a decent page rank, come to them with an offer they can’t refuse. I will give you UNLIMITED links on the directory as often as you’d like in exchange for you sponsoring me a PR8 link(or whatever you think your directory is worth) on a site that’s on topic with my newest project. If your directory pulled a decent page rank they will literally jump out of their chairs at this opportunity. They normally have to pay you around 5 bucks for every link they place on your directory. They also almost always have an excess of links they need to place and are looking for a cheap way to put them up. If they can dump all their excess links in your directory for free, they can make WAY more money then they will spend paying a high page rank site to put your link up permanently. Remember, they do get them for half price after all.

Blue Hat Techniques18 Jan 2006 09:04 am

1,200+ high quality one way links with no strings attached.

Am I kidding? Nope.

For those of you who don’t know; software is where it’s at. If you have a piece of software and a high quality submission software such as Promosoft (the best one) or Robosoft (Tedious, but works VERY well) you can instantly gain enough links to ANY page to give you an instant Page Rank of at least 5. Trust me, this is one of my most guarded secrets, and now I’m spilling it to you. You can turn almost any website into a piece of software and submit it to thousands of software sites that will not only post a full description and a link, but give you a dedicated page to your software product that usually pulls an average of PR3, which when combined with the hundreds of other one way links will almost guarantee you a Page Rank of 5.
This raises an instant question. How do I get a piece of software to promote on my site? Turn your entire site, or content portions of your site into an executable with an installer. You can do this easily by searching on your favorite search engine for “HTML to EXE converters. ”

This will give you several advantages:

1) In my experience it will give you at least 1,200+ one way, high quality links.

2) Depending on your market, these sites can send you hundreds of their own visitors a day to your site!

3) A large portion of those visitors will actually download your software. Which allows you to place a permanent bookmark to your site on their browser(don’t put Adware. Just a bookmark. You don’t want to piss off your own visitors). Plus they get a always accessible link to your site and your content on their desktop/start menu.

Note: I give you this information out of confidence. DO NOT ABUSE IT OR YOU WILL LOSE IT! If everyone starts spamming software directories, we all lose. Use this only for high quality sites, and make sure the people who download your software will benefit highly from actually installing your software. Also, for god’s sake DON’T plague their computers with crap you know they don’t want. Make them happy and they will keep your software/bookmark on their computer for many years to come; generating you residual visitors for as long as your site stands.

If you like this article please place a link to on your site so more people like you may benefit from these techniques.

Blue Hat Techniques18 Jan 2006 08:30 am

For those of you who haven’t read the rivetingly boring but useful article on how pagerank works by Ian Rogers. You are missing out, or possibly have a girlfriend. Either way I suggest at least once in your career…fold up your girlfriend, stuff her in your front pocket and give it a good thorough reading. It gives some great insight on how Google calculates your page rank. How is this useful? It’s not. Except that you learn that ALL links to a single page affect your page rank including those inside your own site. So whats the best way to organize your site so you can gain the most page rank to your main page? The structure is simple and I’ll explain it to you, but first understand that the solution isn’t practical. If you organize your site this way your visitors will click on the red X faster than their Google toolbar can inform them how “important” your site is. Lets first look at what Google’s algorithm considers to be the perfect site, and then we’ll figure out a way around it.

Yes I rock at MS Paint. Quit denyin my skills biznatch!

For those of you who took my advice and read the article you will quickly understand the math behind why this model works the best. Unfortunately, if you have a 100 page site, it will take your visitors the rest of their lives to find the article they were looking for this way. So lets look at an alternative.

With this method each page will theoretically push one third of one third of it’s 1 page rank. Did this make sense? That means each page you add on to this structure gives your main page approximately 0.11 page rank points. So I suggest you try this tried and true method of creating a “mini articles directory.” Place as many articles that relate to your main topic(this is very important, you don’t want your site to be off topic) as you can in this structure. Then create a page that links to all the articles. Somewhere on your site(on a low level PR page) create the first page of the articles directory. Then create a way for visitors to submit their articles.

This creates several advantages:

1) Your site has useful articles on your subject. Consistently gaining in relevant content is always smiled upon by the search engines.

2) More entry points to your site that contain easy to find links to your main page

3) You get to advertise yourself to other webmasters as an article post site(you will get a few inbound links for this) .

4) For every article submitted your main page gains approximately 0.11 of a page rank point. Note: I say approximately because Google uses a delimiter in their equation and there is no way of telling for sure what it is at the current moment. Either way, you can use this technique to your hearts content. Search engines will love you for it, and you can boost your page rank in a method that is much more reliable than seeking inbound links.

Like all methods and techniques in SEO don’t rely 100% on one single method. Juxtapose this method with your current for a little extra boost above your competition.

« Previous PageNext Page »