Blue Hat Techniques


Blue Hat Techniques10 Nov 2009 05:11 am

I promised awhile back that I’d teach you ugly bitches more ways to build your sexy SEO Empire. With some spare time this week I might as well take some time to help your nasty hooker ass do just that. YES I will insult you through this entire post because judging from the recent comments you donkey fuckers are getting a lil too big for your own britches and need to be brought down a peg. I’m kidding of course. You guys are great. I just feel like filling this post full of as many reasons not to read it as possible and since no one gave me an excuse to do it, I just made one up. :) This post will be advanced and since this technique’s ability to be bulletproof feeds off creativity and the subtleties of being selfmade I’ll also only give out pseudo code instead of code samples. It is however an extremely efficient way to build large amounts of unique and self-promoting sites and is more than reusable for just about any chunk of niches so modularize your code and save it for future scaling. Trust me you’ll wish you did.

Getting Started With Your Custom Mininet Generator
It’s always easiest to start a project with an example in mind so begin by picking a generalized niche that can involve a lot of different sites along the same theme. For this example I’ll use music based fan sites. So I’ll grab a few starter domains to test with such as AudioslaveFanzSite.com MetallicaFanzSite.com KanyeWestFanzSite.com and maybe one more for good measure, JonasBrothersFanzSite.com. <- See how I was all insulting and mean at the beginning of this post and suddenly changed it around to being all considerate and using faggy bands as examples so you fags can relate to what I’m saying here. I’m not mean all the time and can in fact be quite understanding. *grin* Anyways! Now that you got your domains setup an account under a single domain on your server and add the rest as addon domains. In your site’s root make a Sources folder and another to hold a data dump. After that setup a MYSQL database to be used by all sites and put a row in a table for a single domain you bought (the rest will be inserted automatically later). I recommend you put the actual domain in some value in that row.

Build a Single Universal Template
This is easier than it sounds. You can always go 100% custom but to save time I like grabbing a generic looking premade template. I then put it into a script and disect the html to put in as many variables as I can fit. A few examples would be <title>$sitetitle - $pagetitle</title> <h1>$heading1</h1> <body bgcolor=”$bgcolor”> <img src=”/data/$domain/images/$mainimage”> <div>$maincontent</div> which I will later fill with the actual content and variable values. Pack the template full of as many customizations as you can so it will not only be flexible and universal among all topics in the niche but the html itself is very random and as uncookie cutter like as you can get it. Torwards the end of the process I also tend to throw in a bunch of $randspacing type variables as possible. Then i use a randomizing function to create various spacing and line returns and randomly insert it throughout just about every group of html tags. I mention this now instead of later in the post because its important to realize that you will want this template to be as flexable as possible because you’ll eventually be using that same template on a TON of sites that may or may not be doing some interlinking so you don’t want it to appear as a network. Changing colors, widths, and images around are a great way to accomplish this just don’t get too complicated with it starting out. Keep it very basic and once you got the mininet nearly done you can add as many as you’d like later. Sometimes it’s typical to throw yourself off focus and doom the project by getting too hung up on getting the same thing perfect. For each variable you place in the template you’ll want to put the same as a field in the SQL table you created previously.

Putting Together Some Content Sources
For an example such as the music fan sites mininet I’d probably jot down a few sources of content such as Audioscrobbler for the band descriptions, primary image, and discography. Then Youtube API for a few music videos for each musician. Another great source would be Yahoo Images for some band related wallpapers and Alexa for some related sites to link to. I might even grab the Google Blogsearch rss for some recent blog posts related to that artist. Starting out it’s usually best to keep your sources as simplistic as possible and not stray too far from readily available RSS and APIs. Like I said you can always get more advanced and custom later. Create a module script for each source and put it in your previously created Sources folder. Then for each source you came up with add it as a table in your SQL and put in all the fields you’ll need for each one and remember to save room to identify the domain on each one.

Building The Generator
Create a backend script that will simply be a place to copy and paste a list of the domains and their primary keywords into with a button to submit it. My domains and keywords for this example would most likely be pipe delimited such as:
GodsmackFanzSite.com|God Smack
U2FanzSite.com|U2
BeyonceFanzSite.com|Beyonce Knowles
Once the list is submitted the generator needs to insert a new row into the table and create all the randomized variables for the site such as the background colors , various spacings (and/or a brand new template file stored in the data folder) putting them in the same single row. Once the basics are done it can call all the source modules and run them using the domain name and the keywords they need to grab the right content. They will then need to put that content into the database under the proper domain’s row(s). You now have all the content you need for each site and each got its own template! Now it’s time to just build the son of a bitch.

BUT! Before I show you how I’ll give you a few examples of how I would setup my tables for each of the sources so you can get a better idea.
For my Youtube I’d probably keep it simple and just do the domain and the embed code.
Domain|EmbedCode

Audioscrobbler:
Domain|BandDescription|Albums|PrimaryImage

YahooImages
Domain|PathToImage

GoogleBlogSearch
ID|Domain|PostTitle|PostDescription|PostLink

Alexa
Domain|RelatedSite1|MySite1|RelatedSite2|MySite2|RelatedSite3|MySite3|MoneySite1

*the MySite1 would be another random fan site in your list of domains. The MoneySite1 would be a money site of yours you can insert later to help with upward linking ;) These are foundation sites after all.

So simple even a retarded piss bucket like yourself can figure it out :)

Scripting The Sites
I know some of you are going to talk about dedicated IPs for each site and various other expensive ways to make sure the sites don’t get associated with each other but there was a good reason I said to use addon domains although there are other more complicated and better solutions. The first thing you should do when scripting the index page is to grab the current domain using an environmental variable such as HTTP_HOST. Once you have the domain name you can use that to grab all the appropriate data for each domain name and you only have to code one site and get it to work for ALL the sites in the mininet. For instance if someone goes to JayZFanzSite.com it’ll grab that into the variable and customize the entire site to be a Jay-Z fan site even though its all the same script controlling all the addon domains. I always start with the main page and branch all my subpages off that. For instance for the Jayzfanzsite.com I would put in a section for Jay-Z Music Videos and link to More Jay-Z Music Videos(the Jay-Z being that domains primary keyword as specified in the DB). That Jay-Z Music Videos subpage would just be more previously scraped music videos from youtube. The same would be done for the Jay-Z Wallpapers, Jay-Z Discography, Jay-Z Lyrics, Jay-Z Guitar Tabs..Whatever sources I’m using. Each would be a small section on the main page and would expand into their own subpage which would target popular keywords for that artist. Once all that is done and built into the template you can test each change among all the current test domains you have to make sure each shows up nicely and the randomizations and backgrounds all are static and neat for each site. Be sure to put in a place for your Alexa similar sites and as shown above mix in links to your other fan sites for each band/musician as well as some placements for your current and future money sites so they can all get good link volume. Once every test site looks pretty and is fully functional along with fairly unique content all you have to do is scale up with more domains.

BUT FIRST! I like to incorporate ways for each site to self build links. Such as for the Google Blogsearch posts I’d put a section on the main page for Jay-Z News listing the most recent 25 blog post results or so. Then I would build a small cronjob script to update it every day with 25 new posts or so and do a pingback on each to score a few unique links from other related sites every day automatically. This way you not only have lateral links from other sites on the mininet but links from other sites and the links are always growing slowly so each site can continue to grow in rank and traffic over time.

Buying More Domains and Scaling Up
As indicated I like to keep it simple and pick a prefix or suffix that has many open domains that way I don’t have to spend a ton of time picking out the domains I can just grab a list of several thousand popular bands and mass buy the domains then copy and paste them into the backend generator. Boom! Several hundred to, if you’re bold enough, thousands of new sites. All of which will grab quite a bit of underexposed traffic from keywords, image search and links. It will also give you tons of links and awesome pagerank for future sites you build. It’s a lot of work initially but it’s definitely easier then hand building all those sites yourself and the sites can easily become just as successful as if you did, especially if you did a good job with your sources. Once you’ve scaled up that mininet to a level you’re comfortable with and can maintain financially (it helps to build in a montenization module to the site so you can easily switch out ads among all the sites so they can create the most money possible per site) you can switch to a new group of sites using the same code, many of the same sources, and same method. The music fan site example is great because nearly the exact same code can be used in so many ways. For instance I can take the same damn code, get rid of the audioscrobbler and swap it for a widely available car DB for the description, image and car specs, and build a whole mininet for every single make and model car out there with a whole new set of domains such as JaguarXJ220specs.com, BMW540specs.com, PontaicGrandPrixspecs.com. It’s as easy as swapping out the keywords used in the modules so they become Pontiac Grand Prix Videos (from youtube source) and Pontiac Grand Prix Wallpapers/Images. All you need is a new template and a new group of domains to build an absolutely massive and diverse mininet that is actually profitable and self growing.

PS. I know I said HUNDREDS and THOUSANDS of sites all dramatically, but as with all techniques start off small. Get your scripts and promotion right. Make sure it works and is profitable on a per site basis before scaling up to any ridiculous levels.

LATA JERKS!

Blue Hat Techniques29 Jul 2008 12:52 am

Summer You Never Even Really Gave Yourself enough time. :)

There was a bit of confusion with my cycle sites technique illustrated in the SEO Empire Part 1 post. I used autoblogs as an easy to understand example. Autoblogs generate links quickly to themselves and can be cycled (redirected) to a source to push those links. Therefore by the definition:

Cycle Site-A site that automatically gains links to itself and then through a redirection passes that link value to another site.

an autoblog is a perfect example of a Cycle Site. However, an Autoblog by itself is not a Cycle Site and a Cycle Site is not just an Autoblog. Any site that quickly gains links to itself and is capable of redirection can be used as a Cycle Site.

In contrast as we all remember, a Link Laundering Site is a site that has an ability to gain links not just to itself but directly to another site. In the post I used a reciprocal link directory as an example. However really almost any platform can be used to launder links. I haven’t actually of heard anyone getting confused amongst the differences between the two techniques, but I also haven’t heard very much discussion pertaining to the extremely close relationship they share. These two techniques more so than the some of the other techniques on this site are very closely related. Inherently a link laundering site takes precedent over a cycle site. Why? Because if a site can constantly feed link value to another site without having to cycle and closing out, even if its for a short while, than its worth more as a link builder.

Therefore whenever possible a Link Laundering Site should be used over a Cycle Site if possible. The Cycle Site simply gives you more structures as to which gain links by than Link Laundering Sites can provide. Since you’re not worried about people liking the site and continuing its success you are able to build links to it much quicker. There is however a happy medium between the two techniques that can give Cycle Sites link laundering stability and Link Launder Sites Cycle Site power. This technique is called Cyclic Documents and its exactly what it sounds like and it is very powerful.

Cyclic Documents
A Cyclic Document is a document or link given to a user of a Cycle or Link Laundering site as to qualify a given set of links or time before it cycles (redirects typically) to its target.

The Premise
The idea is very simple. The link you give to gain the links to is instead of your main page, for which you’d have to cycle out and thus loose the link building power of, goes to a secondary document and/or a redirect to the main page where as its viewed as either just an obscurity, a method of tracking, or even not noticed at all. To help remove the confusion and to help differentiate the actual example with the technique itself I’m going to do the methodology portion twice. The first will be with the classic Autoblog example given in the SEO Empire post the next with a random structure.

Methodology 1 - The Autoblog
1) If you’re using Wordpress or similar platform for your Autoblog do a simple modification to the code that includes a simple conditional with a redirect checking for a certain string in the posts title that would normally never exist in a post title.
if($post_title=="elineverpostswhatabitch")
{
header( 'Location: http://www.mytargetsite.com' ) ;
}
?>

2) Create a cronjob script that parses through the previous posts on the Autoblog and finds posts beyond a certain age. Use the actual mysql database don’t just write a scraper! I’m just going to throw my recommendation out there and you can adjust and make your own necessary changes to it based upon your experience and best judgement. 8 Days.

3) If it finds a post past the set days have it change the title to what your picked unique title was in step 2.

What Does This Do?
Your Autoblog will create posts based on rss feeds (typically). It will then do pingbacks and gather at least one link to that post. I say at least one because there is odds of these new breed of comment scrapers :) The author of the original blogpost may check for the link, hopefully within your set number of days. See his link and hopefully leave the trackback alive on his site. After the author no longer cares about the link and has forgotten about it, but before the search engines have had a chance to index the page it will cycle that single post to your new target site thus giving it +1 link. This is why I never mentioned robots.txt in the original technique post. I wasn’t hiding something fantastic from ya after all. :)

Methodology 2 - The Image Upload Site
1) On your image upload site when the user uploads there image and you give them back the link code instead of linking to www.myimagehoster.com have it link to a sequential numeric subdirectory or subpage. eg. www.myimagehoster.com/10.

2) Through mod-rewrite have all /# or /[0-9]+ pull a script. In this script have it read in a variable saying what number its currently on and incriment it then make all numeric at and below it redirect to your target site. This sounds more complicated than it really is. Really, all you’re doing is recording the number 1 to a file or db or something and every so often have it change it to the next number up which in this case is 2. From that point on every /1 and /2 link automatically redirects to your target site thus giving it it’s hopeful link (assuming the person kept the image code in tact). Based on the popularity of the site you can increment the number faster or slower and redirect more at a faster rate.

What Does This Do?
If your image upload site used to a be a Cycle Site where it would work for awhile and eventually gather tons of links very quickly then cycle out and generate no links for a period before you’d bring it back. Now you keep it going forever and instead of destroying its momentum you can use it to gather even more links faster than you ever could before. After so long people will forget the link code and not click on it. That’s prime to have your link change out. You can also control your rankings. ie. if your image upload site ranks for terms that give it a tons of traffic and you know x amount of link are required to maintain those rankings you can maintain that amount of links thus keeping your momentum at its maximum and yet still produce equally high volumes of links to your target. Also, I could very easily have used the link directory and software directory site example used in the link laundering technique with this same methodology.

Now for Jebus sake don’t go creating a shit ton of image upload sites or Autoblogs like what happened when SEO Empire came out. <- Even my other blogs on other subjects were getting hit by hundreds at a time. Use some creativity as I typically encourage you to do. You won’t ever get rich doing the direct examples gurus give you and you won’t with me either. Most of all have fun and learn a lot from it. :)

Blue Hat Techniques19 May 2008 01:24 am

Holy cripes! It’s been awhile since I’ve sat down and written a Blue Hat Technique. It just so happens I need this one for the next SEO Empire post. I’m like blah blah talking about Keyword Spinning then I realized you guys have no fuckin’ clue what I’m yammering about. So I figure nows a good time to fix all that and luckily this one is really really easy but like all Blue Hat Techniques it works like a mofo in many situations.

The Problem
Let’s say you have a database driven website. A great example would be a Madlib Site or an E-commerce site. In fact this technique works so damn well with Ecom sites it should be illegal along side public urination. So we’ll use that as our primary example. You got your site setup and each product page/joint page has its keywords such as “17 Inch Water Pipes For Sale” and the page titles and headers match accordingly. You have several thousand pages/products put together and are well SEO’d but its impossible to monitor and manually tweak each one especially since most of the keyword research tools available aren’t entirely accurate to the keyword order. Like they may say “Myspace Pimps” gets 50 billion searches a day when really “Pimps On Myspace” are getting it. So while amongst your thousands of pages you have one page that could be ranking for a solid phrase and getting an extra 100 visitors/day for people searching for “Water Pipes For Sale 17 Inch” you’re stuck with virtually no search traffic to that page and never knowing the difference. It’s quite the dilemma and you probably realize that it’s more than likely already happening to you. Luckily its easily fixed with a simple tool you can create yourself to fit whatever needs and sites you have.

Methodology
1) Add an extra field to all you’re database entries. Any row that creates a page of some sort add an extra field called TrafficCount or something you can remember.

2) Add a snippet of code into your template or CMS that counts each pageview coming from a Goohoomsn referrer and increments the appropriate field.

3) Wait a month….*Goes for a bike ride*

4) Call the titles in the database. It can only be assumed, even in a commercial/free CMS that the titles or keywords are held somewhere in the database. Locate them and scan through them one by one.

5) Use the Google/Yahoo/MSN API’s to see if the page ranks for its keywords.

6) If it does rank than compare that to the traffic count for the month. Compare that to some sort of delimiter you’ve preset. I prefer to use a really small number like 5 for the first month or two then start moving it up as needed. If the traffic is too low than split the titles/keywords and randomly reorganize them.

*Sometimes you’ll end up with some really messed up titles like “Pipes Sale Water For Inch 17″ so if its too un-userfriendly than you may want to make a few adjustments such as never putting a For,The,If,At type word in the front or never rearranging the front two words so like Water Pipes always stays in the front then only the trailing ends. Once again it depends on how your site is already organized.

7) Reset the traffic count.

8) Wait another month and watch your search traffic slowly rise. Every month the site will get more and more efficient and get more and more deep traffic to the site. The pages that are already good will not change and the poor performing pages will start becoming higher performing pages. As an added bonus it will help improve your sites freshness factors.

9) Take a scan of your average number of keywords or title sizes. Let’s say your average page has very short key phrases such as “Large Beer Mugs.” There are only so many combinations that those keywords will produce so if its just a low traffic keyword theres no point in continually changing the titles every single month forever. So I like to only have the Keyword Spinning script run for a preset amount of months on each site. For instance if my average keyword length is three words than the most combinations I can have is six. So I should logically quit running after 6-8 months. At which point my site is about as perfect as it can be without human intervention. Lastly don’t forget to make improvements to your CTR.

Simple huh! Keyword Spinning is a really easy way to get the most out of nearly all your sites. The more you can squeeze out of each site the less sites you have to build to reach your profit goals. With minimal scripting its all very quick to implement and automate (please don’t do it by hand!). That’s all there is to it. :)

Usually with my Blue Hat Techniques I like to drop a little hint somewhere in it that references a really cool trick or spin to the method that’ll greatly increase your profits. Since You’ve been all so damn patient about me being late on the SEO Empire part 2 post, and for the moment at least, quit asking me why Blue Hat sucks now I’ll just tell it to ya. My answer to that question BTW is that I’m still working on my projects which is eating up some time and I’m not happy with what I’ve written so far. If I’m not happy, it doesn’t get published. Sorry but the boss has spoken. :)

The Secret Hint

3) Wait a month….*Goes for a bike ride*

Use this technique on your Cycle Sites that you’ve choosen to not cycle out. Instead of competing with the original author, who you are probably linking to might I add, you can sometimes grab even better phrases and rank for them giving you a ton more traffic (I’ve seen Cycle Sites increase their SE traffic over 50x by doing this). If not than you’ll eventually get their original title again which at least will put you where you started. It’s also the strangest damn thing, you’ll get a percentage less complaints and pissed off bloggers when you switch the titles around, maybe they don’t care as much when they don’t see you ranking for their post titles.

Blue Hat Techniques20 May 2007 08:54 pm

Alrighty I’m moving this post up a bit to answer a few questions. In my Real Life SEO Example post I talked a bit about the technique of Log Link Matching. It’s an awesome technique that deserves a bit of attention. So here we go. :)

Description
The reality of the search engines are that they only have a certain percentage of the documents on the web indexed. This is apparent by looking at your own saturation levels with your own sites. Often you’re very lucky if you get 80% of a large site indexed. Unfortunately this means that tons upon tons of the natural links out there aren’t counting and giving proper credit to you and their respective targets. This is a two edged sword. This means your competitors actually have quite a bit more links than it appears, and more than likely so do you. Naturally you can guess what has to be done. :)

Objective
Saturation usually refers to how many pages you have in the index in comparison to the total number of actual pages on your site. For instance if you have a 100 page site and 44 pages are indexed than you have 44% saturation. Since this is a topic that never really gets talked about, for the sake of making it easy on ourselves I’m going to refer to our goal as “link saturation.” In other words the number of links you have showing in the index in comparison to your total actual inbound links. So if you have 100 links in the index but you really have 200 actual links that are identifiable than you have 50% link saturation. That aside, our object is to use methods of early detection to quickly identify inbound links to our sites, get them indexed, and if possible give them a bit of link power so the link to our site will count for more. This will have an ultimate ending result of huge efficiency in our link building campaign. It also will more than likely stir up a large percentage of long dormant links on our older sites that are yet to use the Log Link Matching technique. First let’s focus on links we’ve already missed by taking a look at our log files.

Methodology #1 - The Log Files
Our site’s common log files are a great indicator of a new and old inbound links that the search engines may have missed. Most log files are usually located below the root of of the public html folder. If you’re on a standard CPanel setup the path to the log file can be easily found by downloading and viewing your Awstats config file, which is usually located in /tmp/awstats/awstats.domain.com.conf. Around line 35 it’ll tell you the path of the log file: LogFile=”/usr/local/apache/domlogs/domain.com”. Typically your site as a Linux user has access to this file and can read it through a script. If not than contact your hosting provider and ask for read access to the log.

1) Open up the log file in a text editor and identify where all the referrers are then parse them out so you have a nice list of all the sites that link to you. If you use Textpad you can click Tools - Sort - Delete Duplicate Lines - OK. That will clean up the huge list and organize it into a manageable size.

2) Once you have your list of links there’s several routes you can take to get them indexed. These include but not limited to creating a third party rolling site map, roll over sites, or even distributing the links through blogrolls within your network. Those of course are the more complicated ways of doing it and also the most work intensive, but they’re by far the most effective simply because they involve using direct static links. The simplest of course would be to simply ping Blog Aggregators like the ones listed on Pingomatic or Pingoat. My recommendation is, if you are only getting a couple dozen links/day or are getting a huge volume of links (200+/day) than use the static link methods because they are more efficient and can be monitored more closely. If you’re somewhere in between than there’s no reason you can’t just keep it simple and continuously ping Blog Aggregators and hope a high percentage eventually will get indexed. After so many pings they will all eventually get in anyways. It may just take awhile and is harder to monitor (one of the biggest hatreds in my life..hehe).

There are several Windows applications that can help you mass ping this list of referral URLS. Since I use custom scripts instead of a single Windows app myself I have no strong recommendations for one, but feel free to browse around and find one you like. Another suggestion I have to help clean up your list a bit is to clean the list of any common referrers such as Google, MSN, and Yahoo referrals. That’ll at least save you a ton of wasted CPU time. Once you’ve gotten this taken care of you’ll want to start considering an automated way of doing this for any new links as they come in. I got a few suggestions for this as well.

Methodology #2 - Direct Referrals
Of course you can continue to do the method above to monitor for new referrals as long as you keep the list clean of duplicates. However it doesn’t hurt to consider accomplishing the task upon arrival. I talked a little bit about this last year with my Blog Ping Hack post, and the same principle applies except instead of pinging the current page we’ll ping the referral if it exists.

1) First check to see if a referral exists when the user display the page. If it does exist than have it open up the form submit for a place such as Pingomatic to automatically ping all the services using the users browser. Here’s a few examples of how to do it in various languages.

CGI CODE
if(($ENV{'HTTP_REFERER'} ne "") || ($ENV{'HTTP_REFERER'} =~ m/http:\/\/(www\.)?$mydomain\//)) {
print qq~<iframe src="http://pingomatic.com/ping/?title=$title&blogurl=$ENV{'HTTP_REFERER'}&rssurl=$ENV{'HTTP_REFERER'}&chk_weblogscom=on&chk_blogs=on&chk_technorati=on&chk_feedburner=on&chk_syndic8=on&chk_newsgator=on&chk_feedster=on&chk_myyahoo=on&chk_pubsubcom=on&chk_blogdigger=on&chk_blogrolling=on&chk_blogstreet=on&chk_moreover=on&chk_weblogalot=on&chk_icerocket=on&chk_audioweblogs=on&chk_rubhub=on&chk_geourl=on&chk_a2b=on&chk_blogshares=on" border="0" width="1" height="1"></iframe>~;
}

PHP CODE
if($_SERVER['HTTP_REFERER'] != "" || preg_match("/http:\/\/(www\.)?$mydomain\///i",$_SERVER['HTTP_REFERER'] > 0) {
echo "<iframe src="http://pingomatic.com/ping/?title=$title&blogurl=$_SERVER['HTTP_REFERER']&rssurl=$_SERVER['HTTP_REFERER']&chk_weblogscom=on&chk_blogs=on&chk_technorati=on&chk_feedburner=on&chk_syndic8=on&chk_newsgator=on&chk_feedster=on&chk_myyahoo=on&chk_pubsubcom=on&chk_blogdigger=on&chk_blogrolling=on&chk_blogstreet=on&chk_moreover=on&chk_weblogalot=on&chk_icerocket=on&chk_audioweblogs=on&chk_rubhub=on&chk_geourl=on&chk_a2b=on&chk_blogshares=on" border="0" width="1" height="1"></iframe>";
}

JAVASCRIPT CODE
I really don’t know. :) Can someone fill this in for me? It’s entirely possible I just don’t know Javascript regex well enough.

This will check to see if the referrer exists. If it does and its not a referrer from within your domain than it’ll display an invisible IFRAME that automatically submits the referrer to PingOMatic. If you wanted to get a bit advanced with it you could also check for Google, MSN, and Yahoo referrers or any other unclean referrers you may get on a regular basis.

If you have an older site and you use this technique you’ll probably be shocked as hell about how many actual links you already had. Like I mentioned in the other post, at first you’ll start seeing your links tripling and even quadrupling but as also mentioned its just an illusion. You’ve had those links all along they just didn’t count since they weren’t indexed in the engines. After that starts to plateau, as long as you keep it up you’ll notice considerable difference in the efficiency and accuracy of your link saturation campaigns. I really believe this technique should be done on almost every site you use to target search traffic. Link Saturation is just too damn fundamental to be ignored. Yet, at the same time, its very good for those of us who are aware that it is not a common practice. Just the difference between your link saturation percentage and your competitors could be the difference between who outranks who.

Any other ideas for methods of early detection you can use to identify new inbound links? Technorati perhaps? How about ideas for ways to not only get the inbound links indexed but boost their creditability in an automated and efficient way? I didn’t mention this but when you’re pinging or rolling the pages through your indexing sites it doesn’t hurt to use YOUR anchor text, it won’t help much but it never hurts to help push the relevancy factor of your own site to their pages while you’re at it.

Blue Hat Techniques25 Mar 2007 06:25 pm

I’ve gotten a couple requests asking me to keep the Link Laundering, Madlib Sites, and Power Indexing Tips series going. I think that’s a great Fuckin’ idea. Let’s not only do that but throw in a related Blue Hat Technique at the same time. This is a little technique I learned back in my warez and mp3 site days. You’ve probably seen it used before, but if you’re like most marketers you’ve probably just skipped right by it without ever giving it a second thought. It’s called Keyword Fluffing. It’s fairly simple and works pretty damn well, especially if you have a large site(eg. A Madlib Site).

Objective
We’re going to fluff all of our individual pages’ keywords with additional targeted long tailed phrases. We’re going to do this by creating a search box with static results and inner link within the appropriate pages. This in a sense will attempt to triple or quadruple your long tailed search traffic. This of course is an unrealistic performance result, but it will work and help quite a bit. Worthy of mention, there is an extremely blackhat version of this technique called Keyword Drafting, but for this post we’ll keep it very white hat and by the books. Yes, many major sites use this technique and it’s well within the rules. :)
The Process
1) Create a search feature on your site. Using Mod-Rewrite have it print the results to a separate subdirectory. For instance the results for the search “My Keyword” will result in a static page of the results located at www.myexample.com/search/mykeyword.html.

2) Pick up to five keywords related to your site’s niche to fluff. These will need to be common keywords that people looking for your site may tack on to their search. As an example, many software directories and crackz sites use the terms, download, crack, keygen, & serial to fluff. So when they have a page targeting “Adobe Photoshop” that page will also fluff for the terms, “Adobe Photoshop Download”, “Adobe Photoshop Crack” and “Adobe Photoshop Keygen.” Many of whom are common phrases people might add on to their search terms in the engines.

3) On each individual page on your site at the bottom put a little Div that says “Related Searches” or something similar. Then put in a link to the search results for each of those long tailed phrases. For instance the Adobe Photoshop page will have a link to www.mydomain.com/search/adobephotoshopserial.html with the anchor text “Adobe Photoshop Serial.” Be sure to make these links crawlable and pass PR. You will want them to get crawled and indexed so they can start ranking for those individual terms.

4) Make your site’s search box record the most recent searches. For added value, on your main page put up a link to the “Recent Searches.” Be sure to filter out unwanted html tags and inappropriate words. You don’t want people abusing this feature. You do however want to start gaining some extra targeted phrases you may not have thought of in the indeces.

The real trick to this technique is to scale it according to your sites’ current indexing power. I’d recommend you don’t just immediately implement this off the get go. A rule of thumb I use is to wait till my site has reached at least 60% saturation in at least 2 major engines.

For this technique I used an example that I thought people may have openly noticed. You have probably heard the recent news that Youtube has announced that they have quit using this technique, not because it’s against the Google TOS but because it was “unfair to the integrity of their results.” Meaning it worked too damn well, and other more relevant sites couldn’t compete against their saturation levels. There are of course other more prominent examples I could use, but for the sake of exercise I encourage you to reread my Madlib Sites post and think about the advanced version of this post, Keyword Fluffing with Replacement. I’ll give you an example directly. In the madlib post we targeted the phrase “Dating in ___.” The blank represented a targeted geolocation(ie. New York). Consider replacing the “Dating In” with “Single Women In” or “Single Men in.” That way you not only target the several thousand phrases related to the actual dating terms but you also got the substitution for people using different versions of the searches.

There you go! You just over tripled your saturation and keyword targeting. With any luck and time this will bring in quite a bit more organic search traffic. Hell, who needs luck. :)

Blue Hat Techniques27 Jan 2007 02:03 am

Alrighty so we’re going to go a bit old school today with this technique. Link laundering is an old technique that the pros have been using aggressively in tight circles since around 2001. So it’s nothing new, but it works damn well. A true Internet marketing professional doesn’t just focus on building their money making sites, they expend a lot of energy on building their link gathering power, and thats exactly what this technique does. If used properly this technique will allow you to automatically gather unlimited links over time to any present and future projects you produce. This is where the true beauty lies. It allows you to cut your mass link gathering time for each project dramatically so you have more time for developing other projects.

Objective
What we’re going to do is create a network of third party link laundering sites that will give other developers an incentive to link to our primary money making sites instead of back to our link laundering sites. We’re going to use these sites to our advantage by automating the process of distributing their inbound link gathering abilities to push links to our primary site networks. Don’t worry it’s easier than it sounds. :)

What Is A Link Laundering Site?
Simply, A Link Laundering site is a niche site that requires a link back from other webmasters except that the link back it requires is for another site of yours instead of the site itself.

Methodology
First we’re going to want to grab a solid platform for our link laundering sites. There are literally unlimited ideas on how to accomplish this, all you have to do is come up with some ideas on how to get webmasters to link back to your site. The reality of this is the fact that any site with a link exchange script will work, but since our focus is on drawing large link volume we’ll have to get a bit creative. This requires some brainstorming but to get you started I’ll list an example and we can get creative from there. One idea would be to create your own software download directory. Software download directories are great. They require very little promotion before they spread to all the software submitters and they tend to get a lot of high pagerank software sites that want to submit their software to your directory for the backlink and promotion. With minimal advertising and perhaps a link in DMOZ you should start getting at least a couple dozen submissions a day to your software directory. First you’ll have to build one though. You can get a free script for this from the Association of Shareware Professionals. They are the ones who regulate shareware directories and pad file standards(pad files are the files that describe your software). They offer a free one called Pad Kit. On a side note you’ll notice that they discontinued the distribution of Pad Kit so its nearly impossible to find a recent copy. So I’m going to share mine since it is probably the only working copy available to download until they come out with a new version. God knows when that will be. :(

Download Pad Kit v2.00.09

Next you’ll need to do a bit of modifying to the submit.php(the page they go to to submit their software info). Make it require a link back in order for the submission to go through. Make sure to inform the submitter that they will need to place the link on their site before the “pad file” submission will go through and give them an input to put in the url to where your link back is located. Be sure to give them an easy textarea with the html already for them to copy and paste into their site. This should be simple enough. This is also where it gets interesting and we turn this simple software directory into a link laundering site. Create a mysql or flatfile(textfile) database on another domain. Inside that database put in the urls, anchor text, and link count(start with default value of 0) of your real money sites. Have the submit.php pull the database before they go to submit. Have it grab the link with the lowest link count and put that as the link info. In other words in order to submit their software to your site they will first need to link to www.MyMoneySite1.com. You’re finally starting to see where this is going aren’t you? *evil grin*

So after a couple months or so with a tad bit of advertising your software directory will start getting a ton of daily submissions. Some will be manual and some will be through automated or semiautomated software. For instance Robosoft, Robosoft is a semiautomated way. Its all done manually but it fills the forms for you. They usually specify in advanced where they will be placing any links from the directories that require them. Then they place the links as they go down the list. Others automatically generate the link pages for you based on the linkback criteria of the individual software sites in their database. Then they just upload the link page to their sites and boom your done. The automated ones kind of suck because often times the page they upload just sits on their sites with no internal linking and won’t ever make it into the search engines. In which case record all the linkpages that get submitted to you. Then submit those pages to search engines for them. Boom your links are in! hehe

There ya go. You got yourself a link laundering site that sits there and distributes anywhere from a couple dozen to a couple hundred links a day to your real money making sites automatically. Don’t forget to put ads on your link laundering sites so it can at least pay for its own hosting while it boosts your other sites.

It Doesn’t Have To Be This Complicated

The software directory example is a tad complicated and requires a bit of experience. It’s great because it easily generates a good solid volume of links, but its really more of an example of a type of link laundering site a pro might create. I suggest starting smaller and simpler. As stated above any site with a link exchange section will work. You could easily generate a few niche sites with some Wiki-worthy articles and just put in a link exchange script. Then just modify the link exchange script so the backlink it requires from the user is for your money sites listed in the database instead of the niche site itself. A great example of how to create these type of sites is in my How To Create Your Own Autoresponder Network post. Feel free to let the creativity juice flow with this one. Just think volume volume volume. The more links you can automatically generate daily to your money sites through your link laundering site network the better. It’ll take some serious time but trust me, IT PAYS BIG TIME IN THE END.

If you really start dedicating yourself to building your linking power by developing link laundering sites you’ll start seeing some serious results in the promotion of your money making sites. It also makes your life a lot easier when promoting future projects. You can just create a new money site and put it into your central database. From there it’ll automatically start getting inbound links for as long as you keep it in there. Also, once you approach the 10,000 links or so a day mark you can start bartering with your buddies. “Alright I’ll give your network 50 links/day if you’ll give me 50 links/day with your link laundering network.” This has the added benefit of being able to nitpick your niches. So if your buddy has a bunch of Niche-A related link laundering sites and you have a bunch of Niche-B related laundering sites you can trade generated links for your Niche-A related projects. Like I said, this technique is nothing new, people have been creating link laundering networks for years. Might as well get started having them work for you :)

Blue Hat Techniques10 Dec 2006 01:46 pm

As many of you might not know I am also a computer tech. I get to help a lot of people with their computer problems and frankly its a lot of fun. It’s also very educational in the Internet marketing aspect. As experienced Internet users we sometimes loose sight of how the average Internet user behaves. We get so caught up in our own habits of long tailed search phrases and properly quoting terms that we often forget that a worthless searches like the term “computers” gets over 200,000 searches/day. I learn more and more every year from the benefits of casually watching friends and family members use the computer. Watching how they browse the Internet, why they click on ads, and how they respond to things like error messages has huge benefits in designing money making campaigns online.

This entire technique focuses on just three common user habits that I’ve noticed over the years with my own clients.

1) Double Clicking- Ever watch your parents or grandparents use the computer? You’ll probably notice one thing immediately. They double click everything. They double click icons, they double click Internet links, they even double click buttons. Double clicking is the first skill taught when learning how to use the computer, and more often than not, teaching when double clicking is not required is often skipped. Simply because it typically brings no harm.

2) The Homepage- Homepage changes aren’t a big deal to the average Internet user. As long as the home page has a search and maybe a bit of useful items like news they simply typically don’t care about the switch or even bother changing it. Its us experienced Internet users that tend to be hardcore about our homepage. Internet providers realize first hand the power of this trate. Installing a new ISP will change the users homepage to their homepage 90% of the time. I can’t tell you how many times a client has commented about how much they “like this new Internet” when they switch providers and see the new homepage for the first time.

3) Decisions- Dialog boxes and error messages = decisions. Our familiarity to dealing with dialog boxes often makes us forget the power they hold on the typical Internet user. Try this, next time you’re on the computer with a common user watching and a dialog box comes up and of course you click out of it quickly without needing to read it. Watch the expression they make. It’s usually something like “WTF? You didn’t even read that! What if it was important?” It’s funny because its true.

The habits definitely don’t stop with these three. The more you look the more you’ll find. I just wanted to list some examples of the types of behaviors to look for because the next step is obviously, monetization. So how exactly are we going to use the power of these three habits for our own profit without of course harming the user. The trick is simply; Turn something useless to them into something useful. How about starting with some made for Adsense sites? Then we’ll take this one step further and turn one user click into reoccurring traffic & Adsense income. The following steps will dive even one step further down the rabbit hole and attempt to increase your click through rates on your MFA(Made for Adsense) sites.

The Proccess
1) Prepare your MFA sites. MFA sites are great for this technique because they draw the common Internet user and very rarely trick the experienced user. So since your MFA network is already up and drawing a gripload of inexperienced surfers lets begin there.

2) Create a nice startpage. Give it some nice domain like mysmartstartpage.com. Make it pretty and easy to use. Include your Google search box code(through your Adsense account setup) prominently. Then include a bit of news links and possibly a local search. Don’t forget to put a few inconspicuous ads on the site. Make it look like a really nice start page that you would use yourself if you weren’t so obsessed with that giant Google logo. :) Check out Charters homepage as a great example. It never hurts to learn from one of the best.

3) Insert some homepage change javascript code on your MFA sites.

document.setHomePage(’http://www.mywebsite.com’);

I recommend you put it as an onload() in the body tag(see comments). That way it will automatically popup when the page loads.

4) Reposition your ads. The window.position.set properties are held in the registry and are typically never changed on a Windows XP computer. Through practice try to reposition your ad so they are directly underneath where the javascript dialog box will show on a typical 1024×768 or 800×600 resolution computer. Double check your results on several computers for accuracy.

What Will This Do?
The user will go to one of your MFA pages and get faced with a small dialog box asking them if they want to change their homepage to your new startpage you built. The experienced user will quickly click cancel and be on their way. No big deal. The rest will click to get rid of the dialog box. A percentage will click OK. This percentage will become your reoccuring income for a long time. Everytime they open their browser they will get your start page. It will be nice so they usually won’t care. You will get money from the searches they make on your Google adsense search box. You will also get income from the ads placed on the start page. Over a long enough period of time your start page will start to build in its loyal userbase. The more it builds the more reoccuring money it makes.

What About The Percentage That Click Cancel?
Remember the first trate of the common Internet user that was mentioned? They click once: it gets rid of the dialog box. They make the second click for the complete double click: you get a click on your Adsense ad. Your CTR on your MFA page just went up! *Insert evil grin*

You never really loose.

Don’t Forget To Brand Your New Start Page
Another smart tactic to use would be to brand your new start page. Make it prominent but not in the users’ way. This is important because as the user gets more familiar with the Internet and starts to really learn the ins and outs they will start to become loyal to what they are used to. Since they are in the process of getting used to your start page when it gets changed in the future when they are smart they will remember it and will be more apt to change it back to yours. This creates a long term win for you.

Blue Hat Techniques18 Oct 2006 06:06 am

So since my last lame post on What Are Proxy Servers? the inexperienced promoters are now hopefully caught up with the wonderful world of proxy servers and loving them right about now. So I think this would be a great time for me to fuck up that pretty little proxy planet you’re on and teach you how to exploit spammers using proxies to your own little spammy or even nonspammy purposes.

Cruel Intentions
The intent behind this post will be to teach you how to use massive amounts of trusting spammers to help do your own little spam. Every good spammer knows public proxies are real hit and miss. Some work many don’t. Yours will work, it’ll just accomplish your work not the spammers. I’m going to assume that you’ve already done your research and know about public proxies. I’m also going to assume that you are well aware of the many public proxy lists out there. Every good web spammer has a huge solid list of proxies, and we’re just going to fuck it up for the rest of them. Sound like fun? Good it will be. :)

First Things First
1. Create yourself a public proxy. There are tons of proxy software out there.
CC Proxy is a good Windows based one.
A few open source proxy server software
There are also TONS of good ones for linux. I recommend using linux if you are able to do so. Just feel free to do a bit of research on this step before you take the plunge.

2. Make the proxy server open. This means don’t make it anonymous. Anonymous proxy server means your IP shows up for every request. Make it so their ip directly makes the pull on every request. This is perfectly acceptable. Most spammers don’t bother filtering their lists for ones that claim to be anonymous yet aren’t. They just slam and clean the list from failures.

3. Block EVERYTHING with your proxy server. You can easily block sites and domains with your new kick ass proxy server. You may also set what default site they are redirected to everytime they make a request through your server and have it result in a successful pull. So just block everything and focus on the page all their requests will return with. I will call this page your “return blocked page” because many proxy servers have different terms for it. Wanna see an example? Try using proxy 205.221.223.1 on port 80. It’ll result in every site you go to returning with CAL Community School District website.

4. Build your return blocked page. This page will be your weapon. Whatever you would normally do through a proxy can be accomplished or even redirected through this page. Want to slam an advertisers form? Want to post to guestbooks or blogs? How about just slamming a referrer list or even having them post a fake post on your forum? Whatever and however you want! The world is yours. Their IP accomplishes the task. No worries about cpu usage or failed proxy connections. The spammers ip and scripts does all the work. Your server just sends the redirect. Since you will eventually have hundreds to thousands of spammer using your server the page pulls will seem natural and randomly timed. Aside from the absolute huge amounts of them of course. :)

If creativity isn’t your strong suite and you can’t think of a good way to utilize this step just settle with using it to raise your Alexa rank or even have it search for your terms in one IFrame with Yahoo or something then have it click on your link in another iframe. This should help Yahoo see your site as getting a higher CTR on your search terms. That got you grinning didn’t it? How about having it bookmark your site in social bookmark sites for quick indexing? *waves to SEOStomp*

5. Submit your awesome new “anonymous” public proxy server to some repositories. This will get it some normally unwanted attention and give it a little bit of time to spread around like cancer. Just remember to advertise it as a anonymous server. Most won’t ever double check that. Infact their scripts will report back a successful pull everytime on your server. It won’t be until they actually check the content that they will stand a chance of figuring removing your server from their list.
Here’s a few hundred to start out on.

There ya go. I could spend the next 100 pages giving you ideas on what to do with your now ultimate proxy server. By ultimate proxy server i didn’t mean your actual server i meant the chumps using your server and their virgin ip addresses. Just remember, no one EVER manually goes through their proxy lists and checks for these things. They may check the returned content to see if its what they expected to get, but if it doesn’t chances are they won’t sit there and manually pull it and find out what your doing. However if you are paranoid about them knowing about the technique your using then feel free to be a little sneaky about your redirect or frames.

If you are a complete white hat and just want to have a little fun. Feel free to boost your friends population on their forums and such by having every single spammer you catch inadvertantly signup :) Hell, I won’t condone it but it would be pretty funny to have them automatically email their ISP with the what they are trying to do. Remember email spammers use proxies to.

PS. I lied I didn’t sell out :) I’ll keep writing Blue Hat Techniques as long as people keep finding use for them.

Blue Hat Techniques25 Aug 2006 07:09 pm

The Google Patent talks explicitly about the freshness factor and it’s importance. More importantly it talks about staleness being a factor in the rankings. This concept is understood to be true for not only Google but Yahoo and MSN as well. It mentions that not all sites in a niche need to be updated consistantly. Some niches require more freshness while others require less. This naturally raises the question; how often should my site be updated for my niche? Too much and you could potentially hurt your rankings, too little and you will slowly loose the rank you worked so hard for.One particular site of mine raised this question for me. It’s a site that is 100% static and virtually never gets updated. It would exibit some strange qualities in the organics that none of my other sites would. It would rank in the top 10 for all of it’s terms then slowly as weeks would roll by it would eventually drop into the bottom 30. I naturally considered the freshness factor. So I made a slight change to the title and added one page of content, linked to on the main page. Within 48 hours the site dropped out of site in the rankings(100+ in Google, 70+ in Yahoo, and out of the top 300 in MSN). This fustrated me but instead of changing it back I stuck with it. A week later it rose back up to the top 10. I was like COOl. So I let it be and about 4 months down the road it started slowly dropping again. so I once again made a slight change to the title and added one page of content. Same exact thing happened. This forced me to further examine the pattern being displayed in an attempt to mimic it.

Obviously the search engines must get their data from somewhere to determine how fresh your site should be so I took a close look at the competition in the niche. For a few weeks I studied their cached dates in the organics to see how often they were updated in the engines. There was a slight pattern between the frequency of the cached pages and the frequency of their updates. Which spurred me to investigate that in a whole new light which I documented in my #10 Blue Hat Technique-Teaching the Crawlers To Run. However over all I couldn’t tell how often exactly the sites would update. So I turned to Archive.org’s loveable Way Back Machine. Alexa has a wonderful feature that puts a star next to each Archive’s update vs. Archive’s update+a site update. This comes in very useful to determine how often sites of your niche should be updated. For instance certain news sites require much more freshness (take a look at CNN) than a local government site like Oregon.gov. So if you got a site that holds high rankings for a term you don’t want to loose, but you see it slowly start to trickle down the serps, you may want to take a closer look at the freshness factor. Determine which of your competitors are holding the steadiest rankings. Add them all to a list. Take a look at the sites themselves. Many of them may date their newly added content. See if you can determine how often they update. Make up an average of the sites in the top 5 and an average of the sites in the bottom 10. Then look at the sites in the consistant bottom 30’s. More often than not these rankings are usually held for the sites that are of quality SEO, but are determined too stale to rank in the top 10. Attempt to make a prediction on how often your site should be updated. Then make a prediction on how large your update should be. If you’re unsure on how large it should be play it safe and make it very small. Remember any title change forces the SE’s to reevaluate your site’s topic. This is a good thing and a bad thing. Typically the engines will drop you down on the rankings while they make the new evaluations, but you are sure to come back up if they determine it’s of the same topic. This will cause your staleness factor to drop to 0 and your freshness to be high again. Make sure to note your rank at this point. This will be your target on every update.
I know this behavior in the SERPS causes many webmasters to bang their head furiously, but if you just sit back and watch carefully what is happening with your site you are much more able to make an intelligent decision. Most newbie webmasters, when they see this effect happening on their site, they panic and redo the entire site. This is definitely the WRONG MOVE. Even if you think you’ve been sandboxed remember: You deserved top 10 rankings at one point, there is no reason why it shouldn’t deserve it again. Be patient, be smart, and use your knowledge of the freshness factors to maintain your rank.

Blue Hat Techniques20 Jul 2006 04:30 pm

Once upon a time I actually stole this idea from a small software directory. I don’t remember what site it was or even if they are still around. So I’m not going to bother attempting to give proper credit. Just know that this idea, although very rare and hardly used, is nothing new. It is infact quite old and proven to work.

This technique accomplishes two VERY important things
1) It improves the CTR (click through rate) onto your site within the organics. Simply said, it makes your site stand out more when people search for your terms. So if done right you can actually rank #2 in the search results and yet pull more traffic from the results than the #1 site. You see the advantage of this already. :)

2) Since Googles inception of tracking click through rates, it is naturally understood that they will factor in CTR into which sites they should display first. Yahoo has been doing this for years. Its very simple and obvious, if the #2 site gets a much higher percentage of click throughs when visitors search for ____ than the #1 site. Than #2 must be a more relevant site for what the users are looking for. I haven’t done any testing with Google but I can tell you this for sure about Yahoo and it can be assumed for Google as well: Improving your CTR in the organics WILL improve your rankings over time.

How Is This Done?
In the jist, improving your CTR in the organics is simply done by making eye catching adjustments to your title tag. Don’t you wish there was a way to make your title tags show up as heading tags or giant bold letters in the SERPS? Yeah keep dreaming buddy. Thats impossible, but you can however add common nontypable(not on keyboard) english characters to your title tag to make them stand out above the rest of the results. Every writer knows that bullets and arrows draw the readers eyes to important key points of a paper. Why not use that on your website.

Usable Characters
Here are the usable characters that I have found will work:
^ ¢ £ ¤ ¥ § ¬ ° ± º ø þ
Also:
« note: Only works properly in Google

There are other non english characters you can try that look and stand out great, but i have never used them for fear of the engines thinking my site is non english. I suggest you take the same precaution. Use these english characters to form a title tag that stands out, yet won’t hurt your rankings. The nice thing about these characters is that you can put them in your title tag and yet the engines will pretty much ignore them when they rank your site. Other title changes will sometimes cause a temporary drop then rise again in the serps while they attempt to see if your site changed topics. These seem to have absolutely no affect other than being displayed in the SERPS.

Here’s An Example
ø Blue Hat SEO ø
A stupid blog you should never read.

See the difference? If you would like test this on a site you don’t care too much about and take an unbiased look at it in the results page. You’ll notice your eye naturally gets draw right too it. The biggest difference you will notice is the immediate traffic jump. Once the engines update your title tag in the SERPS you’ll instantly notice more traffic from each phrase you rank for. I really don’t mean to be selling you on this technique, but for how simple this technique is, the results are too amazing to not witness yourself.

What characters do I typically use?
I am personally a big fan of ø and ¤. They just seem to look nice when placed in front and in back of a title. So now that I’ve finally written up this technique, you people can quit emailing me and asking me whats wrong with my giant “character rich” title tag. :)

Here is how the characters look in each one of the engines. I put the « in front so you can notice how it won’t show up in Yahoo and MSN.

Me in Google

Me in MSN

Me in Yahoo

Further thoughts for PPC Campaigns
The whole point of this article wasn’t to tell you to put characters into your title tag. Its to draw some of your interest into a very important aspect of SEO that normally gets overlooked. There are many ways to improve your CTR in the organic results. The best point I can possibly convey is to use your creativity. Your a webmasters so I already know your chock full of that.  I would also like to add the significance of this in PPC campaigns. Why pay an extra 10 cents/click for the #1 spot when you can get the same traffic while paying for the number 2 spot. It’s not all about writing a good ad that describes your site. Use some creativity to also draw their eyes to your ad. This is a basic 101 concept from back in the banner CPM days that still applies today.

Next Page »