LOL, be truthful. Did you honestly see this post coming? People wanted to know how the tool works, but I think I can do you all one better. I’ll explain in detail how exactly it works and how to build one for yourself so you can have your very own, hell one to sell if you wanted. Would you guess there’s a demand for one? Haha Sure why not? I can’t think of a single good reason why I shouldn’t (I never considered money a good enough reason to not help people). However I would like to ask a small favor of you first. Please go to RobsTool.com and subscribe. Throughout this month we are adding a new section called “The Lab” inside the tool where we are going to be hosting a multitude of crazy and wacky SEO tools that you’ve probably never thought could exist. Even if you don’t have a membership please subscribe anyways so you can get some cool ideas for tools to build yourself. That out of the way, lets begin. :)

The Premise
SQUIRT works off of a very, very simple premise. Over the span of the months you promote your websites from their infancy to well aged mongers, you make dozens of small decisions daily. All the tool does is mimic as many of those decisions as possible and boil them down to a small yes or no; true or false; boolean expression. This is just very basic AI (Artificial Intelligence) and decision making based on some data. There really is nothing complex about it. Think about the first time you promoted a website in the search engines. You looked at what you considered some factors in ranking. You saw what your competitors had that you didn’t. What was your first initial reaction? You likely thought, “how can I get what they have?” A script can’t do this of course. This is a human reaction that can’t be duplicated by a machine. However, now think about the second, fifth, tenth website you’ve promoted. Once again you looked at what your competitors had that you didn’t. From there you may have noticed your mindset changed from “how can I get it” to something like, “how did I get it before?” This a machine can do! Experience just made the difference between a decision that needs to be made and a predefined decision that sets an orchestrated path. I know this all seems overwhelming, but I assure you its all really, really simple. The trick is, just build a tool that would do whatever you would do, based on the current situation. The situation of course can be defined by what we all know and study everyday of our professional SEM lives, Search Engine Factors. So the best place to begin is there.

A List Of Factors
Since the tool will make its decisions based on stuff you consider to be factors search engines use to rank your sites, making a list of all the known factors is a big help. Sit and write down every search engine factor you can think of. Break them down to specifics. Don’t just write “links.” Write Link Volume, Link quality, links on unique domains, percentage of links with my anchor text..etc. The SQUIRT utility I released looks at 60 separate factors. So at least you have a general goal to shoot for. Come up with as many factors as possible. Once you got a good clean list of factors start figuring out a proper way to measure each of them.

Factor Measurement
How many times today did you go to a search engine and type in link:www.domain.com? That is a measure of a factor. How about site:www.domain.com? Thats another. Each of those are a factor that when explored by either going to your own site, or going to the search engines can result in some sort of figure or number you can use to calculate how your site fairs in comparison to the sites currently ranking. Let’s use an example. You go to google and you search for your keywords that you are wanting to rank for. You make a list of all the sites in the top 10 and separately do a link: command for each of their domains. You then take all those figures and average them out. That gives you a rough idea of how much “link volume” you will need to get into the top 10. You then do a link: command on your own site to see how close your site is to that figure. From there you can make a decision. Do I need to work on increasing my link volume factor or not? You just made a boolean decision based on a single factor using data available to you. It probably took you a good 5 minutes or more to make that decision. Where as a script could of made that decision for you in less than a second. Now I know you’re all just as much of a computer nerd as I am, so I don’t have to preach to you about the differences in efficiency between yourself and a machine, but at least think about the time you would compound making these very simple decisions for each and every factor on your list for each site you own. There goes a good five hours of your work day just making the predictable yes or no decisions on what needs to be done. This sounds ridiculous of course, but I’d be willing to bet that at least 90% of the people reading this post right now spend most of their time doing just that. Ever wonder why most search marketers just trudge along and can’t seem to get anywhere? Now you know.

Making The Decisions
Okay so let’s take an example of a factor and have our script make a decision based on it. We’ll look at the anchor text saturation factor. We look at our inbound links and find all the ones that contain our anchor text versus the ones that don’t and only contain some similar words somewhere else on the page(most other documents). We then make a percentage. So we’ll say after looking at it 30% of our inbound links contain our exact anchor text. We then look at our competition. They seem to average 40%. Therefore our script needs to follow a promotional plan that increases our percentage of links that contain our exact anchor text. Very good, we’ll move along. Next we’ll look at inbound links that don’t contain our anchor text but contain our keywords somewhere in the document. Looking at our site we seem to average about 70%. Our competition seems to average about 60%. So we are doing much better than our competition. Therefore our script doesn’t need to increase our links that doesn’t contain our exact anchor text but do have relevant text. Wait, did I just contradict myself? These two factors are complimentary. So the more our tool increases one factor the further the other one drops. Wouldn’t this throw our promotion through some sort of infinite loop? Yes I did contradict myself and Yes it would put our promotion through an infinite loop. This is called on going promotion. The fact is THEY rank YOU don’t. Therefore you have to keep improving the factors you lack until you do rank; even if they seem to almost contradict each other. By the end of the analysis your script ends up with a long list of DO’s and a long list of DON’T NEED AT THIS TIME. So now all you have to do is use your own experience and your own site network to make all the DOs happen to the best of it’s abilities.

Establishing A Promotional Plan
So now that we have a list of all the stuff we need to improve with our site we can program our SQUIRT script to just simply do whatever it is we would do to compensate for our site’s shortfalls. To give you a better idea of how to do this and how SQUIRT handles these exact situations, I’ll take 3 example factors and let you know exactly what it does when you hit that submit button. However keep in mind, no matter how much information you gather on each site, every promotional situation is unique and requires a certain amount of human touch. The only thing you can do is define what you would do in the situation if you had no prior knowledge of the site or any extenuating circumstances. Also keep in mind that you have to remain completely hands off. You don’t have ftp access to their site, you can’t mess with their design. So anything you do has to be completely offsite SEO. Also, anything you do can’t hurt the site in anyway. Every plan needs to be 100% focused on building, and any method of promotion that may possibly cause problems for them, even if you plan on only running throw away black hat sites through the tool, needs to be 100% positive. So if you want to go get links. You need to do it within your own network of sites. You can’t go out sending spam comments or anything.

Page Rank
Your Site: PR 2
AVG Competitor: PR 3
Decision: Increase Page Rank
The Plan: Create a network of very large sites. Since pagerank can be gathered internally just as easily as from external sources. Than you need to build a network of sites with lots and lots of indexed pages. Take a look at the current volume of sites you plan on running through your SQUIRT tool and decide how big you need to build your network before hand. When we decided to make SQUIRT public, even though not all the sites would require a Page Rank increase we knew a TON would. We launched the tool with the capability of handling 500 members. So we knew that 500 members, submitting 10 sites/day with each link needing to hold on a single page for at least a week, could result in needing 150,000 links available to us each week. If each link was on a page with a PR 1 than each page would send a tiny page rank increase to the target link. Likewise if each indexed page had a PR1 and we put five links up on each page, than each page would give out even more page rank through the links. There is a point of saturation of course. We decided 10 was good for each page. So we could get the maximum amount of pagerank sucked out of each indexed page while maintaining the highest possible volume of links we could spare. So if we built a network of sites that contained a total of about 1,000,000 pages indexed(you heard me), each averaging a PR1. Than we could transfer enough page rank to 150,000 links/week to constitute a possible bump in page rank to each link. For your own personal SQUIRT of course you don’t need nearly that volume. However make sure to preplan ahead because even if you make a 1 million page site, it doesn’t mean you will get 1 million pages in the index. So may have to build quite a few very large sites to reach your goals. This of course takes time and lots of resources.

Anchor Text
Your Site: 30%
AVG Competitor:
40%
Decision: We need to increase the number of links that contain our exact anchor text.
The Plan: Now since the tool can’t directly affect the site than we can’t exactly go to every inbound link the site has gotten and figure out a way to get them to change the anchor text. There are just too many ways to gain links and its completely unreasonable to attempt. So the only way to increase the anchor text match percentage is to increase the total number of links to the site and then have them all match the anchor text. This is where you need to queue up the blogs. All you have to do is create a bunch of blogs all over, and take steps to increase the chances of each individual post getting indexed. Than you can fill the blogs with the anchor text of the site. This however is no easy task when dealing with a large volume. Since you need plenty of authority you will need to get accounts on all the major blog networks. Remember, the average blog usually has less than 5 posts/day so you will need to compensate in sheer volume of actual accounts. These will also need to be maintained and anytime one gets banned another needs to be automatically created to take it’s place. Those of you using the tool have probably already noticed links from these places showing up in your logs and Technorati links. Since we knew so many links/day would be required we had to create an absolutely HUGE network of blogs on various places as well as the automated system to create new ones if another gets buried. Once these links have been dispersed over time, the anchor text percentage will start to rise.

Deep Indexing
Your site: 3 pages in the index.
Crawl Stats: 10+ subpages identified
Decision: Need to get more bots to the subpages of the submitted site.
The Plan: So the script grabs the main page of the site and immediately sees more subpages available than are currently indexed in the search engines. This lets us know that the submitted site is having a hard time being properly deep indexed. So this is when we queue the Roll Over Sites to help get some SE bots to these subpages. There is one problem however. When dealing with my own sites it’s fine to scrape the content then redirect as detailed in the strategy that I talked about in the Power Indexing post. However since this tool will be a public one I can’t scrape peoples content because there is the odd chance that my rollover site may temporarily outrank their actual site and it could draw some anger from people who don’t know what it is. Remember the rule. The tool can’t harm the site in any way. So we had to go another route. Instead we pulled from a huge list of articles and just used that as the content then pushed the spiders to those pages then when they got indexed, redirected them to the site. These of course don’t show up as links so it’s all backend, however it does a fantastic job of getting subpages indexed. Since Rollover Sites use actual content than there is no problem making them absolutely huge. Therefore you don’t need a very large network of them to push a very large amount of bots. Yet at the same time you still have to follow the rule of no interference with their site. So if their site is having a hard time getting deep indexed and you can’t exactly ftp a sitemap over than you have to bring in a large network of Third Party Rolling Sitemaps. So what you do is, you create a network of sites that are essentially nothing more than generic sitemaps. You drive a lot of bot traffic to them on a regular basis and have them roll through pages that go through the tool. Once the tool has identified up to 10 subpages of the target site than it can add them to the network of third party sitemaps. The new pages go in, old pages go out(The technical term for this is a FIFO algorithm). If you have a solid enough network of third party sitemaps you can hopefully push enough search engine crawlers through them to get every page crawled before it gets pushed out. This of course is a huge problem when dealing with a large volume of sites. If we originally wanted enough power for 500 members than we knew that 500 members submitting 10 sites/day which each contained up to 10 subpages would mean we would need enough third party sitemaps to accommodate 50,000 pages/day. While the efficiency of a single third party sitemap site may be big, a massive network would be needed to push that kind of volume. It’s just beyond unreasonable. So instead, we incorporated a gapping system. So anytime there wasn’t a new page coming in and it had a chance to display a set of links to a SE bot for the second time than it would grab some older entries that were already rolled through and display them as well. So if you push enough crawl traffic through than eventually every page will theoretically get crawled.

Rinse and Repeat
So thats all there is to it. It’s really quite simple. As you build your SQUIRT work your way through every factor you wrote down and think about what you would do as a hands off Internet Marketer to compensate if a site didn’t meet the requirements for that particular factor. This also applies to flaws in onsite SEO. Let’s say for instance the page doesn’t have the keywords in the title tag. You can’t touch that title tag, so whats the offsite SEO equivalent to not having the keywords in the title tag? Putting them in the anchor text of course. Just keep relentlessly plugging away and build an established plan for each factor on your list. With each failure to meet a factor there is potentially a problem. With each problem there is a potential solution your script and personal site network can take care of for you. It just may require a bit of thinking and a whole lot of building. It’s tough, trust me I know :) but it’s worth it because in the end you end up with a tool that comes close to automatically solving most of the daily problems that plague you. Here’s the exciting part. Once you start building and going through all the factors you’ll find some you probably can’t solve. Just remember, every problem has a solution. So you may just learn a few tricks and secrets of your own along the way. Shhh don’t share them. :)

Is SQUIRT Biased?
ABSOLUTELY! Think about what it uses to determine the site’s standing on it’s strengths and weaknesses. If you do a link: command there is a delay in time between what the search engine shows and what actually exists that may range up to days and weeks. This causes major problems. The tool may think you need a bunch of links when it in fact you already got plenty and they just haven’t shown up yet. It may think you have a ton of links when those links were actually only temporary or you lost them before they had a chance to disappear. Essentially the tool operates off of your site’s potential SEO worth. So lets say you have an old site that you haven’t really done anything with, then you run it through your SQUIRT. The tool will stand a much better chance of making an accurate analysis, and likewise any boosts you receive in the factors will more than likely be the right ones. Therefore it will appear as if your site just skyrocketed up simply because of one little submit through the tool. When in fact the site had that sort of potential energy the whole time and it just needed a little shove to get moving. The same could be said about an established site experiencing extremely slow growth. It fairs well, maybe even in the 60%+ range, so it appears to not need very much. Then at the same time, everything the tool does do, matters very little in the large scheme of your escalated promotional campaign. Also, the tool can only focus on one site during one phase in it’s promotion at a time. So if you got a brand new site that you submit, than the tool will naturally focus more heavily on getting the site properly indexed and less on helping it gain rank through the SERPS. So if indexing methodologies don’t completely work in that particular case than it appears as if your tool didn’t do anything at all. Remember, this is only a tool, its all artificial decision making. There is no substitution for the human touch. No matter how complex or how efficient a tool you build is, there is no way it can compete without an actual human element helping to push it. All your competitors are humans. So there’s no logical reason why you can expect to ever build a tool that is the end all solution to beating them everytime. So even though you now have a really cool tool, still remember to be hardworking, smart, and efficient…never lazy. :)

Sorry about the long as hell post, but you guys wanted to know what the tool was doing when you clicked the button. Now you do, and hey…least I didn’t go through ALL 60 factors. :)

Get Building