Proper Indexing Analytics With Squeeshy Words

As the site: command in search engines get less and less accurate and the search engines failed to ever make their API’s out of beta-like reliablity. I have finally made the complete switch to “Squeeshy Words.”

Since I completely made up the term Squeeshy Words I am almost positive none of you have ever heard of them. The term came from a dream I had before I started my company that involved a hamster and a line of ants that needed to get squished. And we shall call him Squeesh! Basically a Squeeshy Word, is an abstract made up word that doesn’t exist. Kind of like what they do with SEO contests. You invent a weird term that no one will ever possibly use. You hide it in all of your templates of a current project. Then when you want to track how many pages that project has indexed you just search for that term. It surprisingly shows you a much more accurate count of how many pages you have indexed in each engine. It also gives you the added benefit of determining which pages hold higher weight Which helps with certain interlinking quarals. Since they technically all compete against each other for that particular Squeesh Word.

For those of you who don’t use the Squeeshy Word system (you don’t have to call it that). I actually suggest you get in the habbit of it. It creates a much more accurate analytics and it saves you a ton of time staring at API based tools.

*Also note: I’ve found that the API results often vary from even the most common datacenter. I really don’t enjoy trusting them.

–>

The Official Blue Hat SEO Heckler

I got an interesting comment in the What is Blue Hat SEO page that i thought I’d take a minute to comment on.—————– an0n Says:July 29th, 2006 at 9:54 pm

Sorry, but the suggestions suggested on this site are plain unfair. While they might not be against the rules of the search engines, doesn’t mean you can be greedy or unethical. You are the patent system – legal but unfair.

Case in point: your spam blog ‘hack’

Hee’s an example of being unfair, but not against the rules. I disagree with your site. I won’t be ‘black hat’ against it, but I’ll use your own style to get what you want. So I shall feed you some advice that will do the world a favour. The advice are ‘instructions on how to delete this site’:a) if you have access to your WP folder remotely, delete the folderb) delete the database or where the actual data is stored

Thanks much and good netting!

————————————————–Despite what you think I’m not going to start ragging on this person, or even disagree with him. Infact I take most of it as a compliment.

Judging from a few other comments he has left; I get the impression that he is a White Hat Fanatic. I’ll explain. SEO is a lot like religion in several rights. There are religious people. People that follow a religion but still respect the rights of others that don’t believe as they do. Then there are religious fanatics, that feel it’s their duty to force everyone believe what they believe, one way or another. SEO holds the same principles. There are whitehats and blackhats. There are white hat fanatics and blackhat finatics. Simply put, the whitehat fanatics hate the blackhats because a blackhat site somewhere down the road outranked them. Blackhat finatics hate the whitehats because they are the are the ones that report their sites and get them banned.

Personally I’m on the fence on this one. I simply don’t care either way. I don’t hate blackhat sites because blackhat sites don’t attempt the competative markets I do. I don’t hate white hat sites cus if they get one of my blacksites banned, all well. I factored SE life span before i even invested in it.

An0n brings up a valid point. Are these Blue Hat Techniques unfair? Obviously being fair is subjective. Are using these techniques unfair because they allow you to compete with large sites like about.com? Some would say having a million dollar a year SEM budget is unfair. Are they unfair because the other webmasters competing against you aren’t readers of BlueHatSEO? If you’re that concerned about being fair email your competitors with the link. I bet you won’t. The only thing I see unfair is the fact that there are 300+ million sites competing for a term and only 10 will get 90% of the traffic. For pete’s sake people use this site as a tool to level the playing field. Make it fair for your sites to compete with all the rest of the bullshit in the industry.

Obviously An0n is frustrated, and Freud Eli might know why. It has been beatin’ into his/her head that content is king. I realize the concept of content is king is so incredibly popular that I’m probably going to get a ton of grief over this, but I really don’t care because saying content is king is like saying having a light car wins you the races. It’s true, but you’re going to want some horsepower before you run with the big dogs. That horsepower is appling advanced SEO tactics. Sorry if that conflicts with the idealist world of “natural linking.” Fucking communists. How disappointed were you when you were first learning about SEO and you asked “How do i do good in the search engines.” Then some person replied “Build good content and get some relevant links.” It’s not a bad answer it’s just not a very useful one, and it’s definitely not satisfying.

That is why I built Blue Hat SEO. I believe it’s a better answer to that question. Now how do you plan on me continuing to answer this question with the website deleted? Perhaps I should convert the blog into every other SEO blog. Where I just rant about new Google products coming out, and avoid actually teaching anything. I’m sorry to say this but there is a serious lack of resources available to webmasters after they pass the complete newbie level. The demand for advanced webmaster knowledge is definitely out there. I just built what I wish already existed. If there is anything I can do to stir up some dust so it’ll finally happen I’m more than willing.

Thanks for donating your input an0n. I really do appreciate yours and everyone else’s comments.

–>

Getting Rid of Web Spam The Historical Way

Back in the mid 90’s phone companies were having major problems dealing with phone hackers(phreaks as they called themselves). The problem seemed hopeless to solve. The phone companies would increase their technology and that in turn would simply inspire the hackers to increase their technology. It was this hilariously vicious cycle that kept spiraling downward. The phone companies kept lobbing to increase the penalties for phone hacking and the phone hackers kept coming back and saying “We’re always going to be better than you. You’ll never catch us.” The situation seemed hopeless.All of a sudden the phone companies did the most brilliant thing on the planet. They took all the most prominent phone hackers, the outspoken ones that taught all the other kids how to do it, and gave them jobs developing ways to stop those pesky kids. This kept the phone hackers happy ‘cus they got to play around with new security that they’ve never seen before(plus I’m sure the paychecks were nicer than they’re used to), and it kept the kept the phone companies happy because they didn’t have to worry about them anymore.

This amazingly this worked! Within 10 years the problem was almost eliminated. There is still little traces of it around today, but nowhere near the degree it used to be. Way to go phone companies.

Lets step back a lil’ furtherBack in the wild west days local ranch owners had huge problems with wild horse and cattle thieves. What did the ranchers do? They hired the horse thieves to stand watch over their cattle. No other thieves would dare go near that ranch once they found out another gang of thieves was protecting it.

Back To TodayWebspam has gotten completely out of control. It’s going beyond the limits. Google, Yahoo, and MSN have tried everything in the book. The more they increase their technology the more the spammers increase theirs. It’s this all to familiar sounding situation seems to be escalating out control.

What do you think should be done about it?

They should hire professional webspammers to develop algorythms to stop webspammers.

Good f***in answer!

–>

MSN Develops Plan To Stop Web Spam

MSN research team recently updated their Strider Search Defender technology plan. The plan details Microsoft’s new developing approach to stopping webspam. It’s direct targets include blog, forum, and guestbook spammers.

The concept is simple. To use web spammers mass production capabilities to make catching them all the easier. This is done by a less than simple form of locating highly spammed places and using them sort of like spam traps. In a sense, the more they spam the more likely they are to get caught.

Neat concept if you ask me, but I have a few theories of much easier and more solid ways to identify webspam without running the risk of locating what MSN dubbs “False Positives.” Sounds sketchy to me.Click Here To Read The Full Report

If anyone from MSN is reading this I would like to point out one thing. I am in full support of this spam filtering technology. I really do think its a good idea, but in the end you may be causing a bigger problem than your solving. People spam forums, guestbooks and blogs because they’re easy. Once a search engine figures out a way to thwart that, then blackhatters will simply start using the harder method of link gathering (you know what I’m talking about), and you definitely don’t want that to begin. It’s been a long time in the coming and its going to be devisitating to SE’s once spammers start having to resort to it full time.

–>

Is Google Downsizing?

I can’t help but notice that lately Google has been downsizing their index. Across my network of sites I’ve been loosing indexed pages the last three weeks. It also seems harder to get pages into the index even though Googlebot is still crawling at a regular pace. It’s about time for a pagerank update, and I’m wondering if this has anything to do with it.

Other people seemed to be noticing the same thing, perhaps this is perpetuating the rumors that Google really is running out of space. Although I am finding that people while talking about this, aren’t seeing one possibility. That Google might be upgrading their storage to accomidate Big Daddy. Possibly switching out old storage arrays with new ones. This would cause temporary drops in index size while the new larger storages are being placed.

–>

Rule of Three

I’ve been getting a common question emailed to me several times lately that I thought I’d address real fast just to clear it up and get it out of the way.

“My site is ranking well in ____ and ___ but doesn’t seem to rank at all in ______. What’s wrong why won’t _____ rank me well?”

Feel free to fill in the blanks with either Yahoo, MSN, or Google.

The common answer to this questions in forums such as WebProWorld is that “Don’t worry about it _____ is just a wierd engine, there is really nothing you can do or good answer of why they aren’t ranking you well. I would focus on gaining some relevant links to your site, that might help.”

Thats an acceptable answer and it covers all the bases, however the person asking the question usually walks away disappointed and unfilled. I think the good answer is a little bit more simple than that. The answer is to make sure your inbound linking campaign follows the Rule of Three.Obviously your site has content that is found acceptable for high rankings in the other two engines, so I would stray from reworking all your content and onsite optimization, because it is obviously close to success in all three. However don’t keep onsite optimization far from your mind. Keep in mind that SE’s such as Google, Yahoo, and MSN value inbound links differently. It is common belief in the White Hat SEO world that one single quality inbound link has the power to boost your rankings in all the major search engines. What one engine may consider the “one link to rule them all” another may not.

Lets look at an example:www.moscowtimes.ru is a crazy good authority site that ranks very well. Once upon a time it released a sister site dedicated to the entertainment portion of it’s newspaper called www.go-magazine.ru. This site got indexed and recieved a PR7 by the next google update with only 3 links at the current time. It also ranks very highly for many many moscow related terms. By all definition this is a perfect example of having one inbound link propelling you to the top.

My point is simple. The three engines each treated this site differently when it first came out. Google for example may have said, “wow nice inbound link. I like you.” Lets give that link a voting power of 1,000 points to the go-magazine site. Yahoo as an example may have seen the same link and said, “wow nice inbound link. I like you.” and gave the inbound link a voting power of 700 points. It’s not that Yahoo liked the inbound link any less, it just didn’t determine it’s value to be as high as Google said it was. Thats just the reality of three different engines having three different opinions on the same subject. Putting this example into context, this is probably what is happening to Joe Schmoes website in the third engine that he thinks is frowning on him. Multiply this effect by several hundred or thousand links the differences become quite large among the three engines.

Stick to the rule of three.If possible attempt to create for yourself three websites. One that by focus does well in Google, one that’s designed to do well in Yahoo, and a third for MSN. You don’t have to build these sites to amazing feats like your original site, but at least have them there and put each site’s focus to one doing well, or as I like to say Gained Approval, from one single engine. Many authority sites create multiple sister sites on multiple domains. Some even prefer using subdomains such as About.com. If each site gains enough notary in it’s focused engine to boost the value of it’s link enough to propel any site it links to and you point it to your parent site, your parent site naturally will do well in that engine, because it got an inbound link that is valued highly by that engine.

Creating 3 sister sites isn’t practical for me?That’s fine, the same rule still applies you just have to figure out how to follow it. Look at the inbound links of the sites that are outranking you in the prodigal engine. Compare them in a detailed manner. One or many of those inbound links may be the difference between your ranking and their ranking. All you have to do is attempt to get links from the same sites that link to them. You can see how this would have the same essential effect as creating sister sites. A good tool for this is PR Prowler.

In short my answer to the question is to get inbound links that the other search engine likes. If your ranking well in Yahoo and Google, but can’t rank well in MSN. Gain some inbound links that you know MSN in particular will like.

I hope this clears it up a bit for a few of you

–>

Fun With Google Trends

So I’ve been playing around with Google Trends quite a bit lately. I’m lovin it so far. It allows you to view the search trends for any phrase (that of course gets searched for enough to warrant a big ass graph).

It also allows you to check out where the people are searching from. Whats with those creepy people from Indianapolis, Indiana searching for my name?

Upon saying the word creepy, it inevitably led to a question. What is the place with the largest majority of creepy low lifes on earth? I apologize in advance, but these are my findings.

Child Porn – Winner: Izmir, Turkey. (no surprise there sickos!)Rape – Winners: Delhi India, Chennai India, Mumbai India. (I thought you respected your women)?Stalking – Germany on all counts!Eating BabiesAtlanta, GA (I won’t comment on this one. The guilty know who they are)Free Pot – Winners: Bucharest Romania and Portland, OR (definitely no surprise there since Eugene Oregon shows up on stats as Portland Oregon).Ebay Children – London down the board! (For never spending a dime on dental bills they sure do seem to want to get rid of their children)Ugly Women – Britain taking the gold! (I actually fell out of my chair laughing at that one)STD’s – Winner: St. Louis! (I am not shocked at all, but I still laughed histerically)

Also, Here’s an interesting one for you.How to commit suicide – Winners: Brisbane Australia, Melbourne Australia, Perth Australia

And here I was thinking Australia was a fun place filled with surfing and cute koala bears.

–>

Proper Doorway Page Redirects

So Google is getting pretty swell at finding redirecting pages. Javascript is out. CSS is out. htaccess is out. Whats left?

Try redirecting through flash. Flash scripts can forward the browser to a new page. Search engines also can’t read flash and won’t redirect.

I only put this post up in honor of Matt Cutts appearing on WebmasterRadio

–>

Trust Factors-Getting To Second Base With Sally Search Engine

Just so no one gets confused I want you all to realize that I do not always practice what I preach. There simply isn’t time for it all. Whenever I create a new site I decide in a written plan what techniques and practices will make me the most money with the least amount of effort for the longest amount of time. Although I have many sites that are over 2 years old, I very rarely put daily promotion work into any of my sites that are over 6 months old. I put a lot of effort into the first six months of a site and then let it perpetuate its own promotion (which is coincidently about when I start getting REALLY sick of staring at the site). This allows me to not only keep my sanity, but to prevent myself from putting too many apples in one basket (I think that’s the way the saying goes). This practice however tends to put a vice over my head when it comes to search engines and that vice is called Trust Factor. One of the greatest strengths you can have in business is knowing your own weaknesses so I’d like to use this article to help us all speculate about what I consider my biggest weakness and possibly yours in SEO.

Factors of Trust RankHere is what factors I think the search engines possibly use to determine your trust factor.

  • Age of Domain
  • Keywords in domain
  • Inbound links from sites that compete for the same keywords
  • Age of Inbound links from sites that compete for the same keywords
  • Stickiness of anchor text of inbound links
  • Ratio of inbound links from sites that don’t compete for the same keywords to sites that does.
  • Links from authority sites
  • Stickiness of links from authority sites
  • Percentage of pages on your site that are on topic with the main page.
  • Outbound links that result in a page competing for your keywords.
  • Site being available in the Google directory.
  • Also note that I believe there are boosts available in trust factors for the size of your site as well as meeting a goal of inbound links.

Using these twelve factors lets assume that they are all presented as equal. Therefore we can derive an algorithm to determine an estimated trust factor for our own sites so we may see how we fair to a scale.

The algorithm

#Age of domain factorIf (Age_Of_Domain > 5 Years) {+10 Trust}Elseif (Age_Of_Domain > 1 Year){+5 Trust}Else {+0 Trust}

#Keywords In Domain FactorIf (Keywords_In_Domain = All Keywords) & (Keywords In Domain 1 ) & (Keywords In Domain <5) {+8 Trust}Else{+0 Trust

}

#Inbound Links From Competing Sites FactorIf (Inbound_Links_From_Competing_Sites >= 100){

+10 Trust

}

Else {

+Trust = Inbound_Links_From_Competing_Sites / 100

}

#Average Age Of Inbound Links From Competing Sites FactorIf (Average_Age_Of_Inbound_Links_From_Competing_Sites > 100 days) {

+10 Trust

}

Else {

+Trust = Average_Age_Of_Inbound_Links_From_Competing/100

}

Stop!

I will stop the algorithm right here because by now you’re catching my point. If anyone actually feels like making a tool of this PLEASE LET ME KNOW! Once you read the algorithm you realize the possibilities. When you are finished there is 144 points possible. If you take your number of points and divide it by 10 and remove the remainder you get a scale of 1-14. Assume that 11-14 points mean you are classified as an authority site. This leaves you with a scale of 1-10 of search engine’s trust factor in your site.

From here comes the leg work and yes I’m just as, if not more, guilty than the rest of you of not putting enough work into this as needed to be done. Taking a look at the assumed factors and putting them to scale; +10 being the best you can be or at least well above the average. +0 having none of the factor. We can at least use this to judge how much search engines trust us.

I would really like to talk to some experts on the matter and have them maybe shed some light on this subject for us. Until then I am forced to listen to who I know best………me.

–>