Blue Hat SEO-Advanced SEO Tactics https://www.bluehatseo.com Advanced SEO Tactics and Techniques Thu, 09 Jun 2011 21:43:54 +0000 http://wordpress.org/?v=2.0.7 en Guest Post: How To Start Your First Media Buy https://www.bluehatseo.com/guest-post-how-to-start-your-first-media-buy/ https://www.bluehatseo.com/guest-post-how-to-start-your-first-media-buy/#comments Thu, 09 Jun 2011 21:21:40 +0000 Eli User Contributed https://www.bluehatseo.com/guest-post-how-to-start-your-first-media-buy/ This post was written by a good friend of mine and one of the best media buyers I know Max Teitelbaum. He owns WhatRunsWhere and has previously offered to write a guest post on the subject for you guys, but with all the buzz and relevancy of his new WhatRunsWhere tool I requested he write a post that mixes the two using his own experience. I shouldn’t forget to mention that WhatRunsWhere was programmed/partnered by the great and very familiar face SlightlyShady. I’m a paid member myself and its not a product I think should go without some sort of beginner guide to make it complete.

My name is Max, I am one of the guys over at WhatRunsWhere.com, we offer the premier online media buying intelligence tool. What I’m going to attempt to do below is teach you how to start media buying on display inventory (banners) with a small budget and little risk.

Media buying is one of the hardest channels of marketing to get into. Usually to do media buys through large sites or networks, you have to shell out quite a bit of cash (usually the 5-10k range to get your foot in the door), but I’m going to show you how to test with a small budget (low risk) and find out what’s working for anyone that wants to get into this highly lucrative space (mo money!). This is how I personally got started media buying and how I scaled my first media buying campaign (it blew up and made me a fair amount of cash). I’m going to talk about two traffic sources that are low risk and can get you in the door and profitable on media buys fast, adbuyer.com first and Sitescout.com secondly (this may be a bit long just warning you guys) – plus a small bonus for those of you that make it all the way through.

The trick that most people don’t know is exchanges/resellers exist. My favourite site to start testing stuff on is adbuyer.com. I love adbuyer because it sells remnant inventory from right media network, ad networks (remnant inventory is inventory they have left over or don’t sell, so you get it at a discounted rate). Right Media is yahoo’s ad exchange platform so they have inventory from every network that uses that exchange, and allow you to target specific networks in their functions. You can do this by selecting manual targeting instead of their audience score (see image below).

Click To Enlarge

What doing this allows you to do is see what networks are converting for you and which shows promise. You do this by dumping in a test to every network and seeing which ones show promise (conversions) on your limited budget. It allows you to vet out what networks have a shot of working and save a LOT of money and time testing the various networks. Also you can do this on a small starting budget of around $500 instead of say $5000 per network.

You can also limit your risk here by using whatrunswhere.com, if you simply look up some of the networks in here and check what’s already running on them within whatrunswhere.com you can select a range of networks that already have your niches product on them and working.

A big part of media buying is your creative and landing page. Obviously your landing page like any other traffic source needs to be optimized, but your creative is key. A good creative can make or break a campaign. This is because of the bid structure of the media buy, most are run on a CPM (cost per thousand impressions), obviously if you have a banner with a high click through rate (CTR) your cost per click (CPC) will be lower, and you’ll be paying less for interested eyeballs. Testing what creatives give you the best CTR while keeping your CPA low is a main factor to having a successful media buy (later on I’ll put in a product plug and tell you how you can use whatrunswhere.com to do this).

Ok, mini break from adbuyer.com and lets go through how to research and get a creative set to be ready to test. What I’d do is go into whatrunswhere.com and search my niche keywords into our banner ad search. Tons of banners in your niche that are currently running and making money will pop up. From there, I try to spot similarities. What do the most popular banners have in common? Also, by looking up banners on whatrunswhere.com you’ll be able to see how long the banner has been running. Banners that have been seen a lot overtime are probably working, because why would you keep running something as a direct response marketer that wasn’t making you money.

Using this as inspiration, I can build my own set of creatives to start testing with. I like to start broad with very different creatives to see what does best. You honestly would be surprised what far out creative ideas pull in huge CTRs and conversions. It’s about being eye grabbing but also communicating your message properly to your audience.

I like to build my creative set of about 10 or so varying banners in different ad sizes. Personally, 300×250 ads have always done best for me, but the other IAB standard sizes like 160×600 also do well on specific sites as they engage with visitors in different ways.
I like using whatrunswhere.com here because it saves me time and money. By using concepts and ideas from what others are already doing, I can take out a lot of the risk of having to test EVERYTHING and focus on testing what major elements of what’s working, works for my specific campaign.

Once you’ve identified what networks you think may have promise, go contact them directly. You know their remnant inventory is ok, or at least converts for your page/offer. Most likely you won’t be profitable off the bat here (most media buying campaigns don’t make money from day 1, they need to be optimized), but it’s a good starting indicator of what will work (if you are profitable off the bat, jump on those networks asap and start raking in some cash). Example: if you test on adbuyer and see that network A is brining you in conversions at over your CPA but no other networks are bringing in conversions, network A would be a good place to start working on optimizing and buying directly with to get profitable.

When you go direct to the network, there are a few things we need to hammer out with the network to give us the best leverage:
Out clause (how much time you need to give them to pause the campaign if you want to stop it) – try to get a 24 hour out clause or a 48 hour out clause at the latest
Frequency cap (how many times your ad will show to a user per day) – I personally like to frequency cap at 2/24 or 3/24 but you should play around with it yourself
Demographic targeting – If you know the demographics of your offer, the network can try to exclude irrelevant inventory (this will save you money)
CPA – make sure to place pixels and track a cpa, its not like ppc or social, the network needs this data to be able to help you optimize placements
Network buys are like a black box, they do most of the placement optimization (Where your site shows up) but you can always make recommendations based on referrers that you see (where the traffic comes from) in your own tracking.

Ok, onto Sitescout.com. Sitescout is a RTB (real time buying platform). What this means is they take a low margin and broker you out direct site buys from Rubicon Project. This varies from adbuyer.com because instead of buying from a “black box” media network, you can see exactly what sites your buying on and exactly where on those sites you’re buying (but expect to pay higher CPMs on a lot of sites for this privilege). It’s like doing major media buys on large sites, but without the huge insertion orders (IOs) or prepays. You can also literally get started here with just $500.

I like this as well because this is where I REALLY can use whatrunswhere.com to give me a huge competitive edge. Every site has a visitor demographic (the type of people that visit the site), by using whatrunswhere to search what other people in my niche are doing, I can view the exact placements (websites) they’re showing up on. If they’re in the Sitescout repertoire, I could test my campaign on them as well to see if those placements would work, but more importantly, I get a idea of site demographics. Using Quantcast.com, compete.com and alexa.com I can build a pretty solid idea in my head about the demographics of any one targets website. If I notice that a lot of similar demographic websites are being targeted, that tells me something.

I can take that information over to Sitescout and look through the various websites they offer. I can use this to match up my offers key audiences to other sites within Sitescout that have the same demographics to try to maximize my chance of finding sites that work.

Not every website on a direct buy will work, and some will burn out. The key is to find a few nice placements that make you money and milk them for all it’s worth.
Once you’ve found what’s profitable on Sitescout and you’re ready to go to the next level, you can go direct and try to place buys on these websites. This isn’t cheap and should only be done if you’re profitable already. Sitescout takes very thin margins as is, so going direct will only make you marginally more and you have to worry about rotating in multiple offers to stop creative burnout (banner blindness) with that websites audience. Luckily, as I mentioned before, whatrunswhere.com is a kickass tool for finding new creatives to test that are already working and keep your campaigns fresh and visitors noticing them.

Negotiating a direct buy is much like negotiating a network buy (you have to consider all the same factors), but usually your out clause will be a bit longer and they’ll charge much higher cpms (it’s the price you pay for knowing EXACTLY what you’re getting). So you need to make sure it’s worth it, don’t just go in there blind and swinging. I did this when I was starting out and lost quite a bit of money this way before I finally wised up, did what I described above and started being able to profitably media buy ever since (when I started, Sitescout wasn’t around, boy I wish it was).

Get out there and start testing. The truth is most people will read this post and do nothing. The few people that take action and actually test and play around will learn from this testing and start making money from media buys. This is great for them, because this fear of testing sets a high barrier to entry for others meaning its harder for your stuff to get ripped off by any new marketer into the space. The best thing to do is just get your feet wet and start testing

BONUS: for all of you that read this far down the post, here’s a way to test banners on a CPC instead of a CPM. Google content network. You can upload banners here but you’ll pay a premium since your single banner has to have a higher ctr than a whole block of text ads or you have to make Google’s eCPM (how much they make the site per thousand impressions) higher than they would normally get, to get shown frequently. But using their placement specific tools or keyword related sites, you can test your creatives across a large range of related websites, optimize cpms and such all on a budget. Again, whatrunswhere.com is awesome here as it shows you the EXACT placements on the content network where your competitors are buying so you can test them out to, and not waste money testing stuff that you’re not sure if it’s working. It’s a decent way to test out banner ads if you have an even more limited budget to the point where you can’t spend the money testing on sitescout or adbuyer. That being said, adbuyer and sitescout are a lot more affiliate friendly and will let a lot more types of offers and more risky creatives through the network than Google will, so it’s a give and take.

As with all IM tool related posts here is your discount with no affiliate link:
Special Order: https://www.whatrunswhere.com/signup/index2.php?pkg=2
Coupon Code: bhseo
$45 off your first month

Thanks Max for the awesome guest post!

]]>
https://www.bluehatseo.com/guest-post-how-to-start-your-first-media-buy/feed/
Open Questions: When To Never Do Article Submissions https://www.bluehatseo.com/open-questions-when-to-never-do-article-submissions/ https://www.bluehatseo.com/open-questions-when-to-never-do-article-submissions/#comments Mon, 12 Jul 2010 21:14:20 +0000 Eli User Contributed https://www.bluehatseo.com/open-questions-when-to-never-do-article-submissions/ Got a question in my E-Commerce SEO Checklist post from Rania, who didn’t leave me a link for credit.

“4. Steal your competitors articles and product reviews and do article distribution.”

You recommend STEALING articles from competitors as an advanced SEO tactic?! Seriously?!

How about recommending that users create their own unique content in order to increase their site traffic and ranking.

Suggesting that people steal competitors work really says a lot about you- and your company.

Good luck.

I know you’re only trying to make a point but I’ll accept the question anyway.

Why would I steal my competitors articles for article distribution instead of writing/sharing my own?

The answer is much more than a time or ethics response.

1. Unnecessary Competition
A typical article distribution involves submitting to around 1-5k worth of article directories and E-zines. Any time you submit the same piece of content to that many sites it creates unnecessary competition. This is especially true if your site is new. The article directories and ezines are old, your site is new. They win. While they usually won’t take you out on the primary keywords especially with your site linked in the article, they can snag a lot of unknown positions in your longtail and mediumtailed phrases pushing your site down and losing you an unknown bit of long term traffic. This can go all the way up to a worse case scenario. Last year when Acai Berry was a hot niche a lot of people were seoing for the term and many did article distributions. While their sites never made it into the top 10, the articles they submitted came closer. So when Google did some manual reviews and bitch slapped a bunch of rebill promoters and affiliate pages on the term most of what was left was the articles. All ranking and for awhile there while everyone else readjusted left a bunch of article directories taking all that good traffic. The first thing the article directory owners did of course was edit the articles and take out the links to the authors websites and throw in their own affiliate links. Lesson learned, but that brings us to point two.

2. Don’t Do It Unless Others Are
ESPECIALLY IF YOU ARE IN A SMALL NICHE! God damn, I see people do this all the time. They find a nice little niche with nearly zero competition and in the miniscule effort they have to put into ranking they realize the only way they know how to build backlinks is to article submit. They get flooded out and cry. Uhhg. The point is, same as point one. Don’t do it unless others are. I follow the rule outlined in SEO Empire, if you want to win Match and Exceed. If it becomes a problem for you to maintain your ranking, then submit where they submit and a bit more. So don’t submit unless you have to, but if you do make sure you…

3. Submit Your Competitors Articles Only
All those same articles on thousands of sites creates a massive duplicate content penalty opportunity. So the worst choice you can make is to go through the articles published on your site and submit them. In the middle of the bad choices spectrum would be to write unique articles and submit those. Unique articles are for pulling in seo traffic and thus they belong on your site and preferably nowhere else. You throw little fits when people steal your content, why would you willingly give it out? The wisest choice is to submit your competitors articles because if you’re going to put anyone at risk for duplicate content you might as well put them not yourself. I’ve always said, there’s two ways to rank: You going up or them going down. I own about 170 article directories as part of my basement. I understand how the article game is played. People aren’t submitting articles to my sites because they want my directory to be the best smelling turd around. They want the links. I want the pages of content for link laundering and they want the links, that is what its all about and nothing more. No one owes me unique articles, nor is it doing them any favors to give them to me.

Hope that clears up the game rules for you Rania, sorry its not advanced, but there’s good reasons for everything we do. Please understand, I don’t tell people to be evil just because I enjoy watching them be evil (even though I do). There is always some form of risk whenever you are. Yet never hesitate to be evil when its a sink or swim situation, always match and exceed.

]]>
https://www.bluehatseo.com/open-questions-when-to-never-do-article-submissions/feed/
SEO Checklist for E-Commerce Sites https://www.bluehatseo.com/seo-checklist-for-e-commerce-sites/ https://www.bluehatseo.com/seo-checklist-for-e-commerce-sites/#comments Fri, 09 Jul 2010 08:12:54 +0000 Eli General Articles https://www.bluehatseo.com/seo-checklist-for-e-commerce-sites/ Answering a question on Wickedfire here.

If you own an Ecommerce site and don’t know where to begin on the SEO go through this check list. In total, it’ll cost less than $500.

1. Signup with all related forums. Put your site in the footer links and go through answering product related questions on a weekly basis.

2. Create a free OSCommerce and Zencart template related to your parent niche (if you sell CDs make a music template), insert your link on it and distribute it on template
directories and their repositories.

3. Create an articles section on your site and put in a form allowing people to submit articles. Email hobby blogs in your niche asking to use some of their particularly good posts in exchange for a link back in the article. This will make them aware of your site and they might even link to you in future posts when talking about a particular product.

4. Steal your competitors articles and product reviews and do article distribution.

5. Create a blog on the site and give out manufacturer coupon codes regularly. This will sometimes help with getting negative results. Post those coupons on item #1.

6. Put all your products in Google Products (froogle/base). This will sometimes help with getting negative results.

7. Browse Google Products for small ecom sites with no reviews and similar products and link exchange on a separate automated link exchange script on a separate page.

8. Make sure you optimize your onsite seo. I assume you know how to do this.

9. Download, convert to html, and attach all the product manuals to each individual product. Link back to the product on each manual. This will give you more pages for indexing and catch a lot more longtail keywords.

10. Spam the fuck out of Yahoo answers and similar.

11. Directory submit! It may not work well for other sites of yours but ecommerce sites are almost always welcome in directories.

12. Customize a nifty and unique toy style item with your logo on it and mail it to the most popular bloggers in your niche. Shirts and hats also work well.

13. If you have access to the products get a webcam and pretend to be a vlogger. Review the products and post them on all the major video sites.

14. Create autoblogs and link wheels.

There’s more but I think that’ll keep you busy enough for now :)

EDIT:
There was some confusion in the comments on what I meant by “Negative Results”
“negative results” or “negative rankings” are the results inside of the regular results that Google puts in.
Such as:
Video Results
Image Results
News Results
Product Results
Blog Results
They used to always appear above the regular results so we call them negative rankings because they’re less than #1. Now they tend to go between random positions. This term may change the older this article gets.

]]>
https://www.bluehatseo.com/seo-checklist-for-e-commerce-sites/feed/
How To Take Down A Competitors Website: Legally https://www.bluehatseo.com/how-to-take-down-a-competitors-website-legally/ https://www.bluehatseo.com/how-to-take-down-a-competitors-website-legally/#comments Tue, 22 Jun 2010 07:40:40 +0000 Eli Check Mates https://www.bluehatseo.com/how-to-take-down-a-competitors-website-legally/ They stole your articles didn’t they?
You didn’t even know until they outranked you.

They jacked your $50 lander without a single thought to how you’d feel?
Insensitive pricks

They violated your salescopy with synonyms.
Probably didn’t even use a rubber.

They rank #8 and you rank #9 on EVERY KEYWORD!
bastards!

Listen, why don’t you just relax. Have a seat over there and enjoy an Otterpop. You obviously Googled this article for a reason. No one reads this blog anymore. I haven’t even updated since November or so I hear. Literally, Someone had to tell me I haven’t updated since November. I didn’t check so I’m just going to assume they wouldn’t bullshit me. The point is you have anger issues and would like to turn those anger issues into an anger problem. That’s cool I’ll show ya how, but first lose your initital instinct to go troll a forum asking about ways to take down a competitors website which inevitably leads to trying to find a hacker. Don’t be stupid illegal shit will ruin you eventually. Instead join the vigilante team to be a real jerk about it.

How To Take Down A Competitor’s Site The Asshole Way
First create a free account on www.spamcop.net. Spamcop.net is a large anti-email spam service that I hear is secretly owned by Spamhaus (another free antispam service). It allows very bored antispam vigilantes (”spamcops”) to anonymously report email spam. From there they track down the host and server provider, tear out all the contact info so they can’t trace it back to you, and send them a report claiming you’ve been sending email spam. The fortunate and unfortunate thing about spamcop.net is they are very reputable despite having no method of verifying any of the reports are real. Hosts/server providers take their reports very seriously and often times after 3-4 reports in a short period will disconnect the targets server’s IP and a few more after that cancel their server keeping all their files. It really is a guilty until proven innocent scenario. The only way to win if someone is reporting fake spam against you is to really be a spammer and just shut down and move to a new host. If you’re a legit webmaster you’re pretty much fucked to say the least because they’ll just keep telling you to stop doing what you’re not doing and if they’re really patient they’ll ask you to prove you’re no longer doing what you were never doing. LOL

DMCA Notices Don’t Have Shit On Spam Reports
Once you create your account and login go to the Report Spam tab…

*disclaimer*
I’M NOT ACTUALLY SAYING TO DO THIS. I’M SAYING IF YOU WERE TO DO IT THIS IS WHAT YOU WOULD DO BECAUSE THAT IS WHAT OTHERS ARE DOING. I AM AN ENLIGHTENER TO WHAT ALREADY HAPPENS EVERY DAY. I DON’T WANT TO BE SUED BY THESE ASSHOLES. LOL
*/disclaimer*

Copy and paste a few random spam emails currently in your inbox. Swap out the domains and ips (they even give you neat tools for this!) and send away.
If you do this every day for a week or two their server will eventually go offline. For them it becomes an uphill battle to keep from getting shut down.

This definitely isn’t a cool thing to do and isn’t in the spirit of competition nor is it affective against large sites like Wikipedia or Yahoo(DANG!). Likewise I wouldn’t recommend doing it, but unfortunately the tactic exists and is in play already. The problem is with reputation. Spamhaus and Spamcop.net has a reputation as never being abused, but that’s only because no one ever bothers to check to see if it is abuse giving them a near perfect record. True recent story: I can’t say if my server really did send out spam or not originally because I just had no way of knowing, but I got this spam complaint. It was nearly worthless just had one of my domains as the links in the spammail and my server IP. I couldn’t find the problem so I told the server provider. That wasn’t acceptable with them so I made something up, saying umm yeah it was a sendmail hack or something. I fixed it. I fortunately could tell from some hidden code in the email that got skipped by the censors who the spam complaint came from. I thought it was over with but unfortunately I got complaints from him every day for about a week and a half. I fended them off till they eventually shut off my IPs with no warning and wouldn’t even give me access to the server to fix the problem. Haha oddly enough they said they wouldn’t turn it back on until I did fix the problem..idk. Anyways I said screw it and let the server stay down for two weeks. I come back and the guy is still sending spam complaints. Each one is dated for the current date. Basically he was spinning these emails and claiming my server was spamming him for two weeks without the server even being on. By the time I could even point this out it was too late, the servers and everything on them was pretty much dead.

THIS IS THE REALITY! THIS IS WHAT HAPPENS.

So I definitely don’t feel bad if this information gets abused because it may eventually lead to a solution to a major problem that causes a lot of damage to the rest of us.

]]>
https://www.bluehatseo.com/how-to-take-down-a-competitors-website-legally/feed/
Addon Domain Spamming With Wordpress and Any Other CMS https://www.bluehatseo.com/addon-domain-spamming-with-wordpress-and-any-other-cms/ https://www.bluehatseo.com/addon-domain-spamming-with-wordpress-and-any-other-cms/#comments Wed, 11 Nov 2009 15:46:05 +0000 Eli Neat Tricks and Hacks https://www.bluehatseo.com/addon-domain-spamming-with-wordpress-and-any-other-cms/ I got this question from Primal in regards to my post on Building Mininets

Eli,

I like the post and your entire site. Thanks for sharing your knowledge. One thing confuses me about this particular tactic. Where are you getting the content from? You mentioned Audioscrobbler and Youtube API but I am not focusing on a music niche. The “widely available car db” sounds more like something I could use. Can you say where you would get something like this from? Also is there any reason why I should use customized pages instead of a CMS like Wordpress to generate these kinds of sites?

Upon a glancing read this question seems to focus too much on the exact example used in the post. Yet if you really read the multipart question thoroughly and get to its core it’s a FANTASTIC question that really needs an answer in more depth than what I would put in a comment response. The mininet building post isn’t about how to use APIs and RSS feeds to get content. Nor is it about creating a custom CMS or doing multiple installs of the same site structure (I covered that in depth in my SEO Empire post and called them ANT Scripts). The real down to brass tax gem behind the technique is understanding how to do Addon Domain Spam via environmental variables such as HTTP_HOST to create a lot of sites from a single install of ANYTHING. I’m absolutely a firm believer that addon domain spam is the future of webspam. Subdomains had their day and now its time for figuring out creative ways to create a ton of unique sites from a single platform. This doesn’t always have to be done through addon domains and as mentioned in the comments can be done through other ways such as editing the httpd.config. For now though I wanted to focus on the basics such as using addon domains and if you’d like to go cheap about it subdomains, and let the SEO ingenuity naturally evolve from there.

To answer your question yes you can use databases to help with the content for these sites. Check out my Madlib Sites post for some great ideas on how to accomplish that and use databases. As for the second part YES you can use other CMS’ such as Wordpress!

How To Use Wordpress To Do Addon Domain Spam
I got several emails from people asking how to create a wordpress plugin to accomplish this technique as well as a comment from the longtime reader PhatJ. I realize at first thought this sounds like a complicated process to be able to convert wordpress over to being able to read multiple addon domains and treat them as multiple installs and probably require some sort of plugin being created, but as with most things the simple solution is often the best.

The easiest and most effective way to convert any CMS to be used for addon domains that I’ve found is to simply edit the config files. No joke, that’s seriously usually all it ever takes. In my wordpress wp-config.php file I grabbed the line that declared the database:

define(’DB_NAME’, ‘database1′);

I replaced it with a simple IF ELSE statement to check for the domain and define the appropriate database:

if ( $_SERVER["HTTP_HOST"] == 'domain1.com' ) {
define('DB_NAME', 'database1');
}
elseif($_SERVER["HTTP_HOST"] == 'domain2.com'){
define('DB_NAME', 'database2');
}else {
define('DB_NAME', 'database1');
}

Then I just pull each database in the browser or mass wordpress installer script and setup each blog as if it was separate.

To show you it in action I put up a single Wordpress install on a subdomain on Bluehat. I then added a second database and put that code into the wp-config.php. Looking at each you’d have no idea they were a single wordpress install. See for yourself :)

Domain 1: http://addtest1.bluehatseo.com
Domain 2: http://addtest2.bluehatseo.com

Thanks for your question Primal!

]]>
https://www.bluehatseo.com/addon-domain-spamming-with-wordpress-and-any-other-cms/feed/
Blue Hat Technique #21 - Advanced Mininet Building https://www.bluehatseo.com/blue-hat-technique-21-advanced-mininet-building/ https://www.bluehatseo.com/blue-hat-technique-21-advanced-mininet-building/#comments Tue, 10 Nov 2009 12:11:59 +0000 Eli Blue Hat Techniques https://www.bluehatseo.com/blue-hat-technique-21-advanced-mininet-building/ I promised awhile back that I’d teach you ugly bitches more ways to build your sexy SEO Empire. With some spare time this week I might as well take some time to help your nasty hooker ass do just that. YES I will insult you through this entire post because judging from the recent comments you donkey fuckers are getting a lil too big for your own britches and need to be brought down a peg. I’m kidding of course. You guys are great. I just feel like filling this post full of as many reasons not to read it as possible and since no one gave me an excuse to do it, I just made one up. :) This post will be advanced and since this technique’s ability to be bulletproof feeds off creativity and the subtleties of being selfmade I’ll also only give out pseudo code instead of code samples. It is however an extremely efficient way to build large amounts of unique and self-promoting sites and is more than reusable for just about any chunk of niches so modularize your code and save it for future scaling. Trust me you’ll wish you did.

Getting Started With Your Custom Mininet Generator
It’s always easiest to start a project with an example in mind so begin by picking a generalized niche that can involve a lot of different sites along the same theme. For this example I’ll use music based fan sites. So I’ll grab a few starter domains to test with such as AudioslaveFanzSite.com MetallicaFanzSite.com KanyeWestFanzSite.com and maybe one more for good measure, JonasBrothersFanzSite.com. <- See how I was all insulting and mean at the beginning of this post and suddenly changed it around to being all considerate and using faggy bands as examples so you fags can relate to what I’m saying here. I’m not mean all the time and can in fact be quite understanding. *grin* Anyways! Now that you got your domains setup an account under a single domain on your server and add the rest as addon domains. In your site’s root make a Sources folder and another to hold a data dump. After that setup a MYSQL database to be used by all sites and put a row in a table for a single domain you bought (the rest will be inserted automatically later). I recommend you put the actual domain in some value in that row.

Build a Single Universal Template
This is easier than it sounds. You can always go 100% custom but to save time I like grabbing a generic looking premade template. I then put it into a script and disect the html to put in as many variables as I can fit. A few examples would be <title>$sitetitle - $pagetitle</title> <h1>$heading1</h1> <body bgcolor=”$bgcolor”> <img src=”/data/$domain/images/$mainimage”> <div>$maincontent</div> which I will later fill with the actual content and variable values. Pack the template full of as many customizations as you can so it will not only be flexible and universal among all topics in the niche but the html itself is very random and as uncookie cutter like as you can get it. Torwards the end of the process I also tend to throw in a bunch of $randspacing type variables as possible. Then i use a randomizing function to create various spacing and line returns and randomly insert it throughout just about every group of html tags. I mention this now instead of later in the post because its important to realize that you will want this template to be as flexable as possible because you’ll eventually be using that same template on a TON of sites that may or may not be doing some interlinking so you don’t want it to appear as a network. Changing colors, widths, and images around are a great way to accomplish this just don’t get too complicated with it starting out. Keep it very basic and once you got the mininet nearly done you can add as many as you’d like later. Sometimes it’s typical to throw yourself off focus and doom the project by getting too hung up on getting the same thing perfect. For each variable you place in the template you’ll want to put the same as a field in the SQL table you created previously.

Putting Together Some Content Sources
For an example such as the music fan sites mininet I’d probably jot down a few sources of content such as Audioscrobbler for the band descriptions, primary image, and discography. Then Youtube API for a few music videos for each musician. Another great source would be Yahoo Images for some band related wallpapers and Alexa for some related sites to link to. I might even grab the Google Blogsearch rss for some recent blog posts related to that artist. Starting out it’s usually best to keep your sources as simplistic as possible and not stray too far from readily available RSS and APIs. Like I said you can always get more advanced and custom later. Create a module script for each source and put it in your previously created Sources folder. Then for each source you came up with add it as a table in your SQL and put in all the fields you’ll need for each one and remember to save room to identify the domain on each one.

Building The Generator
Create a backend script that will simply be a place to copy and paste a list of the domains and their primary keywords into with a button to submit it. My domains and keywords for this example would most likely be pipe delimited such as:
GodsmackFanzSite.com|God Smack
U2FanzSite.com|U2
BeyonceFanzSite.com|Beyonce Knowles
Once the list is submitted the generator needs to insert a new row into the table and create all the randomized variables for the site such as the background colors , various spacings (and/or a brand new template file stored in the data folder) putting them in the same single row. Once the basics are done it can call all the source modules and run them using the domain name and the keywords they need to grab the right content. They will then need to put that content into the database under the proper domain’s row(s). You now have all the content you need for each site and each got its own template! Now it’s time to just build the son of a bitch.

BUT! Before I show you how I’ll give you a few examples of how I would setup my tables for each of the sources so you can get a better idea.
For my Youtube I’d probably keep it simple and just do the domain and the embed code.
Domain|EmbedCode

Audioscrobbler:
Domain|BandDescription|Albums|PrimaryImage

YahooImages
Domain|PathToImage

GoogleBlogSearch
ID|Domain|PostTitle|PostDescription|PostLink

Alexa
Domain|RelatedSite1|MySite1|RelatedSite2|MySite2|RelatedSite3|MySite3|MoneySite1

*the MySite1 would be another random fan site in your list of domains. The MoneySite1 would be a money site of yours you can insert later to help with upward linking ;) These are foundation sites after all.

So simple even a retarded piss bucket like yourself can figure it out :)

Scripting The Sites
I know some of you are going to talk about dedicated IPs for each site and various other expensive ways to make sure the sites don’t get associated with each other but there was a good reason I said to use addon domains although there are other more complicated and better solutions. The first thing you should do when scripting the index page is to grab the current domain using an environmental variable such as HTTP_HOST. Once you have the domain name you can use that to grab all the appropriate data for each domain name and you only have to code one site and get it to work for ALL the sites in the mininet. For instance if someone goes to JayZFanzSite.com it’ll grab that into the variable and customize the entire site to be a Jay-Z fan site even though its all the same script controlling all the addon domains. I always start with the main page and branch all my subpages off that. For instance for the Jayzfanzsite.com I would put in a section for Jay-Z Music Videos and link to More Jay-Z Music Videos(the Jay-Z being that domains primary keyword as specified in the DB). That Jay-Z Music Videos subpage would just be more previously scraped music videos from youtube. The same would be done for the Jay-Z Wallpapers, Jay-Z Discography, Jay-Z Lyrics, Jay-Z Guitar Tabs..Whatever sources I’m using. Each would be a small section on the main page and would expand into their own subpage which would target popular keywords for that artist. Once all that is done and built into the template you can test each change among all the current test domains you have to make sure each shows up nicely and the randomizations and backgrounds all are static and neat for each site. Be sure to put in a place for your Alexa similar sites and as shown above mix in links to your other fan sites for each band/musician as well as some placements for your current and future money sites so they can all get good link volume. Once every test site looks pretty and is fully functional along with fairly unique content all you have to do is scale up with more domains.

BUT FIRST! I like to incorporate ways for each site to self build links. Such as for the Google Blogsearch posts I’d put a section on the main page for Jay-Z News listing the most recent 25 blog post results or so. Then I would build a small cronjob script to update it every day with 25 new posts or so and do a pingback on each to score a few unique links from other related sites every day automatically. This way you not only have lateral links from other sites on the mininet but links from other sites and the links are always growing slowly so each site can continue to grow in rank and traffic over time.

Buying More Domains and Scaling Up
As indicated I like to keep it simple and pick a prefix or suffix that has many open domains that way I don’t have to spend a ton of time picking out the domains I can just grab a list of several thousand popular bands and mass buy the domains then copy and paste them into the backend generator. Boom! Several hundred to, if you’re bold enough, thousands of new sites. All of which will grab quite a bit of underexposed traffic from keywords, image search and links. It will also give you tons of links and awesome pagerank for future sites you build. It’s a lot of work initially but it’s definitely easier then hand building all those sites yourself and the sites can easily become just as successful as if you did, especially if you did a good job with your sources. Once you’ve scaled up that mininet to a level you’re comfortable with and can maintain financially (it helps to build in a montenization module to the site so you can easily switch out ads among all the sites so they can create the most money possible per site) you can switch to a new group of sites using the same code, many of the same sources, and same method. The music fan site example is great because nearly the exact same code can be used in so many ways. For instance I can take the same damn code, get rid of the audioscrobbler and swap it for a widely available car DB for the description, image and car specs, and build a whole mininet for every single make and model car out there with a whole new set of domains such as JaguarXJ220specs.com, BMW540specs.com, PontaicGrandPrixspecs.com. It’s as easy as swapping out the keywords used in the modules so they become Pontiac Grand Prix Videos (from youtube source) and Pontiac Grand Prix Wallpapers/Images. All you need is a new template and a new group of domains to build an absolutely massive and diverse mininet that is actually profitable and self growing.

PS. I know I said HUNDREDS and THOUSANDS of sites all dramatically, but as with all techniques start off small. Get your scripts and promotion right. Make sure it works and is profitable on a per site basis before scaling up to any ridiculous levels.

LATA JERKS!

]]>
https://www.bluehatseo.com/blue-hat-technique-21-advanced-mininet-building/feed/
Review of AutoPligg Backlink Tool https://www.bluehatseo.com/review-of-autopligg-backlink-tool/ https://www.bluehatseo.com/review-of-autopligg-backlink-tool/#comments Tue, 01 Sep 2009 21:42:12 +0000 Eli Site Reviews & Commentary https://www.bluehatseo.com/review-of-autopligg-backlink-tool/ If you already own AutoPligg read this post anyways. I promise you’ll learn something…

Today’s tool review is of a tool called Autopligg by the Syndk8 crew. You’ve probably already heard of it by now but I’ve been using it for the last 6 months or so and I wanted to give you some insight on the tool and maybe some resources if you don’t already have it. Autopligg is a windows or serverside based tool that spams the popular Pligg platform, which is a CMS that basically is like Digg. There is a TON of sites out there using the PLIGG platform which makes this tool more than deserving of a review.

How Is It Useful
I use the word spam tool hesitantly in the case of Autopligg because it’s unique in the fact that yes it is a spam tool and it is automated link building but I think its real power lies in using it for white hat purposes and a way to mass post your stories to a whole lot of fuckin sites that want your posts. If you look at the typical PLIGG site it’s structured much like Digg with categories and you put up your links in those categories and people can either vote them up or down and people leave comments on it. Ya know all that bullshit you’re already familiar and tired of. You can get links one of two ways on them. You can post your links (typically article links if you don’t want them deleted) as stories or leave comments. Very few of these sites are very high PR or of a high link quality but anytime you have a platform that allows you to post links on it with well over a 100k working sites out there using it; It becomes a link builders wet dream. For that Autopligg becomes a very useful tool. Here’s where the catch 22 happens however.

The Quality Of The Links It Builds
It’s no secret that I’ve never been a believer in nofollow and its ability to be “not followed” but I’m never very vocal and definitive about it because 10 minutes after I say something about it, it could change lol. Ain’t that a bitch. The main story links are nofollowed by the Pligg platform. So every link you post as a story will be nofollowed. HOWEVER! The link will be the top outside link excluding the site’s possible navigation. It will be in a heading tag and have your anchor text. Plus it gets its own page that will be cycled through the main page. Here’s the strange shit, the comment links are dofollow. They are only placed automagically by the platform by putting the full url including the http:// in your comment. They are dofollow but they don’t have your anchor text. They don’t cycle through the main pages and they are standard links beyond the point of the page’s navigation. By now, if you’ve been paying attention to this blog, you should already know what link type I prefer….both! But lets not loose focus on the point of this tool by worrying about the quality. It is by all uses a Link Volume building tool NOT a link quality building. It’s nice to have at least a bit of link quality with every link so they produce the link worth necessary to make them count instead of going supplementals. This brings up another SEO concept that the tool was kind enough to also address, link indexing saturation. I was really curious with all the links it builds at once if it’ll have at least something to get them indexed. It does have a pinger that also allows proxy use. Cool! Good enough for me. Between that and the outputted list of successful postings I have everything I need to get as high of a link indexing saturation rate as possible. Kudos on that.

How Is It Not Useful
I may spare no punches when it come to the negatives of tool reviews on BlueHat but to date I’ve never given a negative review of a tool. This is simply because I’ll only post the review if I think the tool is useful to you the reader. I tend to reserve that right when asked to do a review. Autopligg does have a VERY strong negative side which I was really hoping would go away in the months since it’s release that way this review can be all positive and not get a negative just because of a few issues that typically work themselves out after the first few months after it’s release. At this point, almost 8 months later, it appears this isn’t going to happen so the beef still stands. Anytime a new link building tool comes out there’s always those few people (amateurs) that want to get their “monies worth.” They use the tool in the most retarded way possible and do it as hard as they can. It typically makes the tool worthless for the first couple months. It does them no good and provides no benefit and at the same time hurts everyone else and the tool itself. Autopligg stemming from the Synkd8 crowd who is notorious, and even self proclaiming, for pulling this kind of shit wasn’t expected to be an exception. ESPECIALLY since it has a windows based version. At least with serverside versions you have to have at least a few braincells to use the program and with web based you can cancel their accounts. So this was entirely expected, but what shocked me was. It was and since the launch always has been only about 3 users of the program that have been doing this idiotic irresponsible use of the tool. Unfortunately as I mentioned above, they’ve yet to stop.

What they’ve done is, they’ve opened up like 10 instances of the program and put in a single super long comment into the comment poster that’s nothing but a ton of links to separate subdomain spam on a single site (i guess after buying the tool they couldn’t afford more domains). They have a macro restarting each submissions over and over and over so every single post on every single Pligg site instantly gets hit with a ton of worthless spammy links. The main domain has been banned for months now so instead of stopping the script they just forwarded the domain as if it was going to do them any fucking good. Especially since they forwarded the domain and all its subdomains to a single wordpress blog with some spammy text written by the worlds worst content generator and no ads or offers or particular keywords. Here let me show you what I’m talking about and you can decide if this isn’t the most retarded shit you’ve ever seen: http://www.5wing4.net/story.php?title=Cat_Urine_Removers-1. If you’re like OMG that’s my site he just outed consider this before bitching to me. If you don’t want attention don’t be an attention whore. You did it to yourself, fuck off I don’t care. The sites were worthless the moment you got put in charge of SEOing them anyways.

So please, use this tool responsibly. Even if others aren’t it’s still a very useful tool and one I would definitely recommend you have in your arsenal because like I said, it’s for link volume and deep link volume not link quality. For that reason I’ll go over the best way to use the tool and how to get the most out of it, because often times the trick to harnessing the optimal power out of a link building tool is using it correctly.

How To Use Autopligg The Right Way
As with all Blue Hat Techniques the short answer is, mimic the white hatters. Now that you know how not to use the tool I’ll show you how I used it and got a lot of success. I am actually planning on writing a guest post for this for SEOBook that’ll cover it more in depth, but for now I’ll at least introduce what I call MacroNiche Blogs and if he doesn’t end up publishing it I’ll at least post it here for yall. <- Check it out Quadzilla another new SEO concept! Quick copy paste copy paste copy paste!

MacroNiche Blogs
People often talk about Niche Sites and MicroNiche Sites. In case you’re not familiar they’re basically talking about the size and focus (scope) of a particular site. For instance a niche site might be a site on female orgasms. It’ll have the main page which sells some product(s) related to those and it’ll have a few other articles on particular such as Gspot orgasm, Clit orgasms and various other myths. Each of those pages will also typically sell a product or an offer related to them as well. A MicroNiche Site is very similar except its more focused such as the main page being solely on the Gspot orgasm then a few supporting subpages on the same topic but all pushing the same product and offer. Since I love to confuse things I tend encompass all of these types of sites into the single “Money Site” term. This is because i technically consider every site no matter where it is in the empire’s scale a niche site because it does focus on a niche thus it is a niche site. Therefore I tend to group things on the intensity and focus of the site pushing a single offer or product. A Money Site would be at the top of that scale. Therefore a MacroNiche Site or in this case MacroNiche Blog is exactly what it implies. It’s a single blog that encompasses a very large niche but doesn’t focus on that niche. Instead it focuses on lots of much smaller niches within the single site and uses the authority being passed through the subsections to help the others so new sites for each microniche isn’t required to rank for each individual offer.

Although accurate I may have explained that in a way that was a bit more complicated then it is. Let’s use an example, an example that happens to be the very first test site I used AutoPligg on. My blog’s macro niche was Health. I then looked through a bunch of offers on some CPA networks and made a list of niches based on all the offers I found that would fit into the Health macro niche. So I setup my blog with a bunch more categories such as: Weight Loss, Hair Care, Teeth Care, Bodybuilding, Skin Care, Exercise. The list goes on and on. I was very thorough in my niches. I then put in some subcategories under each that were my microniches. These were more focused on individual offers. For instance under Weight Loss I put in Dieting, Diet Pills and under Teeth Care I put in stuff like Teeth Whitening and under Exercise I put in Exercise Equipment, Yoga blah blah.

This MacroNiche structure was the perfect test for AutoPligg because it utilized AutoPliggs most powerful attribute, it’s ability to build both deep link volume and main page link volume. Had I have only used a Niche Site or a MicroNiche Site after submitting the main page and all the subarticles I would have become immediately limited to just spamming the comments for further link volume. The ability to continuously use both the article submission for the microniches and the standard niches and the comment postings for the main page authority and link volume gives this structure a huge benefit in it’s overall domain authority and it’s ability to quickly rank new posts. So that’s exactly what I did and continue to do.

First, once I got the design and categories setup, I hired a couple writers to go through each of the categories and subcategories and schedule several months worth of daily posts making sure each category and subcategory were accounted for within the first month then subsequently each following month at least once. Once they were done I immediately did a directory submission for each primary category as well as the main page along with some social bookmarking and various other link building. As you’re all bound to ask, for my primary keyword for the main page I picked one that was fairly medium competition that would grab the attention of other health related bloggers. That way later when I got that ranking and wanted to do some blogroll link exchanges it was easy to find willing targets. Then every day I would give special attention to the individual blog posts for that day. For each one I would submit it through the story submitter in AutoPligg so it would get several thousand links plus each of those links would get pinged for some link saturation along with a few daily rounds of comment postings for that category and the main page. I would then do a quick scan for similar articles on each post and submit those through several hundred article directories with links to both the singular post and the main page as well as the subcategory with the keywords for each. I would then ping and social bookmark the individual post. After that all I had to do was find 5 new posts with the same keywords as that post and exchange links within the post to the other blogger’s post *cough* Pingcrawl *cough*. This also got the attention of other bloggers in the niches and macroniche to notice my site and be more willing to do blogroll exchanges. That caused each individual post to quickly start ranking the same day/week it was posted and brought up the total site authority with every one. Which brought me to my final step, bringing up the total site authority as much as possible via the main page, by placing links to the main page across my SEO Empire and running some automated link building tools as well as general link building such as commenting on other blogs/news sites. The fact that it was a MacroNiche Blog with legitimate articles on each subject made each post more credible and the site more credible overall so if there was a human review or in the case of directory submissions, more palatable and acceptable to the reviewer.

This is where it turned into a mutha fuckin Money Site! I could go through the offers at my leisure and write targeted salescopies pushing the offers and throw them up as I wrote them as if they were their own microniche site. Likewise when new offers would come out on untapped niches, while everyone else has to start building new sites around them and then start the link building to get them ranked I could just immediately jump in with a new post and instantly hold a new ranking using the site’s existing authority. You actually see this happen all the time when shit like Oprah’s fat ass blogs about a product. All I was doing was using Autopligg and my eye for structure to mimic that effect. There’s obviously a lot of spins and ways to build a MacroNiche Site as with any structure, but if you start out and do it right the first time you’ll quickly learn and expand very nicely. There’s also a lot of macro niches out there to take over such as Business (bizops, grants, paid surveys etc) Consumer Reports (Credit Cards, Credit Reports etc) Downloads (adware, recipe/music/games programs, etc).

Pricing
Now that I gave it a fairly positive review and got you started on how to have success with it (the two make a logical partner) lets talk pricing because as it stands now it’s a bit more complicated then it should be for tools I review. It’s normal for IM tools to fluctuate their prices to accommodate sales volume drops or when they say its only $5 for the first 5 people then it’s a $100 for every person after that. That’s all mind fuck marketing bullshit. The only excuse to fluctuate a price in a tool is when new versions come out (new versions with new features not new versions that simply fix old bugs for that matter) because that causes more development costs along with support and marketing costs. So in that right it’s understandable. Normally as a condition of doing a review I get assurance that the price, at least for the readers here, will not change and that there is always some sort of long standing discount. This time I’m going to break the rule just a bit. When Autopligg released it costed $289. As of writing this post it’s $189. I’m told a new version is coming out next month and the price will go back up to the $289. This is okay with me in this case for three reasons. First, I was okay with the original $289 price so as long as the flux doesn’t go back above that then I’m cool. Second, I’ve been assured that the people who buy at the $189 price will get a the new version free. There’s nothing I hate more than having to buy the same product twice. Third, allegedly the new version will have a new feature of being able to crack the recaptcha captchas used by many of the autopligg sites. This opens you up to literally tens of thousands of new targets. That in itself is a big enough feature to warrant a price change. Finally, after talking to Earl Grey (the owner) I got a coupon code that should last through the price change with the new version.

AutoPligg Purchase Website
Use the coupon code: BLUEHATSEO for a $45 discount.
I recommend you use the Windows desktop version. It’s faster and more efficient then the serverside version. I know I know weird eh :)

BONUS
Since lists of good pligg sites are tough to come by and the program requires it, I took the liberty of building my own list for you guys. Here’s the database of the 7,409 pligg sites I use. Although most require manual registration (till the new version comes out) all are scraped by me tested, work and are importable into the program. It should be fairly error free at the time of writing this post, but they do change daily so bear with it if the list becomes systematically worthless as time passes. The best advice I can give you is for now ignore the lists they hand out in the private forum. Most are raw lists and a colossal waste of time to import (several hours each) and only get 200-400 good ones out of.

List of AutoPligg Sites
Right click Save As

]]>
https://www.bluehatseo.com/review-of-autopligg-backlink-tool/feed/
Conspiracy Theories Please https://www.bluehatseo.com/conspiracy-theories-please/ https://www.bluehatseo.com/conspiracy-theories-please/#comments Fri, 05 Jun 2009 09:09:05 +0000 Eli Random Thoughts https://www.bluehatseo.com/conspiracy-theories-please/ If you find portions of my writing style on this blog hard to understand, overly complicated, different then my writing style elsewhere or just generally outside the realm of normal blogging; It’s not because I’m trying to hide something, trick you, cover up a lack of experience, or prevent you from learning the technique. In fact I’m not blogging at all. I’m writing pseudocode.

Thanks for understanding,
-Fishy Eli

]]>
https://www.bluehatseo.com/conspiracy-theories-please/feed/
Open Questions - Subdomains and Main Domains https://www.bluehatseo.com/open-questions-subdomains-and-main-domains/ https://www.bluehatseo.com/open-questions-subdomains-and-main-domains/#comments Thu, 04 Jun 2009 07:53:59 +0000 Eli Random Thoughts https://www.bluehatseo.com/open-questions-subdomains-and-main-domains/ I got a great question on my SEO Empire post from Ryan at NetSEO. I figured it was worth addressing in a post rather than leaving a really long comment.

Eli,
Can you be so kind and explain why this is:

“Primary domains can pass a penalty to subdomains. Subdomains can’t pass a penalty to a main domain unless the main domain holds a relation to the subdomain (ie. a link).”

Happy to answer Ryan :)

Anytime I make a statement like that I am usually making a reference to an exemption to the general This-Is-My-Site -> This-Is-Google -> This-Is-The-Value stream of things. Sometimes I’m a bit presumptuous in assuming readers caught the reference. In this instance I’m talking about the exception given to protect free hosts from penalties, particularly those who give their users subdomains such as Hypermart, Xoom, Wordpress.com, Blogger, Tripod etc. This exemption can’t only cover the popular free hosts otherwise no new freehosts would ever stand a chance. As soon as they got a single spammy user their whole site could get banned and poof goes their legit business. Likewise algorithmically it can’t cover all free hosts because then the biggens like Wordpress.com and Typepad would all be penalized. On a foresight this would also include profile based social sites such as Myspace and outbound linking social sites such as Delicious. Anyone remember when Geocities sites used to rank so well. :) Yet at the same time with a lot of splog platforms out there manual reviews would be a nightmare and unfeasible. So there is a line drawn. That line has to consist of some sort of relationship between the primary domain and the subdomain of a site that’ll evaluate if the “subsite” belongs to the main site or if it’s a separate entity. By way of algorithms that relationship is very tough to determine. In fact it’s damn near impossible to do with 100% accuracy. Unfortunately for them, they have the burden of relying on internal linking relationships between the two which would include the above statement as well as other protective factors that would encompass other exceptions such as nonstatic links (like Furl & Delicious would use).

This area of unsureness gives us SEO peeps room to do things such as create splogs and do subdomain spam. As long as we know what they’re looking for (the antispam teams) we know what not to provide. Most sites that contain ownership of their subdomains link from their mainpage down to the subpages to the subdomains and so on and so forth. So when in doubt do the opposite. It’ll provide less of a chance for a relationship between the maindomain and subdomain to be found and if you’re worried about linkjuice get it from other sources via deeplinks.

]]>
https://www.bluehatseo.com/open-questions-subdomains-and-main-domains/feed/
Advanced White Hat SEO Exists Damn It! - Dynamic SEO https://www.bluehatseo.com/advanced-white-hat-seo-exists-damn-it-dynamic-seo/ https://www.bluehatseo.com/advanced-white-hat-seo-exists-damn-it-dynamic-seo/#comments Wed, 22 Apr 2009 01:37:00 +0000 Eli Guides https://www.bluehatseo.com/advanced-white-hat-seo-exists-damn-it-dynamic-seo/ Hello again!
I’ve been restless and wanting to write this post for a very long time and I’m not going to be happy until its out. So get out your reading glasses, and I have it on good authority that every reader of this blog happens to be the kind of dirty old men that hang out and harass high school chicks at gas stations so don’t tell me you don’t have a pair. Get ‘em out and let’s begin….

Fuck, how do I intro-rant this post without getting all industry political? Basically, this post is an answer to a question asked a long time ago at some IM conference to a bunch of gurus. They asked them does advanced White Hat SEO exist? If I remember right, and this was a long time ago and probably buzzed up so forgive me, every guru said something along the lines of there is no such thing as advanced White Hat SEO. Now I’m sympathetic to the whole self promotion thing to a small degree. If your job is to build buzz around yourself you have to say things that are buzz worthy. You can’t say the obvious answer, YOU BET IT DOES AND YOU’RE RETARDED FOR ASKING! You gotta say something controversial that gets people thinking, but not something so controversial that anyone of your popularity level is going to contradict in a sensible way making your popularity appear more overrated than a cotton candy vendor at the Special Olympics. In short, yes advanced white hat exists and there’s tons of examples of it; but you already knew that and I’m going to give you such an example now. That example is called Dynamic SEO. I’ve briefly mentioned it in several posts in the past and it is by every definition simple good ol’ fashion on-site keyword/page/traffic optimizing White Hat SEO. It also happens to be very simple to execute but not so simple to understand. So I’ll start with the basics and we’ll work into building something truly badhatass.

What Is Dynamic SEO?
Dynamic SEO is simply the automated no-guessing self changing way of SEOing your site over time. It is the way to get your site as close to 100% perfectly optimized as needed without ever knowing the final result AND automatically changing those results as they’re required. It’s easier done than said.

What Problems Does Dynamic SEO Address?
If you’re good enough at it you can address EVERY SEO related problem with it. I am well aware that I defined it above as on-site SEO, but the reality is you can use it for every scenario; even off-site SEO. Hell SQUIRT is technically dynamic off-site SEO. Log Link Matching is even an example of advanced off-site Dynamic SEO. The problems we’re facing with this post specifically includes keyword optimization which is inclusive of keyword order, keyword selection, and even keyword pluralization.

See the problem is you. When it comes to subpages of your site you can’t possibly pick the exact best keywords for all of them and perfectly optimize the page for them. First of all keyword research tools often get the keyword order mixed up. For instance they may say “Myspace Template” is the high traffic keyword. When really it could be “Templates For Myspace”. They just excluded the common word “for” and got the order wrong because “Template Myspace” isn’t popular enough. They also removed the plural to “broad” the count. By that logic Myspace Templates may be the real keyword. Naturally if you have the intuition this is a problem you can work around manually. The problem is not only will you never be perfect on every single page but your intuition as a more advanced Internet user is often way off, especially when it comes to searching for things. Common users tend to search for what they want in a broad sense. Hell the keyword Internet gets MILLIONS of searches. Who the fuck searches for a single common word such as Internet? Your audience is who. Whereas you tend to think more linear with your queries because you have a higher understanding of how Ask Jeeves isn’t really a butler that answers questions. You just list all the keywords you think the desired results will have. For instance, “laptop battery hp7100″ instead of “batteries for a hp7100 laptop.” Dynamic SEO is a plug n play way of solving that problem automatically. Here’s how you do it.

Create A Dynamic SEO Module
The next site you hand code is a great opportunity to get this built and in play. You’ll want to create a single module file such as dynkeywords.pl or dynkeywords.php that you can use across all your sites and easily plug into all your future pages. If you have a dedicated server you can even setup the module file to be included (or required) on a common path that all the sites on your server can access. With it you’ll want to give the script its own sql database. That single database can hold the data for every page of all your sites. You can always continue to revise the module and add more cool features but while starting out it’s best to start simple. Create a table that has a field structure similar to ID,URL,KEYWORD,COUNT. I put ID just because I like to always have some sort of primary key to auto increment. I’m a fan of large numbers what can I say? :)

Page Structure & Variables To Pass To Your Module
Before we get deep into the nitty gritty functions of the module we’ll first explore what basic data it requires and how the site pages will pass and return that data. In most coded pages, at least on my sites, I usually have the title tag in some sort of variable. This is typically passed to the template for obvious reasons. The important thing is it’s there so we’ll start with that. Let’s say you have a site on home theater equipment and the subpage you’re working on is on LCD televisions. Your title tag may be something like “MyTVDomain.com: LCD Televisions - LCD TVs”.

Side Note/
BTW sorry I realize that may bother some people how in certain cases I’ll put the period outside of the quotes. I realize it’s wrong and the punctuation must always go inside the quotes when ending a sentence. I do it that way so I don’t imply that I put punctuation inside my keywords or title tags etc etc.
/Side Note

You know your keywords will be similar to LCD Televisions, but you don’t know whether LCD TVs would be a better keyword. ie. It could either be a higher traffic keyword or even a more feasible keyword for that subpage to rank for. You also don’t know if the plurals would be better or worse for that particular subpage so you’ll have to keep that in your mind while you pass the module the new title variable. So before you declare your title tag create a quick scalar for it (hashref array). In this scalar you’ll want to put in the estimated best keywords for the page:
[
Keyword1 -> ‘LCD Television’,
Keyword2 -> ‘LCD TV’,
]
Then put in the plurals of all your keywords. It’s important not to try to over automate this because A) you don’t want your script to just tag the end of every word with “s” because of grammatical reasons (skies, pieces, moose, geese) and B) you don’t want your module slowing down all the pages of your site by consulting a dictionary DB on every load.
[
Keyword1 -> ‘LCD Television’,
Keyword2 -> ‘LCD TV’,
Keyword3 -> ‘LCD Televisions’,
Keyword4 -> ‘LCD TVs’,
]
Now for you “what about this awesome way better than your solution” mutha fuckas that exist in the comment section of every blog, this is where you get your option. You didn’t have to use a scalar array above you could of just have used a regular array and passed the rest of the data in their own variables, or you could of put them at the beginning of the standard array and assigned the trailing slots to the keywords OR you could use a multidimensional array. I really don’t give a shit how you manage the technical details. You just need to pass some more variables to the modules starting function and I happen to prefer tagging them onto the scalar I already have.
[
Keyword1 -> ‘LCD Television’,
Keyword2 -> ‘LCD TV’,
Keyword3 -> ‘LCD Televisions’,
Keyword4 -> ‘LCD TVs’,
URL -> ‘$url’,
REFERRER -> ‘$referrer’,
Separator -> ‘-’
]
In this case the $url will be a string that holds the current url that the user is on. This may vary depending on the structure of the site. For most pages you can just pull the environmental variable of the document url or if your site has a more dynamic structure you can grab it plus the query_string. It doesn’t matter if you’re still reading this long fuckin’ post you probably are at the point in your coding abilities where you can easily figure this out. Same deal with the referrer. Both of these variables are very important and inside the module you should make a check for empty data. You need to know what page the pageview is being made on and you’ll need to know if they came from a search engine and if so what keywords did they search for. The Separator is simply just the character you want to separate the keywords out by once its outputted. In this example I put a hyphen so it’ll be “Keyword 1 - Keyword 2 - Keyword 3″ Once you got this all you have to do is include the module in your code before the template output, have the module return the $title variable and have your template output that variable in the title tag. Easy peasey beautiful single line of code. :)

Basic Module Functions
Inside the module you can do a wide assortment of things with the data and the SQL and we’ll get to a few ideas in a bit. For now just grab the data and check the referrer for a search engine using regex. I’ll give you a start on this but trust it less the older this post gets:
Google: ^http:\/\/www\.google\.[^/]+\/search\?.*q=.*$
[?&]q= *([^& ][^&]*[^& +])[ +]*(&.*)?$
Yahoo: ^http:\/\/(\w*\.)*search\.yahoo\.[^/]+\/.*$
[?&]p= *([^& ][^&]*[^& +])[ +]*(&.*)?$
MSN: ^http:\/\/search\.(msn\.[^/]+|live\.com)\/.*$
[?&]q= *([^& ][^&]*[^& +])[ +]*(&.*)?$

Once you’ve isolated the search engines and the keywords used to find the subpage you can check to see if it exists in the database. If it doesn’t exist insert a new row with the page, the keyword, and a count of 1. Then select where the page is equal to the $url from the database order by the highest count. If the count is less than a predefined delimiter (ie 1 SE referrer) than output the $title tag with the keywords in order (may want to put a limit on it). For instance if they all have a count of 1 than output from the first result to the last with the Separator imbetween. Once you get your first visitor from a SE it’ll rearrange itself automatically. For instance if LCD TV has a count of 3 and LCD Televisions has a count of 2 and the rest have a count of 1 you can put a limit of 3 on your results and you’ll output a title tag with something like “LCD TV - LCD Televisions - LCD Television” LCD Television being simply the next result not necessarily the best result. If you prefer to put your domain name in your title tag like “MYTVSITE.COM: LCD TV - LCD Televisions - LCD Television” you can always create an entry in your scalar for that and have your module just check for it and if its there put it at the beginning or end or whatever you prefer (another neat customization!).

Becoming MR. Fancy Pants
Once you have the basics of the script down you can custom automate and SEO every aspect of your site. You can do the same technique you did with your title tag with your heading tags. As an example you can even create priority headings *wink*. You can go as far as do dynamic keyword insertion by putting in placeholders into your text such as %keyword% or even a long nonsense string that’ll never get used in the actual text such as 557365204c534920772f205468697320546563686e6971756520546f20446f6d696e617465. With that you can create perfect keyword density. If you haven’t read my super old post on manipulating page freshness factors you definitely should because this module can automate perfect timings on content updates for each page. Once you have it built you can get as advanced and dialed in as you’d like.

How This Works For Your Benefit
Here’s the science behind the technique. It’s all about creating better odds for each of your subpages hitting those perfect keywords with the optimal traffic that page with its current link building can accomplish. In all honesty, manually done, your odds are slim to none and I’ll explain why. A great example of these odds in play are the ranges in competitiveness and volume by niche. For instance you build a site around a homes for sale database you do a bit of keyword research and figure out that “Homes For Sale In California” is an awesome keyword with tons of traffic and low competition. So you optimize all your pages for “Homes For Sale In $state” without knowing it you may have just missed out on a big opportunity because while “Homes For Sale In California” may be a great keyword for that subpage “New York Homes” may be a better one for another subpage or maybe “Homes For Sale In Texas” is too competitive and “Homes In Texas” may have less search volume but your subpage is capable of ranking for it and not the former. You just missed out on all that easy traffic like a chump. Don’t feel bad more than likely your competitors did as well. :)

Another large advantage this brings is in the assumption that short tail terms tend to have more search volume than long tail terms. So you have a page with the keywords “Used Car Lots” and “Used Car”. As your site gets some age and you get more links to it that page will more likely rank for Used Car Lots sooner than Used Car. Along that same token once it’s ranked for Used Car Lots for awhile and you get more and more links and authority since Used Car is part of Used Car Lots you’ll become more likely to start ranking for Used Car and here’s the important part. Initially since you have your first ranking keyword it will get a lot of counts for that keyword. However once you start ranking for the even higher volume keyword even if it is a lower rank (eg you rank #2 for Used Car Lot and only #9 for Used Car) than the count will start evening out. Once the better keyword outcounts the not as good than your site will automatically change to be more optimized for the higher traffic one while still being optimized for the lesser. So while you may drop to #5 or so for Used Car Lot your page will be better optimized to push up to say #7 for Used Car. Which will result in that subpage getting the absolute most traffic it can possibly get at any single time frame in the site’s lifespan. This is a hell of a lot better than making a future guestimate on how much authority that subpage will have a year down the road and its ability to achieve rankings WHILE your building the fucking thing; because even if you’re right and call it perfectly and that page does indeed start to rank for Used Car in the meantime you missed out on all the potential traffic Used Car Lot could have gotten you. Also keep in mind by rankings I don’t necessarily always mean the top 10. Sometimes rankings that result in traffic can even go as low as the 3rd page, and hell if that page 3 ranking gives you more traffic than the #1 slot for another keyword fuck that other keyword! Go for the gold at all times.

What About Prerankings?
See this is what the delimiter is for! If your page hasn’t achieved any rankings yet than it isn’t getting any new entry traffic you care about. So the page should be optimized for ALL or at least 3-6 of your keywords (whatever limit you set). This gives the subpage at least a chance at ranking for any one of the keywords while at the same time giving it the MOST keywords pushing its relevancy up. What I mean by that is, your LCD page hasn’t achieved rankings yet therefore it isn’t pushing its content towards either TV or Televisions. Since it has both essentially equaled out on the page than the page is more relevant to both keywords instead of only a single dominate one. So when it links to your Plasma Television subpage it still has the specific keyword Television instead of just TV thus upping the relevancy of your internal linking. Which brings up the final advanced tip I’ll leave you with.

Use the module to create optimal internal linking. You already have the pages and the keywords, its a very easy to do and short revision. Pass the page text or the navigation to your module. Have it parse for all links. If it finds a link that matches the domain of the current page (useful variable) then have it grab the top keyword count for that other page and replace the anchor text. Boom! You just got perfectly optimized internal linking that will only get better over time. :)

There ya go naysayers. Now you can say you’ve learned a SEO technique that’s both pure white hat and no matter how simple you explain it very much advanced.

]]>
https://www.bluehatseo.com/advanced-white-hat-seo-exists-damn-it-dynamic-seo/feed/
We Added A New Java Chat https://www.bluehatseo.com/we-added-a-new-java-chat/ https://www.bluehatseo.com/we-added-a-new-java-chat/#comments Thu, 12 Feb 2009 17:43:39 +0000 Eli Announcements https://www.bluehatseo.com/we-added-a-new-java-chat/ Facts be faced I haven’t been posting lately. So if you want to chat it up and get some solid advice feel free to join us in the chatroom.

Chat.BlueHatSeo.com

]]>
https://www.bluehatseo.com/we-added-a-new-java-chat/feed/
How To Overthrow A Wikipedia Result https://www.bluehatseo.com/how-to-overthrow-a-wikipedia-result/ https://www.bluehatseo.com/how-to-overthrow-a-wikipedia-result/#comments Tue, 04 Nov 2008 11:52:50 +0000 Eli General Articles https://www.bluehatseo.com/how-to-overthrow-a-wikipedia-result/ A busy ranking artist runs into this problem quite often. I ran into it again the other day and figured I might as well show my Blue Hat peeps how to overcome the same problem since its a fairly popular problem to have and there is a simple solution to it.

The Problem
Your site is holding a particular rank and a Wikipedia page is ranked right above it. The specific ranks don’t particularly matter, but much like Hillary Clinton in the primaries you can’t possibly live being beaten like that. You have to drop the Wikipage down a notch and you have to continue moving up.

The Simple Solution
The simplicity of this tactic actually depends very heavily on the Wikipedia entry. Either way they’re all very beatable, but some are easier than others. In fact as mentioned I just ran into this problem recently and I managed to knock the competitive Wikipage entirely out of the top 20 in just two days using these steps. First you need to understand why the Wikipage ranks. Most of these pages rank for 3 reasons.

1) The domain authority of Wikipedia.org.

2) Innerlinking amongst other Wikipedia entries boosting the page’s value. <- Particularly the *See Also’s

3) Inbound links from most typically blogs and forums. <- An observant person would not only notice the high percentage of links from blogs/forums in contrast to other types of links but a strong lack of sitewide links from any of those sites.

You obviously can’t do anything about the domain authority of Wikipedia.org but understand that it’s pages are like a tripod; If you knock out one of the legs the whole thing falls (pun). Well now that you understand why it’s there right up above you like a towering fugly friend of the girl you’re trying to hit on the solution becomes obvious. Knock out reasons two and three.

Steps
1) Using your favorite link analysis tool (I prefer the simplistic Yahoo Site Explorer) find all the pages that link to the particular wikipedia entry that come from the wikipedia.org domain.

2) Go to each listing and find the reference to the offending Wikipage. You’ll find most of them in the See Also section or linked throughout the article. This is where the simplicity that I was talking about before comes into play. Listings such as “Flash Games” or “Election News” are easier because they’re so irrelevant. When people are searching Google for terms such as these they’re obviously wanting to find actual flash games or election news, not some faggy Wikipedia page saying what they are. The same concept applies to other Wikipages linking to them. Just because the author put the text Cat Food in the article or the See Also doesn’t mean its a relevant reference to the subject matter.

3) SLOWLY remove nearly all those bitches! Be sure to leave a good convincing reason for the removal on the editing reason. Remove as many as possible but strictly limit yourself. I understand Blue Hatters have a tendency to overdo things but you’re just going to fuck yourself if you quickly go through each and every reference and mass delete them. If you don’t know how many you should remove, then keep it to no more than 1-2 a day. Remove the references with the highest pagerank first if you got a ranking emergency and switch IPs between each one. This will either knock out one of it’s legs or at least cripple the leg a bit. Which leaves you with my match and exceed philosophy.

4) Find all the blogs and forums that link to that Wikipage and go drop a link in as many of them as you can. Match and exceed. :) I’m not going to dive into the nofollow talk on this one or talk about the benefits of links via blog comments. Just realize your goal in this instance isn’t to get more links it’s to get your link on the same pages that link to the Wikipage. As mentioned above you’ll be dealing mostly with blogs and forums, you’re in the same niche as the topics they’re talking about obviously and you probably won’t have any sitewide links to deal with so you won’t have to go through any link begging pains.

5) Try to drop your link into the article. This is common sense.

Side Note
Wikipedia’s domain authority isn’t something Ý0µ should be entirely worried abouṪ. They’re site and µrl structure actually ßÊcomes favorable to help deaden some of the heightening factors.

OH FYI! There is now a Printer Friendly link on every post on Blue Hat by popular demand

]]>
https://www.bluehatseo.com/how-to-overthrow-a-wikipedia-result/feed/
Stumble and Digg Begging https://www.bluehatseo.com/stumble-and-digg-begging/ https://www.bluehatseo.com/stumble-and-digg-begging/#comments Wed, 10 Sep 2008 06:25:48 +0000 Eli Neat Tricks and Hacks https://www.bluehatseo.com/stumble-and-digg-begging/ Haven’t done a Neat Tricks and Hacks in awhile. Here’s one to remind DIGG and Stumbleupon users to up your shit.

PHP


PERL



*PERL code is untested, I just translated it off the top of my head. Probably made a mistake or two… I always do. Am I the only CGIer left in this world?

Javascript
Source: Top News Trends


I just started testing this method today, so I couldn’t tell you how well it works yet. I’m going to start with Stumbleupon because I’m willing to wager I’ll have better results with them than the Digg crowd but who knows? Let me know how it works for you in the comments. :)

]]>
https://www.bluehatseo.com/stumble-and-digg-begging/feed/
New Wordpress Plugin - PingCrawl https://www.bluehatseo.com/new-wordpress-plugin-pingcrawl/ https://www.bluehatseo.com/new-wordpress-plugin-pingcrawl/#comments Wed, 06 Aug 2008 22:02:59 +0000 Eli SEO Tools https://www.bluehatseo.com/new-wordpress-plugin-pingcrawl/ I’ve been starting to use a new plugin I helped develop with the coding expertise of Josh Team from Dallas Nightlife Entertainment. It’s called PingCrawl. Its a plugin that helps get your Wordpress blogs deep links automatically on every post.

Plugin Summary
Every time you make a post on your blog it grabs similar posts from other blogs that allow pingbacks using the post tags. It then links to them at the bottom of the post as similar posts. It then executes the pingback on all the posts. You can specify how many posts to do per tag and that many will be done for each tag you use in your posts. Typically it has about an 80% successrate with each pingback and they are legit so the ones that fall into moderation tend to get approved. This creates quite a few deep links for each blog post you make and through time really helps with your link building. Especially for new blogs.

Theory Of Operation

* The plugin will listen to anytime a post is saved, published, updated, etc.
* The plugin on execution time will find all the tags on the post and perform the following per tag:
o Use Google API to check for ( 35 ) results with the tag name.
o With the ( 35 ) results it loops through them and performs the following
+ Does the result have a pingback meta tag?
+ Does the result have trackback somewhere in the source
+ (if yes to both) it stores the pingback xmlrpc location in memory.
+ (if no to either) we skip that record and move to the next.
+ Once their are 5 legit pingable servers we then append their links to the post we currently added.
+ We then retrieve the xmlrpc urls from memory, and execute a pingback.ping against the xmlrpc as defined in the pingback 1.0 spec. (due to the nature of pingbacks and php it is not a 100% guarantee. A lot of dependencies on state, server responses, headers, etc.)

Their are built in features such as caching google’s recordsets per tag, so you don’t have to make request out to google for the same use. And logic to know if you’ve already “PingCrawled” a post then on edit to ignore it, etc w/ a built in polling system.
Installation:

1. Download Plugin
2. Put file in the wp-content/plugins directory of your wordpress installation
3. Login to your blog dashboard
4. Click on Plugins
5. Click on Active to the Right of PingCrawl in the list
6. Make a Post

*Note ( because of the nature of the script any one tag can make as many as 41 HTTP request and storing source code into memory to run regular expressions against. Because of this I would try to limit my tags to no more than 3 (123 HTTP Request). Use more at your own risk.

Warning: This plugin can really slow down the time it takes to make your posts so I would recommend not using more than 3 tags per post. Also we coded in a small link injection which will put a link of mine into the mix about once every 10 posts. They will all be very white hat and clean links so no worries and if you left the code intact I’d consider that a substantial thank you for the plugin. :)


Download PingCrawl

Screenshot
*The size of the links are entirely customizable. I’d recommend making them very small at the bottom of the post but in this screenshot I made them big so you can see the format better.
pingcrawlscreen.jpg

]]>
https://www.bluehatseo.com/new-wordpress-plugin-pingcrawl/feed/
Open Questions #4 - Diminishing Values On Outbound Links https://www.bluehatseo.com/open-questions-4-deminishing-values-on-outbound-links/ https://www.bluehatseo.com/open-questions-4-deminishing-values-on-outbound-links/#comments Tue, 29 Jul 2008 23:12:23 +0000 Eli General Articles https://www.bluehatseo.com/open-questions-4-deminishing-values-on-outbound-links/ I somehow missed this question from the Open Questions post and I can’t help but answer it.

From Adsenser

I loved your SEO empire post.
But I was wondering how much effect does a lot of links from a lot of indexed pages from the same domain have?
I always thought that the search engines looked mainly at the number of different domain linking to you.
Can you give some more info on this?
Or do you use these pages to link to a lot of different domains?

This is a fantastic opener for a conversation on sitewide outbound links affects on other sites as well as the site itself. Which has been long debated but never cleared up, not because its too complicated just because theres so many myths its hard to work the fact from the fiction. To be clear in my answer I’m going to refer to the site giving the link as the “host site” and the site receiving the link as the “target site.” Just so I don’t have to play around with too much terminology.

The entire explanation of why sitewide links, main page links, subpage links, and reciprocal links work is based off a simple SEO law called Diminishing Values. It basically states that for every link whether it be recipricol, innerlink, or outbound link there is some form of consequence. Also, for every inbound link, innerlink accepted or reciprocal link there is a benefit.

SEO Law of Diminishing Values
Diminishing Values = sum(benefits) > sum(consequences)

The need for the sum of the benefits to be greater than the sum of the consequences is essential because, as mentioned in my SEO Empire post there can’t be a negative relevancy for a site in relationship to a term. For example lets take the niche of cars. There’s a theoretical mass of car blogs. For the sake of the example we’ll say there are several thousand blogs on the subject of cars. Something in the industry happens that stirs all the bloggers such as SEMA having a car show or something. So all these car blogs blog about SEMA’s new car show coming out and give it a link. If these outbound links caused a consequence greater or equal to the valued benefit given to SEMA than all these blogs would drop in value as per the topic, cars. Thus the mass affect would be that of a negative relevancy, therefore sites with no relevancy but contain topic links would by all theory rank higher than the general census of on topic sites.

So the notion of an outbound link diminishing your sites value in equal proportion is just complete bubkiss and obviously not the way things actually work. Even if it was true and there was a compensation for on site SEO when an event in a niche happens the site hosting the event wouldn’t just rise in the rankings it would propel everyone else downwards causing more turbulence in the SERPS than what happens in actuality with just their site rising. It’s just simple SEO Theory 101, but sadly a lot of people believe it. There’s also a lot of sites that absolutely won’t link to any sites within their topic in fear that their rankings will suddenly plummet the moment they do. They’re under the greedy impression that they’re somehow hording their link value and that is in some way benefiting them. So with the assumption that an outbound link gives much more value to its target than it diminishes from its host everything in a sense balances out and outbound links become much less scary. This of course in no way says that the consequence to the host is a diminishment of any sort. It’s entire consequence could be 0 or as a lot of other people believe +X (some people think on topic outbound links actually adds to your sites relevancy). I haven’t personally seen one of my sites go up in rank after adding an outbound link but I’m open to the idea or to the future of the concept being reality.

I Practice What I Preach
The Law of Diminishing Values is one of the reasons why BlueHatSEO is one of the only SEO blogs that has all dofollow comments as well as top commentators plugin on every page. Your comments will not hurt my rankings..I’ll say that one more time Your comments will not hurt my rankings. Whewww I feel better :)

Back To The Question
Before we get into the meat of the question we’ll take a small scale example that we should all know the answer to.
Q: If a host site writes an automated link exchange script that automatically does thousands of link exchanges and puts those links on a single subpage and all the target sites also have their link exchange page setup the same way on a subpage. Will the host site gain in value?

A: I’ll tell you straight up from personal experience. Yes it does. It’s simple to test if you don’t believe me go for it yourself

Now we’ll move up to a much larger scale with a specific on topic example using sitewide links.
Q: If you own two 100k+ page lyric sites with lots of inbound links and very good indexing ratios, will putting a sitewide link to the other site on both raise both in value or keep them both the same?

A: Also from my personal experience, yes both will not only raise in value but they will skyrocket in value by in the upwards of 50% which can result in much higher rankings. Likewise this example can be done with any niche and any two large sites. Cross promote them with sitewide links between the two and see what happens. The results shouldn’t be surprising.

Now, on the large scale to the meat of the question.
Q: If these two lyrics site cross compared all their inbound links from other sites and managed to get all the sites that link to lyric site A to also link to lyric site B to the point at which each increased in links by 100k (same as the number of increased links would of been with a sitewide link between the two) would both sites increase in value more-so than if they did the sitewide link instead?

A: Yes absolutely. This is a bit harder to test, but if you’ve been building an SEO Empire and each site’s inbound links are from your own sites than it becomes quite a bit easier to test and I’m certain you’ll find the results the same as I did.

Conclusion
On a 1:1 ratio on a generalized population of relevant links vs non-relevant inbound links from separate domains/sites are still more effective than a sidewide link of the same magnitude. However! A sitewide link does benefit both sites to a very high degree. Just not to the degree that lots of other sites can accomplish.

Sorry that question took so long to answer. I didn’t just want to give you a blank and blunt answer. I wanted to actually answer it with logic and a reasoning that hopefully leads to an understanding of the ever so important WHY.

]]>
https://www.bluehatseo.com/open-questions-4-deminishing-values-on-outbound-links/feed/