Internet Marketing with Jonathan Leger

Category : SEO

Penguin 4.0 – Google’s long-overdue course correction

penguinWhen Google first released their Penguin update in 2012, it was an unmitigated disaster. That update penalized all the wrong sites for all the wrong reasons. It was designed to target link spam, but it initially failed – badly.

Google quickly started making changes and corrections to try and improve the algorithm. It did improve, though webmasters the world over grumbled and griped about how many perfectly legitimate, “white hat” sites had been demolished by the update.

Some continue to argue that, years later, it still unfairly favors big name brand sites over high authority niche sites, but that’s another post for another day.

One basic principle of Penguin that Google didn’t change was the effect that spammy links had on the ranking of a site. Those low quality links from known spammy sources often got a site demoted out of the rankings. This led to the birth of what’s called “negative SEO”, where unscrupulous SEO practitioners would build garbage links to competitors’ sites and tank their rankings.

When negative SEO was first postulated I was in denial that Google could be so stupid. “No way Google, with all of its brilliant engineers and forward-thinking leadership puts the power of negative SEO into the hands of the bad guys. NO WAY!

Alas, it was true. Negative SEO was real.

Granted, it had no effect on highly authoritative sites, but for the little guy trying to build the reputation of his or her site, it worked. Though Google never directly admitted to granting the bad guys such power, they provided tacit proof of its existence by giving webmasters the ability to disavow links. Why would a site owner need to claim that they didn’t build the spammy links if spammy links had no negative effect on a site?

Now, finally, more than four years later, Google has taken the power of negative SEO away from the bad guys.

Gary Illyes from the Google Webspam Team has stated that with Penguin 4.0 (which is rolling out now) Google can finally “devalue spam instead of demoting” sites that are the target of spammy links. What that means is that Google “no longer penalizes the site or specific pages but rather ignores/devalues the spammy links and thus the rankings are adjusted” (see this SearchEngineLand.com article).

While this news should certainly make webmasters across the world happier (and sleep better at night knowing their rankings won’t be tanked by negative SEO), I have to ask the question: Why did it take one of the most technologically brilliant teams in the world four years to correct such a painfully obvious, damaging flaw in their algorithm?

Given Google’s general lack of transparency when it comes to anything related to their algorithm (which, honestly, I understand their reasoning for), we may never know.

This doesn’t mean you can run out and build spam links with impunity, however. Google is still taking manual actions against sites known to build low quality links. It’s the automated demotion that’s been removed from their algorithm. A human being has to make the decision now. So if you’ve been the target of a negative SEO attack, Google still recommends disavowing those links to recover your rankings.

As always, post your questions, thoughts and personal Penguin/Google gripes in a comment below. I’m all ears.

P.S. It’s these kind of innocent-webmaster-ignoring actions that Google has taken since going public that have led me to look more and more into alternative forms of traffic. And boy oh boy am I glad I did. CLICK HERE to see the astounding results!

Read More

Why Keyword Competition Tools Are (Almost) Always Wrong

As the creator and designer of a number of popular Search Engine Optimization tools, I have used (and regularly test) a variety of SEO tools and services in order to compare them to my own and see where I can improve (and, to be honest, where I’m beating the pants off the competition). This includes not only direct competitors to my own products, but also the “Big Data” providers in the SEO industry.

Great data is only great if you understand what to do with it. If you’re a beginner to ranking your site in Google then you need guidance more than a bunch of stats and numbers. This is where most of the SEO tools fail miserably. While the individual data points provided by these tools are often pretty good, when it comes to using this data to give you practical advice they almost always fall flat.

Take estimating keyword competition for example. How difficult will it be to rank in Google for a specific set of keywords? With all of that data at these tools disposal, you would think they would be pretty good at estimating that difficulty.

They’re not. In fact, they’re usually pretty bad at it.

Let me back this up by giving you an example of some keywords where these tools get it wrong. This example is a “long tail” (that is, a set of keywords that don’t get searched very often and contain 4 or more words).

Keywords: online acoustic guitar lessons

Difficulty rating from popular tools (scale is 0 to 100):

Moz – 50

SEMRush – 69

SpyFu – 56

KWFinder – 49

Difficulty rating from my soon-to-be-released SEO system:

Keyword Titan – 28

Notice the difference? The four popular tools shown estimate it to be about twice as difficult to rank for “online acoustic guitar lessons” as Keyword Titan does.

The reason why their estimates are so (incorrectly) high is because those tools appear to be averaging the authority of Google’s top 10 ranking domains / pages for the keywords. That’s a mistake, a serious mistake, and it’s where pretty much every keyword tool goes wrong.

You see, the true estimation of how difficult it will be to rank for a set of keywords in Google isn’t found in the strength of the top 10 sites ranking for the keywords — it’s found in the weakness of the weakest ranking site in those top 10 results.

That is, if there are 9 very strong sites ranking for a set of keywords and one weak site mixed in among them, that weak site is the true indicator that ranking in the top 10 for those keywords is not so difficult. After all, if it was difficult to rank for then that weak result wouldn’t be there, right?

Almost no other keyword tool gets this right. They always average the strength of the top 10 Google results together to come up with their difficulty estimations.

I designed Keyword Titan to be different. When you analyze a single set of keywords in KT you get what I call a “Snap Analysis”. Here’s the snap analysis for online acoustic guitar lessons:

onlineacousticguitarlessons

Notice the site ranked #9, acousticguitarlessonsonline.net. The site was clearly created for the simple purpose of ranking for a number of keywords related to online acoustic guitar lessons. It has a keyword rich domain name (I’ll go into detail in a future blog post about why that’s helping this site rank).

The TrustFlow of the domain and of the page is somewhat low (in case you’re not aware, TrustFlow is a respected measure of how much “trust” the links coming into a domain or page give it — the higher the TrustFlow, the more likely the domain or page is to rank in Google).

But where that domain really shows its weakness in comparison to the rest of the ranking sites is in the number of other sites linking in to it (the refdomains (site) metric). While all of the other sites have hundreds or thousands or more external domains linking into them, acousticguitarlessonsonline.net only has 86. Getting 86 quality links takes a little bit of time, but hardly qualifies for the “hard” ratings being given to these keywords by the other tools.

The relatively low external linking domains combined with its marginal TrustFlow causes Keyword Titan to give it a difficulty rating of only 28 (which is on the low side of “moderate” in Keyword Titan).

This same scenario plays out again and again any time I run keywords through the popular keyword tools. Because those tools use the strength of the ranking sites to estimate difficulty rather than looking at the weakest site in the results, they are wrong much of the time. That means that SEO professionals and beginners alike are making poor decisions about which keywords to target.

It’s not that these tools don’t have access to the same data that Keyword Titan does — they do — they just interpret it incorrectly. So the next time you’re trying to figure out what keywords you should be trying to rank for in Google, keep that in mind.

I welcome your thoughts and responses in a comment below.

Read More

How do you Know a Site is Authoritative?

Determining the level of authority, or “power”, a site has in regards to SEO is very important. You need to know how strong your competition is in order to determine what it’s going to take to outrank them. In the past you could look at the Google Toolbar PageRank that Google publicly disclosed. It provided a rough indicator of authority, but even that was often times determine site authorityway off. You would see sites with a PageRank of 0 outranking sites with a PageRank of 5 or more on a regular basis (for example). Google no longer updates their PageRank though.

The total number of backlinks to the ranking page was something that used to be focused on in a big way as well. The only problem with that was, and is, that quality was far more important than quantity. However, up until a few years ago the quantity of backlinks was much more important to Google in determining authority than it is today. So the overall backlink quantity to the ranking page has also been “thrown out the window” as an authority indicator.

That leads us to the present state of SEO. What is the most effective way to determine the authority of a website?

Since the beginning of search engines, the best way to determine the authority of a website is to analyze the sites that rank for any given keyword and determine what they have in common! When this is done you have a lot of BIG data to analyze. So the goal is to figure out the most simple metrics you can that are consistent with the top Search Engine results.

As you may know, I love SEO! I love data analysis too. So I’m always testing and comparing data to figure out what works and what doesn’t. What I have found to be the most effective way to determine the authority of a website for the average Internet Marketer, without getting too complicated and technical, is to look at the MOZ Domain Authority (DA) of the domain, and the total number of unique domains linking to the site (not just to the ranking page).

The DA of a domain is determined by multiple factors that MOZ looks at in order to determine how likely the site is to rank in Google’s search results. If I want to get a general idea of a site’s authority, this is what I look at.

The total number of unique domains linking to a site, in itself, is too raw to give any more than a rough idea of a sites authority. But when analyzing keyword competition, it is very powerful! Simply plug a keyword into Google and then find the total number of domains linking to each of the top 10 results. If you find a site that has 100 or less, great! If you find one with 50 or less, even better. If you find one with 15 or less, you’ve found GOLD (if the keyword has much search volume).

Keep in mind, the quality over quantity factor remains. The quality of the referring domains is a big factor to both the DA and the ability to rank with fewer referring domains. So if you find a keyword that has a top 10 competitor with 25 unique domains linking to it, 26 isn’t necessarily going to beat it. But if you get 26 solid links, you have a very good chance at it! (Reaching out to site owners in my niche and offering a unique article in exchange for a link at the end of the article to my site often results in mid to high-level links for me.)

This is how I do my own simple, quick and accurate keyword research and it works! I eventually achieve top-10 rankings for most of the keywords I choose to target as a result of using this method.

Read More