Keyword Rankings Are Not the Best Way to Determine Success

There was a time not long ago when sites could be built quickly by just copying and pasting articles from other sites and filling in the appropriate keywords. These were a simpler time, and black hat tactics reigned supreme. Quality always seemed to take second place to quantity. This was a time when Google produced search results that were entirely based on the keywords used in the query. In those days, to be a success what you had to do was stuff your content full of keywords and build a bunch of worthless links, doing both tactics more ruthlessly than your competition. Keywords rankings could be quickly and easily tracked. Their success was based on how many visitors come to the site because of those keywords. This was data that let website managers determine where they needed to improve and see what was working. Things have changed considerably since then, as pretty much anyone working in SEO will tell you. Today, site rankings are based on numerous factors, and all of them are affected by how Google’s algorithms work. You may not understand what those algorithms entail and how they affect you, but they certainly have an impact on how your site ranks, whether you realize it or not.

Google added Panda to their search parameters back in early 2011. This change had an impact on more than 10% of all search results. It wasn’t completely penetrating all corners of the internet, but that’s still a massive effect. The purpose of this update was to affect the ranking of sites based on their quality of content, how many spammy links they had, and various quality issues. Panda has been updated through the years, and every update comes with a slightly different focus. Google provided a set of guidelines for websites to follow. They have told people what will cause their sites to lose their rankings. These would include things like keyword spamming, link schemes, and various other underhanded tactics that try to play the search engines instead of catering to users.

Penguin Update

After Panda, Google introduced Penguin which took the algorithm implemented in the search engines to the next level. Penguin narrowed its focus to low-quality off-site practices. That particularly meant targeting link schemes, which were a way to artificially inflate the authority of a site by building shallow or worthless links around the main site. Once Penguin hit, sites that used these link building schemes saw their ranks drop quickly.

Hummingbird Update

Then in 2013, Google updated again. This update, known as Hummingbird, altered how Google chose search results. Instead of just looking for specific words, the search engine would try to understand what the user was searching for and create a more conversation search to more accurately predict users’ needs. This was an attempt to make the search results more relevant to users and ensure that even if users didn’t always know what they were looking for, the search engine would be able to help them find it. All three of the algorithms work in conjunction. So what was put in place by one was carried over to each subsequent update.

Further Updates

Google made a change to how it functioned for website owners in 2011 in that it would no longer display keyword analysis for websites. They said it was to help protect the privacy of its users. What this did is make it impossible for websites to be able to tell which terms users were inputting to get to their sites. That upset a lot of website owners, naturally. Here is what all this means for website owners.
Now we can’t match what is on our page to what we know to be the top search terms that bring people up on our sites. We can also no longer count on individual keywords alone, thanks to the update that looks at the entire page content. On top of that, we cannot use keywords repeatedly to achieve the best results. Does that mean it is time to stop focusing on keywords? It would seem like they don’t hold the same way they once did. So are they just a waste of time at this point? We still want to shoot for the high rankings. The problem at this point is that keywords are a nebulous factor right now. Long-tail keywords are proving to offer more value as far as getting visitors to a site. They provide more context for the search engines. There are some tools still being offered by Google that let you see how well a keyword is doing in some regards. You can see what the clickthrough rate is and various other analytics, but you end up with some inconclusive data and a lot of work to get to any decent analysis.

How Do We Still Make keywords Matter?

There are still plenty of ways in which keywords still play a role in page ranking. To generalize it, they would include things like using long-tail keywords, working on direct value, noting areas of success and failure, and using available tools to get a decent analysis. That kind of in-depth look at using keywords in today’s Internet landscape is a post for another day.