"I was thinking about you all day today and what a great
person you are."
"I wanted to be #1...After 2 months I reached the top position for my most popular keywords."
Above The Fold!
Google Buys New Algorithm
Google has recently made an algorithm change which few understand. This article is a continuation from my earlier article, titled "Google Sells Christmas." The theories behind the recent Google change have been far and wide.
Commercial Bias of the Web
A good research paper has solid information in it, but typically cares little about what a search engine may think. The fact that the internet can provide essentially free distribution after initial investment makes it appeal to marketers of all types.
The marketer wants to make money, frequently caring very little about what search engines or the end user think. In the past, webmasters worked hard to place the second result above the first. Recently Google has aimed to change this.
Even I am shocked at my own recent popularity. Some of the heaviest spenders in the search marketing world have recently called me looking to employ my services.
Despite my interest in search engines I had no idea how broad the scope of PageRank sales was. I knew some networks existed, but as I have analyzed more and more of the web, I notice PR sales at places I thought I would never see.
One of my callers wanted to know if investing in another 1,100 newspapers would be a good investment. I started thinking about what a huge competitive advantage that purchase would have been.
In later web search I noticed PR selling links at sites like The Duke Chronicle. Educational institutions were at the heart of the internet. Though I doubt they realize it, these newspapers are trying to overthrow the Google ranking system.
If Google were to make one simple change at a time people like me could easily reverse engineer their every move. This is the reason I think they made multiple changes at the same time. Built into the new features was selective engagement. Recently Google has added Stemming, a Bayesian Spam Filter, and perhaps local redistribution of search results.
As noted in the Cre8asite forums Google recently introduced a new stemming feature. Searches for terms such as "smoke" will return results with "smoking" in them. Some SEO experts believe a large amount of the changes at Google are due to the poor stemming implementation.
When a person looks at a cached page it will state at the top some of the relations of the location of that page to the search which found it. When you search for coffee, Starbucks is the #1 resource (even though the word coffee does not even appear on the page.)
With some sites the reference has been way out to lunch using phrases such as "allinurl:blablabla" and "allinanchor:blablabla". When you search for something it should not tell you how operatives are working on other pages when you are not searching for them - this is part of the reason why some say "Google is Broken."
The fact that those other operatives are showing would indicate to me that in some ways Google is using a variety of information from the linking page to determine the value of a link.
Bayesian Spam Filter
In October it is believed that Google implemented a Bayesian spam filter. A bayesian spam filter compares returned results to known examples of spam, and determines what is spam using a probability analysis of the whole by breaking it down into its simplest components.
Paul Graham wrote an excellent article describing using these spam filters on email. The system he created filtered out 99.5% of spam with 0 false positive returns. Things such as the html color code for red may be a big flag for spam. Non spam elements can offset the filter just as much in the other direction. Many believe the Google fitter is set too course and returns many false positives.
Players of a game called Googlewhack search for a query which returns only one results. A "wack" is when a searcher finds only one search result for a given query. In October the search for "wacks" became somewhat easier as the new filter made many sites disappear from the results. The players termed "whack" results which returned due to the filer as "Google Nack." Apparently, some spam results would stop other results from showing for very common phrases as cited in Google Spam Filter Gone Bad.
Things which would appear as dead on spam to a search engine would be
- high keyword density
- high keyword proximity
- only having reciprocating links
- keyword stuffed inbound links
- most links from off topic resources
Local Reorganization of Results
Teoma (owned by Ask Jeeves) is one of the five major crawlers on the web. They have a ranking system which displays results based on their local inter connectivity. After the user searches, the initial set of results are re ranked based on links to and from topical hubs and authorities.
On top of contributing to research projects such as Hilltop, a few months ago Google bought Kaltix, which was a small startup focusing on search personalization and local community networking (Topic Sensitive PageRank pdf). In the HighRankings forum Danny Sullivan stated "Well, I'm sure Google could do LocalRank if they want to. Moreover, I think they are doing this."
If you search for any type of advanced search you will see the old results. If you search for a general search the new results appear. It is much harder to search for some and not all versus just searching for all. This added computation time is the most logical reason on why the newest "filters and algorithm changes" are not possible for complex searches.
Activation of the Filters
The changes to search results are not universal, only some searches are effected by the new algorithm. The new filter is used to offset unnatural manipulation of search results. The data may be collected from
- searches via the Google Toolbar
- monitoring the search engine results pages from various queries
- financial information from Google AdWords bids
The problem with his study is that mortgage has many stemming versions which apple does not. In addition, real estate being one of the three largest domestic industries makes it more susceptible to human intervention and manipulation.
This new algorithm has caused changes to both commercial and non commercial terms though, so it may not have anything to do with AdWords. For any filter to work there would need to be an adequate resource network for the specific search. If no such network exists, then it is not likely that there are people artificially trying to create one to gather profits from.
Spam aims to steal from existing markets, not create new ones. Perhaps the statement from Rosing that (Google's advertising and search businesses) "are completely separated, there is no linkage between the two" is completely true.
It is of my personal opinion what we are seeing now is a combination of stemming, filtration, and result reorganization due to local inter connectivity.
How to Win With the Change
Many of the means used to artificially gain rankings are forever gone. Certainly there still are some effective manipulative techniques, but they really should not be the focus of any site - in a world of constant change it doesn't make sense.
A couple ways to tone down over optimization
- lower keyword density
- spread out the key words
- encourage natural linking instead of using all optimized terms
- try to get links from industry expert sites (this is easier if you have great content)
- words that are less frequently used (especially on inner pages) tend not to set off the filter. "Any commercial searches with volumes over 200 per month (as determined by Overture’s search term suggestion tool) seemed to trip the filter. Searches under that threshold seemed to remain unfiltered." - from Florida Fever: The Google Update Uproar
- some have stated that a PR6 or above site wil not trip the filter, but if it is obvious that you are overoptimizing you still may (as witnessed with my own eyes from PR7 and PR8 phone calls)
Other than toning down some of the prior optimization:
- My single piece of advice to service providers would be to study and learn your topic, and freely give away information. Once you have guru status you can continually profit from it. Eventually people will search out your opinion without any need for marketing.
- My single piece of effective advice for merchants would be to think of your unique value proposition and exploit it. If you are unique enough, someone on the web will want to talk about it. Google Search: Talking Stuffed Animals
some of the places I have been learning:
Search Engine Marketing Forums
Most of the collaborative efforts to solve this problem have been done in SEO forums. Below the forums are highlighted with why I like each one
- HighRankings Forum - Many of the largest SEO's post there.
- SearchGuild - Chris Ridding is highly technical and original in his thinking - If I ever have technical questions this is where I ask
- Webmaster World - while there are many erroneous posts from people who do not know much, the shear breadth of distribution makes it likely that many of the correct answers to our puzzle will come from here
- JimWorld - another large forum, their post on the topic started with a quote from my original post in the Search Engine Journal
- Other top forums: Cre8aSite IHelpYou SEOchat
Other Articles on the topic
- Vaughn Flow Chart - a visual look at the perceived Google filter
- Been Gazumped by Google - article from Barry Lloyd on the new Google change
- Florida Fever: The Google Update Uproar - report from one of Canada's top SEO firms
- Scroogle - the anti Google filter testing website
- and the part 1 version of this article
I have been so busy that most of my analysis of Google has been through research for customers. Each person who emails or calls me gives me another set of eyes to view the web through. I thank you all for helping me learn!
- by Aaron Wall, owner of Search Marketing Info