Each generation and branch of SEO tacticians, similar to any other established order in any industry, has built a set of assumptions â€” and even worse, for their ability to remain strategically nimble â€” developed sunk costs in infrastructure, most appropriate to a fixed (past) eraâ€¦ in this case, a past era in search engine ranking algorithms.
What increasingly matters (whoops, has always mattered) is whether (a) your content is relevant and high quality; (b) popular and/or authoritative. Search engines attempt to assess these raw qualities in different ways in different eras, and they move the goalposts when the nature of information consumption and sharing change, and when spammers catch up with their measurement techniques.
There are many great ways to sum this up; to illustrate for common-sense purposes the difference between what search engines actually measure at any given time and what they are trying to capture for user benefit. But perhaps one of the most succinct is Hugh McLeodâ€™s notion of social objects. If youâ€™re shouting about the benefits of Maxwell House coffee (yawn), youâ€™ll eventually get through if you spend enough. But if coffee enthusiasts are really discussing coffee and really helping one another â€” as they do on this thing called the Internet â€” surely thereâ€™s a relevancy algorithm waiting to happen to that process of relatively spontaneous buzz. If the right community of people are retweeting Richard Floridaâ€™s tweets about a certain article about urban tranportation, that helps us to understand more about the value of the article itself, but also about the trust patterns and interest patterns within the community. It should also help us to understand which publications and authors themselves are reputable. With rel=author and other mechanisms, search engines will have more and more available cues so that we donâ€™t have to sift through counterfeit crapola. Itâ€™s a long term battle, but one that each generation of spammers will lose after their initial successes.
Thereâ€™s no question that this is already what Google PageRank (and other search engines like Teoma) were attempting to tap into. Itâ€™s just that the proxy for the community interest and heartfelt recommendation â€” the backlink structure of the whole web â€” is outdated and endlessly gameable today.
SEO tacticians have invested more than anything else in the infrastructure around backlinks, because â€œGoogle PageRankâ€ originally centered around the authority conferred on a website or page by the authority and volume of links pointing to themâ€¦ along with anchor text and other relevancy factors to attempt to match the aboutness of a page with the aboutness/intent of the userâ€™s query.
With every passing year, traditional link signals become less useful. Backlink-obsessed SEO tacticians (including the companies that spider the whole web and give you ponderous information about your siteâ€™s linkage skills) have had a longer-than-usual heyday precisely because Google themselves has had so many sunk costs invested in their link-centric ranking paradigm, theyâ€™ve been slow to wind down this archaic methodology. One key problem was they didnâ€™t have enough to replace it with yet. And part of the reason for that was that â€œGoogle didnâ€™t get social,â€ so it was reliant on deals with companies like Twitter so it could assess social signals. It then began acquiring small reputation management companies like PostRank. Facebook, of course, wonâ€™t play ball.
The biggest change in Googleâ€™s ability to understand sharing behavior just happened. They launched Google+.
Social signals and user clickstreams are two emerging types of signals search engines look at to help with ranking. Google will soon have more social and sharing signals than anyone but Facebook.
The environment for SEO is going to change radically. Google wonâ€™t be afraid to shift their ranking algorithms more dramatically in the future â€” away from their old backlinks analysis â€” because theyâ€™re no longer impressed by their own sunk costs. Googleâ€™s costs and revenues both keep rising dramatically, and in Google+ itself, Google has a huge new sunk cost. Think they wonâ€™t mine that data to make search work better? Think the SEO world is only going to change incrementally, and it will be more or less business as usual? Think again.
When a big company comes to me and asks a 2007 question like â€œif we pay you guys $2,000 a month for a few months, how many quality backlinks do you think we can get for that?,â€ it makes me want to cry. A fair question, if you could get quality backlinks (incremental to your companyâ€™s already huge online footprint) by simply contacting a few people, like we used to do in 2001. Admittedly, I am a rink rat who grew up playing shinny for 9 hours straight without lunch on the outdoor rinks of Ottawa; later, inhaling the Zamboni fumes in suburban Toronto. Maybe Iâ€™m still high on the fumes, which is odd given that itâ€™s July. But when it comes to SEO, I advise companies to head where the puck is goingâ€¦ not where it was years ago.
A talented outside agency can assist with that process, of course. But maybe you donâ€™t need â€œSEOâ€.
This entry was posted
on Thursday, July 14th, 2011 at 10:49 am and is filed under Search Marketing.
You can follow any responses to this entry through the RSS 2.0 feed.
You can skip to the end and leave a response. Pinging is currently not allowed.
Article source: http://blog.traffick.com/2011/07/google-and-seo/