The latest SEO developments you need to know

Over the past 12 months, the search marketing landscape has seen many changes. Let’s take a look at the biggest developments that all digital marketers need to know.

The slow and final death of Yahoo Site Explorer
This year, the once mighty Yahoo shrank some more — and in a big way for SEOs. After being usurped by Google years ago, the 16-year-old search engine finally killed off its most coveted features of SEOs across the globe. Yahoo Site Explorer is no more. That’s a shame, as it was the only link graph reporting tool operated by a major search engine.

There are plenty of other players in the space, but few are free, nor do they tout as robust an index as did Yahoo Search. The death of Yahoo Site Explorer is significant, but it’s certainly not a surprise. The prepared in the industry have already done their due diligence in selecting alternatives. Out with the old and in with the new, as they say. Speaking of the new…

These aren’t your father’s search engine bots
Evidence continues to mount that search engines’ crawlers can do more than simply parse in-line HTML on a given page, and this news recently came from the proverbial horse’s mouth in the form of Matt Cutts‘ slightly vague PubCon announcement that Google is “getting smarter.” That’s not news to most of us, but just how — and when — “smarter” is happening is an interesting topic. Though it remains speculative, it’s clear the search engines have made much progress in developing technology that emulates the human browser. Nowadays search engine bots can execute JavaScript and AJAX, index Flash, and even complete forms. Fancy.

What does this mean for SEOs? It means that the pillars of SEO — relevance, accessibility, and authority — still remain foundational. But we must work closely with design, information architecture, and user experience folks to assure the user experience we build is positive, persuasive, and aligned with our conversion and revenue KPIs. Above-the-fold page layout, content, links, and rich media locations all need to be carefully considered. If we want to stay ahead of the curve as internet marketers, we can’t work in silos.

Google giveth more transparency
Among the more recent and impactful changes to the search landscape are announcements from Google that it’s going to be more transparent moving forward. This is welcome news since many conclusions about what the engine is doing tend to be based on speculation or statistically unsound test cases.

For SEOs, a critical announcement is that more detail will be provided to webmasters submitting reconsideration requests. We will now be informed if a suspected penalty or filter is due to manual or algorithmic action. The engine’s new monthly series on algorithm changes represents another commitment to transparency by Google. That’s more welcome news to help us filter out a bit of the noise in the industry and keep our sites in line with the trends.

Google taketh away referring keyword data
Google might give us the warm fuzzies on one end of transparency spectrum, but it’s balancing this by taking away data on another end. In an effort to better secure personal data, the engine has elected to leverage default SSL encryption on search queries for logged-in users.

What does this mean? Well, according to Google’s Matt Cutts, we will see “(not provided)” as the keyword referrer from Google in analytics reporting in place of the actual keyword used. He also said that “(not provided)” would not exceed single digits in the percentage of terms affected. Unfortunately this isn’t the case. In some cases, a significant chunk of data is lost, which can punch holes in informed optimization efforts.

Curiously, paid search traffic queries aren’t hidden, nor are queries from logged-out users, which begs a few questions around whether this change was solely to protect users’ privacy, or to make SEOs guess while maintaining status quo for the paid search guys. Google suggests using Google Webmaster Tools as an alternative to view the lost data, though historic data is lost after 30 days and only the top 1,000 search queries are reported.

Google +
In yet another effort to break into the social media universe, Google launched its Google+ project with more success than any of its previous attempts. It first rolled out its +1 button, which allows logged-in users to essentially tag those search results (organic or paid) that they find worth sharing with those they’re connected with via their Google account. This was the initial attempt to join the foray with tweet and “like” buttons from the entrenched social players. Later the +1 button was made available to embed on websites, and plenty took notice.

In June, Google launched the Google+ social network, complete with circles, sparks, hangouts, and more. The platform gained quick momentum with 10 million new users in two weeks, and the numbers keep growing. It’s yet to be seen how the Google+ Project will impact search, but it’s clear that the search giant will incorporate what data it can from its social network to advance search results.

Panda! Panda! Panda!
And how can a 2011 SEO year in review be complete without lots of Panda talk? It can’t.

At the end of February, a massive update from Google, officially dubbed “Panda” after an engineer at the company, began an incremental rollout. The SEO community called this update “Farmer” since it had a significant and immediate impact on sites with thin content, often referred to as “content farms.” The first update and the six subsequent ones have changed the definition of a quality site in the eyes of Google and SEOs that are paying attention.

The update essentially combines research around the opinions of a human quality rating panel with user data gathered by the Google’s automated search technology. Google leveraged advances in machine learning to combine these human quality ratings around whether a set of sites is trustworthy, likeable, and other subjective elements against the myriad of data points already gathered for these and similar sites. It then applies machine learning to essentially “predict” whether people will trust other websites based on the seed research. It sounds complicated because it is.

The takeaway here is that SEOs must now look at their work through a holistic lens that takes into account all elements of a site. Things like design, user experience, and truly engaging content are now all considered by Google in algorithmically measurable ways. User metrics are critical as well.

If this boils down to a few specific points, it is that testing and consideration of all elements of a site’s performance and quality are now critical to strong, lasting rankings. Simple adherence to best practices for traditional SEO elements might not be enough to help your site rank better in the long run.

Ramsay Crooks is director of SEO at Geary SEO.

On Twitter? Follow iMedia Connection at @iMediaTweet.

Article source:

Related Posts