The Evolution of Search

Evolution of Search

Evolution of Search

This past year has been one of immense change as it pertains to Search, and more particularly Google Search. It all started in early 2011 with Google’s algorithm update called Panda. It sent the Internet into a tale-spin as webmasters and SEO’ers scrambled to make massive adjustments to their way of optimizing their websites. Panda’s focus was on higher quality sites that did not have a lot of duplicate content, both on-page and off-page. All the article sites took a big hit at that time because of the amount of duplicate content they contained. Then in April of 2012 Google announced the launch of Penguin. This algorithm change focused on bad linking strategies, backlinks from bad neighborhoods or link farms.

Since the introduction of these two major algorithm changes, Google continued to refine and improve upon those updates, rewarding sites with good unique content, and devaluing sites with duplicate and thin content. Bottom line, they want to give their Google searchers exactly what they are looking for, and not websites that are weak in content and relevancy. According to an article written by Peter Prestipino, Editor of Website Magazine, “Unique and interesting content and engaging digital design experiences are what will separate your brand from the competition and what will earn your website the rankings you want and need.”

Recently I’ve seen a couple of other significant changes with Google. In September Amit Singhal, senior VP of Google stated that Google quietly launched another algorithm change which he called Hummingbird. Not a lot of detail has been given on this launch other than what he said about it: “Our algorithm had to go through some fundamental rethinking of how we are going to keep our results relevant.”

According to TechCrunch: “The main focus, and something that went repeated many a time, was that the new algorithm allows Google to more quickly parse full questions (as opposed to parsing searches word-by-word), and to identify and rank answers to those questions from the content they’ve indexed.”

Also recently, Google confirmed that it was “obfuscating all keyword referral data going forward,” meaning that they will not be giving search detail on secured searches.

It appears that Google’s algorithm is becoming more human and less technical in its process of determining top quality sites. They are looking less and less at the number of targeted keywords on a page, but more at topical and relevant content. They are looking for websites with authority and good branding. They are also looking for websites that have other “authority and good branding” websites that refer to them, not just with linking but with citation mentions.

So what does that mean for your website, and where do you go from here? It means that you need to make sure that you are squeaky clean with Panda and Penguin. You need to eliminate all duplicate content pages, have good clean backlinks and focus on the quality of your content. You need to become the “authority” in your industry. You need to collaborate with industry-related, high authority websites and link with one another. According to Peter Prestipino, “In 2014, expect the savviest brands to focus on creative ways to convey meaning through design while creating highly engaging experiences simultaneously.”

As you focus your efforts on good interactive content for your users and collaborative relationships with your industry, in the long term you will become an high ranking authority.

Leave a Reply

Your email address will not be published. Required fields are marked *