There’s a really solid post on Michael King’s blog arguing the case for Googlebot actually being Google’s Chrome browser.  While there isn’t any conclusive evidence that points specifically at Chrome being the user-agent in question, we can conclude that Googlebot is indeed a browser.

It’s an enlightening (and very long) read but I encourage you to take a look at the full article if you can.

So, what does this mean?  And why the sudden interest in search theory and software development?

Simple.  It changes how publishers should understand their sites and puts a strong emphasis on improving the user experience to improve organic traffic.  A combination of patent filings by Google and recent updates to search (like instant preview) indicate that Google understands webpages on a much deeper level and can now mirror the user experience almost 1:1.

There are tons of parameters that define the user experience which could, in theory, be used as ranking signals in Google’s algorithm.  If you’re not paying attention to the user experience of your site it may very well nip your Google organic traffic in the ars (if it hasn’t already).

Be sure to follow the coversations and live tweeting at Distilled’s #SearchLove Conference.  @dohertyjf and @iPullRank do a great job of live tweeting the most impactful takeaways from the conference.

 

Follow this author on Google+

 

The E-Commerce Business Owner’s Guide to SEO Management

Lessons From Online Retailers Who Successfully Drive SEO Growth
 

About the Author+David Weichel is the Director of Paid Search at CPC Strategy. He specializes in conversion rate optimization, search behavior research and attribution analysis. David graduated from the University of California, San Diego with a B.S. in Management Science. See all posts by this author here.