Is Page Load Speed Google's Next Organic Rankings Factor?




Example of a dead heat in horse racing

A tweet out of SMX Advanced last week got my attention:

Tweet by @MissDeFacto saying load times increasingly a factor in SERPs
posted by Meaghan Olson (@MissDeFacto) from TotalAttorneys.com

We’ve known since March of 2008 that Google factors page load time into the Adwords quality score, which helps determine ad rankings. But there’s been no discussion of whether speed matters for organic rankings, except that many suspect it might.  Even SEOMoz’s factors list deems server response time as “Moderately Important,” and then only from the perspective of being crawler-friendly.

So what to make of this hint dropped over lunch at SMX? As I see it, an algorithm change is in the works, and sites with merely a passable page load times can expect to lose rankings to their speedier competitors shortly.  Don’t be evil; be fast as hell.

Google Giveth

Perhaps it was coincidence, but the very next day, Google released a speed tool for developers built on Firebug. The comments on TechCrunch were pretty much along the lines of this one from Patrick, “Yeah, this is exactly like YSlow. Google has to build all of their tools themselves though to prove how smart they are.”

WordCamp SF - Straight from Google - Matt Cutt...
(cc) Kenneth Yeung – www.thelettertwo.com

Maybe.  But why would Google release an app that’s almost identical to Yslow? Perhaps because they need one of their own.  Why?  Because in a few months, Matt Cutts will be holding a microphone telling anxious Webmasters the algorithm now factors in page load time, so they need to focus on optimizing page loads.  Directing people to Yahoo would be a bit busch league, so that’s why Google needs a tool of their own.  To me, it’s totally Google’s style: let’s not be evil by changing the algorithm without giving Webmasters tools to test page load speeds.  So viola, Google Page Speed.

How Big a Factor?

My guess is we’ll see page load speed as a factor impact long-tail SERPs where trusted sites are offering relatively similar information. For example if you search “real estate casis elementary” on Google, you get a cluster of sites like Yahoo Real Estate, Trulia, and Zillow who take feeds from Education.com or Greatschools.net (who also rank) with similar (but not duplicate) content. Which one is best to show– the page from the more-trusted site or the page that loads faster? Increasingly, I think the answer will be “the fast one.”

What to Measure: HTML Serve Time, Page Serve Time, Page Render Time?

If this is all true, I think it’s worth stopping to consider what exactly is Google measuring? The time it takes my server to spit back HTML? The time to retrieve all the Javascript, CSS, and images for a page? Or the time it takes the browser to render everything?

Meaghan told me that the Google engineer focused mainly on page serve time, but said client-side “matters because it annoys users.” So I suspect Google is working hard to measure total client-side page render time, and coincidentally that’s what Page Speed measures.

Also, if you reel back the tape to Matt Cutts’ keynote at Pubcon in Austin, we heard him say, “The team there only thinks about speed. They want to get the results back to users as quick as humanly possible.” Now I realize Matt was speaking about how fast Google returns its own results to users, but if you pay close attention, he was talking about how Google tries to improve client render time. If improving client render times is good for Google users on google.com, then you can bet they believe the same is true elsewhere.

Upshot: Panic!

Just kidding.  If you run a site with highly unique content (e.g. a blog), diverse competition, and solid current rankings, I’d expect to see less impact on you. But if you’re responsible for a site with similar content to competitors (e.g. real estate listings sites) and the competition is clued into the standard SEO tricks, then I won’t be surprised to see faster sites outrank slower, higher authority sites in some cases.

How exactly will Google measure?  In the short term, they’ll probably measure the time it takes your server to spit back HTML.  But in the long term, I expect it will be total browser render time.

I’d love to hear what others think about this situation. Any guesses what happens if you have long, text-heavy pages? Theoretically, they’ll have poor page load times if measured purely by HTML serve time.  Will Google adjust your page load time time for the amount of content you’re serving up? If not, won’t that effectively penalize long, text-rich pages? How well can Google measure the render speed of pages served by JS-AJAX heavy frameworks like Wicket?  How can this be gamed– are there scripting tactics to hide object loads and make pages appear to have less cruft than they really do?

Reblog this post [with Zemanta]