Optimizing Site Speed – Why Google Is Right

April 5th, 2011 by Search Influence Alumni

With the most recent algorithm update, Google made it clear that a site’s user experience is now a factor for your search ranking. Though this may be a giant leap forward for assessment of site quality, the backlash against the farmer update shows that such metrics for SERP rank might not always be the best. But a user experience factor older than the Big Panda still gets some people’s goats.

Site Load Time

Site speed has been a significant factor since about this time last year. Reading some of the comments on Matt Cutts’s blog post could be an exercise in patience, but as always over-concerned webmasters bring up two good points: javascript and CMS design.

The first is that things like javascript, Flash, and messy implementations of HTML and CSS can slow down a page’s load time. While standard wisdom says to avoid tricky solutions to simple problems, common analytics and advertising packages are among the last things to load on a page and are often causes of frustratingly long load times.

The Big Panda update may have added a factor of ad density to the list, though simply removing ads isn’t an option for many sites. Additionally, removing an analytics package most likely would do more harm than good if you’re tracking your internet business. And to change your entire layout for one factor is farcical at best.

Yahoo’s Developer Network offers a whole host of solutions that don’t involve tons of man-hours put in. Three major best practices are easy to implement and should solve a number of speed issues. The first is to keep scripts and CSS off the page. This makes sure you’re saving time by letting your user cache the component parts of your site. Secondly, keep all scripts on one file, all CSS in another, minimizing the amount of files you have to call per page. Finally, if you’re sure you’re not going to be editing your scripts and CSS, you can Minify your code, which removes white space and comments. This is especially effective for large files, though it can make reading and editing the code more difficult.

But really, optimizing site speed is just clean coding. You shouldn’t be surprised that shrinking your megapixel images to 50×50-pixel thumbnails or making a slough of HTTP requests for each page would keep things slow. What happens, though, when it’s not your fault?

You use a CMS because you didn’t want to have to deal with all that. You expected what “real” coders created would be following best practices. This is especially grating when you bought a CMS and paid good money for it.

Here’s where there aren’t always good answers. To start, CMS’s almost invariably use databases to house their content and other components; this immediately adds steps to your request chain. And while plugins and hard-coding the CMS are always options, these can be either cost- or knowledge-prohibitive. For many CMS’s, simple things like being SEO friendly are hard enough to accomplish without fighting a losing battle against site speed.

Some suggest using a caching plugin, but even a benchmarking report from a year ago shows a forest of options just for WordPress, none of which look different on the surface, but can offer extraordinary ranges of performance. Not to mention the added headache of working “with” these plugins as you write for your site. It almost seems like Google wants you to have a “flat” HTML site.

Take a deep breath. First, remind yourself that this is one factor, and a small one. Linking and on-page optimization are the most important factors. After clearing your head, try to follow the same guidelines as much as you can — Minifying code, using simple solutions, and avoiding inline CSS and scripts will take you farther than you’d think. Once you’ve done all that, do practical tests: how fast is your site from a user’s perspective? Would you be turned off because of it?

If the answer is “Yes,” then you have bigger concerns than just one factor.