Faster! Curious how we did it? - Blogging, Small Business, Web Design & Hosting Tips - A Small Orange

Faster! Curious how we did it?

The redesign of my site was partially vanity, and partially SEO. Now, when we say SEO, we do not mean cultivating back links from random sites, jockeying with keywords that have nothing to do with us, or randomly finding blogs to comment on just so we can get a link to our site in their comment area.

Our SEO undertaking was all about organization, and speed. We’re going to take each thing we did piece by piece and share it with you.

Today, we’re going to talk about compressing your site.

Let out some hot air

Mod-deflate is an optional module for the Apache HTTP Server, Apache v2 only. Based on Deflate lossless data compression algorithm that uses a combination of the LZ77 algorithm and Huffman coding. This module provides the DEFLATE output filter that allows output from Apache HTTP server to be compressed before being sent to the client over the network. (Lifted word for word from

In English?

It takes your files, squishes them, hands them over to your visitors squished, and your visitor’s browser (if its a modern browser) unsquishes them so they can view the page. This will save your bandwidth, and your visitor’s loading time making your site faster. It does this by trading off resources, as the act of the server squish takes more of the server’s CPU than normal.

Any of you currently using mod-deflate?

If your site currently rates well with regard to speed, you may not need to worry about mod-deflate. You are going to trade off CPU usage, and we do have a ceiling on that so you should think carefully about how busy your site is, and whether you think that you can afford the CPU hit on your account. On the other hand, if your site is one that is competing for the race up the Google rank or you have a serious concern about having the absolute best possible experience for your visitors with regard to speed, this is something you may want to implement.

And implementing it is pretty much a snap. While it runs on all servers right now, it won’t compress anything until you tell it to on your individual site, and the way that you give it marching orders is through .htaccess. Here’s ours:

# compress certain file types by extension:
<FilesMatch "\\.(js|css|html|htm|php|xml)$">
SetOutputFilter DEFLATE

#exempt old browsers
BrowserMatch ^Mozilla/4 gzip-only-text/html
BrowserMatch ^Mozilla/4\.0[678] no-gzip
BrowserMatch \bMSIE !no-gzip !gzip-only-text/html

This will squish our pages, javascript, CSS files, php files, and xml pages, but not any of your graphics. Graphics should be web optimized, anyway, and running them through deflate generally won’t save you enough to justify the CPU cost of doing it. It will also serve older browsers a plain old uncompressed page if any happen to come knocking.

Again, the warning is that this will up the CPU cost of serving the pages while lowering the bandwidth and speeding up the page. How high the CPU cost will be will depend on the size of your pages, and your traffic.

If your website’s implementation causes a problem, we will definitely contact you and let you know. You can also implement it, and email support to ask us to look up your site’s daily average resource usage for the days proceeding your implementation and the days during its usage, and we’ll be happy to tell you so you have an idea.

  • Jen Lepp

    It's likely not going to be as resource intensive as all *that*.

  • That's great…. but what does page load speed have to do with SEO?
    Google's “webmaster guidelines” page says nothing about it.

  • Jen Lepp

    Late last year, Google's Matt Cutts confirmed on his blog that speed will become a ranking factor on Google, and at PubCon, he said said Google plans to introduce a new ranking factor into the algorithm i.e. “how fast a site or page loads” in his ‘what to expect in 2010? bullet points (you can see that presentation replayed here: )

    At *this* moment, it may have nothing to do with ranking and SEO, but there's widespread belief that in the very near future (within 2010), it will with Google. If you sign up into Webmaster tools, you can see they already graph how long the site's losing takes under Labs| Site Performance, so this is data they are already keeping track of, though in a Beta way.

    Google's Page Speed Firefox plugin also recommends the above for sites (among other things), so while for the moment page speed may not immediately affect rankings on Google, we've been told repeatedly publicly by Google that it will, and very soon.

  • Janice Schwarz

    Yeah, I've read similar articles on the subject of Google trying to “make the web faster”. I do like this! Will plan to try it out this week. Perhaps I should give you warning before trying it on Skippy's List. 😀

  • Thanks. Do you happen to have a link to that blog post?

    (I'm really looking for written primary sources here. “Widespread belief” is rumor, a round-table video is difficult to cite to busy people, and of course the Page Speed plugin talks about improving speed in various ways, since that's its express purpose.)

    I don't mean to seem argumentative here, just trying to nail this down as confirmed and citable fact rather than the voodoo that seems so common in SEO discussions.

  • Jen Lepp

    The upgrade is called “Caffeine”, and you can google “Google Caffeine” to see a lot of information about it, but the major definitive announcement was made during pubcon where it was confirmed in that talk that we linked to above. In addition to the talk we linked to above, there's:

    and you can see the slides Matt used here:

  • Jen Lepp

    Like I said, it all depends on what information you want to accept at face value – if someone at Google says it's probably coming, that, to me, is time to take it seriously.

    Google's algorithm is proprietary, they're giving information on page speed now, and have gone so far as to make software that makes suggestions on speeding sites up. They rank, within Webmaster tools, a comparison of a site's speed in comparison to everyone else's. SEO has always been a process of extrapolating what benefits a site and to me, there seems to be overwhelming evidence between caffeine, minimalistic design, its own public DNS, and Chrome that speed is a huge focus for them. Since there's a Google Page Speed tool, that seems to be a pretty good indicator along with what's actually been stated about the push for speed rankings that its coming.

    Personally, regardless of whether it will directly effect rankings and when, the simple fact is that slower pages lose revenue – the data's already in on the fickleness of internet browsers and shoppers with regard to load time and just how long (not long) they are willing to wait. So, whether you optimize because you might lose rank, or you optimize because you have only seconds to attract and land the traffic the search engine sends you, it would seem that there's little argument that faster load times equals better optimization.

    Google doesn't state outright many of the reasons it ranks what it does how it does – they don't have to tell anyone anything. They could be ranking based on load time now and that's why there is a focus on speed and a graph showing you load times and a tool showing you precisely what they think you should do and how they would score you. They may not be using any of it and maybe someone was just bored. They may never do it.

    Personally, though, it would seem to me if there is the level of widespread speculation and hints that there is (and there is an enormous amount) it would indicate to me that (a) it's a good idea to do anyway and (b) if you don't do it and you have a very slow loading site, if they do implement it overnight you may find that you're already digging out of a hole scrambling to regain a position you lost because the chatter wasn't enough to convince you.

    For me, the statements made at PubCon, the high level of chatter, the amount of money Google is investing in speed coupled with the fact that its just better practice anyway were enough – obviously, your mileage may vary.

  • That link to Matt Cutts's blog goes to a 2006 post about updates, and has nothing about Caffeine, site speed, or presentation slides.

    In that YouTube interview (let me mention again how much I hate using videos as sources for concrete information), speed is treated as a completely separate issue from Caffeine, and accounting for speed in the rankings is talked about as a *possibility* that some in Google are pushing for, rather than as anything definite, much less as part of Caffeine.

    And I finally got a chance to go look at that roundtable link… and found not a video as I thought you were linking to, but a liveblog of people paraphrasing what was said. (Even worse than a video as a source for concrete information.) Again, site speed and Caffeine are treated as separate things. As far as site speed being used for rankings, there are two sentences: “Matt says Google is considering making speed a ranking signal. He says Larry Page wants the web to flip.” Sounds to me like it's still under discussion.

    I searched some more on Caffeine, and it sounds like Caffeine is just a back-end improvement of their index/search infrastructure, explicitly intended not to affect site rankings much at all. Whether it's actually live now seems to be a matter of some debate, but it also seems to be irrelevant to the site speed discussion.

    I did find another post talking about Cutts's PubCon comments:
    “Matt explained that page speed is a factor in the search ad AdWords quality score and there is currently a strong push to make it a factor in the organic ranking algorithm. Matt basically implied that in 2010, it will be one additional factor.”
    This “basically implied” stuff reminds me of people trying to extrapolate the intentions of the Federal Reserve from cryptic comments by its chair. Not exactly a very reliable process. (That same post goes on to speculate that if considered, speed would likely be an extremely minor factor anyway.)
    Another post on the same topic, treating it as more speculative:

    There's no doubt that Google wants people to improve the speed of their sites, and I agree that it's a good thing. But the connection from there to SEO ranking (outside of AdWords) seems merely speculative at this point.

    I wish there were something definite we could point to one way or the other, to justify spending time on fraction-of-a-second speedups vs the infinitely-many other tasks we all have. Instead we have innuendo from a single person in the know, and speculative extrapolation from everybody else based on that… in an echo chamber leading to “widespread belief” all rooted from the same not-necessarily-so-definitive source.

    (BTW, it's ironic that this blog page actually loads rather slowly for me due to the third-party commenting engine.)

  • I don't buy the level of chatter and speculation as an indication of anything except the high amount of public interest in getting pages ranked higher. Like I said, it's just an echo chamber. But I certainly see your point about the rest.

  • Pingback: Using caching to Speed Up Your Wordpress Blog | ASO Web Hosting Blog()

  • Pingback: GZIP Compression - Website Builder Community Forum()

  • Jen Lepp

    Hey Rob – they went public with it April 9th, on their blog:

    “As part of that effort, today we're including a new signal in our search ranking algorithms: site speed. Site speed reflects how quickly a website responds to web requests.”

    I should have bet you a cookie. 🙂