A few days ago I gave a presentation at San Jose State University on how to speed up your website’s frontend performance, which, according to many, is responsible for 80-90% of a page’s load time.
Rather than throw up a bunch of slides and blab on about expires headers, gzipping and concatenating, I created an awesome social networking website called openSpaceBook. Go ahead and check it out, any login will work.
Now that you’ve realized you should cancel your Facebook account because this is going to be awesome, you can learn how it is pretty slow to load and how we can speed it up.
According to webpagetest.org, a handy-dandy webpage loading tool, it takes about 12 seconds for openSpaceBook’s home.html page to load on DSL. 12 seconds! And that’s just for a static html page, nothing going on in the backend. WTF is taking so long?
This is what’s taking so long:
26 HTTP requests, 773KB! Rendering doesn’t start until 12 seconds into downloading the page! (Thin vertical
blue green line.)
Don’t think this page is bloated compared to other websites. Facebook has 182 requests and has 500KB in assets (down from over 1MB a few months ago). YouTube has 42 requests, 356KB. And those sites have most likely been optimized, openSpaceBook is not.
So I have this bloated website, but I can’t remove any features because I’ll lose the ‘ooh shiny!’ audience that I’m trying to attract. What should I do?
Step 1: Reduce HTTP requests
The first thing that can be done is reducing the number of HTTP requests. There are 3 reasons for this. One is every HTTP request has a small amount of network overhead, and the more requests you remove, the less overall network traffic.
The second reason is browsers have a limit to the # of HTTP requests they can make to a webserver (or hostname, specifically). Older browsers were limited to 2. Some newer browsers have upped that limit to 6 or 8, but they still have a limit. Therefore when a browser has reached its limit, it has to wait for requests to finish before starting up new ones. So the more requests necessary, the more queuing will occur.
So lets see how we fare after reducing our HTTP requests. I did this manually by copying and pasting all of my CSS files into 1 file and JS files into 1 file. I also took all my CSS background images and merged them into one. Then I can use CSS background positioning to only show the graphic I need.
Results after combining files
Shazam! Down to 5.8s! That’s a 50% increase in speed for less than 30 minutes worth of work. And 8KB has been shaved off of the page. Sweet. On to our next step.
Step 2: Set expires headers
Another simple trick that requires a few lines of code is setting expires headers. Expires headers are a type of header that tells the browser when an asset ‘expires’ from its cache. When you set it to years in the future, a browser will cache it and never ask the website again for it.
Expires headers look like this:
Expires: Thu, 15 Apr 2020 20:00:00 GMT
You can set expires headers in Apache by adding this to your httpd.conf or .htaccess file:
Header set Expires "Thu, 15 Apr 2020 20:00:00 GMT"
Hmm, 6 seconds. No improvement. Not surprising, we didn’t really change anything. In fact, we actually added an extra header per request. The extra 0.2 seconds could be due to this or due to varying network latency or utilization.
What was improved was repeat response time. Down from 1.3 seconds to 0.6 seconds. And this performance improvement is carried over to all other pages. Any previously cached assets will not be downloaded again or checked to see if there is a newer version (If-Modified-Since request). So a user’s multiple page view experience will be significantly faster.
So what else can be done to really improve performance?
Step 3: Gzip your content
Now we’re talking. Compressing your content is an easy way to significantly reduce the amount of data transmitted. HTML, CSS & JS are all text, so they compress very well. So let’s see what kind of performance increase we get when we compress our content.
Yes! Down to 2.7 seconds! And we’ve cut the amount of data down to 223KB, which equates to $$$ saved when you pay for bandwidth.
To gzip your content in Apache, you can add this to your httpd.conf or .htaccess file:
So, we’re down to 2.4 seconds, can we do any better? You’re darn tootin’!
Step 5: Minify CSS & JS
So how well does openFaceBook do after minifying?
2.2 seconds, not bad. Down to 154KB too. So openFaceBook has gone from 12 seconds and 773KB of data to 2.2 seconds and 154KB. That’s a 81% improvement in load time and a 80% decrease in page weight. And all those steps took me about an hour or two to do. That’s a pretty good ROI for a few hours worth of work.
Speed is everything. It’s a feature that is often left out of PRDs and users’ thoughts about websites, but it’s there. And they notice on a subliminal level when your site is slow. With all the hoopla about how browsers are getting faster and faster, how fast you can deliver content to your users becomes more and more important.