Home > Articles > Advanced Website Optimization – Making your Site Faster

Advanced Website Optimization – Making your Site Faster

April 1st, 2010

Introduction

There are a few interesting tools around to analyze the speed of your website. Yahoo’s YSlow and Google’s Page Speed (both Firefox plugins) are a good start and offer a lot of advice and background information. In this post you’ll see graphs from Webpagetest.

This website offers the best visual analysis IMHO and shows exactly how the page is loaded, which file was received in which connection and at which time. I’ve used it to optimize my file sync webpage and will use the steps that I made as example. Here’s my starting point:

First Benchmark

Image 1: First benchmark

Each bar represents a file. Time goes from left to right (less is better) and the different colors represent the different aspects of the file transfer.

Tip #1: File size doesn’t really matter

This may be hard to believe, especially if you used to optimize your site in pre-DSL times. But check the image again and look for blue bars. Blue means the actual data transfer times, everything else is just connection stuff. Right, there’s not much blue.

Connections count

When analyzing my site the first time I was surprised to see that things weren’t done all at once. But one after another with only a few parallels. The graph shown above assumes that your browser uses 4 parallel connections to load data. Only when one of these 4 connections has loaded a file completely the next file is also requested.

And these requests are quite slow, so keeping the number of requests  to a minimum is the important factor now. The reason is that the HTTP/1.1 standard recommends using a small number of parallel connections. The reality differs in terms of browsers:

Browser Parallel requests (default) Configurable?
Chrome 3 4
Firefox 2 2 network.http.max-persistent-connections-per-server
Firefox 3 6
IE 7 2 Q282402
IE 8 6
Opera10 16 Tools > Preferences > Advanced > Network

Tip #2: Reduce number of files/images

To reduce http requests, you have to reduce the number of files/images. Stripping images that you don’t really need may be one way to do this. I had some quotation mark images that I didn’t really need. Normal characters worked here, too. If you have multiple Javascript or multiple CSS files you can combine them.

CSS sprites are another way of reducing the number of images. If you have many similar ones like menu icons or buttons with overlays you can put them into one single file and use CSS tricks to show only the part that you need. I’ve already been using them, otherwise the loading time would have been a lot worse.

Special conditions apply

The graph shows another interesting thing. After the main file is loaded, everybody waits for the next file, a javascript file, to be loaded. Only after this Javascript file is done, the images start to get loaded. This seems to be the usual behavior with browsers to allow the script to make page modifications.

Tip #3: Put javascript at the bottom

That’s why you should put javascript includes at the end of the html code. Your CSS include should be at the top however, to allow an early loading.

After making these changes, my website already loaded much faster. It started to render (vertical green line) after 1 second (before: 1.5s) and had the document completed (vertical blue line) after 2.8s (before: 3.4s). (The files coming after the blue line are the favicon and extra loads from a script.)

Removed 2 images JS at the bottom

Image 2 – Removed 2 images, JS at the bottom

Forcing parallel downloads

The limitations for parallel requests were implemented to keep the browsers from over stressing servers. But usually the servers can handle a lot more. Since the limitations mentioned above are per domain, you can use extra domains (which may be hosted on the same server) to force the browser to open more connections.

I tried that by using the subdomains img1.easy2sync.com and img2.easy2sync.com. The downside is that the browser will need to perform extra IP lookups (even if it’s only a different subdomain). You can see this extra time as 2 new dark green boxes.

Using multiple=

Image 3 – Using multiple domains

You can clearly see that now more downloads are done at the same time. The start-to-render time is almost the same and the document-complete time has decreased from 2.8s to 2.2s.

Tip #4: Use extra domains. Maybe.

This is a lot of work and the effect wasn’t so big for me, so it may depend on the number of files that you have. Plus, it depends on the user’s browser choice. But it can make a difference. However, you shouldn’t move javascript files to other domains, since cross-domain script access might cause problems.

Cutting connection overhead

The orange parts in the graph are interesting, too. They show the time required for the Initial Connection. You can see this orange part in every row since my server didn’t support the “Connection: Keep-Alive” feature. This feature enables the server to re-use a connection, after a file was transferred completely (instead of closing the connection and opening a new one). All current browsers support it, but maybe your server doesn’t.

Tip #5: Turn on “Connection: Keep-Alive”

My server didn’t, and it took some time until my hosting company fixed this after I inquired. You can see in the next image that most of the orange bars are gone. Since this benchmark was done much later you can’t really compare it to the previous benchmarks (some things are shown as slower now for no obvious reason), but it’s probably still safe to assume that this change improved the speed.

With Connection Keep Alive

Image 4 – With "Connection: Keep-Alive"; taken much later than the other benchmarks

Tip #6: Use Expires or Cache-Control Headers

If you already combined or minified your CSS and Javascript resources and combined your images intro sprites, you can also apply the following tricks:

  1. For static content add “Expires” header and set it far in the future.  This means that a static file like http://www.site.com/images/logo.gif, which has a low probability of changing in the future, will “never expire” so the browser will not repeatedly download that file each time it’s requested and will grab it from the cache instead.
  2. For dynamic content that can change in the future, like the CSS files, you can set add “Cache-Control” with max-age=[seconds] option.  This is similar to “Expires” but this directive is relative to the time of the request, rather than absolute. [seconds] is the number of seconds from the time of the request until the browser will reconsider refreshing the file.

Summary

During this session I cut down the time till the document is complete from 3.5s to 2.2s. Making the site faster for the customers is only one aspect here. Page speed is also part of Google’s ominous “quality score”, so it might even influence your website’s position in the search results.

On the other hand, it’s never that simple. The speed differs with the location / connection of the user and the browser  they use. Image 4 also shows that benchmarks may also be different some time later for unknown reasons. But faster is still faster and spending some time to optimize your site might be worth it. To start things, simply visit Webpagetest and enter your page URL.

Thomas Holz is the owner of ITSTH and the author of outlook tools to synchronize, remove duplicates and use boilerplate texts and writes in his devblog, if he still has too much time after optimizing the website.

Articles , ,

  1. June 6th, 2010 at 00:31 | #1

    nice info, thanks for share

Comments are closed.