One of the things stopping me hugging and embracing Zooomr is how slow it is for me to view images off their servers. Take for example the image on this post on Thomas Hawk’s blog. There are two things wrong with it:
- It’s 241k, but it downloads on my fast shiny broadband connection like it’s ten times bigger. Brings me back to the good old days of dialup and a modem connection. Remember how that was? Oh, there’s the connection made, first bit of the image, oh oh, a small bit more, half way there, yawn, zzzzzz. I’ve fallen asleep.
- It’s not cachable. Every time you reload that page the whole image has to download again. Go check out what the cacheability engine thinks.
Date Thu, 14 Dec 2006 09:22:30 GMT
Content-Length 241.9K (247754)
This object will be considered stale, because it doesn’t have any freshness information assigned. It doesn’t have a validator present.
Date Thu, 14 Dec 2006 09:24:50 GMT
Last-Modified 2 min 28 sec ago (Thu, 14 Dec 2006 09:22:22 GMT) validated
Content-Length 127.2K (130220)
Server Apache/2.0.52 (Red Hat)
This object doesn’t have any explicit freshness information set, so a cache may use Last-Modified to determine how fresh it is with an adaptive TTL (at this time, it could be, depending on the adaptive percent used, considered fresh for: 29 sec (20%), 1 min 14 sec (50%), 2 min 28 sec (100%)). It can be validated with Last-Modified. The clock on this Web server appears to be set incorrectly; this can cause problems when calculating freshness.
Despite the problems reported above the image is cached by my browser and even with a force reload, it loads quicky.
I’m not sure how to fix the first problem except by adding a faster pipe to the servers hosting the data or upgrading the hosting hardware, but the second problem is very easy to fix using eTags and better headers. There are numerous tutorials and even code examples out there. Please, please, please look into it and make your images more cacheable! Your European neighbours will really appreciate it!