Cache - Is Your Web Site Cache Friendly? - NetLingo The Internet Dictionary: Online Dictionary of Computer and Internet Terms, Acronyms, Text Messaging, Smileys ;-)

Is Your Web Site Cache Friendly?

Print this page
By Brian D. Davison

With the continued growth in Net usage, Web caching is becoming increasingly important. In this article I'll discuss Web caching and its importance in site design to minimize costs and maximize scalability and compatibility. Before closing, I'll describe some ideas that may improve Web caching in the near future. First, a quick review of Web caching.

The simplest form of caching is just like using an address book that you keep close at hand. By keeping often-used information nearby, you can save time. While you can always find a phone number in a phone book, it's slower and may require you to get the phone book from across the room. Web caching works similarly-a Web cache stores resources that are expected to be requested again. Web caching is useful for three important reasons: It can reduce user-perceived Web-site delays, reduce network bandwidth usage, and reduce server loads. Caching makes the Web appear faster and cost less by better utilizing existing resources.

Benefits of Web Caching

When a request is satisfied by a cache (whether it's a browser cache, or a proxy cache run by your organization or ISP), the content no longer has to travel across the Internet from the origin Web server to you, saving bandwidth for both the client or ISP as well as the origin site. Instead of taking the time to establish new connections with each origin server for each resource needed, the client can retain its connection to the proxy. This is particularly beneficial to clients behind high-latency connections like modems or satellites. Furthermore, sufficiently busy proxies may be able to send requests for multiple clients on the same connection to one server, thus saving on connection establishments to servers, as well. Recall that TCP has a fairly high overhead (in terms of time) for connection establishment, and that it sends data slowly at first. This, combined with the fact that most requests on the Web are for relatively small resources, means that reducing the number of necessary connections and holding them open so that future requests can use them (that is, making them persistent) has a significant, positive effect on client performance.

The cacheability of your Web site affects both its user-perceived performance and the scalability of your hosting solution. Instead of taking seconds or minutes to load, a cached page can appear almost instantaneously. And whether you spend $20 a month for a starter Web site that allows 1GB of transfer per month, or many thousands of dollars per month for high-end connectivity and multiple servers, a cache-friendly design will let you serve more pages before you'll need to upgrade to a more expensive solution. To extract the most performance for the dollar, you should make your site as cache-friendly as possible-read on.

Page Views and Hit Counts

Some of you might think that caching is a terrible idea, because it means that there will be page views and users about which and whom you will never know. In fact, some Web designers have explicitly tried to use cache-busting techniques (such as setting the expiration date of an object to be zero) to prevent their page or resources on the page from being cached. This is sometimes used in the form of a hit-counter image that changes on every viewing. Other designers provide no caching-related headers, in the belief that the absence of those headers will make the site uncacheable. As we'll see shortly, this belief is false. Additionally, proxy caches are often highly configurable, and as a result, cache operators with special concerns of privacy, reliability, or bandwidth restrictions will sometimes configure their systems to cache objects from sites not designed to be cached. Therefore, in general, a site administrator will never know about every page view, and so some caching advocates suggest using sampling techniques instead (for example, occasionally make a page or site uncacheable for a period of time to measure hits). If you don't host your own Web pages (and even if you do), it's a good idea to find out what the caching policies are for your site. Fortunately, there are a few easy ways to get the information you need. Mark Nottingham's Cacheability Engine (see " Online Resources") is a wonderful, free Web utility that will examine the page of your choice (plus embedded resources) for cacheability, and provides recommendations on what needs fixing. Another method you might try is Netscape's View: Page Info menu item, because it can show some information (namely, the Last-Modified and Expires dates) of the resource currently being viewed. However, if you have access to a telnet client that's flexible enough to let you select the port (common for UNIX/Linux telnet), you can view all the headers returned by your server and thus explicitly check the cacheability of individual resources on your site quickly and easily (see Example 1).

The headers from this example that are meaningful to caching include Date, Last-Modified, Etag, and Content-Length. An accurate Date is also essential for the origin server to be able to verify that an object has not changed since it was last modified at some date in the past. A Last-Modified date is also needed for GET requests that include an If-Modified-Since header, because those will return the content only if the object has been modified after the date specified. It's also sometimes used by caches to guess how soon the object will be modified in the future (when other cacheability headers are not present). Content-Length is necessary for persistent connections to be enabled. Finally, an Etag represents a signature for the object-if the signature of a future object at this URL matches this one, then the objects are considered equivalent. Example 2). demonstrates how to view headers using an HTTP/1.1 request. The new header in this example is Expires, which specifies the date at which the content becomes invalid. For slowly or never-changing resources, setting an explicit expiration date is the perfect way to tell caches how long they can keep the object (without having to take the time to verify it).

In addition to the common ones shown above, there's another relevant header (introduced in the latest version of HTTP/1.1-see " Online Resources"). Cache-Control lets you specify a variety of options, including max-age, no-cache, and must-revalidate, among others. These are quite helpful, but are not yet commonly used.

Make Your Site Cache Friendly

Because a large number of users surf the Web behind proxy caches of some kind, it's important to make your Web site caching-compliant, whether the content of the site should be cached or not. For example, AOL servers may cache objects that have neither the Expires nor Last-Modified headers. To ensure that objects seen by AOL users are up to date, you need to use appropriate HTTP headers.

Most Web-server software provides many options for caching, and setting these appropriately is crucial. A Webmaster can control caching through options provided by the Web server, usually on a per-directory and/or MIME-type basis. This means that the whole Web site or even all elements on the page do not need to share the same caching settings. In Apache, these settings can be made in the server configuration files, or in local directory-specific .htaccess files. (For an example of how to set these properties in Microsoft IIS, see the "Business Developer" article in the January 2000 issue.) If you don't manage the Web server and you're not sure how to specify cacheability settings in your environment, contact your system administrator. If he or she is unable or unwilling to let you control your Web site's cacheability, complain loudly and/or consider alternate hosting services. Note that HTML META tags are not valid ways to specify caching properties and are ignored by most caching products, because they do not read the HTML. Even the use of cookies does not preclude caching for your Web site. Although the presence of a Set-Cookie header made the response uncacheable under HTTP/1.0, this is no longer the case with HTTP/1.1. Web servers that are HTTP/1.1 compliant (and most modern servers are compliant) are able to specify cacheability independently of the presence of cookies, if that's what's desired. Note that the cookies themselves are never stored (except by the browser); only the content is recorded.

So, to maximize the cacheability of your Web site, you should give Expires headers to all static-content elements (buttons, graphics, audio and video files, and pages that rarely change) so that they can be cached for weeks or months at a time. Dynamic elements or pages may be marked Cache-Control: no-cache if they change on every request, or perhaps Cache-Control: private if it's OK for the client browser to cache the item (such as for personalized resources that don't change often). When possible, all resources should have a Last-Modified date (so that caches can check for newer versions) and all resources should include Content-Length headers so that persistent connections are possible. More caching tips for Webmasters can be found at my Web-caching Web site and at the Cache Now! campaign.

Caching, Then and Now

Caching has always been a part of the popular Internet (DNS entries, for example, are cached by nameservers). Likewise, caching has been a part of the Web almost since the beginning. The first proxy cache was implemented as part of the CERN httpd server. Caching was implemented in other systems soon afterward-notably the Harvest Object Cache as part of the Harvest Project, which gave rise to the NetCache commercial proxy now offered by Network Appliance, and Squid, a free, Open Source proxy cache that's in widespread use today.

Today, caching is commonplace in client browsers everywhere. Workgroup and enterprise proxy caches are also widespread, especially in parts of the world where bandwidth is expensive. Proxy caches are also used by many ISPs, such as AOL, which uses Inktomi's Traffic Server. Performance and scalability of proxy caches can vary widely, so it makes sense to compare product features and to view the results of independent testing (such as the UCSD/IRCACHE Bake-offs).

Cache-busting is also commonplace today, primarily for hit-counters and banner advertising. Banner ads are typically displayed after an HTTP redirect, which makes the selection of which ad to show uncacheable, although the final image displayed can and should be cacheable. One interesting development has been the use of Web caching in the relatively recent growth of content distribution networks like Digital Island/Sandpiper and Akamai. These services place servers around the Internet to host primarily static (read: "highly cacheable") resources. Content providers that want to improve site responsiveness or spread the load of a highly popular site subscribe to these services so that resources are distributed across the Internet and thus are closer to the clients. Any dynamic portions of these Web sites are usually still hosted at the origin server, but the end result can be a significant improvement in response time and the scalability to handle sharp increases in traffic.

Caching Tomorrow

A promising caching research topic is in anticipating users' requests. The idea of prefetching content that's likely to be used has been around for years, and is quite appealing. If you know what resource the client will request next, then that resource can be fetched in advance, so that it will be in the cache and available almost instantaneously when the client actually requests it. There are a number of researchers investigating this idea, and in fact there are some products that already incorporate this technology to some extent. Perhaps you've seen popular browser add-ons such as Web3000's NetSonic or PeakSoft's PeakJet 2000. Among other features, these and similar products prefetch the links of the current page so that when the user clicks on the next link, it's available immediately. Early products in this genre were actually too energetic and tended to overload the slow Web servers of the time, but most products today are more intelligent and highly configurable.

Prefetching features have also been incorporated in some commercial proxy caches, such as those from CacheFlow. The primary difficulty in all prefetching mechanisms is to be able to accurately predict which resources will be needed next, to minimize mistakes that result in wasted bandwidth and increased server loads. Another concern is about the side-effects of prefetching, because clicking on some links may have undesirable consequences (like adding another product to your shopping basket). As the caching industry's focus changes from reducing bandwidth to improving Web-site responsiveness, prefetching techniques are likely to become more common.

At the IEEE INFOCOM 2000 conference, investigators from AT&T Labs Research and Tel Aviv University are expected to present techniques that do everything but prefetch. In particular, they've demonstrated the usefulness of preresolving (performing DNS lookup in advance), preconnecting (opening a TCP connection in advance), and prewarming (sending a dummy HTTP request to the origin server). In their experiments, the number of slow responses (those taking more than 1 second, even with fast client connectivity) was reduced by over 80 percent. These techniques can be implemented in both proxies and browsers, and can significantly reduce the latencies experienced by users without needing to prefetch content (and thus avoid some of the concerns associated with traditional prefetching).

In Closing

The caching and content delivery markets are projected to exceed $2 billion in the next couple of years. As more traffic is served by caches instead of origin servers, it's prudent to consider how to maximize their usefulness for your Web site. Caching will never be a panacea, but it is a tool that can be used by Web-site developers to increase site responsiveness, minimize server and network loads, increase scalability, and manage the freshness of the delivered content.


In addition to operating his Web-caching resources site, Brian is a doctoral candidate involved in Web-caching and prefetching research at Rutgers University. He also runs a small online computer bookstore ( Send your comments to

Learn Online Jargon