Image caching prohibited at the server level?

A programmer friend just approached me with an odd problem which I couldn’t explain. Basically what he’s seeing is this:

On Server A, you can do a search and the results page will load and in the log you’ll see the requests coming in from the client for the navigation image files. Then you can hit next to go to the next page of results and you won’t see the image files re-requested since the browser already has them.

On Server B, when you search the behavior is the same until you hit next. Then the next page loads and requests all the navigation image files as though it was the first time it had ever heard of them. And if you hit next to see the 3rd page of results, again it will request all the navigation images.

He’s seeing this in the same browser, so it appears to be an issue on the server not the client side.

Anyone got any idea what’s going on?

One environmental note: Server A & B are, theoretically, running exactly the same proprietary web server which is integrated with the searching software.

Published by

Dinah from Kabalor

Author. Discardian. GM. Current project: creating an inclusive indie fantasy ttrpg https://www.patreon.com/kabalor

3 thoughts on “Image caching prohibited at the server level?”

  1. Web servers are capable of being configured to issue Expires: and Cache-Control: headers for subsets of content (e.g., for images). The two proprietary web servers *might* be configured differently. (Perhaps someone was playing with performance on one, and didn’t either back out their experiment or propogate the result to the other server).
    Or, there might be a caching proxy in the path to one server, but not the other.
    Your friend can gain some enlightenment by finding a way of capturing raw HTTP requests and responses. This can be done by installing a packet sniffer, or by finding a client-side proxy that’ll capture and log headers.

    Like

  2. Thanks, Dave. I’ll pass that along.
    I think he is already looking at the raw requests and responses, but the performance tuning fallout idea is a great suggestion!

    Like

  3. One other thing to check is that the time is synchronized on both the servers. If the “Expires:” header is set, then it might be that one server thinks the page has already expired, and the other one doesn’t. Actually, this should be happening in the browser; i.e., the browser would determine whether or not to request the images based on the page expiration information. So, if the browser client is synchronized with the time of one server, it won’t request the images again; if the other server has a different time, it might decide that it needs to refresh the images. I run xntp on all my machines to ensure that they all have the same time.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s