Wednesday, September 13th, 2006

Ajax IE Caching Issue

Category: Ajax, IE

David Arthur, like many, has had problems with the caching issue that Internet Explorer seems to have with Ajax connections:

If you’ve been working with the Ajax framework long enough, i’m sure you’ve run into at least a few speed bumps thanks to Internet Explorer. Not a day goes by that i don’t have to rewrite a line of code, or tweak my css in order for IE to render what i think it should. But alas, this is the nature of software that comes from a company that views Standard Compliances as recommendations.

He describes his problem – grabbing a new image with an Aajx request, a seemingly simple task – and the results of his queries. IE decided caching it was the “in” thing and wasn’t going to grab anything new. He tried all sorts of hacks and fixes to try to get things working, but to no avail. Finally, after finding this entry on Wikipedia, he stumbled across a solution – using POST over GET.

While he was figuring it out, though, there were also lots of comments being made to the original post with hacks to get around the issue – so many that he wanted to create a new place for them all to be shared. In this post he includes two of the suggestions, including using something like a timestamp to change up the URL and adding in an unused POST variable.

Posted by Chris Cornutt at 8:25 am

3.8 rating from 159 votes


Comments feed TrackBack URI

Thank god I’m not the only one having this problem. I wonder when this bug was first found?

Comment by mdm-adph — September 13, 2006


yet if you read the HTTP specification, you’ll learn that any resource got using GET method is due to be cached by any agent on the chain (browser, proxy, even web server); whereas if got using the POST method, the query will be resubmitted and reprocessed each and every time.

a solution if you still want to use GET is to add a timestamp or an ever changing parameter in your query to make the resource identifier unique for each call…

Comment by Sad developer — September 13, 2006

Actually in this regard IE follows HTTP RFC, which clearly states that GET requests are cacheable. So it really isn’t an IE bug.

There are several workarounds for this, simplest being attaching a unique, but completely unimportant parameter to request string.

Comment by Marko — September 13, 2006

This is ironic to me, because about 2 months ago I was trying to scroll a long image across the top border of a website for a nice DHTML effect, and I had just the opposite problem. Every time I changed the position of the graphic, IE grabbed a new copy of the image, resulting in tremendous bandwidth inefficiency as well as an ugly ficker effect.

Comment by Daniel — September 13, 2006

What about setting the ‘Cache: no-cache’ response header instead, isn’t that suppose to do the trick?

Comment by Sunday Ironfoot — September 13, 2006

@Daniel: to stop the flicker check out these links:

Comment by Mario — September 13, 2006

Use POST or use a cache buster param in the query string. One sentence worth of content expanded into several blog posts and an article on Ajaxian… wow.
And btw, it has nothing to do with prototype.js, comments in his blog entries are oddly blaming it. Caching per unique URI is expected behavior with a GET.

Comment by Ryan Gahl — September 13, 2006

The article has nothing to do with prototype – i just dont like prototype.js

Comment by David Arthur — September 13, 2006

Right, and the article having nothing to do with prototype.js makes it even more odd that your commenters are blaming it :-)

Comment by Ryan Gahl — September 13, 2006

BTW, I didn’t think about this after my first comment… Ajaxian finally fixed the comment bug!! I’m impressed. That only took about 6 months. Must have finally been a break in all the convention planning :-). Sorry Ajaxian, obviously I can get pretty cynical, hopefully it was my squeaky wheel bitching that pushed you to finally fix it.

Comment by Ryan Gahl — September 13, 2006

Ryan: Ugh, we do need to do a better job maintain our site’s blog platform. Thanks for calling us out on it.

Sunday Ironfoot: If you have access to the server doling out the content, obviously setting the proper headers (Cache-Control or an expiration header) is the way to go, but the hacks can serve as a useful way to deal with caching when you’re doing, say, a mash-up and you’re dealing with a server that doesn’t properly handle headers.

Comment by Ben Galbraith — September 13, 2006

I thought we all knew this…

Comment by Rob — September 13, 2006

Prototype haters… oh uh.. lets not start a flame… I

Comment by Mario — September 13, 2006

I *heart* prototype.js (to each their own I guess)

Comment by Mario — September 13, 2006

Rob: I’m guessing a lot of regulars do, but there are people learning ajax all the time who get caught by this.

Comment by Rob Sanheim — September 13, 2006

I used exactly the same technique as above, just posting either the date and time or a random number in the querystring to force IE to reload the relevant document.

Getting around caches in pretty much all browsers is achieved by doing this.

Comment by Tim Leonard — September 13, 2006

Hey I posted about that a while ago (I think last June), easy just add a dummy date to your Ajax requests!

Check my blog here for the details. Work like a charm for me :-)

Comment by Paschal — September 13, 2006

I’ve never had a problem with setting the If-Modified-Since header.
Blog post about how to do this in Protoype.

Comment by RIM — September 14, 2006

New ones (Web, Code stuff)

Web stuff Web Development Tools for the Power Developer Ajax IE Caching Issue and Dramatically improved

Trackback by Hulkster — September 14, 2006

The workaround with the seconds is not working on a lot of proxy servers. To be more sure there is nobody caching the request use POST. As I can imaging Ajax.NET is doing this from the beginning. And the second issue is that GET is not working for large amout of data.

Comment by Oliver — September 15, 2006

Oliver wrote: And the second issue is that GET is not working for large amout of data.

Because there is a limit with how long a URL can be. Nothing new…


Comment by Eric Pascarello — September 15, 2006

I’ve ran into this problem many times. For my current client we are using Ajax.Updater() from prototype.js and we’re pulling plain .html content. My first attempt to fix IE caching was to use POST, but as some of you already know, with Apache you cant use a POST to request an html page.

There for our solution was to add ‘?random=’ + Math.random(); to the end of all our requests. Its nasty looking but it solves the issue.

Comment by Aaron Pedersen — September 15, 2006

Ahhhhh, thx man, I had a similar problem and thanks to this article I found the solution. :D

Comment by Bazzz — October 22, 2006

I am using Ajax.I have added some data to Cache usind Add Method.But when I use Ajax I can not use Cache.How to do it?
I am using System.web.Caching.

Comment by Himani — November 8, 2006

using Sajax, i have experienced the same caching problem. trying to retrieve a list oj JSON objects from server which changes time to time.
I ve solved the problem with expires http header parameter: e.g in CGI perl:
print $cgi->header(-expires=’-1d’); # already expired !!

Comment by Morque Alain — November 15, 2006

Thanks.. the POST method saved me.. (sigh IE is a pain in ass) For some what reason I started debuging using Metatags, and adding &random but didnot work for me.

Comment by Bravo — March 9, 2007

thanks for suggestion it really works for me , after going through many problems.

Comment by Rajeev Khandelwal — August 10, 2007

ALELUYA !. Somebody figured out, thanks.

Comment by Esteban — August 15, 2007

I’ve done all my ajax requests with prototype, which uses the JSONRPC spec (orignal ver), if you set the id parameter of the request to a random value (I stripped the preceeding “0.”) you should have a unique request URI in the GET scope (didn’t want to change my backend for this suite which was already scoped to GET).

Comment by Jason Medland — September 12, 2007

You may want to note that in prototype.js 1.6 you need to change

setRequestHeaders: function() {
var headers = {
‘X-Requested-With’: ‘XMLHttpRequest’,
‘X-Prototype-Version’: Prototype.Version,
‘Accept’: ‘text/javascript, text/html, application/xml, text/xml, */*’


setRequestHeaders: function() {
var headers = {
‘If-Modified-Since’: ‘Thu, 1 Jan 1970 00:00:00 GMT’,
‘X-Requested-With’: ‘XMLHttpRequest’,
‘X-Prototype-Version’: Prototype.Version,
‘Accept’: ‘text/javascript, text/html, application/xml, text/xml, */*’

Comment by ujr — September 28, 2007

This is so retarded!
firefox does not handle it stupid like that.
It is more than obvious that any ajax call is not to be taken from a cache!

I’m glad i found this blog… solved it by adding the current unixtime as GET parameter *sigh*

Comment by Lirezh — October 11, 2007

Yes if you use prototype then Medland’s solution would be nice.

But i solved my problem by putting this 2 lines before the send method.

here is my sample code:“GET”, url, true);
myReq.setRequestHeader(“If-Modified-Since”, “Thu, 1 Jan 1970 00:00:00 GMT”);
myReq.setRequestHeader(“Cache-Control”, “no-cache”);

It Rocks!!!!!

Comment by shimul39 — February 24, 2008

Just wrote a blog on this exact issue only using ExtJS (

The problem was as I was using a specific url rewriting format I couldn’t use conventional query string params (?param=value), so I had write the cache busting parameter as a posted variable instead….. I would have thought that using POST variables are a bit safer that GET, simply because a lot of MVC frameworks use the pattern


and so the mapping of variable name to value is lost, and params are simply stacked… so when using a GET cache buster parameter

i.e. protocol://host/controller/action/param1/param2/no_cache122300201

no_cache122300201 can be mistaken for a $param3 parameter which could have a default value


public function action($param1, $param2, $param3 = “default value”)

no chance of that happening with POSTED cache busters

Comment by larsguitars — October 6, 2010

Leave a comment

You must be logged in to post a comment.