How To: Prevent Caching Of Ajax Calls
Working on the principle that any content is better than no content here's a little tip some of you (probably not the experts though) might find useful.
I was working on an Ajax-based document rating system this week and seeing some unexpected results (that is, nothing was happening). After staring blankly at the LotusScript code for a while I looked elsewhere and it turned out that the calls to the rating agent were being cached by IE.
To prevent the Ajax call caching I added an extra parameter the function which creates the GET URL, like so:
$.get( "RateArticle?OpenAgent", { article: id, ajax: true, rate: rate, _: new Date().getTime() }, function(data){ alert(data) } )
Calls to the JavaScript function which contains the above Ajax code will result in URLs like so:
/db.nsf/RateArticle?OpenAgent&id=3AA...631&ajax=true&rate=3&_=1196345282726
/db.nsf/RateArticle?OpenAgent&id=3AA...631&ajax=true&rate=3&_=1196345329269
Obviously each one is unique and never repeated, so caching is not a problem.
In this case the problem is only ever during testing as, in practice, a document rating URL is only ever called once per article and so it's the id= parameter that makes each one unique - per user. However, it's a handy trick to use in any Ajax call and hence I thought I'd share it with you all.
Whenever you find yourself building a list of parameters to pass in a URL you don't want caching, tag "new Date().getTime()" to the end of it.
Alternatively, assuming your LotusScript Agent is returning all the content you could prevent caching from there. Like so:
Print "content-type: text/plain" Print "cache-control: no-cache" Print |{ rate: 4, message: "Thank you for rating this article."}|
This is all probably old hat to lots of you, but I'm always mindful that there are a wide range of skill-sets among my readers.
Thanks for the article. That would explain that when I switched to Yahoo's Ajax web mail, that they have that bogus parameter at the end of the URL:
/launch?.rand=fbg165prnurhu
Cheers,
Gerald
Hi jake,
You can also prevent browser caching server side by using rules document (R6+).
Here is a post from Joe witch describe how to properly set the rules documents :
{Link}
Cheers,
Julien
@julien,
Unfortunately IE caches all GET ajax requests extremely aggressively. It ignores any cache headers and so doesn't bother asking the server if there is a new version.
--
Kerr
Uhm.. I think I'd rather POST that kind of request, since it results in a change on the back-end (increase/decrease in rating).
Funny thing is, I'd swear I learned that preference from you Jake.
Good point Peter. Normally I'd agree, but the scenario dictated using a GET. Kind of.
The usual reason to avoid GETs is because search spiders follow them and can trigger back-end actions. In this case it was only for authenticated *users*, so not a concern.
Having said that said, you're right. POST probably would make more sense. Although this also goes back to the link v button argument which didn't count in this case as I used a five-star visual image for the user to click on.
Jake
I seem to recall this was more of a Domino issue than an IE issue. Appending the date-time string to the request was the old trick when you had formula-customized javascript in a page. Domino would cache the js request since it appeared as a static file.
Good tip. Until recently I hadn't done enough ajax work to find caching a problem. But more recently I've been using the Ext.Ajax object and this apparently includes some cache busting tricks.
I don't claim to know how it works - it just does so far. Maybe one day things like this won't ever be an issue ...
Jake,
Nothing (beyond the sheer effort required) prevents you from using multiple image submit buttons (each with their own tiny forms) and appropriate event listeners to give you s solution that would still work with JS disabled. but then...how many of us really build sites like that?
Peter: "how many of us really build sites like that?"
Exactly. Avoiding GET for anything consequential is a good practice to get in to, but it doesn't mean we can't bend the rules from time to time ;o)
Scott: "Maybe one day things like this won't ever be an issue"
Hadn't ever thought about that, but maybe soon browsers will start to change their attitude toward caching as static websites become a thing of the past.
I use the 'If-Modified-Since' header (with an expired date value) to prevent the cache problem:
xmlhttp.setRequestHeader('If-Modified-Since', 'Fri, 31 Dec 1999 00:00:00 GMT');
Another way, very similar to the date method is:
url = myurl + "&rand=" + Math.floor(Math.random()*1001);
This might be slightly off topic....I have been using Prototype for quite a while and more recently EXT.js. My apps have been internal, but now have an externally facing app to develop, where SSL is going to be a requirement.
AJAX and SSL...can it be done. I am searching web, but finding conflicting answers.
As I see it Ajax doesn't care if you're using SSL or not. Why should it? So, yes, it works and there's no need to do anything different at all.
POSTs should be the norm in an SSL environment. Not to argue that GETs should never be used, but when SSL is in use, plastering parameters in an unencrypted header can be just as informative as the data that's returned...
Charlie. Even the URL in an SSL transmission is encrypted. Is it not? So the GET parameters are safe from prying eyes.
This is why you can only have one SSL-enabled internet site per IP address of a server. When the server receives the SSL GET request all it knows is the IP address of the destination server. So Domino then looks to the internet site document to see which is tied to that IP and from that document it gets the keyfile to decrypt with. You have to use the IP address in an SSL-enabled Site document as Domino doesn't even know the domain name used in the URL at that point.
Unless the way I understand it all is wrong of course, but I'm sure that's how it works.
Aye, Nick, with ExtJS it's just a quick "Ext.isSecure == true" and you're all ready to use Ajax with SSL.
Jake,
My apologies; I meant URL where I said header.
I'm certainly no expert on TCP/IP protocols; my comment was based on an admin telling me that only the content of the packets are encrypted.
So are you saying that when an SSL handshake is initiated, the URL is encrypted? If that is the case, how do routers know where to send the request? Could it be that only the IP is sent as part of the handshake and then the reset of the URL is transmitted inside of the tunnel?
I'm no expert either Charlie. This is just based on my working knowledge of it:
It's my understanding that the URL *is* part of a TCPIP packet and so is encrypted. The bits that aren't a part of the packet are the source/destination IP addresses/ports. Along (I guess) with the protocol used. In this case https so the destination box knows to use the destination IP address to find out what keyfile to use to decrypt the packet, find the URL and pass to the HTTP task to process.
Jake is right. The URL (and query string parameters) are encrypted in SSL sessions.
After an SSL handshake is completed the browser sends the HTTP request message over the secure channel and waits for the encrypted response. The URL is not used to set up the connection.
The URL and get parameters are embedded in the HTTP request message and are therefore encrypted.
I haven't looked at it in detail but I wouldn't expect any difference in HTTP caching with Ajax requests. This is because the layer in IE that is responsible for caching doesn't know whether a request is an Ajax request. The caching behaviour should be controllable using HTTP response headers such as Cache-Control just like any other HTTP request.
@Simon, My experience with IE is that ajax GET requests are cached more aggressively than regular requests in IE. Why this should be, I have no idea, but it seems to be the case. In testing no amount of fiddling with cache setting and headers on the server would stop IE caching the ajax requests. Even setting IE to not cache anything didn't stop it!
I haven't tried the xmlhttp.setRequestHeader method that Fredrik suggested though. I'll have to give that a bash.
--
Kerr
**********************************
Firstly I appologise, not really off topic, but talking of caching you know that annoying flicker on background images ( like the codestore logo ) maybe...
just add this to the jsheader
document.execCommand("BackgroundImageCache", false, true)
taken from..
{Link}
Because IE seems to cache XMLHTTP requests even if the header tells it not to I have been adding a parameter to the url since IE5 when I first started to use the msxml.dll object.
Sometimes I want to allow caching for a period of time. Maybe a day or a week or even a month. In those cases I use the date to prevent caching.
yyyymm for a month
yyyymmdd for a day
yyyymm<week number> for a week.
I use this methodology for JSON files and also in the head section for .js files. The reason I use it for .js files is that we are not yet running the D6 web configuration even though we are running D7 servers, so I can't create a rule for .js files.
I know this post has not had any comments for quite a while, but in case anyone is interested...
When I turned SSL on for my DBs my AJAX, Ext.nd elements were failing.
As Jake and others pointed out, SSL doesn't actually make any difference to AJAX, or if you are using any of the Ext.nd\ Prototype, etc. frameworks. To cut a long day short, of debugging in Firebug, then testing in IE, it was the fact that my page was referencing some elements as {Link} and others at https://mydomain/element2 (this was bad design on my part). I am testing views using the Ext.nd framework, but nothing was loading in IE, but working fine in Firefox. My AJAX calls using Prototype were working fine in Firefox but failing in IE.
Firefox appears to be quite forgiving, but in IE you get the really helpful(!) error sign bottom left, none of your AJAX works, and the "This page contains both secure and nonsecure items" dialog box pops up for every page.
I now use a field (in conjunction with the CGI HTTPS field) to detect whether SSL is ON or OFF, and build urls accordingly, and all your problems go away!
Jake...been visiting your site since the beginning, keep up the good work and have a happy new year.
If you don't want to bother checking for SSL, use relative URLs or format your URL without http or https:
//www.codestore.net