Friday, July 4th, 2008

qUIpt: caching JS in

Category: Performance

Mario Heiderich has released qUIpt, a library that uses the property to store away useful data, in this case JavaScript.

How does it work?

  • It checks for the contents of while your page is being loaded.
  • If there’s nothing inside the cache the JS files defined by you are fetched via XHR
  • The same happens if the users enters your site for the first time of his current browser session or if document.referrer is off-domain or empty
  • After that the contents of are being evaluated
  • If the user requests the next page on your domain the JS files are directly taken from – no more requests necessary

You can check out an example of it at work

Posted by Dion Almaer at 8:57 am

3.8 rating from 48 votes


Comments feed TrackBack URI

I’m pretty sure this trick is unusable for security reasons – an attacking site could populate with malicious JavaScript and then redirect to your site, which would execute that JavaScript in the context of that site’s domains as an XSS attack.

Comment by SimonWillison — July 4, 2008

On second look, he’s guarding against that attack by checking document.referrer before eval()ing the JavaScript in Smart workaround.

Comment by SimonWillison — July 4, 2008

Yup – I am not yet sure if the script is “un-hackable” in all major browsers – especially IE6 – but my tests showed good results. During this month we will start a live test on a high traffic site – I will post the results to the project page as soon as completed.

Comment by x00mario — July 4, 2008

This should not be needed if you provide proper caching headers.

Comment by ironfroggy — July 4, 2008

Not all browsers support the CacheControl headers for SSL connections yet – and it may take more or less long while until the majority will do. So as a site owner you can remove one blackbox (you never really know how your users configures the caching and why and when resources are loaded again) and traffic generator.

Comment by x00mario — July 4, 2008


Can I use as a persistent store for other things?

Comment by Nosredna — July 4, 2008

Did u test storing large data in in IE.. performance based.
Also did u test.. performance when the memory is running full?

As these a some big issues that i found out when using the for storage, even for small javascript of a size around 150k (normal extended library size)

Also, i don’t really see the point of “caching” javascript in as the browser already caches it itself.

Comment by V1 — July 4, 2008

@V1: browser caching relies on several factors that can’t completely be controlled by site owners – headers, client settings, SSL, strict provider etc. I made several performance tests – IE7 runs slower as soon as you start putting more than 2MB into FF3 runs pretty stable even with +200MB.

Comment by x00mario — July 4, 2008

So it’s gone when you close the browser, right? Anything else that kills it?

Comment by Nosredna — July 4, 2008

I have try on FireFox 3. This is good. But … When I type new url on address bar. The Firefox take all my CPU around 30 sec and then display the warning to continue or terminate the running script.
Hmm … this not good for my client and need more improvisation to to handle it. But thank you and good work.

Comment by rindu — July 4, 2008

*Really* clever, and *really* scary. Persistent data storage, persistent JS storage. And, basically circumventing the size limitations imposed on cookies. Seems more compelling for persistent state than JS module caching.
I’m inclined to agree that it’s good to rely on the caching controls of the browser for external JS modules, although one of my pet gripes about browsers is the lack of a formal packaging protocol – for *anything* external. Is there nothing in HTTP that speaks to batching of requests? I’m sure I’m not alone in finding that there are cases where one bundles many JS files into a single file in order to improve page performance by reducing HTTP requests. We also do this with CSS sprites. So, clearly the HTTP protocol – in its current form – is failing critical use cases and critical request patterns. There’s a need that the spec is not adequately addressing.
So, one of the gripes about creating JS bundles for some pages is that on other pages when one wants to load in only a few of the modules that were already part of a bundle used on some other page, those modules are not already cached from the browser’s perspective. In an ideal world, an enhanced HTTP would just take care of this automatically, based upon negotiation between the client and the server, so that the sweet spot of separate requests vs batched/bundle requests would just be tuned over time by the server’s record of performance and how separate vs bundled would improve performance.
HTTP and networking is already a sophisticated world of complex algorithms of mathematics. It doesn’t seem unreasonable to push HTTP further in the direction of negotiating the pattern of requests to achieve optimal results, without forcing application designers to have to build access assumptions into their application design.

Comment by uize — July 5, 2008

you can md5 the with the md5 of your js file to prevent hackers from inserting code in

Comment by Urielka — July 5, 2008

U can just add a some sort of identifyer infront of the code that u will store in If it doesn’t match the identifyer in your normal code, just fetch the regular data en clear the

Just taking a simple window.location as identifyer would probly be enough.. and lets face it… what would the chance that someone would inject evil code in a and the person with infected would actually visit your site…. So a simple identify string would be enough..

And if u secure information in your just an moroon.

Comment by V1 — July 5, 2008

What if your site allows user input via HTML (like forums) and you fail to filter out the “target” attribute on links. Then someone could craft a malicious URL that modifies the window name and your website might run it. e.g.,
[a href=”.” target=”alert(‘evil’);”]Click Here[/a]

Comment by Jordan — July 5, 2008

The HTML in my last post was not escaped. Here it is again:
[a href=”.” target=”alert(‘evil’);”]Click Here[/a]

Comment by Jordan — July 5, 2008

Is this really persistent? It goes away upon browser close, right? So it’s only good for replacing session cookies, right?

Comment by Nosredna — July 5, 2008

What is it supposed to do? I get nuthin? I’ve tried it in Firefox 3, IE7, and Safari.

Comment by Mikael Bergkvist — July 5, 2008

Just seems pointless, fiddly, and probably insecure. If it’s a really js complex app then just do the entire thing in ajax so you never leave the page

If you want persistent data store for a standard app, then store it server side via sessions.

Comment by stevesnz — July 6, 2008

and if you’re using it to store multiple javascript files, then you better off just copy and pasting the all the files into a single .js file so that there’s only 1 http request.

that will stay cached for multi visists, stuff won’t so will actually offer even worse performance

Comment by stevesnz — July 7, 2008

@Jordan: Yep – but if you allow the user to post HTML with dangerous attributes – which target definitely is – you mostly also have an XSS or a first step in this direction.

@stevesnz: It depends – if you have the possibility to compress and concat all your JS files during deploy you can and should use it. But not any site does it and this approach tries to give an overall solution to reduce the traffic sent from the server to the client.

Please keep in mind that the project is very young and there is of course a lot to optimize.

Comment by x00mario — July 7, 2008

Just a sidenote: I just opened a Google Group for qUIpt rekated discussions:

Comment by x00mario — July 7, 2008

Leave a comment

You must be logged in to post a comment.