Tuesday, May 27th, 2008

Announcing AJAX Libraries API: Speed up your Ajax apps with Google’s infrastructure

Category: Ajax, Google, JavaScript, Library

AJAX Libraries API

I just got to announce the Google AJAX Libraries API which exists to make Ajax applications that use popular frameworks such as Prototype, Script.aculo.us, jQuery, Dojo, and MooTools faster and easier for developers.

Whenever I wrote an application that uses one of these frameworks, I would picture a user accessing my application, having 33 copies of prototype.js, and yet downloading yet another one from my site. It would make me squirm. What a waste!

At the same time, I was reading research from Steve Souders and others in the performance space that showed just how badly we are doing at providing these libraries. As developers we should setup the caching correctly so we only send that file down when absolutely necessary. We should also gzip the files to browsers that accept them. Oh, and we should probably use a minified version to get that little bit more out of the system. We should also follow the practice of versioning the files nicely. Instead, we find a lot of jquery.js files with no version, that often have little tweaks added to the end of the fils, and caching is not setup well at all so the file keeps getting sent down for no reason.

When I joined Google I realised that we could help out here. What if we hosted these files? Everyone would see some instant benefits:

  • Caching can be done correctly, and once, by us… and developers have to do nothing
  • Gzip works
  • We can serve minified versions
  • The files are hosted by Google which has a distributed CDN at various points around the world, so the files are “close” to the user
  • The servers are fast
  • By using the same URLs, if a critical mass of applications use the Google infrastructure, when someone comes to your application the file may already be loaded!
  • A subtle performance (and security) issue revolves around the headers that you send up and down. Since you are using a special domain (NOTE: not google.com!), no cookies or other verbose headers will be sent up, saving precious bytes.

This is why we have released the AJAX Libraries API. We sat down with a few of the popular open source frameworks and they were all excited about the idea, so we got to work with them, and now you have access to their great work from our servers.

Details of what we are launching

You can access the libraries in two ways, and either way we take the pain out of hosting the libraries, correctly setting cache headers, staying up to date with the most recent bug fixes, etc.

The first way to access the scripts is simply be using a standard <script src=”..”> tag that points to the correct place.

For example, to load Prototype version you would place the following in your HTML:

  1. <script src="http://ajax.googleapis.com/ajax/libs/prototype/"></script>

The second way to access the scripts is via the Google AJAX API Loader’s google.load() method.

Here is an example using that technique to load and use jQuery for a simple search mashup:

  1. <script src="http://www.google.com/jsapi"></script>
  2. <script>
  3.   // Load jQuery
  4.   google.load("jquery", "1");
  6.   // on page load complete, fire off a jQuery json-p query
  7.   // against Google web search
  8.   google.setOnLoadCallback(function() {
  9.     $.getJSON("http://ajax.googleapis.com/ajax/services/search/web?q=google&;v=1.0&;callback=?",
  11.       // on search completion, process the results
  12.       function (data) {
  13.         if (data.responseDate.results &&
  14.            data.responseDate.results.length>0) {
  15.          renderResults(data.responseDate.results);
  16.         }
  17.       });
  18.     });
  19. </script>

You will notice that the version used was just “1”. This is a smart versioning feature that allows your application to specify a desired version with as much precision as it needs. By dropping version fields, you end up wild carding a field. For instance, consider a set of versions: 1.9.1, 1.8.4, 1.8.2.

Specifying a version of “1.8.2” will select the obvious version. This is because a fully specified version was used. Specifying a version of “1.8” would select version 1.8.4 since this is the highest versioned release in the 1.8 branch. For much the same reason, a request for “1” will end up loading version 1.9.1.

Note, these versioning semantics work the same way when using google.load and when using direct script urls.

By default, the JavaScript that gets sent back by the loader will be minified, if there is a version supported. Thus, for the example above we would return the minified version of jQuery. If you specifically want the raw JavaScript itself, you can add the “uncompressed” parameter like so:

< view plain text >
  1. google.load("jquery", "1.2", {uncompressed:true});

Today we are starting with the current versions of the library, but moving forward we will be archiving all versions from now onwards so you can be sure they are available.

For a full listing of the currently supported libraries, see the documentation.

Here I am, talking about what we are doing in two short slides:

The Future

This is just the beginning. We obviously want to add more libraries as you find them useful. Also, if you squint a little you can see how this can extend even further.

If we see good usage, we can work with browser vendors to automatically ship these libraries. Then, if they see the URLs that we use, they could auto load the libraries, even special JIT’d ones, from their local system. Thus, no network hit at all! Also, the browser could have the IP addresses for this service available, so they don’t have the hit of a DNS lookup. Longer lived special browser caches for JavaScript libraries could also use these URLs.

The bottom line, and what I am really excited about, is what this could all mean for Web developers if this happens. We could be removed of the constant burden of having to re-download our standard libraries all the time. What other platform makes you do this?! Imagine if you had to download the JRE everytime you ran a Java app! If we can remove this burden, we can spend more time flushing out functionality that we need, and less time worrying about the actual download bits. I am all for lean, but there is more to life.


I want to acknowledge the other work that has been done here. Some libraries such as jQuery and Dean Edwards Base were already kind of doing this by hot linking to their Google Code project hosting repository. We thought this was great, but we wanted to make it more official, and open it up to libraries that don’t use our project hosting facilities.

Also, AOL does a great job of hosting Dojo already. We recommend using them for your Dojo needs, but are proud to also offer the library. Choice is good. Finally, Yahoo! placed the YUI files on their own CDN for all to use.

Posted by Dion Almaer at 9:42 am

4.6 rating from 200 votes


Comments feed TrackBack URI

Wow Dion, what a great idea. Thanks for pushing out another excellent API.

Comment by dshaw — May 27, 2008

What about plugins for JQuery for example? I love it for the base libraries, but JQuery’s power is really fed by the great community plugins.

Comment by ilazarte — May 27, 2008

Thank you so much! I’ve wanted this for the longest time.

Comment by Kingsley2 — May 27, 2008

@ilazarte I totally agree. We wanted to start somewhere and see how people use it, and then work on how we can add more libraries to the system in a smart way. I would love to see all jQuery plugins get into the system myself.

Comment by Dion Almaer — May 27, 2008

Hey guys, fellow Googler… I actually spent some time this weekend looking over all the Google API’s this weekend (man there’s a lot). I’ve been working on a mapping project and just started to get into AJAX and developing on the front end. What I couldn’t figure out looking at the library is how do these “frameworks” fit into the Google Web Toolkit? Are they extra widgets? Are they an alternative to the toolkit?



Comment by bradleybossard — May 27, 2008

Would be cool to serve Prototype in a minified/packed/yui-compressed form since gzip doesn’t do all the job of reducing file size. When only gzipped : 30K, when compressed with yui-compressor + gzip : 22K

Comment by MaxiWheat — May 27, 2008

@dshaw, ditto for Dojo. The Dojo core libraries are fantastic, but when you use anything outside of that, you’ll usually want to take advantage of the opportunity to do a custom build that allows you to roll in the specific pieces you want. It would be really cool if there were a way to upload one’s own custom Dojo build to this service. Fantastic to have this available, though. It will be great for simpler sites that only require core files.

Comment by mcbenton — May 27, 2008

@MaxiWheat: Using eval-based compression (ie, packed) is actually a performance hit compared to using just minified javascript. This is due to the time it takes a browser to eval an entire library compared to the extra time it takes to download the un-packed version.

That said, this is exciting news. If this becomes widely-used, it would be very rare that browsers would have to download the library twice.

Comment by tj111 — May 27, 2008

I’m having a difficult time understanding why you would want to give up control of your JavaScript libraries to Google. If I have caching configured correctly and I am happy with my server performance, why would I go out of my way to make it an external Google resource? I’ve seen enough sites just plain halt on load waiting for Google Ads to load…

Good idea but I am not sure I would use it. I like the idea of having my own source copy of the library hosted locally – way less complicated.

Comment by Jigs — May 27, 2008

@MaxiWheat – It would be nice, but Prototype’s core team seems dead set against minification of any kind (despite a 17kb+ saving of Prototype+Scriptaculous minified and gzipped).

YUI Compressor works with Prototype (use of the $super variable) and ProtoSafe has a modified Dean Edwards Packer 3, that works with it as well. Using the modified Packer 3 you get Prototype = 20.7kb gzip + minified.

I know when I develop web applications I have a debug mode with non-compressed code, but when it’s stable and ready for prime time I flip a switch and run minified + gzipped code.

The core team doesn’t like minified code because they think it is hard to debug and find issues. If they ran a debug/performance mode they would remedy that issue.

I think regardless of the core teams personal views on minification, giving the developer a choice is a good thing. That is why I have maintained a collection of compressed/gzipped/minified/packed versions of Prototype and Scriptaculous.

Comment by jdalton — May 27, 2008

Another issue google will need to work out is that MooTools, Scriptaculous, and Dojo are modular (meaning you don’t have to load the kitchen sink and can just load the parts you want). This can effect the file size footprint as well. This may be beyond the scope of a CDN though.

Comment by jdalton — May 27, 2008

Unless you open this up a little bit it may kill the development of new libraries.

Comment by deanedwards — May 27, 2008

@deanedwards – The article and the video both mention adding more libraries in the future.

Comment by jdalton — May 27, 2008

… also our secured Javascript scopes will be opened for Google.

Comment by steida — May 27, 2008

I tweak my own librarys, and I’m damn proud of it.
The last way of loading librarys is complete waste of a http request.. Why on earth would u load a google script who will load a library when u can just point to them directly….

Comment by V1 — May 27, 2008

“I’m having a difficult time understanding why you would want to give up control of your JavaScript libraries to Google. If I have caching configured correctly and I am happy with my server performance, why would I go out of my way to make it an external Google resource?”

You wouldn’t. Other people would. People who don’t satisfy your “if” clause, that is.

Comment by Andrew Dupont — May 27, 2008

I really dig this. Thank you Dion and Googlers. Long been an admirer of what Dojo set up with AOL to deliver cross-domain resource loading, and finally it’s here for Prototype. Hat’s off! I’ll be using this immediately.

…this has also got me to thinking… to create a Rails plugin that optionally loaded Prototype from the Google via http://ajax.googleapis.com/ajax/libs/prototype/ instead of my_app/public/javascripts/prototype.js

Comment by holts — May 27, 2008

http://www.jsloader.com – Glad someone else caught on :) (someone who can afford to host the libs without worrying about bandwidth on a VPS hosting account!)

Will Extjs join this list of supported libs?

Comment by Dov Katz — May 27, 2008

Is this world domination, or charity?

Comment by LFReD — May 27, 2008

“I’m having a difficult time understanding why you would want to give up control of your JavaScript libraries to Google.”
You save some bandwidth AND offer faster page loads for people who already have the library cached from an other place.

How many people use prototype but shy away from adding more libraries like scriptaculous because of weight issues.
The more people will use it, the more beneficial it will be.
Soon enough external library weights wont even be a consideration when estimating page download times.

Comment by JeromeLapointe — May 27, 2008

Also… I hope to see lowpro on there at some point… even if it’s relatively light.

Comment by JeromeLapointe — May 27, 2008

Please add Appcelerator to the list.

Comment by beckie — May 27, 2008

Ok, this isn’t really my style, but since part of John Resig’s attention he got for jQuery was due to a semi-trolly comment here a long time ago, I’ll give it a try as well.

Initially, this seems like a good idea, but thinking about it, web developers really have to learn to set up gzip, expires headers, minifying etc for their local code, and not just their favorite library usage. This solves just one detail in their set-up, leaving them in a sort of false security, thinking: “Google takes care of EVERYTHING for me”.

And it really does give web developers the impression that now it’s ok to include 800 kb of JavaScript libraries into their web pages, since PERHAPS the end user has previously visited another web site where he/she was forced to download a lot of superfluous code (yes, it’s still superfluous, even if it comes from Google). Maybe JavaScript library makes should start by scrutinizing the file size of their libraries instead…

I’d definitely like to echo the concern that it doesn’t really seem like a good idea to add an extra DNS look-up and Google-specific code just to have a JavaScript library in a web page. We’ve all been in web pages where loading has stalled due to an external dependency, and yes, a fair number have been due to Google Adsense, Google Analytics etc.

To begin with, I’d find it mandatory for JavaScript library authors to offer a minified or compressed version. It’s not about what they personally prefer, but about the market’s demands and needs. Who knows best but the authors to offer such a version in the most efficient manner?

I’d also like to say that I agree with Dean Edward’s notion above: since there is no option to submit not already included libraries, get them reviewed and hopefully added, it’s just all up to Google’s good will, which might lead to an unbalanced competition between JavaScript library makers.

I guess I just have to continue being part of developing the most light-weight JavaScript library, with the fastest and most accurate CSS selectors, and just hope, beg, for Google one day deeming us worthy.

And, oh, DOMAssistant rocks.

Comment by robnyman — May 27, 2008

One major thing missing: an SSL certificate for “ajax.googleapis.com” so the scripts can be served up over HTTPS if necessary.

If your site is being browsed over SSL, in IE you get the “This page contains both secure and nonsecure items” by serving up the over plain http.

Comment by DuncanSmart — May 27, 2008

I was going to say the same thing of DuncanSmart ;D

I’m missing a SSL Certificate too.

Comment by nagaozen — May 27, 2008

I suppose it will be simple, for big G, to add an option in its Google code service like: make this code “Ajax API available”
Until them, or an alternative solution, I totally agree with Dean.
4 frameworks are not Web, and a lot of developers do not necessary use 3rd parts client library (while many others create stand alone libraries)

On the other hand, I am thinking about https sites, any idea for them?

Comment by Andrea Giammarchi — May 27, 2008

Either this is the greatest charity-gift a SW company ever has given the world, or the greatest piece of lock-in a SW company has ever given the world…
I guess only history will show…

Comment by polterguy — May 27, 2008

What the ???
I approached you more than a year ago about this at the last Ajax Experience with this EXACT same plan.
However, you forgot some of the most important aspects of my idea:
1) Create version specific URIS, so people can target old releases
2) Use subversion with ‘outside’ edit access
3) Use a global Wiki that is targeted to the current session for edit of the JavaScript. That way I can change the library (for myself only) in the web browser
4) Use the Google infrastructure, but have it hosted as a 3rd party organization that has ALREADY been created: OpenAjax. That way we can have trust that it stays open source
I even offered the use of http://OpenAjax.Com for this (just as I gave the domain OpenAjax.Org for FREE)
I realize there are many other smart people out there that may have thought of similar ideas, but that does not mean I have nothing to contribute.

Comment by OpenAjax — May 27, 2008


The developers that do not have their servers configured to handle caching correctly or are going over-bandwidth have larger issues that should be addressed. I see that this AJAX API is being presented as a convenience to those that do not want to configure caching themselves, but it is not a replacement. It wouldn’t save you any bandwidth if your server was caching the library, which is much more straightforward than opening your JavaScript up to Google and making an http request for their hosted content.

Comment by Jigs — May 27, 2008

@tj111 – YUI-Compressor is not an “eval” based minifier, it actually strips spaces, replace tokens, etc. so performance should not drop because of the compression. I personnaly always use the YUI-Compressed version for production and sometimes for development. Bugs occur mostly in my code instead of Prototype’s code ;-)

Comment by MaxiWheat — May 27, 2008

Wow! This brings back memories of the days I worked at Headspace/Beatnik developing the MusicObject API — all the fuss around the size of JS libraries and people having to set them up on their Web servers and all that.
The idea of hosting the libraries in a common repository seemed very seductive, but version management always seemed like the Achilles’ heel of such a plan. Also, Web developers typically wanted to have their cake and eat it. They didn’t want the fuss of the setup, but they also didn’t want their Web site at the mercy of some other third party server out there. Also, OpenSource – by its very nature – invites users of the code to make modifications per their needs and feed those modifications back into the source per their convenience.
I think this kind of library hosting service could benefit some developers, but it seems like the more serious Web developers will want to control the version of the frameworks and extensions they use, and will want the power to be able to dive in themselves to fix mission critical bugs where they can’t wait on the community to get a new version up there on the central repository.
All that said, I wouldn’t complain if Google decided to throw in support for the UIZE Framework (http://www.uize.com).

Comment by uize — May 27, 2008

HTTPS works for the https://www.google.com/jsapi way of loading libraries, just would rather load it via a regular script tag. I’m sure they can afford the certs…

Comment by boodie — May 27, 2008

What about other cachable items such as CSS (thinking of Ext) or Images. The Silk icon set is incredibly popular in Ajax applications, will that be added?

Comment by antimatter15 — May 27, 2008

oh, and isn’t this project similar to CacheFile.net? It seems to have the same goals.

Comment by antimatter15 — May 27, 2008

@antimatter16 – I think cachefile is the “CDN” concept, but not the programmatic loading concept. When it came out I spoke to him about integrating jsloader.com on top of it, but i never had time to fully complete it.

One thing to think about is the need to make this completely “localizable” so that it can be hosted offline (e.g. in a corporate intranet)… It seems that the google approach (and any other CDN’s approach) is unsuitable for internal corporate use, since it would expose internal URLs to an external host.

Comment by Dov Katz — May 27, 2008

FYI – http://ajaxian.com/archives/jsloader-on-demand-javascript-libraries

I’m glad Google is going ahead with this concept. Next I’d want to see google analytics around who’s using what JS library, what version of them, and allow people to contribute libraries to be hosted on the google ajaxlib “cdn”… (And as I mentioned, a HTML+JS friendly wiki for people to playground and prototype on)

Comment by Dov Katz — May 27, 2008

@Jigs: “The developers that do not have their servers configured to handle caching correctly or are going over-bandwidth have larger issues that should be addressed.”

That’s a straw man. There are plenty of people who can’t gzip, or choose not to.

“It wouldn’t save you any bandwidth if your server was caching the library, which is much more straightforward than opening your JavaScript up to Google and making an http request for their hosted content.”

In the case of Prototype, it’d save you 30K (or 120K without gzip) for each unique visitor to your site, since everyone would have to download it at least once. And I’m not sure what “opening your JavaScript up to Google” means. Are you worried about uptime? Response time? From Google?

Comment by Andrew Dupont — May 27, 2008

@antimatter16: Cachefile.net isn’t a CDN, just a dedicated host, although I was going to look at building up a faster “open source CDN” than the rediculously slow Coral CDN. But the goals of CacheFile have always been exactly the same as Google’s here. Regarding the loading library, Cachefile.net (which is mine) has using.js (which is also mine). I just forgot to stretch references to using.js all over the cachefile.net home page. I’ll fix that this week, but truthfully I’m actually very excited that Google paid attention to my very expensive plea to the Big Players, in the form of a web site called cachefile.net, for a trustworthy, reliable, and well-maintained repository of popular 3rd party libraries so that web sites only need to reference one URL.


Comment by stimpy77 — May 27, 2008

sry, that was to dov, not to antimatter16

Comment by stimpy77 — May 27, 2008

awesome, thanks Dion

Comment by doublerebel — May 27, 2008

@stimpy77 (Jon) – I know about using.js but it seems to require hooks into method calls, etc. I guess at the end of the day, we’re all doing the same thing. You your way, mine my way, etc… I was looking for nothing more than a replacement of script and css/link tags with “library+version” includes. This allows my team to patch releases, release new versions, etc with maximum reach and minimum need for developers to alter their code.

I can’t really share my intranet’s instance of what jsloader is a small case of, but ultimately, what I developed and widely deployed is a combination of ubiquitously high-availability hosting/and network filesystem mounts that give you as much centraluzed hosting, or localized serving of the same filesystem as you need. No modifications to any libraries, no callbacks needed at the end of loads, and anytime it doesnt work through a hosted model, simply map the shared/replicated filesystem to your webroot and continue using the same approach. Every library approved for intranet use is made available when developers need it, and all can benefit from its use.

In our intranet environment, this enables us to provide internally developed AJAX libraries, proprietary to our services, as well as externally developed ones, all under a single namespace.

Comment by dovie — May 27, 2008

This is great news; very obvious benefits here (especially on the larger/20KB+ libraries).

My biggest concern is the longevity of the project. If I start using this URL in client projects, the client’s website becomes forever dependent on this service staying static ad infinitum. The site says “Once we host a release of a given library, we are committed to hosting that release indefinitely”, but I don’t know if the word “committed” really seems like it captures the relative liability concerns that can be raised if this service dies, moves, etc.

The other thing to know is, even though it doesn’t use Google.com, what cookies/superfluous header info gets recorded with requests to this Googleapis.com domain? If it’s adding another HTTP request I’d like to know it’s going to be tiny (and stay that way).

If these two concerns were fully taken care of, I’d jump at this opportunity to improve the effective services I provide to my clients.

Comment by Kit — May 27, 2008


Dojo already builds a couple of layers for you by default (dojo.js, dijit.js, dijit-all.js, etc.), but the point is well taken. I don’t know that it’s feasible to really expect google to host zillions of custom Dojo builds, but perhaps we can expand the number of default layers that we put out on the CDN builds from both Google and AOL.

File a ticket on it and we can get the discussion started. There’s still time to get something done for 1.2.


Comment by slightlyoff — May 27, 2008

Suppose I use this and then I sell my site to one of Google’s competitors like Microsoft or Yahoo or AOL. Do you want my site to then stop hitting Google’s servers, or is that a question for lawyers?

Comment by Nosredna — May 27, 2008

Is the 1.2b version of MooTools included too? At the moment, 1.11 is getting outdated…

Comment by deef — May 28, 2008

dov, you missed my point. I love jsloader, and obviously using.js is not anywhere near what jsloader does. I’m not comparing using.js with jsloader.

I was comparing cachefile.net+using.js with Google. You indicated to a commenter here that cachefile.net wasn’t in pursuit of the same objective as Googles; I differ with that opinion, cachefile.net + using.js (and the as yet unposted prefab using.js registrations script, with versioning et al, all targeting cachefile.net’s repository) was indeed a solution with an objective similar to Google’s.

On the other hand, that is certainly not made clear on the cachefile.net home page, and perhaps the objectives have evolved there over time.

Comment by stimpy77 — May 28, 2008

MooTools 1.2b is not stable yet :(

Comment by Snowcoredotnet — May 28, 2008

Forgot to add,

When we are using google anal…. we sometimes allready have to wait 2 seconds for it to be even loaded… If the same happens to librarys to most sites will be fucked. Its not only happing on my server, but also others.

Comment by V1 — May 28, 2008

A great idea! Will ExtJs also be hosted?

Comment by robversluis — May 28, 2008

I believe it is just another way how Google spies on our requests and data. Who knows, Google may use this data to generate targeted ads both site-targeted and visitor-targeted ads !! Leaving us without our cut of ads revenue :(

Comment by Tabeeb — May 28, 2008

It would be great to also add Dean Edwards’ ie7.js and ie8.js libraries, as well as any minified prototype/protocolous version.

Comment by icoloma — May 28, 2008

I’m not a programmer but I have simple suggestion:

Couldn’t be possible to include in any library or script a unique ID so with simple script you could check if this library is present in a user’s cache?

Comment by letheos — May 28, 2008

Hmm. ‘Staying up to date with the most recent bug fixes’ — this sounds like blindly upgrading the hosted libraries to new versions, which is a very bad idea.

As OpenAjax noted, it’s a much better idea to use version-specific URIs, allowing users to choose the versions they wish to use — otherwise version mismatches will occur between user apps and the Google-hosted libs, creating bugs and the classic “dependency hell” that would be familiar to anyone who remembers the days of “DLL hell”.

Comment by jmason — May 28, 2008

IMHO the real benefit is “The Future” part. That’s where sites could really benefit from caching due to the shared hosting.

Here’s my proposal: Google could pre-warm the browser caches around the world by – randomly, probably based on usage – loading one of the JS libraries on the Google result page. To prevent delay of the result page this could be done asynchonuously, after a delay of, say, 2 seconds.

This would at the same time reduce the privacy issues as most sites would not issue requests to the googleapis site any more.

Comment by cschneid — May 28, 2008

thanks Google, it’s a simple idea but would deal major bonuses for faster page loading; plus keeping everything tied down in terms of versioning.

it’s like my birthday present came early ;-)

Comment by indiehead — May 28, 2008

robnyman sounds like a giant child. Haven’t we learned already from the MooTools guys that always tooting your own horn and dissing on other libraries (especially jQuery) isn’t going to get you anywhere? What’s the deal? Additionally, DOMAssistant looks like a direct rip off of jQuery except for the fact that it’s not chainable.

As for Google’s idea (which obviously from the comments isn’t such a new concept), let the people who want to use it, use it, and as for the rest of us who know what we’re doing and want complete control (or until this really does become ubiquitous), we’ll keep on doing what we’re doing.

Comment by krunkosaurus — May 28, 2008

This is a great idea!

I was one of the many people suggesting this back last year, I’m glad Google signed on.

Two additions I would like to see:

1. Ext JS. It’s a big library that would benefit tremendously from strong caching.

2. Mark James’s beautiful CC / PD icons (http://www.famfamfam.com/lab/icons/)

3. A formal method for suggesting files to include, and a list somewhere of what has been suggested and the status (working it out with the developer, not a good fit, evaluating the licensing, online and active, etc.).

Comment by richardtallent — May 28, 2008


Thank you for disregarding my arguments, and not reading carefully…

First, I wasn’t dissing jQuery, I mentioned how it got some of its attention in the first place. Why that was mentioned, was also in an ironic wink to what happened back then.

Second, if you were to actually read up on DOMAssistant, you would see that it indeed supports chaining.

It is a very valid concern, in my world at least, that most people will completely avoid learning how to serve code the best way and just rely on Google, and that Google gets to decide what code will be offered to such a massive market of web developers.

Cachefile.net seems (seemed?) to have a much more humble and open approach, where they were all ears for the market as a whole and suggestions about code that should be shared.

Conclusively, if the opposite to being a child is your comment, just ignoring facts and very real issues, I’d rather be a child all the way.

Comment by robnyman — May 28, 2008

I know this goes against the grain of caching the library off a CDN but I wonder if the benefits of this are negligible because it requires multiple domain lookups and the file size is fairly negligible at this stage of internet evolution compared to the lookup time. I hesitate to believe that libraries are so pervasive now that the average user will come into an application or site with a primed cache. Even with 100% adoption this seems like a lofty goal. In that light the convenience here is more about having a reliable, centralized external host for the library files. Which in and of itself isn’t such a bad thing. Personally what I’d find far more useful would be something to the effect of:

http://… google address …/jquery.LATEST.js
http://… google address …/jquery.LATEST.STABLE.js

Which would be a symlink or something similar to the latest published version of the library. Most library cores are stable and consistent enough in their behaviour now that I’d be comfortable with automatic the update. This would give my applications and sites the optimization advantages of the latest builds of the libraries without having to constantly download and update my inclusions in the numerous sites and applications I manage.

Obviously variants of this theme could be applied such as latest version of a major build (ie. jquery.1.2.latest.js) where any version up to 1.3 would be updated automatically. This would further cement the stability of the auto-updating.

*preparing to be flamed… ducks head* :)

Comment by OwenL — May 28, 2008

@OwenL – Having done this experimentally in my enterprise solution to the hosted-ajax-library problem (symlink ‘latest’ and ‘stable’ to numbered releases in a large distributed filesystem and hosted solution in an enterprise setting) I can say it’s a risky idea since it will make your webpage code be very vulnerable to the “infrastructure” breaking it. I think that it makes more sense to do a 1.x -> 1.x.y linking/aliasing, since in that case, library owners should aim for proper minor-version compatibility, and things should still work… What we do is deploy 1.x.y, let people run against it for a fixed amount of time by load()ing it explicitly, and if regression testing and other observed behaviors permit, we then swing over the link from one “y” to another “y” (e.g. some library’s 1.6->1.6.4 now becomes 1.6-> 1.6.5)

For development, it makes sense to have a “x.nightly.js” but for production you should know what api (major.minor) version you’ve written your app for and the library owner should ensure API compatibility.

Anyway, this is based on my enterprise AJAX experience where we’ve had hosted ajax in place for about 1.5 years so it might not be applicable to everyone….

Comment by Dov Katz — May 28, 2008

Well, regarding the DNS lookup, caching works both ways. If DNS is working properly then the DNS lookup will also likely be cached. Also, browsers will limit the number of simulatenous downloads from a single domain so having the content spread across multiple may help speed up the page load. That may not help with JS now that I think about it since the browser will wait to do anything until the JS is loaded and executed.

One benefit has not been mentioned yet – getting gzipped js files to work across browsers is not always that easy[1]. If your site is optimized for static content you may not have the ability to serve compressed js files only to capable browsers. That means you’re forced to use the hard to debug with packed js files.

Comment by newz2000 — May 28, 2008

I agree this is a good start but I feel the url version references are a little too flexible, Dion and will invariably negate some of the intended caching benefits.

Personally I’d rather see an HTTP RFC addition of file checksum/hashes to be ref’d against cached files based simply on version-less names, etc. But that’s just me.

And I’m not entirely sure I agree with Dean’s concern over stifling development (one of the few apparently).

Comment by bclaydon — May 28, 2008

“It is a very valid concern, in my world at least, that most people will completely avoid learning how to serve code the best way and just rely on Google, and that Google gets to decide what code will be offered to such a massive market of web developers.”
It’s a bit like saying let’s not create helpers, frameworks, wysiwyg editors… people will rely on them and become less skilled…

If someone could not cache things properly before what makes you think they were about to start caching things properly ever… If they were not compressing their files… you’re saying now that google will compress one or two that they will host for them… they will never compress the rest of their files?

Odds are being in contact with “the correct way of doing things” will inspire more people (or maybe just inform them that its even possible to do) than otherwise.

But for those who don’t care and will do things the crappy way and would have continued doing so no matter what… you are right… now that it’s all done for them… they will continue to do things the crappy way no matter what…
I don’t see what the point is though.

Yes, now you make an extra DNS lookup and possibly additional HTTP requests if you had a bundling solution.. but only if the file is not cached.
For libraries with version sensitive URL… the lifespan of a file in the cache can be very long… in theory it should never change.

With browsers using these URIs as references to factory loaded libraries… there’s no reason you could not have a system that recognizes links to:
as much as

Comment by JeromeLapointe — May 29, 2008


Comment by JeromeLapointe — May 29, 2008


I basically agree with what you are saying. What I meant was that, yes, many people will disregard from learning or looking into how they themselves can cache, gzip and compress files on their own, and other types of files than JavaScript, since it’s just offered to them now.

If they instead, as you suggest, look at how Google do it and learn from it, great! I’m just skeptical, though, after having worked as a consultant for about a decade now, having seen how numerous customers work. I think it’s as important to educate web developers and spread the word, and not just offer hosting for certain JavaScript libraries, added and judged only by the Google gods (they should add many more JavaScript libraries to begin with, such as base 2 etc, and also add some sort of “suggest this code” feature).

While frameworks, WYSIWYG tools etc have their place, they are also, more often than not, part of creating something decent in a short amount of time, but not really optimal (since no code can foresee any type of possible context it will be used in).

And sure, files will be cached, but for those who aren’t, there’s still the DNS risk, and looking at how some web sites perform with Goggle Analytics and Google AdSense scripts, I’m just not sure that it will always be completely friction-less.

So, at the end of day, this might be a great solution for some, but for now my doubts linger in relation to factors such as people not learning proper set-ups, DNS dependencies and a too small JavaScript library subset on offer.

Comment by robnyman — May 29, 2008

Cheers that was very helpful

Comment by Aphrodisiac — June 5, 2008

Regarding the critical mass… is anyone using this yet?

Are any big players using it?

Comment by blueclock — June 6, 2008


I can see your point about making assumptions about it being frictionless.
But can you (or anyone else) tell me something… it seems to me that the real value of this idea is not in relying on Google to do the proper gzipping and hosting of the files, but rather the files will have a much higher chance of staying cached (especially if a critical mass is reached) if they’re all pointing to a SINGLE location, no?

Or, I can rephrase the question: if I host “mootools1.11.js” on my site, and an identical version exists on the client’s computer, it won’t be cached because the file is specific to my domain, correct? This is the problem that Google’s solution offers an answer for.

Comment by Bowser — June 11, 2008

Just posted my thoughts on “Web Service for Content Distribution Network” – http://tinyurl.com/6y3gqx . Would love to hear your thoughts.


Comment by Mukul — June 12, 2008

This boasts a better ability to cache across web servers. Even if you have a very well equipped caching system etc. The idea would be that if the user had come from another site, loading the same framework, from Google’s URL, and you make a request to the same URL, you’re able to then just have the user load from their cache instead of saying “No use my version from x.com instead of y.com, even though the file is identical”

Matt Foster
Ajax Engineer
Nth Penguin, LLC

Comment by MattFoster — June 17, 2008

Wont the DNS be be on auto for this to happen?

Comment by Tribulus — September 17, 2008

This article is very good for developers.

Thanks for this article

Comment by GlobalFreelanceProjects — October 12, 2009

It looks like a very promising project, good luck with your work and keep us all informed on its progress and how its use has benefited small community groups.

Comment by pariuri — December 3, 2009

Good review…… Thanks for posting…….. I was searching this kind of projects…… Good work man….

Comment by aarthi123 — December 16, 2009

google is starting to be a very important part in our lifes. As a matter of fact, some of us could not live without the big G. By launchig services like this one Google assure’s us that the continuos spreading is far from finish.

Comment by Bet365 — January 8, 2010

Actually using Google’s AJAX Library API (or any JavaScript Library CDN from Yahoo or Microsoft) is slower right now than just serving the files locally except in extreme cases. Its the Network effort. This is only valuable if lots of other people are doing it too. Otherwise nothing gets cached. I did some research and found tat the market penetration for these JavaScript Library CDNs isn’t enough to make it worth the cost of contacting a wrd party.

You can read more here : http://zoompf.com/blog/2010/01/shoud-you-use-javascript-library-cdns/

Comment by BillyHoffman — January 15, 2010

is very interesting what you say there must be knowledge but a little more advanced. I am not so good.

Comment by masini — February 8, 2010

Thanks for sharing your knowledge.

Comment by josephanderson — July 8, 2010

Nice, this article is very helpful for us.

Comment by royalinternationalpackersmover — July 8, 2010

The files are hosted by Google which has a distributed CDN at various points around the world, so the files are “close” to the user. That is the point.

Comment by thestreamssstone — September 19, 2010

Leave a comment

You must be logged in to post a comment.