Thursday, April 26th, 2007

jscsscomp: JavaScript and CSS files compressor

Category: JavaScript

<p>We love to play with the plumbing don’t we. jscsscomp is the latest compressor that uses Nicolas Martin PHP version of the Dean Edwards JavaScript Compressor.

With a swish of mod_rewrite:

RewriteEngine on

RewriteCond %{REQUEST_FILENAME} -f
RewriteRule ^(.*.)(js|css)$ jscsscomp/jscsscomp.php?q=$1$2 [L,NC]

you can get your JavaScript like this:

  1. <script src = "/jscsscomp/yahoo.js, dom.js, event.js, effects/dragdrop.js, slider.js"></script>

Posted by Dion Almaer at 7:58 am
17 Comments

+++--
3.7 rating from 53 votes

17 Comments »

Comments feed TrackBack URI

Great tool for php.
If you are looking for the same functonality in Java/JSP check out pack:tag.
http://sourceforge.net/projects/packtag

Comment by Daniel — April 26, 2007

Depending on the size of your JS file this algorithm can be somewhat slow. It’s great for apps that use build scripts, but I wouldn’t want to be compressing this on-demand.

Comment by Richard Marr — April 26, 2007

Though the implementation might be weak, this is a great idea: adding in-line compression (a-la gzip’ing html) to the server.

Comment by Karl — April 26, 2007

that looks interesting!
however, I already have some rewrite rules needed for my website to function. Can anyone help me combine those into one that’s not mutually exclusive?

My rewrite rules:
——————–
RewriteEngine on
RewriteRule !\.(swf|js|ico|gif|jpg|png|css|html)$ index.php

Required rules:
——————–
RewriteEngine on
RewriteCond %{REQUEST_FILENAME} -f
RewriteRule ^(.*\.)(js|css)$ jscsscomp/jscsscomp.php?q=$1$2 [L,NC]

How do I roll them into one?

Thanks,
Temuri

Comment by Temuri — April 26, 2007

But the code show in the plain text box still doesn’t affect http download time. Take a look in firebug. The js files would have to be combined into one file to affect performance.

Comment by Mike Henke — April 26, 2007

Great work. I though I was the only one doing it this way, although I only gzip.

Comment by cdude — April 26, 2007

The Dean Edwards JS uncompression algoritm can take up lots of CPU resources if the script is large. It takes a few seconds to uncompress TinyMCE (150k) on a P4 machine in FF so that algorithm it’s not usable on larger scripts since this uncompressing needs to be done each time the JS is loaded even from local browser cache.

I’ve also seen that if you remove all comments and whitespace and gzip that it can even be smaller that dean edwards algoritm and gzip I guess the huffman algorithm can compress the script better in plain text.

Comment by Spocke — April 26, 2007

See this component from JSOS also: http://www.servletsuite.com/servlets/jstrimflt.htm

Comment by Dmitry — April 26, 2007

Great Tools. When is ASP.NET going to come out such compressor?

Comment by PohEe.com — April 26, 2007

“Houston, we have a problem…”

testing on a real application (it uses prototype.js) showed a failure with .js squeezing.

so, at this time this script can’t be used…

NEED HELP! (;

Comment by flashkot — April 27, 2007

Very nice. I will use it.

Comment by Valentino — April 27, 2007

I’m a tad confused? If you have mod_deflate/mod_gzip within apache, why is this necessary? We’ve used these on Apache 1.3 and 2.0 for years and years, to compress all outgoing text files on the fly. And since this is done in at a compiled C level (it being a module of Apache), there is no CPU concern. Visit http://www.silverstripe.com and experience the excitment of compressed JS/CSS/HTML et al.

Comment by Siggy — April 27, 2007

This is a good idea to use once, the save the javascript and CSS into two files, and be done with it. Compressing on demand is a worthless waste of resources and you never know what bugs will be introduced if you make tiny changes to the files.

Comment by Nice — April 27, 2007

I think it’s important to note that the performance hit of this script only occurs when the script has detected that either a compressed version does not exist OR that the script has been modified and needs to be recompressed. That’s why the library wants a “cache” directory. Once the script has compressed the files there’s no performance hit, and quite a substantial performance boost.

mod_deflate/mod_gzip can’t consolidate multiple file requests (reducing the number of HTTP requests and the associated overhead and browser connection limits).

As for the corruption on the prototype.js library, I haven’t had a chance to use the code myself but it’s probably that the jsmin was set too aggressively, something that should be easy enough to fix in a few days.

This library has quite a bit of potential. I look forward to seeing it evolve.

Comment by Patrick Hunlock — April 27, 2007

I use the PHP framework Prado.
Prado has a similar mechanism discussed here: http://www.pradosoft.com/demos/quickstart/?page=Advanced.Scripts3

Prado’s clientside code is built off of Prototype, Scriptaculous, Base and many other libraries that get combined into one file, minified, versioned, gzipped and cached on the server.

It doesn’t have an issue using Prototype or any of the other libraries.
This leads me to suspect the cause of the JavaScript corruption in jscsscomp is JSMin as well.

If you are looking for compressed and gzipped versions of Prototype go to the Google Group “Prototype: Core” Files page.
Prototype 1.5.1_rc3 is 15kb.
Prototype 1.5.0 is 11kb.

http://groups.google.com/group/prototype-core/browse_thread/thread/3dc7344c40fbc40e

Comment by jdalton — April 28, 2007

First of all; why zipping content, if the webserver does the compression as well? I’ve done some tests on filesize two years ago. Although you can reduce the filesize by this method, those files cannot be compressed any further by the webserver.
Say, the original file is 100kb, after compression the file is reduced to 30kb, but then, the webserver cannot compress the 30kb. If the webserver compresses the 100kb, probably 20kb will remain.

Secondly unzipping the content client-side is slow.

For my own projects I use my ‘js compiler’. It will remove all all whitespaces, comments etc. It then renames all variables and private methods (marked by a leading underscore).

function _foo() {
var d = document;
var newElement = d.createElement('foo');
var node = d.getElementById('bar').firstChild;
while (node) {
// do something
node = node.nextSibling;
}
}
// becomes:
function a(){var b=document,c=b.createElement('foo'),d=b.getElementById('bar').firstChild;while(d){d=d.nextSibling}}

This can reduce filesize by 60%. So 100kb, becomes 40kb. The resulting 40kb will be compressed by the webserver. Sounds to me as a better approach.

Comment by Jorgen Horstink — April 30, 2007

Not to steal jscsscomp’s thunder, but Minify is a cleaner, faster, more secure, and more reliable PHP library that does the same thing: http://code.google.com/p/minify/

Of course, I wrote Minify, so I’m biased.

Comment by wonko — May 3, 2007

Leave a comment

You must be logged in to post a comment.