Monday, February 12th, 2007

BISON: Binary JSON

Category: JavaScript, JSON, Library

<p>Binary Interchange Standard and Object Notation (BISON) is a new binary format created by Kai Jäger.

After seeing the direction AJ(AX) was going into with light web service protocols like JSON, I was wondering if I could come up with an even lighter binary protocol that would work with JavaScript. Not so much to replace XML or JSON but more as a proof of concept and to see if it was even possible. I ran into a couple of problems (some of which I talk about in the article), but I actually managed to write a working implementation of the protocol. Essentially, you can now serialize any JavaScript variable into a string that holds a binary representation of whatever you had in that variable. This string may then be sent over XMLHTTPRequest to a script running on a web server. JavaScript and PHP implementations are available from my blog and there’s also a few demos.

I can’t wait for someone to write an XML wrapper of the binary protocol ;)

Related Content:

Posted by Dion Almaer at 8:49 am
17 Comments

+++--
3.6 rating from 44 votes

17 Comments »

Comments feed TrackBack URI

BISON is the GNU version of Yacc. Hopefully he’ll change the name of Binary JSON to “BSON”.

Comment by Jamerquai — February 12, 2007

The XMLHttpRquest object’s send method automatically applies utf-8 encoding to anything it sends. This may be fine for text messages, but what I was sending wasn’t exactly text. Now utf-8 encoding will encode characters with a numeric represtation that is larger than 128 with anything between two and four bytes. Meaning that when I was thinking I had sent a single byte, I might have actually sent four. Now this problem I could not yet circumvent and while sending and receiving works just fine, the messages are a lot bigger than they have to be. Typically they’re about the same size as a JSON message but sometimes they’re also bigger.

So… binary about the same size as plaintext. Cool hack but doesn’t really come off as a net win for me.

There’s an effort to binary encode XML in the w3c’s Efficient XML workgroup.

Comment by Karl G — February 12, 2007

It seems a good experiment, but if sent string size is the same of a generic JSONotation, I suppose that serialize/deserialize computation time is not “so interesting” as experiment itself.

P.S. I don’t understand very well Array limitations (ordered … why ? … obj = []; obj[1] = true; obj[0] = true; … fails your check) … however, this is a simple workaround.
function BSONSafeArray(arr){
for(var i = 0, j = arr.length, tmp = new Array(j); i

Comment by Andrea Giammarchi — February 12, 2007

oops … I meant … “this simple workaround” (… maybe)

function BisonSafeArray(arr){
for(var i = 0, j = arr.length, tmp = new Array(j); i < j; i++)
tmp[i] = arr[i];
return tmp;
};

Comment by Andrea Giammarchi — February 12, 2007

When I encoded the JSON text without whitespace as

{info:”Hello, I’m an object”,randomNumbers:[9,53,46,12,6,33,59,28],PI:3.14159265,nestedObject:{anotherNestedObject:{yetAnotherNestedObject:{color:”Blue”,width:32,height:90}},someArray:[1,4,true,"Hello World",[2,3,4,5],{member:”value”}]}}

it is 237 bytes in size.

When I fed this JSON to Bison.serialize() it produced a 227 byte string.

Why would anyone use this BISON protocol to save only 4% in data transfered?

Comment by Jamerquai — February 12, 2007

I was going to ask how BISON compares to gzipp’ed JSON, but after reading the other comments I understand that BISON isn’t actually producing smaller content anyway.

It was probably a fun exercise, but gzipp’ed JSON wins hands down. If one he had implemented gzip in JavaScript instead we could send JSON from the client too, instead of only from the server.

Comment by Theo — February 12, 2007

Theo: That is an interesting idea about implementing gzip in JavaScript. I can’t think of any time I would have needed that but it would be an interesting experiment for sure. The savings would need to override the library download.

Comment by Peter Michaux — February 12, 2007

Peter: File size is a consideration, and presumably processing time (to gzip client-side) as well. Of course, paradoxically, the client-side gzip code could itself be gzipped for fast download to the client. ;)

Comment by Scott Schiller — February 12, 2007

I dont think there is a need to implement gzip in javascript. That will be taken care of by the browser before the data reaches the javascript. If the browser supports gzip that is. And since all modern browser do support gzip there is no problem in sending json gziped.

Comment by Kristoffer — February 12, 2007

One thing that author came close to is the idea of obfuscating JSON so it become bit more difficult to reverse engineer.

Comment by Robert — February 12, 2007

Kristoffer: I was meaning sending gzipped (or otherwise-compressed) data from the client, from the browser (submitting large amounts of text, for example, comes to mind.) Assuming there would be some bandwidth savings, this might be an interesting area to explore.

Comment by Scott Schiller — February 12, 2007

I actually *have* implemented a gzip control on a client (not in pure JS though I do have the start of that laying around somewhere, it’s been a while); it was an IE-only intranet application based on passing XML documents from the client to the server and back (not my idea) and because of not-so-careful planning, the size of said documents were pushing the 4MB POST limit (obviously not using multipart).The experience was…um…not the most satisfying. While the size of the packets being sent back and forth were greatly reduced (a 4MB XML file was reduced to around 220k after gzipped and base64 encoded) the fact of the matter is that the browser is really not the best platform to be sending and decoding text of that size. The app in question was *slow* because of this (ironically enough, the gzip was actually quite fast; it was the rest of the app that was slow).

Comment by Tom Trenka — February 13, 2007

“(ironically enough, the gzip was actually quite fast; it was the rest of the app that was slow)”

maybe coz u use this stupid and terrible slow DOJOTOOLKIT? kick this shit out of your app and it will be fast like hell!

Comment by handcoder — February 14, 2007

Wow! What an ass you are! Don’t you think that if I’d implemented gzip with Dojo it would be a part of the toolkit by now? The app I was referring to was an internal IE-only application that I worked on in 2003, long before Dojo was in existence. Don’t look down now but your ignorance is open.

Comment by Tom Trenka — February 14, 2007

anyway…DOJO IS slow and browser-unsafe….just my 2 cent

Comment by ajaxianer — February 14, 2007

if you’re sending JSON from server to client, the HTTP/gzip encoding mechanism should give you good additional compression without having to resort to compressing the data in script-land. HTTP/gzip’s been around for ages, all browsers support it, thru proper handshaking:

- if a user supports gzip, it well send an Accept-Encoding: gzip http header in every http request to a server.

- if a server chooses to gzip the payload it sends back to the client, it MUST also send a Content-Encoding: gzip header.

Comment by chris holland — May 25, 2007

typos: s/user/user-agent/ s/well/will

Comment by chris holland — May 25, 2007

Leave a comment

You must be logged in to post a comment.