Thursday, April 6th, 2006

Speeding Up AJAX with JSON

Category: Ajax, Programming

<p> Speed is everything when it comes to online applications. Users hate to sit and wait, especially the experienced ones. A few seconds pass and they start to think something’s broken. Knowing this, optimizing everything you can on your application can mean the difference between someone leaving or sticking around to explore the site more. One method for enhancing the speed in your app is described here on Builder.com – using JSON to speed up your Ajax script’s connection between it and the server.

XML is the standard way to interchange data, but it’s often not the best way. Although XML can add structure and metadata to data, it does so in an overly verbose way. XML also has a fairly complex syntax, requiring a non-trivial parser to attack it. In JavaScript, XML must be parsed into a DOM tree to be used. And, once you’ve constructed the DOM tree, you still have to pilot through it to create corresponding JavaScript objects or otherwise use the XML data in your client-side Web application.

Fortunately, there’s a better way.

The article introduces JSON to the reader, offering a comparison with a matching XML structure. Their point of view is that XML is great for marking up data, but JSON is meant for the speed of data exchange. There’s an example provided of the difference between the same request done in both XML and JSON, with the JSON seeming to be the simpler. The article finished off with a look at how reliable JSON is for your application, and some mention of life on the other side – the server-side functionality.

Posted by Chris Cornutt at 7:05 am
26 Comments

+++--
3.8 rating from 97 votes

26 Comments »

Comments feed TrackBack URI

Hmmm .. an article on ‘speeding up’, but with absolutely no attempt to benchmark any speed gains JSON might deliver.

I suspect the differences only become apparent when dealing with large datasets. It would have been nice if he had made some attempt to at least quantify this.

Comment by Simon G — April 6, 2006

Please check out http://blogs.ebusiness-apps.com/dave/?p=43

It compares JSON vs. XML with benchmarks and shows that XML dominates in speed with large data sets. There is also an article on XML-XSLT vs. JSON vs. DOM with XML-XSLT clearly winning with large datasets.

In smaller cases, the speed difference was only about 5ms which is pretty negligible in most cases. Speed also differs between browsers (i.e. FF vs. IE)

He concludes that JSON is definitely nifty, but it all boils down to your own particular need. Each tech has its place so adjust as needed.

Comment by Derek — April 6, 2006

Speed without benchmarking, completely agree with above. That’s like claiming science with no data. JSON seems like a convenience format, but last I checked there are tons of tools both free and purchasable for the xml format, and none for JSON.

I find it hard justifying building anything for JSON when I know at some point I’ll have to deal with XML anyway. I’m not a fan of coding things twice. Maybe someone should create an XML JSON api; thats probably the only way I would consider using it.

Comment by Ivan — April 6, 2006

I agree, if you already have consumers using an XML format, there is no reason use should use JSON. On the other hand though, if you don’t have a web-service need to produce XML, sending JSON back to your AJAX client is a lot more convienent, on the client-side, than sending XML.

Comment by John Christopher — April 6, 2006

I don’t understand why we need JSON?
the day xForms will be supported by the browsers the data that we send to the server will be xml and not JS, by doing it now we can migrate our application in the future quickly.

XML is developed for defining DATA, it’s more easy to understand code that parse DOM than JSON (in my opinion) and we can use the XML in other cases (for accessbillity for example, building site not require JS that screen readers can read the data)

Comment by Nir Tayeb — April 6, 2006

“it’s more easy to understand code that parse DOM than JSON (in my opinion)”

you’ve obviously never worked with JSON, since there is no code to parse JSON besides eval() or perhaps the JSON.parse method if you’re using the json.org file.

JSON will continue to gain popularity for not just closed apps but also web services where SOAP was the norm. All major languages can support JSON through parsers and being able to do something like this in mybic:
var student = new Object();
student.name = “tommy”;
student.phone = “123-345-3393″;
ajaxObj.call(“action=saveStudent&student=”+JSON.stringify(student), callBack);

makes sending complex data to the server a breeze and letting something like PHP turn that into a native object.

now in php I just do
$student = $JSON->decode($_POST['student']);
and now I have
$student->name;

Much more of a hack to turn that student object into an XML string, then have to load that XML up in PHP again. Those who are mostly against JSON seem to be made up of people who don’t really know what JSON is.

Comment by Jim Plush — April 6, 2006

The assertion that JSAN will speed up your web applications is potentially naive. JSON will parse on the client faster, but unless you’re hitting performance problems on the client that plays all of zero weight of concern as to whether your Web application… which is essentially a server application will scale or perform.

*IF* your application is performing ok in the client the question is what will be performant on the *server* in preparing the data…. now that’s not to say preparing JSON on the server to be sent to the client wont be more performant, merely that is what needs considered.

I don’t know about Ruby or PHP, but Java, .NET and Python all have performant XML implementations, and serialising data to XML isn’t a performance concern. It also means your response is potentially reuasable by other clients etc etc.

As for the notion of posting JSON to the server to be parsed for the request… no is the simple answer… it wont be more performant than simply posting a regular query. Not even close. In Java or .NET to reconstruct an object on the server in an ad hoc fashion is going to completely slag performance.

For myself, for descrete value returns, plain text is just fine for the response. For large returns I’ll simply have the application prep XML and transform in server side to a partial view with XSL into HTML and return the whole portion…. if it’s more than a simple value it’s an exercise in transclusion for me, not RPC… I still feel we’re reinventing a wheel that should have been addressed with XLink…. if I need to repopulate a list then I’ll do just that, I’ll pass back the list content to be sqirted in. Screw having the browser parse anything with JavaScript. Take this content, and put it there.

My clients are as dumb as I can keep them bar plumbing and regular mechanic.

Comment by Guy Murphy — April 6, 2006

JSON lends itself to cross-domain data interchange via the dynamic script tag hack. Using XMLHttpRequest and XML, you are limited by the same-domain security policy in most browsers. When implementing remote media library browsing in AjaxAMP I was forced to use JSON for this reason since the client must fetch media library information from a user-defined remote IP address. The nice surprise was that it also turned out to be amazingly simple to parse on the client side.

Comment by Gabriel Levy — April 6, 2006

“Screw having the browser parse anything with JavaScript. ”

I wish I had the luxury of having a nice fat pipe to serve every client however I don’t so in my particular case I need to minimize network traffic while also allowing 3rd parties to tap into our core system. Rather then have them writing xml strings for complex api calls, they simply need to pass in a javascript object and I can do the rest.

KISS. You’ll never win a battle of XML vs JSON vs TEXT. Its all about what you need to do to get the job done. If XML works for you then enjoy, if JSON works for you then enjoy. Good programmers will use both in their toolbox if one of them has advantages over the other.

Comment by Jim Plush — April 6, 2006

JSON maybe elegant, and I think it is. However the title of this post is more than misleading.

If you are writing a app which does a lot of object creation etc. I can see where JSON would be attractive at least for the cleanliness of the solution.

But if you are pushing a lot of data back and forth, I think clearly the XML is the way to go. and it has more going for it than just performance.

Comment by Hubris Sonic — April 6, 2006

The few posters claiming XML is faster than JSON and the ridiculous website “proving” XML is faster are all misinformed. Parsing XML may be faster than evaluating JSON, but what good is a parsed XML tree unless you actually use the data – i.e., retrieve it from the DOM tree to convert it to a javascript variable? Once you add the parsing time to the actual use time you will see JSON is a clear speed winner. More to the point, JSON can express maps and arrays much more clearly than XML. In XML all maps and arrays are expressed by convention rather than as part of the data format itself. With JSON there is no chance of misinterpretation.

Unrelated to this, the article mentioned that “a malicious server could have your browsers executing dangerous actions. In that case, you’re better off using a JSON parser written in JavaScript.” They are talking about the javascript eval() command. When you visit ANY webpage with javascript enabled you are allowing that site to execute arbitrary javascript in your browser – whether you use eval() or not. So the browser client may as well allow eval() use since it is no worse than the non-eval() javascript case. The only place to be concerned about potentially dangerous JSON use is on the server, and using a JSON parser there would make perfect sense.

Comment by Jay Sam — April 6, 2006

Did anyone notice that the article refers to builder.com, but the article is at developer.com

Nothing new and a bit late, don’t you think?

Comment by David Davis — April 7, 2006

I like using JSON. I primarily work with PHP and although there are plenty of XML packages, there’s still some level of string concatenation, etc to actually build the XML sent via the AJAX request. Then, in JavaScript, I have to go through the process of traversing the XML and building my javascript object so that I can use it. Even with relatively large datasets, I find that json->encode(largearray) and then json.parse(large ajax response) is far easier to implement and creates a negligible affect on server performance.

I fail to see the point of the naysayers here. JSON is XML without the hassle of XML…I think that’s the point of the article. Here’s a quote:

“JSON produces slightly smaller documents, and JSON is certainly easier to use in JavaScript” I think that wins it for me.

Comment by Jon — April 7, 2006

Gabriel, you can do cross domain XML – you just have to do some creative cross domain ajax ;)Jay, which ridiculous website is misinformed?IMHO, the most common task when dealing with AJAX is to render data into HTML – so the question is not one of XML vs JSON but XSLT (+XML) vs JavaScript (+JSON). From the AJAX performance of these combinations one can determine which to use for a given situation. In general I like to keep state information (active record, highlighted object, selected items – that sort of thing) as JavaScript objects but data from the server in XML.I also wonder why everyone always uses the most complicated and verbose XML + XML DOM code when talking about JSON vs XML? What about XPath goodness?

Comment by Dave Johnson — April 7, 2006

@Jay Sam wrote:

When you visit ANY webpage with javascript enabled you are allowing that site to execute arbitrary javascript in your browser – whether you use eval() or not.

Let’s say you are writing a web app and you fetch some JSON data from another web service, “Joe’s Car Parts”. You fetch the price of a car part, but Joe inserts some additional JavaScript into the response. You eval() the response and Joe’s code runs, does some nasty stuff (in your domain, potentially using your session data).

So if you are fetching JSON data from a source that you don’t trust, you should probably parse it.

Comment by Patrick Fitzgerald — April 7, 2006

Dude, you can only make an XmlHttpRequest to the domain that served you the web page in the first place – it is a security restriction of the browser. So eval() all you want – it makes no difference in security on the client side from just going to normal javascript enabled websites.

Comment by JoJo Json — April 7, 2006

JoJo, Patrick is talking about using a script tag to load the JSON, not XMLHttpRequest. So it does work cross-domain, and you do need to take into account whether you trust the party you’re loading the JSON from.

Comment by Michael Geary — April 7, 2006

JSON can express maps and arrays much more clearly than XML. In XML all maps and arrays are expressed by convention rather than as part of the data format itself. With JSON there is no chance of misinterpretation.

give me a break. YOU think JSON can express maps and array more clearly. thats what YOU think. dont state that empirically. Secondly, ‘no chance’ of misinterpretation? right… how long have you been programming?

Comment by Hubris Sonic — April 9, 2006

OK, Hubris, time to put your money where your mouth is. :-)

Given these JSON objects:

‘obj’: { ‘prop’: ‘value’, ‘notarray’: ‘one’ }

‘obj1′: { ‘prop1: ‘value’, ‘array1′: [ 'one' ] }

‘obj2′: { ‘prop2: ‘value’, ‘array2′: [ 'one', 'two' ] }

Show me the equivalent XML that clearly indicates that array1 is an array with one element, array2 is an array with two elements, and notarray is not an array at all.

Comment by Michael Geary — April 9, 2006

One more such tool at: http://info-sense.net/color.html

Comment by Niks — April 10, 2006

<obj prop=”value”/>
<obj1 prop=”value”>
    <array>one</array>
</obj1>
<obj2 prop=”value”>
    <array>one</array>
    <array>two</array>
</obj2>

Comment by Hubris Sonic — April 10, 2006

Hubris and Michael,

To be fair, in my opinion, your XML doesn’t (and can’t (1)) take advantage of the purpose of XML, which is to have a DTD or XSD defining what the data should look like. When interchanging data between two systems, this is the /real/ advantage of XML (again, IMO). I can publish a service with an XSD and you can write a program to parse any document I generate which validates against that XSD. We can do this comletely independently of one another.

If all I want to do move information from one part of MY system to another part of MY system, I know and control the code on either end of the exchange. That’s where XML loses it’s advantage. JSON shines in that space because I can quickly move a data structure across the wire into one programming language or another with minimal parsing effort.

When publishing a service, or accepting data from the general public, XML would be my first choice. DTD/XSDs allow me to publish what I dole out or accept in a nice, widely understood format that you can code to. When I control both ends of the communication, and one of those ends is in Javascript, I use JSON.

So, remember, the right tool for job.

(1) A better XML document:


<object name="obj">
  <property name="prop" value="value" />
  <property name="notarray" value="one" />
</object>
<object name="obj1">
  <property name="prop1" value="value" />
    <array name="array1">
    <element value="one" />
  </array>
</object>
<object name="obj2">
  <property name="prop2" value="value" />
  <array name="array2">
    <element value="one" />
    <element value="two" />
  </array>
</object>

Comment by Doug Van Horn — April 10, 2006

The right tool for job.

My point exactly. However my XML was better ;) youre a schema guy. i can tell.

Comment by Hubris Sonic — April 10, 2006

No…your XML sucked compared to his Hubris. Yours didn’t even complete the requested requirements. His clearly shows the requirements and one can understand that the first object is not an array yet holds the value of “one”. The other two are arrays. Standards are there for a purpose ;) Nice post Doug Van Horn.

Comment by Chris — April 14, 2006

I would add a further requirement that strings should be distinguishable from numbers, so that 1 != “1″. This is notoriously cumbersome in XML.

Comment by AC — May 3, 2006

We are using ColdFusion to create a new Internet Portal to replace Share Point. Since I have been working on this project I have been getting more negative by the day on ColdFusion. It is soooooo slow on IE, works great on FF and Safari. I have no choice to use IE.

I am using every new Ajax feature possible in order to the users a rich client application experience. I seems that I will need to abandon all of my work is I can’t get to speed up. It takes 20 seconds for a page to load even when everything is cached, only a few seconds with other browsers.

What do I need to do to make it FAST?

Comment by mskinner — September 10, 2008

Leave a comment

You must be logged in to post a comment.