Thursday, January 10th, 2008

Cross-Site XMLHttpRequest in Firefox 3

Category: Security, XmlHttpRequest

>John Resig has written up documentation of Cross-Site XMLHttpRequest that discusses the W3C Access Control working draft which Firefox 3 implements.

He gives us a nice example:

In a nutshell, there are two techniques that you can use to achieve your desired cross-site-request result: Specifying a special Access-Control header for your content or including an access-control processing instruction in your XML.

In HTML:

  1. < ?php header('Access-Control: allow <*>'); ?>
  2. <b>John Resig</b>

In XML:

  1. < ?xml version="1.0" encoding="UTF-8"?>
  2. < ?access-control allow="*"?>
  3. <simple><name>John Resig</name></simple>

And the XHR code itself isn’t different from any other XHR code:

javascript
< view plain text >
  1. var xhr = new XMLHttpRequest();
  2. xhr.open("GET", "http://dev.jquery.com/~john/xdomain/test.php", true);
  3. xhr.onreadystatechange = function(){
  4.   if ( xhr.readyState == 4 ) {
  5.     if ( xhr.status == 200 ) {
  6.       document.body.innerHTML = "My Name is: " + xhr.responseText;
  7.     } else {
  8.       document.body.innerHTML = "ERROR";
  9.     }
  10.   }
  11. };
  12. xhr.send(null);

Some are excited to see the cross domain work, and some are concerned…. e.g.

I agree with Thomas. I never understood the NEED to modify the client security model to allow for this. If this is something the software needs to do, then the developer can implement a proxy on the server side. At least in this way the developer has sole discretion on the connections. Just more to go wrong if you ask me.

-

I’m still under the impression – and correct me if I’m wrong – that all these means are tailored to protect the server and its documents. But I thought the issue was to protect the client!

-

What exactly is the reason we need this? Has anybody here really understood why XMLHttp is currently limited to one host and cannot communicate cross-domain? I really do not understand that. If XMLHttp cannot do this by default, why it is still possible to load scripts and images from other servers? Why can I do exactly the same type of cross-domain communication using Flash, maybe using Silverlight in the future? What is the original reason for this limitation? Is this documented anywhere?

If, as mentioned in the spec, HTTP DELETE is problematic, because it may delete data, why cannot we filter such actions when detecting a cross-domain communication? GET and POST are possible in the same way when submitting simple form. It is even possible to generate these form elements dynamically. And this also works cross-domain. At least these two HTTP methods should be enabled by default to allow cross-domain communication. The open web, as often mentioned by Alex Russell, really needs features comparable with closed source software e.g. Flash or Silverlight.

-

I agree with those saying that this spec is misguided. But bothering users too much is also not good. How are they to know in every case what things mean?

What do you think?

Posted by Dion Almaer at 12:29 pm
11 Comments

++++-
4.1 rating from 33 votes

11 Comments »

Comments feed TrackBack URI

Is this a good place to use Json instead?

Comment by bnye — January 10, 2008

It’s protecting the client from having its cookie-authenticated data exposed to malicious sites.

And yeah, bnye, unprotected JSON (or Javascript, even more so) is basically allow=”*”. So if you want to allow other sites to use data associated with your users, totally. If not, protect your JSON.

Comment by bander — January 10, 2008

Protecting the user is exactly the reason why we need cross-site XMLHTTPRequests… as stated before we could use JSON, but getting data from JSON without using exec (which should never be done with data that you get from a thirdparty server) is a pain. A crosssite XMLHttprequest is much safer… I mean, what harm can it to as long as protected data is not marker allow=”*”, which hopefully nobody in their right mind would do…

Comment by Hans Schmucker — January 10, 2008

wasn’t there plans to implement the cross-site jsonrequest function proposed from crockford almost 2 years ago?

Comment by ded — January 10, 2008

cross-site XMLHTTPRequests helpful .

Comment by linizou — January 10, 2008

The risks are actually pretty huge. The biggest being infiltration behind firewalls. A malicious server could serve a cross-domain script that essentially robots through an unsuspecting surfer’s intranet.

Comment by pwb — January 11, 2008

I think there is a huge need for this. Bottlenecks on proxies, easy integration of Ajax components that connect to a central webservice, etc.

Comment by alexeiwhite — January 11, 2008

Beside if this is a ‘good’ or ‘bad’ thing, I think it would still take half a decade to use this method unobtrusively in web applications. (since this would probably be the time Microsoft needs to adobt this and all it’s users upgraded)

Comment by pool — January 14, 2008

I think this could enable some kind of attacks like, using js to steal the password in the page and send back to the attacker’s sever by xmlhttp request.
This is normally not available, but in FF3 this is available if the attacker had configured his own server correctly!

Comment by stauren — January 29, 2008

maybe this will light you up
http://websecurity.ro/blog/2008/04/10/cross-domain-requests-will-be-back/

Comment by luca — April 10, 2008

So this is basically like crossdomain.xml in Flash??

Comment by tlrobinson — April 22, 2008

Leave a comment

You must be logged in to post a comment.