Wednesday, August 12th, 2009

W3C publish first working draft of File API

Category: Standards

The W3C has published a working draft for the File API which gives us a much improved <input type=”file”> and programmatic ability to work with file uploads and the like.

There are actually a few pieces to this work, which does a good job interfacing with other standards too:

This specification provides an API
for representing file objects in web applications, as well as programmatically selecting them and accessing their data. This includes:

  • A FileList interface, which represents an array of individually selected files from the underlying system.
    The user interface for selection can be invoked via <input type="file">, i.e. when the

    element [HTML5] is in the File Upload state, or through the FileDialog interface.

  • A FileData interface, which provides asynchronous data accessors for file data via callback methods.
  • A File interface, which includes readonly informational attributes about a file such as its name and its mediatype.
  • A FileError interface, which defines the error codes used by this specification.

The API to get access to selected files is trivial (document.getElementById("myFileInput").files.length etc) and then you can get the file data itself in various forms (data: URL, text, binary, Base64, new filedata:// URL).

An example usage of the filedata URL:


  1. // Sample code in JavaScript
  2. // Obtain fileList from <input type="file"/> using DOM
  4. var file = fileList.files.item(0);
  5. if (file)
  6. {
  7.  // ... Make asynchronous call
  9.  file.getAsURL(handleURL);
  10. }
  11. function handleURL(url, error)
  12. {
  13.  if(url)
  14.  {
  15.    var img = new Image();
  16.    img.src = url;
  18.    // Other stuff...
  20.  }
  21.  else
  22.  {
  23.    // error conditions
  24.  }
  25. }

Fun to see this all come together. The editor is a fellow Mozilla-n Arun Ranganathan … an all round good chap :)

Some have talked about alternative solutions such as using XHR to do the work, or DOM events to allow built-in progress events. The working group is listening, what would you like to see?

Posted by Dion Almaer at 6:25 am

4 rating from 38 votes


Comments feed TrackBack URI

I vote for DOM events for every percent uploaded.

Comment by Jaaap — August 12, 2009

I second the above, DOM events for tracking upload progress would be awesome.

Comment by ossreleasefeed — August 12, 2009

Ooo, being able to get hold of the binary in javascript will make so many new things possible..

Comment by meandmycode — August 12, 2009

With JavaScript the following methods (kinda similar to video/audio):

start() stop() and cancel(), so it is not necessary to submit a form rather than to access it by script. DOM Events like stated above to receive information about the current upload.

Just imagine if you are just interessted in the head of a file, you can do sth like:

var input = document.getElementById(“…”);
var file = input.files.item(0);
input|file.addEventListener(“track?”, function(e) {
if (e.currentSize >= [size in kb thats needed]) {
// continue with e.filedata

I didn’t have a look into the spec, so the code is just to describe my words above… kinda :)

Comment by gossi — August 12, 2009

I really like the idea above. It would be very useful to have such a granular control over file uploads.
Now, the question is, when can I start using it in FiCOS? (Firefox, Chrome, Opera, Safari ;))

Comment by iliad — August 12, 2009

It’s really nice to see that there is a good specification for the File API. It’s a bummer though that the Mozilla team hasn’t implemented it fully and doesn’t have support for the slice method this makes it very unpractical when you handle large files for example a video stream since it then loads the whole contents into memory before sending it using for example a XHR call.

Comment by Spocke — August 12, 2009

@iliad: One could just hope that both the WebKit and Gecko teams implement this feature. The current state is that WebKit doesn’t have support for any of the file methods. You can just extract the name and the size of the selected files. Gecko has most of the methods but not one important one for slicing the files into smaller chunks. I hope both vendors implement the W3C spec soon and think about memory management this is an awesome concept and would remove the need for Flash or Gears for uploading many and large files.

Comment by Spocke — August 12, 2009

Spoke, iliad, meandmycode: Firefox already supports many FileList features (and has for a while).

>>var file = document.getElementById("some-file-input");
>>[property for (property in file.files.item(0))].join(", ")
getAsDataURL, fileName, fileSize, getAsText, getAsBinary

Comment by EliGrey — August 12, 2009

I’d love to be able to upload a file via XHR, then be able to query that process for progress information.
Note that I’d prefer this to callbacks, as it would give me more control over when the UI is updated.

Access to binary data would also be a big win.

Comment by jdhuntington — August 12, 2009

does work on file ‘type’ will put pressure on browser developers to resolve 2GB limit bug? For now, trying to upload file larger than 2GB result with silent crash or bad headers (Content-Length below zero)..

Comment by nixon — August 12, 2009

Yay, finally another useful thing in HTML5 that Flash actually doesn’t do better.

Comment by Darkimmortal — August 12, 2009

I think the most important thing is to integrate this into XHR to ease all of the iframe hacks for uploading files asynchronously.

I would also vote that if there is a progress API that you can specify the call rate and the callback something like this:

var rate = 2000; // milliseconds between each report
loader.registerProgressListener(rate, function(uploadedBytes,totalBytes,bytesPerSecond){

it is important to be able to control the rate of reporting.

Comment by BH23 — August 12, 2009

@EliGrey, great – didn’t know that, hopefully a specification can push things along more, given the only way specifications become final is after x amount of implementations?

Comment by meandmycode — August 12, 2009

Progress events/status falls outside the scope of this article since that is not part of the File API, but the XmlHttpRequest API.

That said, I don’t agree with progress events, simply allowing the script to periodically poll the request would be simpler, and effectively the same thing.

Upload progress is something that should already be exposed in the browser. Primarily as a progress widget similar to the loading progress bars (which are largely fake, since total page size includes all the images, scripts etc.) The browser should know how big the request is, and how much data has been sent. It just needs to expose that information, visually or via JS.

This shouldn’t be limited to the request either. Responses should be accessible too. Access to data loaded, and the http size header would be nice as well, and not just for AJAX Requests. Frame, img, video, audio etc objects would be nice too. Potentially allowing for pretty pre-loaders for these elements.

However as I said, whole other kettle of fish.

Comment by Deadmeat — August 12, 2009

I’d be interested in being able to access files as mime style packages. (similar to emails and MHTML.)

This would provide a low rent way to have multipart structured files. Of course the would make the API more complex since you’d want to be able to read each part individually. This would be useful for apps that want to load and save multipart data in a single file such that you didn’t need to prompt the user to select multiple files or embed and decode multiple files in a single file.

This is easily fakeable if the file is already an MHTML document, by reading as text and parsing. I’d be nice if this was baked in though.

Would make for a really simple shorthand for generating email attachments too.

Comment by Deadmeat — August 12, 2009

I love the idea of being able to open local files, make changes, and save them back to the users local file system. This would be huge for pushing the web forward as a platform.

Comment by aheckmann — August 12, 2009

You can prompt users to save “new” files by using data uris in links. so href=”data:text/saveme;base64,……”

Which isn’t quite the same thing. But then writing to local files is a whole other kettle of fish opening up way too many avenues for malicious sites to write files to disk.

Quite rightly runtimes like Adobe Air et al, are much more suited for that sort of application.

Comment by Deadmeat — August 12, 2009

FileList must be a plain-old array or a singly-linked list (this isn’t obvious in the spec), so that JS/C++/Java/Python/etc. developers can use their native tools for the job. And you know they will.

Usage of error codes in 2009 seems kind of… obsolete. Structured exceptions came from ML, where they are represented simply as datatypes. Why isn’t it the case in the IDL?

Comment by chiaroscuro — August 12, 2009

Uploading files by drag and drop from the native OS to a programmatically defined destination area inside the browser window would be great. Maybe we could have a confirmation dialog to prevent accidental uploads and to make this feature’s security equivalent to the one of the standard file selection box.

XHR file upload and progress callbacks will be very useful. Progress callbacks should be available for any XHR request, not only for files.

A way to query progress even for non-XHR file uploads would also be useful but it could be skipped if the browser reports about the progress of the uploads of every single file.

@Deadmeat: access to local files is a matter of trust. The OS itself or any famous application running on your machine (Office, Photoshop, any web browser) could be sending a copy of all your files to a remote server right now. All you can do is trust your vendor or the users’ community, read the source code if available, monitor your Internet connection. The companies running web applications are usually way more obscure than the vendors of OS or mainstream applications so trust is lower. Flash and also Java (Silverlight as well?) can access files, webcam and microphone but only after prompting the user for permission and I’m sure you grant permission only to applications you trust. There are no reasons why JavaScript shouldn’t be able to do the same. The only problem I see is that it will lower the barrier to entry for attackers: there are probably less people that can write Flash, Java and Silverlight than there are that can write JavaScript. As a countermeasure permissions should be granted on a single file basis and/or variables and DOM elements getting content from local files should be marked as tainted and the browser should not be able to send their values to a remote server.

Comment by pmontrasio — August 13, 2009

Mozilla has implemented drag and drop support for the File API, see

You can try out this demo in a Firefox nightly build:

You can even drop multiple files at once.

Comment by stonedyak — August 13, 2009

Leave a comment

You must be logged in to post a comment.