Tuesday, September 5th, 2006

Apocalypse 2.0 – A New Era of Fragmentation

Category: Browsers, JavaScript

Ryan Stewart cautions on Javascript fragmentation among web browsers in this ZDNet article. He speculates on the upcoming ECMAScript update (last edition was in 1999), with the following hypothetical scenario:

ECMA released the specifications for the fourth edition of ECMA Script, and JavaScript 2.0 was born. It added enhanced JavaScript functionality for developers, but FireFox was the only browser to fully support it. Worse than that, all of the Ajax applications written to fuel the Web 2.0 boom required major reworking. Microsoft decided that they were going to add some enhancements to JavaScript 2.0, but these enhancements only worked with IE7 and broke Mozilla browsers. A new browser, which had steadily gained market share because of its ability to keep the browsing history secret, decided to implement a more “secure” version of JavaScript. Developers of applications which were once the toast of the Web 2.0 world now got hundreds of emails a day from users saying “why doesn’t your page work with my internet”. Chaos ensued.

Realistically, most developers won’t be able to rely on the new features for a long time; not many developers have the luxury of assuming a Firefox-only user base. If you’re willing to commit to recent versions of Flash, you gain ECMAScript portability:

ActionScript 3 is based on the newest ECMA standards, and provides a look at what JavaScript 2.0 may look like. JavaScript developers who take a look at ActionScript will feel right at home and not be required to hack around browsers or worry whether or not their code is going to run for most people.

We’ve mentioned the upcoming threat to browser standards previously. Things have actually been pretty good for a while, compared to the “Works Best in IE/Works Best in Netscape” days of the mid-90s and the “Only works in IE” period that followed. But now we’re at a crossroads. How much longer can we expect browsers to remain reasonably consistent, when there’s so much more increased interest in web apps and the browsers that power them, when MS’s core applications again under threat from the the web, when there’s every chance of Safari share continuing to grow alongside that of Apple/OSX?

Posted by Michael Mahemoff at 7:30 pm
11 Comments

++++-
4.2 rating from 24 votes

11 Comments »

Comments feed TrackBack URI

On the one hand: hell. On the other: job security.

Comment by Gabe da Silveira — September 5, 2006

Eh. I think the IE team has started listening to developers outside of their own office (and orifice), luckily for all. IE7 may still have an enormous way to go, but they can more clearly see where they need to go. The other major browsers have started trying to beat eat other out at standards compliance. I don’t think we need to worry about everyone branching off so much as what everyone begins migration to from JavaScript.

Personally, I can’t freakin’ wait for it.

Comment by The Hater — September 5, 2006

IMO we don’t need new JS features as much as complete and non-leaky JS implementations of yesterday’s standards. I should be able to leave a couple pages open all day without my browser constantly soaking up more memory. When Opera widgets hit mobile phones the problem is really going to become evident (but this problem is certainly not limited to Opera).

Comment by Stephen Clay — September 5, 2006

I read that article and thought it was pure FUD!
The author focuses on javascript, but his arguments could equally apply to html, css, rss, or any other format / language. Take WHATWGs html5 spec for example, some aspects are already appearing in gecko, and they could be really useful. But not if they aren’t available widely.

The problem of browser compatability is obviously very important, which is why we must continue to lobby browser makers to follow standards, and lobby the standards bodies to create intelligent useful standards. There is no point saying let us just hide in proprietary closed formats like flash. Flash is great but it should only be one aspect of the web, not the whole thing

Comment by scottbp — September 5, 2006

Browser developers will always have a tendency to offer more than what is expected. This was exactly IE’s problem – it offered a lot of functionality thanks to activeX and other neat built in features (however it failed to deliver the standard core in a sufficient level. ) But how many webdevelopers used these extra features? Almost none. Most of them stick to standards or functionality that can be delivered easily to cross browser. (well, easily….)

Now of course that we will always have more standards than the browsers developers will be able to support in full extent. (Look at the time it took XSL to get crossbrowser or XForms that are still getting there.) And some of these will be forgotten before thay gain global acceptance and usage. But as always – it will be the customers (web-developers), who is going to decide how to develop their applications and what standards or components to use to reach the desired audience – not browser developers (though they may try – but most probably will fail, just as MS did few years ago)

Comment by nxt — September 6, 2006

It is interesting that you compare this “problem” to the days of the browser wars but I would argue that the browser wars generated innovation. The best features are determined by actual use and not some W3C committee.

One browser vendor will cook up a feature that they feel will be useful. Either through market share (IE) or pure usefulness (tabbed browsing) it may be adopted by other vendors. If that feature is for developers (JavaScript, DOM, etc.) then different vendors may arrive at different incompatible implementations. This is just a natural result of one vendor trying to improve on an idea.

If the feature is really useful it will be adopted by more vendors usually with one design emerging as the “standard”. If one design does not emerge this is where a committee like the W3C steps in to consider all proposals and then pick one. Then as a check-and-balance it is up to the vendors to decide if they want to support this standard. It is important to note that a defacto standard or a standard approved by a committee such as the W3C is not necessarily the best design of a feature. It is just the design that vendors and standard committees agreed to support.

This entire process can take years before most major vendors are supporting a common standard that developers can rely on. I admit this is frustrating because web developers want to use the features as soon as they are available and not have to wait. But this process is important because it forces a feature to prove its usefulness before it can be relied on.

Early adopters can use new features ahead of time but they risk higher development costs (dealing with implementation bugs and workarounds for incompatibilities). Early adopters also risk loosing customers that use a browser that doesn’t yet support the feature being used. These are choices a developer must make to balance the use of new innovation and universal access. We need these early adopters because they are the ones that will prove a feature’s usefulness and work out the kinks. But obviously not everyone has the ability to be an early adopter.

All of this is just a long-winded way of saying I welcome the new features coming out in JavaScript. Many of them will fall away if usage shows them to not be useful. But a few will stick around, be adopted by other vendors and perhaps even be standardized so they can be used without risk. This is the path that the XHR object is taking. It started as an IE innovation years ago. Was almost forgotten but has now been adopted by almost every major vendor and is currently in the process of being standardized to work out the remaining incompatibilities.

JavaScript has stagnated for too long. Once the browser wars ended innovation stopped and all we got as web developers were new pie-in-the-sky, impractical standards designed in committees by the W3C. Committees are good for reaching comprise but they are poor at innovating. I see the new browser wars as an opportunity for real useful innovation to happen again.

I do hope that this war is less about egos and more about trying to deliver the best technology. I think the vendors learned from the last war that working together won’t prevent success. Only time will tell if that lesson stays learned. But I also hope that the vendors don’t stop innovating just because the idea has not already been approved by some committee.

Comment by Eric Anderson — September 6, 2006

Eric Anderson:

“JavaScript has stagnated for too long. Once the browser wars ended innovation stopped and all we got as web developers were new pie-in-the-sky, impractical standards designed in committees by the W3C. Committees are good for reaching comprise but they are poor at innovating. I see the new browser wars as an opportunity for real useful innovation to happen again.”

That’s the same as saying that a programming language that isn’t in a constant state of flux must therefore be rejected for the stability of its platform. Change for the sake of change is a fool’s choice.

The ECMA team working on scripting has more or less lost touch with the average web developer. The focus is less on integrating newer effects using existing technologies in the existing web and more and more on achieving some form of perfection within the JavaScript language as some form of inner nirvahna.

The effort is chaotic, secretive in that the discussions are inserted into mailing lists and forum entries, hinted at in conferences, and covered glancingly here and there, but never in such a way that there’s complete transparency on what’s going on. When was the last time Eich updated his roadmap weblog? Oh yes, a small group of the obsessed Know All. Bully.

I’m not going to beat up on IE because from what I can see, they at least are talking about their direction.

“How much longer can we expect browsers to remain reasonably consistent, when there’s so much more increased interest in web apps and the browsers that power them, when MS’s core applications again under threat from the the web, when there’s every chance of Safari share continuing to grow alongside that of Apple/OSX?”

Actually, I think the consistency among most browsers will continue, but I think the new IE in the future is going to be Firefox.

Comment by Shelley — September 6, 2006

[…] I commented over at Ajaxian. Here is my comment: […]

Pingback by The Bb Gun » Blog Archive » Me Feed Six — September 6, 2006

Shelley wrote:

That’s the same as saying that a programming language that isn’t in a constant state of flux must therefore be rejected for the stability of its platform. Change for the sake of change is a fool’s choice.

I think you greatly misunderstand me. I in no way stated or implied the change is good for the sake of change. But change is good when there is a real need. There are hundreds of areas that browser can improve to give the user a better experience. Things like the XHR object are just the tip of the iceberg.

Because of a renewed browser war (a bit friendlier this time) we are starting to get innovation again which is lending towards much more capable web applications than those of the 199X era that JavaScript had stagnated in for so long.

Comment by Eric Anderson — September 6, 2006

That is a very refeshing approach to the subject.

Comment by Mike Luxury — October 23, 2006

Good article the same as all! ;)

Comment by meble biurowe wrocÅ‚aw — June 17, 2007

Leave a comment

You must be logged in to post a comment.