Thursday, August 28th, 2008

In Praise of Evolvable Systems

Category: Articles, Editorial, Standards

<p>I met with a colleague recently who wants to take his project and create a standard on the web that actually gets adopted. We talked for a long time, and when we finished up I pointed him at a paper that had a huge impact on me, called “In Praise of Evolvable Systems” by Clay Shirky.

This paper was written a long time ago in the ancient year of 1996 by the web’s timeline, but everything in it still holds on why the web won and some possibilities of how we can move forward from where we are today. Under the tagline for his paper Clay has the summary “Why something as poorly designed as the Web became The Next Big Thing, and what that means for the future.”

Clay starts by pointing out how bad the Web is:

If it were April Fool’s Day, the Net’s only official holiday, and you wanted to design a ‘Novelty Protocol’ to slip by the Internet Engineering Task Force as a joke, it might look something like the Web:

  • The server would use neither a persistent connection nor a store-and-forward model, thus giving it all the worst features of both telnet and e-mail.
  • The server’s primary method of extensibility would require spawning external processes, thus ensuring both security risks and unpredictable load.
  • The server would have no built-in mechanism for gracefully apportioning resources, refusing or delaying heavy traffic, or load-balancing. It would, however, be relatively easy to crash.
  • Multiple files traveling together from one server to one client would each incur the entire overhead of a new session call.
  • The hypertext model would ignore all serious theoretical work on hypertext to date. In particular, all hypertext links would be one-directional, thus making it impossible to move or delete a piece of data without ensuring that some unknown number of pointers around the world would silently fail.
  • The tag set would be absurdly polluted and user-extensible with no central coordination and no consistency in implementation. As a bonus, many elements would perform conflicting functions as logical and visual layout elements.

HTTP and HTML are the Whoopee Cushion and Joy Buzzer of Internet protocols, only comprehensible as elaborate practical jokes. For anyone who has tried to accomplish anything serious on the Web, it’s pretty obvious that of the various implementations of a worldwide hypertext protocol, we have the worst one possible.

Except, of course, for all the others.

The web, however, was better than the contenders of that time. He argues that all the other formats, such as Gopher, Interactive TV, and so on were too well designed and had too much internal consistency:

These various [non-web] protocols and services [Gopher, Interactive TV, AOL, etc.] shared two important characteristics: Each was pursuing a design that was internally cohesive, and each operated in a kind of hermetically sealed environment where it interacted not at all with its neighbors. These characteristics are really flip sides of the same coin — the strong internal cohesion of their design contributed directly to their lack of interoperability. CompuServe and AOL, two of the top online services, couldn’t even share resources with one another, much less somehow interoperate with interactive TV or CD-ROMs…In other words, every contender for becoming an “industry standard” for handling information was too strong and too well-designed to succeed outside its own narrow confines. So how did the Web manage to damage and, in some cases, destroy those contenders for the title of The Next Big Thing? Weakness, coupled with an ability to improve exponentially.

Clay then goes on to argue that successful systems must be evolvable and gives three rules:

Evolvable systems — those that proceed not under the sole direction of one centralized design authority but by being adapted and extended in a thousand small ways in a thousand places at once — have three main characteristics that are germane to their eventual victories over strong, centrally designed protocols.

  • Only solutions that produce partial results when partially implemented can succeed. The network is littered with ideas that would have worked had everybody adopted them. Evolvable systems begin partially working right away and then grow, rather than needing to be perfected and frozen. Think VMS vs. Unix, cc:Mail vs. RFC-822, Token Ring vs. Ethernet.
  • What is, is wrong. Because evolvable systems have always been adapted to earlier conditions and are always being further adapted to present conditions, they are always behind the times. No evolving protocol is ever perfectly in sync with the challenges it faces.
  • Finally, Orgel’s Rule, named for the evolutionary biologist Leslie Orgel — “Evolution is cleverer than you are”. As with the list of the Web’s obvious deficiencies above, it is easy to point out what is wrong with any evolvable system at any point in its life. No one seeing Lotus Notes and the NCSA server side-by-side in 1994 could doubt that Lotus had the superior technology; ditto ActiveX vs. Java or Marimba vs. HTTP. However, the ability to understand what is missing at any given moment does not mean that one person or a small central group can design a better system in the long haul.

I know we sometimes get frustrated with the state of the web today, but its useful to know why we are here, and what characteristics any new ideas, features, or standards probably need to have in order to be successful; Clay’s paper helps guide me in terms of navigating these issues.

Posted by Brad Neuberg at 5:45 am
1 Comment

++++-
4.2 rating from 6 votes

1 Comment »

Comments feed TrackBack URI

Is anyone else tired of complaints about HTTP from people who never use it as a resource transport protocol like it was intended?

Comment by trav1m — August 29, 2008

Leave a comment

You must be logged in to post a comment.