Sunday, April 18th, 2010

The Best of Steve: Performance at JSConf

Category: Performance, Presentation

<p>steve_book_large

(Live blogging notes.)

At JSConf, Steve Souders walks us through several performance-optimising things on his mind lately.

Site Speed in PageRank

A week ago, Google announced site speed is going to be taken into account for PageRank. For Steve, this is a dream come true. Now companies are going to start investing in performance, so less of those slow-loading sites that frustrated him enough to get started down this performance path. One of the criticisms is that this will favour big companies, but Steve points out that smaller companies are often more nimble and able to adapt to changes like this.

As part of Google’s webmaster tools, site performance is shown to respective site masters, along with some guidance. Another good resource is http://www.webpagetest.org/. Other than its main measurement service, a great feature of WebPageTest is side-by-side comparisons. Show your manager a side-by-side against a competitor for guaranteed satisfaction.

Performance of Third Party Widgets

There’s been something of a reversal in performance hotspots. Five years ago, it was the core application code that was mostly slowing things down, much as teams would like to blame performance problem on 3rd party ads. Nowadays, those apps have been more finely tuned, and at the same time, people are using more 3rd party stuff – not just ads, but embedded widgets. All of which explains why Steve’s been looking at third-party widgets lately. You can see what he’s been up to at P3PC, a benchmark tool for third-party widgets.

A key question is how are the widgets embedded? People are no longer just doing the blocking document.write calls. Instead, it’s much more common to dynamically create a script tag and append to the page. But where and how do you append it? jQuery’s library code, for example, does it in a simple, elegant, manner:

javascript
< view plain text >
  1. var head = document.getElementsByTagName("head")[0] || document.documentElement,
  2.   script = document.createElement("script");
  3. ...
  4. head.insertBefore(script, head.firstChild)

Others, not so much. See Steve’s recent blog posts for more analysis on these techniques (e.g. on Google Analytics).

Frag Tag

An early proposal from Steve and Alex Russell …

<FRAG>
<script src="snippet.js"></script>
</FRAG>

The idea is that the frag loads independently; even document.write doesn’t block. And it could go further, into a sandboxing mechanism. “If we had this frag tag, it would be one of the biggest things website owners could do to improve the performance of their pages.”

Browser Disk Cache

The main message here is that browser disk cache is too small, and he’s been talking to vendors about upping capacity. And there’s a survey for you.

What makes sites feel slow?

So lately Steve’s been going back to basics and looking at user perception, thinking not just about how fast the Javascript takes to load, but how fast till the user sees anything. So he’s promoting the standard progressive enhancement pattern, similar to Facebook’s earlier talk:

* Deliver HTML
* Defer JS
* Avoid DOM
* Decorate later.

He’s done a couple of studies to this end:

* Charting page load time – as the user actually sees it – against market share…in an attempt to show faster sites mean bigger market share.
* Looking at initiail payload versus execution. Many of the sample (highly popular) sites are serving many functions on initial payload, which could be deferred until later – putting scripts at the bottom of the page, and loading scripts asynchronously.

Other Stuff

Check out:

* Browserscope, “a community-driven project for profiling web browsers”.

* HTTP Archive Format (HAR). An industry standard for capturing Used in an increasing number of tools, e.g. NetExport plugin in Firebug.

* Velocity Conference, which Steve founded with O’Reilly. June 22-24, Santa Clara.

Posted by Michael Mahemoff at 2:54 pm
2 Comments

++++-
4.4 rating from 12 votes

2 Comments »

Comments feed TrackBack URI

A week ago, Google announced site speed is going to be taken into account for PageRank

.

This was followed by suggestions that included, among other things, loading jQuery by appending a child script to the HEAD.

Does googlebot block on that script request? I believe that that has never been shown. Moreover, unless it has a full-blown browser engine, why would it? Googlebot probably doesn’t block on script requests.

Most sites have a unnecessary amounts of markup, add large scripts that are also unnecessary, such as jquery or dojo, have an inordinate amount of badly managed css.

Limiting the amount of junk HTML and validating the remaining through validator.w3.org can improve SEO.

Taking the recommended counteractive measures to improve performance is senseless. No thanks for the snake oil.

Oh this is good:
Spam Question: What is the name of Mozilla’s browser?
Fennec? Iceweasel? Seamonkey?

Comment by dhtmlkitchen — April 19, 2010

@dhtmlkitchen: To clarify, time measurements are measured from real users. Owners of a web site can see that information as described in this blog post Your site’s performance in Webmaster Tools. In addition to potential SEO improvements, making your web site faster improves the user experience, increases revenue, and reduces operating costs. I just came back from JSConf and heard positive case studies from Happy Cog, Zappos, and Facebook about how their sites were made significantly faster using these performance best practices. All site changes should be addresses in the appropriate priority. Site speed should be in that priority list somewhere.

Comment by souders — April 20, 2010

Leave a comment

You must be logged in to post a comment.