Web developers use two basic types of metrics to measure page speed: literal load time and “best practice” score cards like Google Page Speed and Yahoo’s YSlow. The tools for both metrics mislead developers as to what they should spend their time and effort on.
First off, basic load time scoring does not measure perceptual load time. If the user can interact with their desired content, they will not perceive the lack of a footer. But Web Performance Today already has a great post on perceptual load time, so I won’t spend time on it here.
Another gripe I have is how difficult it is to accurately measure the impact of a given change, as there are multiple variables throwing off each measurement. Even in high school chemistry, we averaged three weight measurements because even a breeze from the ventilation duct would throw off the sensitive weight scales. Web server processing speeds (especially on shared hosting), neighborhood and local network congestion, and local computer rendering time varies minute-to-minute. Web developers should average their measurements and track their data points over time, but that rarely happens.
You may think point number two is silly, as any major change should render a load time difference of half a second or more. But, after extensive troubleshooting, I pinned a client’s load time issues on the WP über cache plugin W3 Total Cache. The caching plugin adds a lot of overhead and the site basically traded long download times for long response times. If an agnostic measurement tool on an independent server which controlled for local variables were available (kinda like the nice weight scale in chemistry class that wasn’t under the air conditioning vent), I would have caught this much more quickly. A service that helps developers average load times before and after changes, and also give comparisons over various time periods would go a long way towards correcting the ad-hoc measurements that take place now. Linking those numbers to traffic load (ala google analytics) would be cool, linking to a WP plugin or a ssh session + ps would be awesome.
However, my most serious complaint with Page Speed and YSlow is the way they present their data. From this pretty waterfall loading graph one can see that time spent downloading JS/CSS never goes over 1ms. The total time *downloading* all JS/CSS is .033 seconds while time spent *waiting* for the JS/CSS is 4.2 seconds: the majority of the entire page load time.
To put that in perspective, all of the work one puts into compressing the text and using a CDN saved about ~0.4 seconds while combining the JS/CSS would save ~4.0 seconds. The general recommendation checklists YSlow and Page Speed offer are useful, but they should report the expected speed gains from any given change and prioritize according to a time/benefit ratio. Both tools weight their sub-scores, but they do little to tell a developer if it is worth their time and processing power to bring their etags in-line.