CDNs are not magic and there's a lot more than bandwidth to consider: you still have the overhead of doing DNS lookups, additional server connections and response latency before bandwidth enters the picture and the client overhead of decompressing, parsing and processing the response, where CSS will block rendering until it transfers and JavaScript will not only block rendering but also further loads until it finishes executing. Latency is almost always far more important to the user experience than the actual bandwidth and, particularly if you have many users on wireless networks, with blocking resources the concern isn't the average or best-case latency but the worst-case.
Libraries are still usually a reasonable trade for the reasons you mentioned but it puts a solid limit on your site's performance. If most of your users won't execute most of that code, it makes sense to reconsider – for example, if you could replace most of your jQuery usage with APIs like e.g. querySelector/querySelectorAll and your site only supports IE8+, it's quite reasonable to question whether you're getting enough benefit to be worth that download or whether there's a way to defer anything which isn't used all of the time. For example, much as I love LeafletJS, I'll toss in a rel=prefetch header for it but won't load the script until I need to display a map.
I find this specious: What modern web app isn't minifying/concatenating it's Javascript and CSS source? All of your concerns are easily remedied via a proper build procedure. And we're talking about, in relative terms, miniscule amounts of data for the web. 33 kb? C'mon now.
> I find this specious: What modern web app isn't minifying/concatenating it's Javascript and CSS source?
In that case, you should learn more about web performance: minification is a useful tool but it's not magic. If you add code to a bundle, you're almost certainly transferring too much data for the initial request — which increases the time before anything can run — and making caching less efficient because any change to your JavaScript invalidates the entire cached object; if you serve it separately, caching works better but you'll incur the latency cost multiple times. Every site has to balance these factors against user capabilities and the site performance targets.
To illustrate why you're being entirely too cavalier, consider Ilya Grigorik's excellent Velocity 2013 presentation detailing exactly what it takes to reliably deliver a rendered webpage in 1 second on mobile:
Note that his performance target is roughly 14KB transfer to deal with the way 3G performs in the real world. If you were thinking about including jQuery, you just blew that budget twice over: a fully compressed, gzipped copy of jQuery 2.0.2 is 29KB. If you concatenated everything, you not only blew your transfer budget but you ensured that none of your JavaScript, even the parts which are entirely self-contained, executes within that performance target.
The point isn't that CDNs aren't useful (they're great), asset packaging isn't good (it's a key tool), or even that jQuery is bloated, but rather that good engineers make decisions based on their performance goals and actual measured user performance rather than flippantly saying “We use a CDN so the site will be fast!” (contra: healthcare.gov) or “everything is minified, so it'll be fast”.
The magic of CDNs is caching. So, if you are using, for instance, the URL "//code.jquery.com/jquery-2.0.3.min.js" on your site, and the user has already visited a bunch of sites which are using that URL for their jQuery (highly likely), it means that there is no re-download because the browser already has the DNS and JS cached. Perhaps one extra hit to check for a 302 (and sometimes not even), so the thing about CDNs is the idea that when people hit your site, they already have the libraries you are using locally.
In this regard, you should exclude well-used resources that are on a CDN from your packaging/build process.
> So, if you are using, for instance, the URL "//code.jquery.com/jquery-2.0.3.min.js" on your site, and the user has already visited a bunch of sites which are using that URL for their jQuery (highly likely), it means that there is no re-download because the browser already has the DNS and JS cached.
Have you actually measured this? I've found that the results are lot murkier because not every site uses the same CDN or the same version of jQuery, so a user might have a bunch of different copies cached and still need to load the version you use, and many browsers – particularly on mobile – still have very small cache size limits so the fact that they downloaded jQuery yesterday doesn't mean it's still cached when they visit your site again today. This becomes increasingly dubious for any library less popular than jQuery.
This also doesn't help as much as you think for latency: by the time the browser is requesting that copy of jQuery, it already has your site's DNS resolved and almost certainly has at least one connection already established to your webserver. Restarting those processes for another server adds a non-trivial amount of latency – more so for HTTPS — which is acceptable if you're going to be making many requests but might take longer than simply transferring something like jQuery over an existing connection, particularly for wireless users or anyone using SPDY. This is fine for something which doesn't block rendering or which can stream but for something like CSS/JS in the critical path you really need real world monitoring to see how well your CDN is actually performing.
Ok, but trade that off against the latency for sending cookies and whatnot, and I think the difference is likely trivial.
If we look at the headers for jQuery 2.0.3, we get Expires:Thu, 31 Dec 2037 23:55:55 GMT, which means that it will not expire.
You make a good point in the fact that mobile won't have so much room to cache, however, I'd imagine this will increase as devices get more space and whatnot - in Android 4.4 with the latest Chrome Beta, I tried a few sites I'm using CDN on and got 304 for pretty much all CDN'd stuff:
> So what I'm saying really is I think that whilst you have good arguments, I think that things are continuing to get better and better :)
Yeah, I don't want to come across as a complete downer on this – the situation is easily better than it's ever been with CDNs becoming not just available but cheap, distributed DNS a turnkey service, etc. etc. Working on a global website for awhile has definitely reminded me that performance / reliability work hits diminishing returns unless you have have near-Google budget.
> That said, once everyone's on LTE, perhaps we don't even have to care ;)
Libraries are still usually a reasonable trade for the reasons you mentioned but it puts a solid limit on your site's performance. If most of your users won't execute most of that code, it makes sense to reconsider – for example, if you could replace most of your jQuery usage with APIs like e.g. querySelector/querySelectorAll and your site only supports IE8+, it's quite reasonable to question whether you're getting enough benefit to be worth that download or whether there's a way to defer anything which isn't used all of the time. For example, much as I love LeafletJS, I'll toss in a rel=prefetch header for it but won't load the script until I need to display a map.