JavaScript plays a major role in today’s web, with many sites relying heavily on it. It’s usually assumed that JavaScript will be available in the browser. And while that may be true most of the time, there are still exceptions. Stuart Langridge illustrates this in his brilliant post, “Everyone has JavaScript, right?". Now, the absence of JavaScript doesn’t happen all the time – or even most of the time – but it still does happen. Which is something worth considering when building for the web.

Progressive enhancement

Assuming that JavaScript may fail (or be absent) fits within the broader strategy known as progressive enhancement. Progressive enhancement focuses on first making sure the page is accessible and functional for all users, regardless of the browser. Even if JavaScript and CSS are absent, a progressively enhanced page should still render. The links and forms should still work. Content should still be accessible. The experience may not be beautiful, but it should at least be functional. Then, once the base content and functionality are in place, we can add extra layers. These include adding extra styling (CSS) and functionality (JS) in order to enhance the user experience.

The base level (when CSS and JS are absent) will typically be quite performant. But there are situations where the higher layers can actually help with performance. Lazy loading content, for instance, can have a huge impact on page weights and load times. But if progressive enhancement is our aim, we should also consider what will happen if lazy loading isn’t part of the equation. Will the images be visible? Will there be ways to get to the extra content?

When <noscript> alternatives hurt

As an example, let’s consider lazy loaded images. In these situations, one common approach is to wrap fallback images in <noscript> tags. That way, if JavaScript works as intended, the images are lazy loaded. But, if JavaScript is absent, the user will still be able to see the images in the <noscript> tags. I’ve done this many times, and it’s a solid option. Unfortunately, though, it’s not foolproof. And if not implemented thoughtfully, it may have some unintended consequences.

[Note: this approach may end up being more “graceful degradation” than “progressive enhancement,” depending on how it’s done.]

Recently, Tim Kadlec did some research on the potential effect of Google’s NOSCRIPT intervention (Chrome was beginning to turn off JavaScript for Android users under certain conditions). In his post, he lists the impact that this intervention would have on some popular sites. In general, it greatly improved (i.e. decreased) the weight of the pages. But there were a couple glaring exceptions. The biggest one was the impact on The Verge. Their home page went from around 3MB in page weight with JavaScript to around 68MB without it. That’s over a 2,000% (20x) increase! And it begs the question: Why?

They were using JavaScript to lazy load images, and were using <noscript> tags to designate fallback images. But this in itself wasn’t the problem. The problem was there were a lot of these “fallback” images, and many of them were quite large. So whenever JavaScript was disabled, all of these large images were being loaded right away.

Preparing for failure

This case serves to highlight the importance of thinking carefully about non-JavaScript situations. How will our page perform if JavaScript isn’t part of the equation. Perhaps we’ve implemented lazy loading but haven’t considered what would happen if that lazy loading doesn’t take place. Or we do think about what would happen, and so we wrap the fallback images in <noscript> tags, “just in case.” But in doing so, we fail to take into account how many images are on the page, and what they would add up to if they all loaded at once.

In these kinds of cases, it’s worth seeing if we need to adjust anything for non-JavaScript scenarios. Do the dimensions of any fallback images need to change? Should there be an alternative layout for the page? Should there be some kind of pagination system by default? This could allow users to view additional content without loading too much on the initial load. And if JavaScript did end up being functional (which is usually the case), it could go ahead and remove the pagination and load any extra content.

When used judiciously, JavaScript can be very helpful in speeding things up for our users. Lazy loading is a prime example. But if we care about performance, it’s also worth considering the performance of the page when JavaScript fails. Because failure is bound to happen. And when it does, hopefully we’ve prepared appropriately.

Resources