Recently I was reading Daniel Stenberg’s detailed write up on HTTP/2, and the thing that caught my eye was potential implications this new protocol may have on the way we optimize websites. The good news is that HTTP/2 won’t require any changes to existing sites or applications in order for them to work with the new protocol. However, some adjustments may need to be made to take full advantage of the new features it has to offer.
Current Optimization Techniques
Due to the inherent limitations with HTTP/1.x, there are a number of techniques we’ve become adept at using to optimize performance. These include:
- Spriting Images
- Using inline images
- Concatenating assets
- Using different domains or sub-domains to load assets (also known as “sharding”)
These are done primarily to either reduce the overall number of requests (first three), or to increase the number of simultaneous requests (last one).
HTTP/2, though, changes the way information is sent between the client and server. For instance, it’s designed to:
- Keep connections open for re-use
- Allow the server to push content before the client requests it
- Multiplex requests and responses
And here’s the important thing: Because of these changes, some of the techniques that are currently used to optimize sites over HTTP/2 may no longer be as useful, and may even have a negative impact on performance. For instance, instead of having multiple connections to the same server, it may end up being more efficient to pass the data over a single connection. It will take testing, or course, to determine the trade-offs, and the potential benefits of making these kinds of adjustments, but it’s likely that some of our standard approaches to front-end optimization will need to be adjusted.
We still have some time before HTTP/2 has widespread adoption, but it is starting to gain some traction. As of now, Firefox and Chrome are already supporting it, and IE 11 supports it on Windows 10 (beta). As far as servers, IIS supports it in Windows 10 (beta), nginx is planning on supporting it yet this year, and there is the mod_h2 to add support to Apache. (There is an up-to-date list of implementations on the HTTP/2 GitHub wiki.)
In the meanwhile, the challenge will be how do we deliver an optimal experience to both sets of visitors–those with HTTP/2 enabled browsers, and those that can only support HTTP/1.x–especially if the ways we optimize for each may end up being different.
If we start serving our sites over HTTP/2, and decide to adjust our optimization techniques to take better advantage of the protocol, what kind of impact will this have for those visitors that can’t support it? Will the changes to optimize for the one end up negatively affecting the other, and vice versa? And what will it feel like if we do have to scrap some of the techniques that we’ve utilized for so long, because we come to find out they’re doing more harm than good?