It’s not uncommon to utilize 3rd party scripts from time to time, perhaps for analytics or social sharing or other services. One of the benefits of this is once we’ve link to the script, the 3rd party can make all the updates they want and we’ll immediately be using the updated code – we don’t have to download or update our code. But relying on a 3rd party also means we lose complete control over how that external file is served (how long is it cached, how it’s compressed, etc.). This may not be a big deal, but if you want to have granular control over as many aspects of the performance of the page as you can, there are times where it would be nice to not have to rely on external scripts.

In this post, we’ll cover a simple way to not only keep these kinds of scripts local, but also make sure they stay up-to-date. I heard about this approach from my colleague Erik who had implemented something similar on his site, so I tried out the concept this week and am sharing the steps I took to implement it.

Using a Local Version

1. Make Local Copy

The first step is to make an actual copy of the file you need to link to and then store it locally. For instance, with Google Analytics, you could make a local copy from the terminal using the curl command:

$ curl > analytics.js

Although there are other ways to save a file to your server, doing so from the command line will make it that much easier down the road when it comes time to keep the file up-to-date.

2. Change URL References

Next, we need to update references to this script to use the new location. This could be changing the src attribute of a <script> tag:

<script src="/path/to/my/file.js"></script>

Or, it may mean changing some inline JS to reference the new file. For instance, Google Analytics code could be updated like this:

(i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),

3. Update Local File Regularly

After testing to make sure that this works, the final thing that needs to be done is to ensure that this file gets updated regularly. If it doesn’t, we’d be giving up one of the benefits of referencing an external file – that the 3rd party would take care of updating the files as appropriate, and we wouldn’t have to worry about whether we have the latest version or not.

The easiest way to go about doing this is simply setting up a cron job. To do this on the command line, simply type

$ crontab -e

to edit the cron jobs on your system. At the bottom of the file, you can add a simple task to run the same command you did to get the original file at a specific interval.

* 2 * * * curl > /path/to/my/analytics.js

The above code would run the curl command every morning at 2 a.m. If there were other commands you wanted to run all at the same time, you could also just create a shell script that contained all of the commands, and then just have one cron job fire off that script at the appropriate time.

Other Benefits

One benefit of having a local copy of external scripts is that we now have complete control over the details of how they are served. But another benefit is that it can reduce or eliminate the possibility of a single point of failure (SPOF).

Loading a local script instead of an external one could also reduces the number of domains that we’re connecting to, potentially taking advantage of the changes in HTTP/2, and the ability to reuse existing connections to the server.

Test and See

If performance is the primary reason for using this technique, the amount it will help will vary from site to site and situation to situation. So, the best thing to do, if you go down this route, is to test it out and see if there’s a performance boost.

Even if there’s not, there are still other reasons to consider it. Ultimately, having local copies of external scripts provides us with more control, something that can be helpful to have. We may not always need it, or want it, but it’s nice to have a simple technique to gain more of it when we want.