service workers

How Service Workers Sped Up Our Website by 97.5%?

Here’s how we made our website load 97.5% faster by using service workers, how we ensure the users will get the newest version every time, and how you can do it too.

Our website is more than 1 year old. It started from a few static pages and has grown to a fully-fledged documentation with live examples of how to use our product. Our code uses code splitting heavily, and our product has a lot of static resources like icons and CSS files.

To top it all, our documentation uses iFrames to present our components, each calling its dependencies separately (yes… like a Microfrontend architecture…).

At some point, things started to break.

It started with our own dev environment. After a few reloads, it would just get stuck. We could live with that. Like that:

We couldn’t live with a ticket coming from our consumers showing us how bad the situation really is…

We had to do something about it. This is how we solved it.

The Performance Boost

Let’s start from the end. Our main goal was to reduce the number of calls to the server. A secondary goal (or bonus, if you will) was speeding up the load time of pages in our app. Here’s an example of how good we did:

Before

In the image above, we can see the profile of the network requests before the change. At some points, the page load took 860ms – a huge amount of time! The average load time of a page was around 400ms. In the rightmost section, you can see the gray bar of the site trying to load the `all.css` file, which keeps going forever. This is the part in which the site stops loading for the user, and refresh doesn’t help (actually, makes it worse…).

After

After the change, we see each page loads really fast. We found pages that took less than 10ms to load. The average was around 10ms. That is a whooping 97.5% increase in speed (or decrease in load time).

So… how did we do it?

The Chrome Connection Limit

We got lucky. Our consumers are developers, and one of the reporters tested the website in various browsers and hinted it happens only in Chrome.

I don’t know how common knowledge it is that chrome limits the number of connections to a server to 6 concurrent connections. While that’s not surprising, what surprised us was the long timeout and the fact that the requests remain “live” even when refreshing or browsing to a different page.

Because we had almost a hundred requests coming out on every page (we code-split to an extreme, it seems), it didn’t take much time for chrome to just shut down the connection to our documentation website for every chrome user after just a few minutes of browsing the website.

How to Reduce the Number of Files?

Attempt #1: Remove Prefetch

If the problem is too many files, then we need to do something about it.

All in all, our documentation code looked like this:

These files are just a small part of our files manifesto. See that we also prefetch the files, trying to speed up the load time of the next pages.

Our first step was to stop the prefetch. This, while reducing the amount of requests, did not really help because the prefetch happened only when the network was idle, so there wasn’t much effect. We needed to remove the number of scripts requested.

# Attempt Number 2: Bundle all of the Files into One Big Bundle

In our project we use `rollup` to bundle our files. Our configuration is meant to code split everything and let the consumers use their own bundlers to code split, bundle, and tree shake.

In this step, we just went over all of the components, created a barrel file, and bundled all of them into one big `vivid-components.js` file we used instead of all the other script tags.

This is the commit if it is of interest: https://github.com/Vonage/vivid-3/pull/1208/commits/83d9b8fe1a2ce9452bcabb40fdc6123efef4c777

It helped on most pages but – remember the iFrames? They still brought a lot of stuff with them. Duplicates of the files already brought by the top page…

How to Cache with a Service Worker?

A service worker is a layer between our app and the network. It can listen to all requests coming in and out of the app and handle them.

In our case, we wanted to handle the requests, cache the response, and return the response on consequent requests.

How to Register a Service Worker?

The first step is to register the service worker in the client:

(async function() {
const registration = await navigator.serviceWorker.register(
'/sw.js',
{
scope: '/',
}
);
})();

The service worker can fetch requests according to its containing folder. This is why I put it at the root of my project.

In our project, the actual file is not in the root. It moves there in our build process, hence giving us a nice development experience while still allowing us to fetch requests from the root.You could use the service-worker-allowed http-header to serve it in a different folder, but using the “build to root” trick, I had no need for it.

The Service Worker

Our service worker looks like this:

const addResourcesToCache = async (resources) => {
const cache = await caches.open('vivid-cache');
await cache.addAll(resources);
};
const putInCache = async (request, response) => {
const cache = await caches.open('vivid-cache');
await cache.put(request, response);
};
const cacheFirst = async ({ request, preloadResponsePromise, fallbackUrl }) => {
const responseFromCache = await caches.match(request);
if (responseFromCache) {
return responseFromCache;
}
const preloadResponse = await preloadResponsePromise;
if (preloadResponse) {
console.info('using preload response', preloadResponse);
await putInCache(request, preloadResponse.clone());
return preloadResponse;
}
try {
const responseFromNetwork = await fetch(request);
await putInCache(request, responseFromNetwork.clone());
return responseFromNetwork;
} catch (error) {
const fallbackResponse = await caches.match(fallbackUrl);
if (fallbackResponse) {
return fallbackResponse;
}
return new Response('Network error happened', {
status: 408,
headers: { 'Content-Type': 'text/plain' },
});
}
};
const enableNavigationPreload = async () => {
if (self.registration.navigationPreload) {
await self.registration.navigationPreload.enable();
}
};
self.addEventListener('activate', (event) => {
event.waitUntil(enableNavigationPreload());
});
self.addEventListener('install', (event) => {
event.waitUntil(
addResourcesToCache([
'./',
'./index.html',
'/assets/styles/core/all.css',
'/assets/scripts/vivid-components.js',
'/assets/scripts/live-sample.js',
])
);
});
self.addEventListener('fetch', (event) => {
event.respondWith(
cacheFirst({
request: event.request,
preloadResponsePromise: event.preloadResponse,
fallbackUrl: './assets/images/vivid-logo.jpeg',
})
);
});
view raw sw.js hosted with ❤ by GitHub

We have two utility functions: addResourcesToCache to add a resource to the cache and `putInCache` to put a request and its response into the cache.

They are both using the caches global object that gives us access to the CacheStorage object.

`cacheFirst` is where the magic happens:

It tries to get the response from cache. If it finds it, it returns the cached response. (Lines 12-15)

If not, it tries to get the response from a preload. If it works, we’re good – we cache it and return the preloaded response. (Lines 17-22)

If this fails, we move on to request from the network (e.g. the server), get the response and cache it. (Lines 24-27)

If all fails, we just return an error with a picture (but why should we fail?). Lines 29-35

Now, a service worker has a lifecycle:

  1. Registration (we’ve been through that)
  2. Installation
  3. Activation

In our service worker, we listen to the installation phase and add our main resources to the cache.

self.addEventListener('install', (event) => {
event.waitUntil(
addResourcesToCache([
'./',
'./index.html',
'/assets/styles/core/all.css',
'/assets/scripts/vivid-components.js',
'/assets/scripts/live-sample.js',
])
);
});
view raw sw-install.js hosted with ❤ by GitHub

Notice the utility `waitUntil` we get on the event object. This utility helps us avoid race conditions as it awaits for async operations to finish.

Then in the `activate` phase, we enable pre-loading of content (again, with `waitUntil`).

const enableNavigationPreload = async () => {
if (self.registration.navigationPreload) {
await self.registration.navigationPreload.enable();
}
};
self.addEventListener('activate', (event) => {
event.waitUntil(enableNavigationPreload());
});
view raw sw-activate.js hosted with ❤ by GitHub

The final step is to add a listener to `fetch`. This listener intercepts the requests and allows us to handle them using our `cacheFirst` function:

self.addEventListener('fetch', (event) => {
event.respondWith(
cacheFirst({
request: event.request,
preloadResponsePromise: event.preloadResponse,
fallbackUrl: './assets/images/vivid-logo.jpeg',
})
);
});
view raw sw-fetch.js hosted with ❤ by GitHub

Notice the `respondWith` utility. It actually does what it says – given the request, we can return any response. In this case, we return the result of `cacheFirst`.

How to Handle Versions in a Service Worker?

There might be a time in which you’d like to update a service worker’s version. In our case, it is needed in the case of our library’s update.

For this, we must state the version in our Service Worker’s file, create a versioned cache and delete the old cache.

How to state the version in a Service Worker?

Adding a version is easy: `const VERSION = ‘3.17.0’;`

This can be changed manually on every release.

In our project, for example, we use rollup to bundle so we did the following “trick”:

  1. We set the version this way: `const VERSION = ‘SW_VERSION’;`
  2. During the build, we extract the version from our `package.json`
  3. We use the rollup’s replace plugin to set the version on every build.

You can see our setup here.

How to create a versioned cache in a Service Worker?

If you noticed, the `caches.open` method accepts a string:

`const cache = await caches.open(VERSION);`

It expects a cache name or id we can later reference.

In many cases, you’ll need to update the service worker or your website’s cache. For instance, when you bump a version of a library, add a new section to the page etc.

Using the version as the id allows us to address the cache of each version and thus display the current version and delete old ones.

How to delete obsolete cache in a Service Worker?

Now that we have versioned cache, we can go ahead and delete it. 

Let’s create a function `removeOldCache`:

async function removeOldCache(event) {
await caches.keys().then(function (keys) {
return Promise.all(keys.filter(function (key) {
return key !== VERSION;
}).map(function (key) {
return caches.delete(key);
}));
}).then(function () {
return self.clients.claim();
});
}

The function goes over all the cache keys (line 2), and for every key that is not the new activated version, we delete it (line 6). Once this is done, we use the `self.clients.claim` method to tell the browser our new Service Worker controls all of the tabs now (line 9).

We call this function during activation:

self.addEventListener('activate', (event) => {
event.waitUntil(removeOldCache(event));
event.waitUntil(enableNavigationPreload());
});

Notice that the cache is the global cache for all the service workers. In our case, we can safely remove them, but in your case, you’d might have more than one cache and thus might consider using a suffix to the version to remove only the cache you want.

An example of that is a case in which you’d want to separate HTML and server requests caches. The HTML cache will be cleared when you change clientside-related areas, while the server requests cache will change on some other term.

Why Didn’t My Service Worker Update?

New Service Workers have a waiting period. It might take a few hours to replace them. We can skip this waiting period by adding `self.skipWaiting()` in our install phase.

Now our Service Worker will update immediately when we change it (e.g., upload a new version of the library).

Summary

Service Workers are a very powerful tool for web developers. They allow us to control the communication between the server and the client. Here we saw the classic example of caching content, but there are many more use cases.

For instance, if we cache the server’s responses and return them, the user can keep using our application while offline.

This caching example shows our solution to our problem – and it indeed solves it. But Service Workers answer many other needs in development. I’d be happy to hear yours 🙂

Thanks a lot to Oria Biton and Miki Ezra Stanger for the kind and thorough review of this article

Sign up to my newsletter to enjoy more content:

5 1 vote
Article Rating
Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments