In this post I take a look at service workers in JavaScript. I will show how to use a service worker to support offline loading of an Angular application.

In my opinion, the main benefit of service workers is support for offline http requests. By using a service worker as an http proxy you can build up a cache of http responses. Future requests for the same resources can then be served from cache rather than making the request again.

How is this different from regular browser caching?

It’s true that browser cache gives us some of the same performance benefits, but you don’t have the same granular control over what gets cached.

When using service workers you can even cache the initial html document request. This is huge since it allows the entire application to work offline once its fully cached.


Let’s take a look at how I used a service worker to cache my Angular application. The application used for this demo is the Closure compiled application from my previous article.

Since service workers are not universally supported in all browsers, the first step is feature detection to ensure support.

if ('serviceWorker' in navigator) { window.addEventListener('load', function() { navigator.serviceWorker.register('/sw').then(function(registration) { console.log('ServiceWorker registration successful with scope: ', registration.scope); }, function(err) { console.log('ServiceWorker registration failed: ', err); }); }); }

Once support has been detected I can proceed to load the script that controls the behavior of the service worker.

var CACHE_NAME = 'site-cache-v1'; self.addEventListener('fetch', function(event) { event.respondWith( caches.match(event.request) .then(function(response) { // Cache hit - return response if (response) { return response; } var fetchRequest = event.request.clone(); return fetch(fetchRequest).then( function(response) { // Check if we received a valid response if(!response || response.status !== 200 || response.type !== 'basic') { return response; } var responseToCache = response.clone(); .then(function(cache) { cache.put(event.request, responseToCache); }); return response; } ); }) ); });

The loaded script primarily defines caching behavior for the service worker. Basically it intercepts all http requests, and caches all successful internal requests.

You can find more detailed information about how this works in this great Google Developer article.

Now, let’s try to load the application here.

Notice that I am loading the application over https. This is important. In fact service workers will only work over https. This is a necessary security measure to ensure that no one has tampered with the service worker while in transit over the network.

Once the application is loaded for the first time, try throttling the network, or even go offline. If your browser supports service workers, the application will continue to work.

Take a look at your browser's network tab. You should see requests served via the service worker cache instead of actual http requests. This leads to much faster response times than loading it over the network.

Future Improvements

The demo application makes a few ajax requests on demand. With the current implementation those features have to be manually loaded by visiting the corresponding nav points (Lazy loaded Treeview and Redux). As a potential improvement I could pre-load those data requests to make the application work completely offline once it's loaded for the first time.

After loading those features once, the ajax request will be cached and start to work offline though.

Fixing this is a topic for another day :-)