Using JavaScript at its best in order to have a high-performing site

Put us to the test
Put us to the test!
Analyze your site
Select the database

It is not always easy to combine the magic triad Google, Javascript and SEO: the programming code is particularly appreciated by the technicians for all the opportunities it offers, but it often seems tricky to the search engine’s scan, that is performing big steps ahead in this area anyway, mainly with its evergreen Googlebot. From the web.dev blog , though, here it comes helping a practical guide to JavaScript optimization on sites, very useful to limit possible errors that could negatively affect the performance of the project.

6 interventions to enhance JavaScript’s performance

As a work of multiple hands, the document describes six areas of interventions allowing to limit errors and possible obstacles to the site’s performance (both generally speaking and on the search engine). More specifically, the list of suggestions focus on:

  • To apply instant loading with the PRPL model.
  • To pre-upload critical resources so to enhance loading speed.
  • To reduce JavaScript payloads with code splitting.
  • To remove any unused code.
  • To minimize and compress network payloads.
  • To offer a modern code to modern browsers to achieve a faster loading of the pages.

PRPL pattern for istant loading

It has been an article written by Houssein Djirdeh that opened up the way for further tips and ideas, the same Googler that already guided us through the path of optimization of site images: this time around we talk about PRPL pattern and istant loading, two elements that can really make a difference in terms of site speed.

The PRPL acronym describes the model used to upload and make interactive the web pages, that shapes four techniques that can be applied together or independently so to achieve performing results:

  1. Push (or preload) – Launches (or preloads) the most important resources.
  2. Render – Renders the initialling path the quickest way possible.
  3. Pre-cache – Pre-loads in the cache the remaining resources.
  4. Lazy Load – A slow loading of other minor paths or resources.

Using Google Lighthouse to check the page

The first step in order to identify any chance of improvement of a page through PRPL techniques is to launch Google Lighthouse: if a specific resource is analyzed and retrieved with significant delay, the tool pops up a “check failed” we can intervene on.

The preload is a retrieving request that indicates the browser to extract a resource as soon as possible. It is possible to signal critical resources by adding a <link> tag with rel=”preload” in the head of the HTML document, so that the browser can set the proper priority level and try to download the resource quicker, so to avoid the delay of the window.onload event.

How to avoid to delay First Paint

Lighthouse signals with a warning whenever it finds resources delaying the First Paint, the moment in which the site actually displays pixels on the screen: there is no such thing as a single correct solution to reduce First Paint in the app and the integration of styles and renderings could be needed on the server side if the benefits outweigh compromises.

Anyway, a recommended approach is to merge critical JavaScript and differ it from the rest by using async, as well as to integrate critical CSS used above-the-fold, or to use server-side renders for the intial HTML of the page: by doing this, the user will immediately view the content while the scripts are still being retrieved, analyzed and executed, but it could also significantly increase the HTML files payload and damage the Time to Interactive parameters, meaning the time the app takes to become interactive and respond to the user’s input.

Tips to speed up the site

By working as proxy, service workers can retrieve the resources directly from the cache rather than the server during recurrent visits: this not only allows users to use the application when they are offline, but also makes the page’s loading time much quicker on recurring visits. This whole process could be simplified by using a third-party library, unless you have more complex cache requirements than the ones a library can supply.

Last step of this first “lesson” is the lazy load: big sized JavaScript payloads are particularly pricy due to the amount of time the browser spent to analyze and compile them, but dividing the entire bundle and launch multiple slow-loading blocks on request allows to sent a smaller JavaScript payload, that only contains the needed code when the user initially loads the app.

Preload of critical assets

The preload of critical assets guarantees that the most important resources will be retrieved and downloaded first by the browser. When a web page opens up, the browser asks for the HTML document from a server, analyzes its content and sends separated requests for each and any referring resource; the developers already know every resource needed by the page and which are the most important ones, hence could ask for critical resources beforehand so to speed up the whole loading process.

Preloading works for those resources generally discovered late by the browser, to which instead we now assign a priority; to mark the preload we can add a <link> tag with rel=”preload” in the head of the HTML document, so that the browser can memorize the resources inside the cache and make them immediately available when necessary.

Differences between preloading, preconnect and prefetch

Unlike preconnect and prefetch, for instance, that are suggestions performed by the browser in the way it deems fitted, the preload is mandatory. Modern browsers already are quite able to choose the right priority for resources, and that is why we need to wisely use the preloading, by applying it only to the most important resources and always verify the actual impact with real tests.

Among the resources that can be preloaded we mention fonts or background images in CSS, CSS files and JavaScript files: in this last case, the procedure is useful to separate retrieving from execution and can enhance metrics such as Time to Interactive.

Reducing JavaScript payloads by splitting the code

We know that the speed is crucial on the Web and over 50% of users totally leave a website if the loading takes more than 3 seconds. As Houssein Djirdeh reminds us, sending big-sized JavaScript payloads significantly impacts on site speed, but it is possible to divide the bundle in multiple pieces and only send what is truly necessary in the beginning, so to improve this aspect.

Rather than send the whole JavaScript to the user right away, only the first page of the app is loaded. Meaning that we can split the bundle so to only send the code needed for the initial path, reducing to its minimum the quantity of scripts to analyze and compile and, therefore, making faster the page’s loading times. This operation can be performed with several module bundlers, such as webpack, Parcel and Rollup, that use dynamics imports.

Removing the unused code

Registers such as npm allow everybody to download and easily use over half a million JavaScript bundles, but often also including libraries not fully used: the site performance’s improvement process includes this kind of step too, by analyzing the unused code and useless libraries, that only nibble away precious space and time.

To minimize and compress network payloads

There are two specific techniques to enhance the performances of your web page, the minification and compression of data, that can reduce the sizes of useful weight and, consequently, enhance the page’s loasing times.

Lighthouse is once again the one to signal if there are any possible CSS or JavaScript resources on the page that could be minimized: if needed, we proceed with minification, a process of removal of blank spaces and of any other code not useful to create a smaller code file but still perfectly valid.

Data compression works through an algorithm that indeed compresses the resources so that the files can be limited in a dynamic or static way: both approaches have their pros and cons, so whatever suits your case the best will be the perfect choice.

In the dynamic way, resources are compressed on-the-fly whenever requested by the browser: this can be easier compared to manual or compiling processes, but could cause delays if high levels of compression are used. Static compression reduces and saves resources beforehand: compiling process could take a bit longer, but there will not be any delays when the browser finally retrieves the compressed resource.

A modern code for modern browsers

To build websites that works properly on all the main browsers is a fundamental principle of an open web ecosystem, writes Houssein Djirdeh: this means further work in order to guarantee that the whole written code is actually supported by each browser we would like to choose as target. Basically, whoever wishes to use the new features of the JavaScript language must necessarily transcribe these features on backwards-compatible formats for those browsers that cannot support them yet.

A tool that allows to compile a code including a more recent syntax and turn it into a more understandable code for different browsers and environments is Babel, whose counterpart’s name is Lebab (a.k.a simply Babel written backwards), a separated library able to convert the older code in a newer syntax.

Iscriviti alla newsletter

Try SEOZoom

7 days for FREE

Discover now all the SEOZoom features!
TOP