2 performance snippets
Preloading images with JavaScript
If your web app dynamically displays certain images and you don't want to make sure that the images are downloaded before they are first displayed, you can pre-fetch the images using some simple javascript.
For single-page apps, this should be sufficient:
function preload_images(urls) {
urls.forEach( function(i, url ) {
(new Image()).src = url;
});
}
preload_images( [ 'image1.jpg', 'image2.png', 'image3.tiff' ] );
If you want to add a slight delay (so other web assets can load first) use something like:
setTimeout( function() { preload_images( [ 'image1.jpg', 'image2.png', 'image3.tiff' ] ); }, 500) ;
The single-page-app method above loads each image in the array into memory. However, browsers generally won't cache these images, so if the user navigates to another page without viewing the images, they will be lost.
To make the images cachable, it helps to add the image that is created into the actual DOM tree for the page. Here's one way:
function preload_images(urls) {
var newdiv = document.createElement("div")
if(newdiv.setAttribute) {
newdiv.setAttribute("style","display:none;")
} else if(newdiv.style && newdiv.style.setAttribute) {
newdiv.style.setAttribute("cssText","display:none;")
} else if(newdiv.style) {
newdiv.style.cssText = "display:none;";
} else {
newdiv.style = "display:none;"
}
urls.forEach( function(i, url ) {
var newimg = new Image();
newimg.src = url
newdiv.appendChild(newimg)
});
document.body.appendChild(newdiv)
}
Pre-generate pages or load a web cache using wget
Many web frameworks and template engines will defer the generation the HTML version of a document the first time it is accessed. This can make the first hit on a given page significantly slower than subsequent hits.
You can use wget
to pre-cache web pages using a command such as:
wget -r -l 3 -nd --delete-after <URL>
Where:
-r
(or--recursive
) will causewget
to recursively download files-l N
(or--level=N
) will limit recursion to at most N levels below the root document (defaults to 5, useinf
for infinite recursion)-nd
(or--no-directories
) will preventwget
from creating local directories to match the server-side paths--delete-after
will causewget
to delete each file as soon as it is downloaded (so the command leaves no traces behind.)