WordPress is one of the most used web platforms, is actually used in more then 26% of the world’s websites, that’s a lot. With such usage, comes a big community of developers, a very big directory of plugins and themes created for this platform … and with those comes an even bigger responsability.
The problem is that not all of those websites are fast, not all of them have a decent page load time. Knowing that Google has added page loading time as a ranking factor in its SERP algorithm, having a fast website means a lot.
So what can you do to make your website faster?
Working in the web programming field for quite some time now has taught me that even here the Pareto rule is valid. With just 20% of the work/tasks in your checklist, you can do 80% of the optimisations needed. Having said that, i will focus on teaching you what you should improve for first, how to test and read page speed tests … and how to improve based on the results of those tests.
So before doing anything just head to one of these sites (or all of them):
and test your website, don’t close the results page, keep it open so we can rerun tests easily and compare results between tests after we improve the site.
Here are the scores for my own website. My only weakest link is that i need to serve scaled images … but lets suppose i would have had more issues to solve.
Tools like this report for problems/possible problems in your site and even give you informations on how you can resolve these issues. Seeing this from a professionals point of view it might not look like much but seeing this from a beginners point of view, it is a great help. Even those of you who don’t know how to minify html for example will read the how-to and soon will be able to implement it on your own website.
In my point of view though, after a lot of experience with WordPress in specific and websites in general i came to the conclusion that not all of the issues need to be resolved. There are more important issues to be resolved and lesser important issues to be resolved, that’s how most of us should consider this issues list. Probably the only ones who should really really consider solving all the issues found by these tools … are the ones trying to optimize very heavy websites, websites in a very important niche or … perfections.
From my experience its not cost-wise trying to solve all issues reported from these tools. Even a final result of 90/100 or more generaly is more then accepteable. Generally the issues i do solve are:
In other words minimize calls that the website does when a page loads. So minimize the number of included css and js files more generally. There are different ways how to achieve this. I would start with the most efficient of all though … dont include the file at all. Rethink your functionalities, if there are things in your websites that can be done through CSS for example … no need to add an extra jquery plugin to do that. If that can’t be done and the number of css files and js files loaded on every page load is high … you should really think about, minimizing css and js files, concatenating them in 1 single file (so all css files in 1, all js files in 1). There are different kind of plugins to achieve this in WordPress so you can try which ones fits your needs the best. Also if in your website you use a lot of small images (icons, small image borders … even though you shouldnt use those anymore, etc) … you should consider implemnting css image sprites.
This is one of those simple to do stuff, one of those 2-mins works that has a huge impact on the performance of your website, specially if your website has a lot of content and the content has a lot of repetitive words. By enabling gzip compressing you minimize the size of the generated html documents (so instead of the server serving 100kb of html content, now the server will server 20k of gzipped html content). In other words, the html content will “fly” faster from the server to your browser and the browser will render it almost in same time as it would have rendered an uncompressed html file …
Usually images are the kind of content that take the most of mb from the bandwith of a website. Usually an web optimized image is 50-70kb (talking about a medium sized image and optimised for web). An unoptimised image though, could be even 500kb, 2mb or even 10mb … depending from image size and if it has been optimised for web.
Having said that … i think you can guess by yourself the kind of impact optimizing images can have on a website. Think of a website that has a page size in mb of 4mb, 3 of those mb are images (un-optimised images), think how much we gain by optimising those images? The more we gain, the faster the website will be.
These is one of those that gives a huge boost in performance (specially when you have a looot of css and js files), i dont like to use this to much on my websites but i do reccomend implementing it though. I dont use it since in my projects i try to use even a better performance rule, dont create confusion, dont add a lot of stuff when possible. The lesser you add … the lesser you will have to clean up.
Cleaning up your links, cleaning up the files you are requesting helps a lot. During the load of a page the browsers, sends a request to all the files your site is requesting. If your site is linking to a js file in a dead site … it might take some time for the browser to handle this, time which is added to your site’s page loading time. Thats whay minimizing bad links, bad requests its a big deal. Also … if your site is requesting js, css, font files, etc from external sites … try minimizing those requests also. Think if only 1 of the servers from the which your site is requesting a file is not good, in simpler words is a slow host. That would affect for bad the loading time of your page. Thats why it is always suggested that you download all external resource files to your own host, at least if something bad happens its on your server under your control. Another thing to keep in mind here is to not use a lot of analytics tools in same site. The more analytics tool you add, the more requests your site is making, the more your site will need to load and in an even worser case scenario … sometimes some analytics tools block the work of each-other … which means an even slower loading time for your website.
This again is one of those (usually) easy fixes with a big gain on performance. Just a couple of lines of code added to .htaccess file will make your website go faster. What this does is that it helps specifically repetitive users so they dont redownload files over and over again every time they visit your site … instead of doing this, they use the files that the browser already has in its cache (so in the visitor’s computer). This will save bandwith, will save non-needed requests, lesser requests means faster loading time. Of curse this only works for visitors who have visited at least once your website but it is still an easy fix with a lot of gain in performance so no reason to not do it.
This is not one of those things that get specified as an issue on the tools i recommended checking … but again i strongly recommend to follow and implement this. No issue resolved is gonna help you if you dont know how to program, if you write a very bad code, non semantic code, non html5 valid code. With this i mean … think of the ways we can achieve x thing in programming … for sure there are at least 2-3 ways. Usually only a small portion of programmers chose the best of those 2-3 ways to follow to solve/achieve that x thing.
For example … how to write a drop-down in Foundation?
There are a lot of ways and even on the Foundation’s docs are presented like 2-3 ways to achieve it … i have seen programmers though writing 10 lines of code to achieve something that could have been achieved by just 3 lines of code, so my question is … why the extra lines?
You see now … no matter how much of the issues reported from those tools you solve if you write bad code, bad code usually means non semantic code, sometimes even non valid code … no improvement is gonna make your website go faster. You could have your website be at 100% optimized based on the result of those tools but if you write bad code … no tool can help you, no magic trick can help you.
Measuring loading time in seconds is something not very precise and its not correct … it will not always tell you that there is a problem with your website. For example if your website is hosted in Germany and you have a visitor from Japan (you dont use a CDN) … its normal for the user to have to wait a bit (not good but normal). Its normal because Japan is on the other side of the world … so if a page speed testing tool says to you that your website loading time is 20 seconds for example dont go out crying … check before from which server does that tool the test. It would be a real problem if for example you would have had a website hosted in Germany, a visitor from France and the page loading time to be like 10 seconds (or even 7 seconds) … that is a problem.
This problem can be usually resolved using a CDN. What a CDN does is that in practice it checks from which country a user is visiting your website and it serves the content to the user from the nearest data-center … this way that Japan user visiting your website instead of connecting with the server in Germany … might connect with a CDN data-center in Taiwan lets say … in other words, the page loading time would not be anymore 20 seconds.
I have worked in a lot of projects where i had to do performance testing and improve page load time of the website. Just by solving these issues the website leaded much more faster, at least twice as fast as before but anyway as i said i dont always see the time in seconds as the only factor to judge how well a website is optimised.