Page Speed has, of late, become particularly important. You can see this reflected in the many new applications that Google have released in the past year. Their page speed extension for Chrome and the larger part that page speed takes in its Webmaster Tools shows that it is putting greater weight on the speed element of web browsing.
A further and by far more interesting application is in their latest edition to their analytics package where there is now a specific ‘page speed’ section in the software. It tracks the average time it takes to load a sample of your pages over time. You can then cross reference these against certain dimensions such as browser or geographic location to see it there are any marked differences.
Then, as you try and lighten the page loads, this should be reflected in your analytics with the average dipping.
There are days where the data may throw you, ever so slightly. The grab below show that one day, where one site spiked in page average, we found that the average was affected because of one “visit” where the page load time for that visit was 706 seconds or about 12 minutes! This wasn’t reflected in our server logs, so we are at the mercy of Google on that one.
While Google provides excellent analysis, it doesn’t give you the why. What exactly is slowing down your page and what can you do.
There is always a latency in your analytics data, so if you want to get an immediate view of any changes you make to your pages you can use some of the online resources.
There are plenty of tools available to analyse pages and give you some answers.
Pingdom Tools is one such site or, if you want a bit more analysis, go to webpagetest.org. The latter gives very detailed analysis of all the elements on a page and their page load times. The nice advantage of Pingdom tools is that it keeps a record of each time you analyse your site so you can look back to track your progress. To combine both and get some good tips for solutions gtmetrix.com is not bad at all.
Don’t get hung up on the home page – it’s the least of your worries. Many of these sites give you the opportunity to, once the main parts of the pages have been loaded into your browser, to revisit a page. This shows you the elements that will be reloading on every page – were their problems there?
Because there are so many elements on a page and so much to be loaded you need to know which ones are performing, which are not and assess what can be done.
Items that can give you a fairly immediate reduction in clutter would be:
Look at your hosting – dont simply opt for the free version. ‘Free’ can be very expensive.
Look carefully at your images and make sure they are an appropriate size. Don’t use HTML to reduce your picture size, have them reduced before.
Minify your js files. This essentially strips out the whitespace in the files but makes them function the same, but they can be much smaller. You can have a look here at the minifyjs.com site where you can do this on line.
On a point of caution: Back up theses files first. It only takes a few seconds and will save tears if something goes wrong.
Like minify, compress your css files to do the same. Even template sites can do with a little tinkering on the css front.
Also with the css files you should (if you are on an apache server) try gzip. You can test to se if your site is compressed using a gzip test and hopefully you can get results like this:
Original Size: 31.79 KB
Gzipped Size: 6.72 KB
Data Savings: 78.86%
So, your efforts are rewarded in this case.
This site works off a template that allows gzip to be turned on or off. Without gzip the site load times (for the front page) are in the region of twelve seconds (the front page is a bit ‘phat’ and its unashamedly design over function on that page). Turning on gzip will knock nearly three seconds off the page.
Original Size: 25.25 KB
Gzipped Size: 6.05 KB
Data Savings: 76.04%
So you can see immediately for the above the benefits of gzip.