I’ve been wanting to update my portfolio site for a while now, but have been putting it off for one reason. My Google PageSpeed Insights score was crap and I wanted to see how high I could get it before doing anything else.
The image above shows my desktop score at 97/100, and my mobile score turned out to be 91/100 – not bad eh?
Honestly, I’m a little ashamed to admit it, but I haven’t paid too much attention to these metrics in the past. I don’t completely ignore page speed, but I’ve never pushed to get a very high rating.
Since my portfolio site is relatively small, I figured there was no excuse for poor page load times. So I decided to see how well I could do if I really tried.
For those who followed this link and are lost: Google has an analytic tool called PageSpeed Insights.
It doesn’t necessarily measure how fast your page loads, but it checks for issues that affect page load times negatively. The higher the score, the faster your page probably loads, and thus you get a bit of an advantage when Google crawls your site for rankings.
The tool also gives you feedback and suggestions for fixing any issues. The main criteria the tool checks against are these:
- Reduce server response time
- Leverage browser caching
- Minify CSS
- Minify HTML
- Optimize Images
- Prioritize visible content
I won’t go into great detail about each of these, for that you can consult Google. But I’ll gloss over what each point means, and explain what I did to satisfy it.
Reduce Server response time.
According to Google, this rule will be triggered if your server takes longer than 200ms to respond to a request.
This is probably the most complicated of them all. However, if you’re simply hosting your site on a shared host provider and it’s just a static site – I doubt you’ll get this warning anyway.
If you have a larger site with a lot of application logic or database queries, and you are on a dedicated server, there is a lot that could potentially go wrong. Figuring out what that is, is out of scope for this article. Even Google doesn’t offer much for recommendations.
Leverage browser caching
If you don’t know what browser caching is… it’s just when your web browser stores files from a server response in a cache.
My portfolio site is running on Apache so enabling browser caching on my shared host is as simple as dropping an
.htaccess file into my root folder with this bit of code:
<filesMatch ".(css|jpg|jpeg|png|gif|js|ico)$"> Header set Cache-Control "max-age=2592000, public" </filesMatch>
The first line just matches for any file that ends with
js etc. Now that I’m looking at it – I should add
json to that regex so that my
manifest.json is cached too. Not that anyone would ever save my portfolio site onto their phone…
The next line sets the HTTP header to send with the response. The header is called Cache-Control.
Basically, that line in your .htaccess will create two headers that look like this:
cache-control: public cache-control: max-age=315360000
cache-control to public just means that anyone can cache the files in this response. This is appropriate for a static site like mine. I have no reason to prevent proxy servers from caching my files.
max-age=315360000 just means that the file will stay cached for one month before being cleared.
For these 3 criteria, I already met 2 of them. I fell short on minifying HTML and I’ll explain why in a moment – but first – what does Google mean by “minify”?
Just in case some readers don’t already know, minify usually means stripping all unnecessary white space and comments out of a file.
Basically minifying is when we take code that looks like this:
And turn it into this:
So the first snippet of code, is the code you actually write and edit. For those who are newer to all of this, you’ve probably noticed
src folders on GitHub before… that’s where your un-minified code goes.
The second snippet is what you actually upload to your production server. For a small file like the one above it doesn’t make much difference, but for much larger files it can. Smaller files load faster, so minifying your code is generally good practice.
For absolute beginners, one way to achieve this is by finding minifyers online or one that you can download, and manually minify each of your files by feeding into this program.
You’ll do it one time, and then probably never want to develop websites ever again!
Usually, developers use build tools like Gulp or Grunt, which you run from your command line, to automate this process. You can set those tools up to minify your code automatically and then store them in a
build folder. Sometimes you’ll find these in Git repositories on GitHub, but they are usually left out, and it’s up to each developer to run the build locally.
So this has actually been a part of my normal workflow for quite a long time. So for this website, all of my JS and CSS had already been minified, and as a bonus, concatenated into one JS, and one CSS file too – which reduces the number of requests to the server.
Unfortunately, I hadn’t been minifying HTML…
The reason was I couldn’t find a decent plugin for Gulp that would minify PHP files. I use PHP for this site only for templates, and to automate switching between production and development environments. So honestly, you could argue that it’s unjustified anyway, but at the time I started this website I was playing around with an idea that I won’t get into here – I’m just using PHP because I am, so there!
Anyways… the site only has a few PHP files that need to be minified for the time being, so I decided to run them through a minifyer manually. I used a site called HTML Compressor.
Then I just manually dragged and dropped those files into my
dist folder before uploading it to my server. Definitely not a long term solution, but it worked for today.
I think I’m going to be changing this site over to Jekyll anyway. Once I do, all of my files will be HTML again, and I already have a nice build system for working with HTML so I’m not too worried about this temporary inconvienciance.
This one was by far the easiest to fix once I noticed the handy download link Google provides! You’ll see a link under this point that says “download compressed assets” or somewhere along those lines…
Done! Add those assets to your dist folder and upload them to your server.
But seriously, you should still be using best practices when it comes to web images along the way. All of my images were already 72dpi, they weren’t any larger than they needed to be, and unless I needed transparency my images were all JPEGs too. Pretty standard stuff.
I actually run all of my images through a Gulp plugin called
image-min that strips meta data during development. So my images were already pretty optimized before some final compression made by Google.
This one is still in the yellow as I write this article, which I’m happy with. It was a red point the first time I ran Google PageSpeed Insights with my site. This was because of the way I used to include web-fonts back when I first started working on it.
I used to include web-fonts and CSS libraries with link tags in my HTML. Now I use a pre-processor called SASS which compiles all of my CSS into one file. So I simply moved all of the link tags to web-fonts and a link to normalize.css (the only vendor CSS I’m using on this site) into my stylesheet.scss as imports instead.
So now, the browser is only waiting on one resource before it renders the page instead of 5 or 6 or whatever it was.
Above-the-fold content refers to the content that a visitor sees before they scroll. Google wants developers to start including any CSS necessary for above-the-fold content (ATF) to be embedded right in the HTML. The idea is to render something on the page as quickly as possible, and then have everything also render afterward.
I found post on Stack Overflow that includes some pretty good answers and links to other resources on this topic: http://stackoverflow.com/questions/18340402/what-is-above-the-fold-content-in-google-pagespeed
I don’t really consider myself as a front-end developer, so meeting the requirements of this point is a little too involved for my liking at the moment. Maybe a challenge for another day, and another article too 😉
So, yeah. There you go. I’m pretty happy with how my portfolio performs, and I hope that me sharing a bit about my strategy for getting there will help you with optimizing your own static content for the internet overlords we call Google.
If you have any tips of your own you’d like to share, leave ’em in the comments below. Don’t forget to share this if you enjoyed the read too!
Oh and if you want to see the current state of my portfolio here it is. Feel the speed! danjfletcher.ca