Hello World: Creating this blog Part 3

In this blog post, we’re going to do some small clean-up items so the site is ready:

  1. Enabling HTTPS
  2. Setup Google Analytics
  3. Setup Security Headers
  4. Setup security.txt
  5. Setup robots.txt

1. Enabling HTTPS

It is extremely easy to add a Lets Encrypt certificate to your website so that it’s accessible over HTTPS. Back in Netlify, go to your site and then on the left click Domain Management and then scroll down to HTTPS. You may need to click Verify DNS configuration if you haven’t already. Then after you have verified your DNS configuration, click on Let's Encrypt Certificate. Within a few moments, a certificate will be provisioned for your site and you’ll be able to access it using https://.

2. Setup Google Analytics

It’s reasonably common to use Google Analytics to see how many people are visiting your site and checking the engagement.

To do that, we can modify themes/hugo-refresh/layouts/partials/counter.html and add in the Google Analytics code. It will look something like this

<!-- Global site tag (gtag.js) - Google Analytics -->
<script async src="https://www.googletagmanager.com/gtag/js?id=UA-XXXXXXXXX-X"></script>
<script>
  window.dataLayer = window.dataLayer || [];
  function gtag(){dataLayer.push(arguments);}
  gtag('js', new Date());

  gtag('config', 'UA-XXXXXXXXX-X');
</script>

3. Setup security headers

In our config.yaml file, place the following after the first stanza of configuration

server:
    headers:
    - for: /
      values:
        X-Frame-Options: DENY
        X-XSS-Protection: 1; mode=block
        X-Content-Type-Options: nosniff
        Referrer-Policy: strict-origin-when-cross-origin
        Feature-Policy: accelerometer 'none'; camera 'none'; geolocation 'none'; gyroscope 'none'; magnetometer 'none'; microphone 'none'; payment 'none'; usb 'none'
        Strict-Transport-Security: max-age=31536000; includeSubDomains; preload
        Content-Security-Policy: default-src 'self'; script-src 'self' 'unsafe-inline' www.google-analytics.com; img-src 'self' www.google-analytics.com; style-src 'self' 'unsafe-inline' fonts.googleapis.com; font-src 'self' fonts.gstatic.com fonts.googleapis.com; form-action 'self';

You can find more information about Security headers here

4. Security security.txt

security.txt is an RFC draft that encourages web site owners to publish information in a standard location. On the security.txt site, they will help you generate a security.txt file. Using their form I created one for this site:

Contact: https://www.linkedin.com/in/michaelkkehoe/
Preferred-Languages: en
Canonical: https://michael-kehoe.io/.well-known/security.txt

And we will now place this in static/.well-known/security.txt

Don’t forget to ensure that this file is added when you commit your work.

Once you commit and deploy this change, you can test your headers against https://securityheaders.io/

5. Setup robots.txt

Now we want to create a web-crawler policy for search enginers like Google, Bing etc. This is done in a file called robots.txt

Since my site has nothing that I don’t want discovered on it, I’m going to allow all search-engines to find all conent.

User-agent: *

This file is placed in static/robots.txt

Once you commit and deploy this change, you can test your robots.txt against the Google tool here

Last modified: 31 August 2020