Table of Contents
Essentially, this site is my .com, so it'll have everything that pertains to me and my career that I think you'll find important or fascinating whether you're a recruiter or someone getting to know me better. Since this website can also act as a platform for me to express my opinions about things, I may also post some reviews or opinion articles here too.
To get into the technical aspect of this website, it leverages the power of Hugo to generate a static website and its HTML pages from source markdown files. All of this is automatically built with a custom GitHub Actions Workflow and is deployed onto S3. The site then sits behind Cloudflare's robust CDN and DNS.
You'll need a few things in the cloud set up:
-
AWS S3
-
Cloudflare
Refer to the included links in the Acknowledgements section to set them up correctly with the correct Security Control Groups and settings.
With the infrastructure set up, a manual deployment looks a little something like this:
-
First,
cd
into the directory for the site you wish to build e.g.music
. -
Run the
hugo
command on your development machine to create the website files in thepublic/
folder. -
Use the AWS S3 web console to upload all files in the
public/
folder to the root domain S3 bucket.
For me, this becomes a rather tedious and repetitive process, especially when you want to make multiple successive changes quite frequently. Unsurprisingly, it also becomes the perfect task for automation purposes.
This section will detail how to set up the Jenkins CI/CD pipeline using the aforementioned services.
All you need to do here is update deployment.target.URL
in the config.yaml
file for your site with the correct S3 bucket URL.
The naive method to upload files to S3 using a CI tool is to use the aws s3 cp
command.
The issue with this is that it copies the entire /public
folder into S3, even if the files did not change - the upload isn't incremental.
This becomes a cost issue since S3 charges based on the number of PUT calls you make.
Ergo, your cost goes up proportionally with how many times you deploy how big your project is; this is clearly an antipattern for DevOps!
The clear and easy solution is to use the built in hugo deploy
command.
Under the hood, it uses the author's own s3deploy
project, which "uses ETag hashes to check if a file has changed, which makes it optimal in combination with static site generators like Hugo".
Essentially, all you need to do is configure the aws-actions/configure-aws-credentials@v2
plugin with the IAM role that has access to your S3 bucket.
Source | Description |
---|---|
Official AWS Guide to hosting a static website with S3 | Walks you through everything you need to know to setup AWS S3 buckets to host your website |
Hosting a static website: Amazon S3 + Cloudflare | Great guide to get you up and running |
Official Cloudflare Guide | Detailed Cloudflare guide for Cloudflare and AWS hosting |
jeff-h's StackOverflow answer | Explains Cloudflare Auto-minify setting to prevent broken CSS |