From commit to production!
My first real pipeline
My setup is built up around usage of a range of simple components:
- GitHub for code repository
The code is developed on my local desktop. When I am satisfied, I commit to master branch on GitHub,
where GitHub actions grabs the code and does a complete build.
- GitHub actions for CI/CD
When code is committed, GitHub Actions kicks in, tests (using pytest) for errors, and if none,
the code is built into a container
- Crazy-Max' converter
Crazy-Max has managed to package a conversion of x64 (the default runtime available on GitHub) to
ARM - which is the instruction set required for Raspberry Pi - and make it available directly.
This converts the image to ARM and pushes to Docker Hub
- Docker Hub for container registry
I use Docker Hub as a free container registry, and ensure that I don't have hardcoded values in the
publicly accessible containers (secrets are applied via the yaml deployment).
- Raspberry Pi Kubernetes cluster as infrastructure backend
I have three RPi4 in a Kubernetes cluster, with one master and two workers. These all share the
workload between them, and are surprisingly fast. They also normally run using less than 15 watts
- Keel.sh for update detection and deployment in Kubernetes
To avoid having to update versions all the time, manually updating and applying yaml files, every
time I want to deploy an update, I have installed Keel.sh (by Rusonask), which continuously checks
Docker Hub for new updates - if a new version of my containers are ready, it pulls a new version
and applies it to the cluster
All in all, I have no subscription fees required for this setup, and the cost of the Kubernetes
cluster was less than 3K DKK total - with negligable power costs. I am able to commit and typically
between 2 and 5 minutes later, the new version is in production, in a rolling update.
I love this setup - I will update with info properly, either here or on github. Until later, cheers!