How To Unit Test Your Helm Charts
With the help of helm-unittest and the AAA pattern
Introduction
In my current project, I have to write some Helm charts. And with time, the Helm charts are getting more and more packed with logic.
And as the complexity keeps on increasing as I use the template logic of Helm. While this comes in handy, it makes my chart more open to bugs or accidental changes in the logic (regression!). To avoid this, we can leverage unit tests similar we do when we write software! And solving operations problems with software is for me a very central aspect of a DevOps world!
Let's roll
The Template File To Test
First, we create a simple scenario, we would like to test. I create templated resources of the type Deployment
and I want to test if the template logic is working correctly.
{{- if .Values.deployment.test.create }}
apiVersion: apps/v1
kind: Deployment
metadata:
labels:
app: test
name: test
spec:
replicas: 1
selector:
matchLabels:
app: test
template:
metadata:
labels:
app: test
spec:
{{- if .Values.deployment.test.initContainers }}
initContainers:
- name: busybox
image: busybox
command:
- sleep
- "3600"
{{- end }}
containers:
- image: nginx
name: nginx
resources: {}
{{- end }}
Let's go through the logic in this resource:
It will render the resource only if .Values.deployment.test.create
is validated as a true value.
The next point is the init container. The initContainer
property will only get rendered, when .Values.deployment.test.initContainers
is set to true
.
The values.yaml
looks like this:
deployment:
test:
create: true
initContainers: true
If we run now our template command, we should get the following output:
helm template test .
# Source: node-red/templates/deployment.yaml
apiVersion: apps/v1
kind: Deployment
metadata:
labels:
app: test
name: test
spec:
replicas: 1
selector:
matchLabels:
app: test
template:
metadata:
labels:
app: test
spec:
initContainers:
- name: busybox
image: busybox
command:
- sleep
- "3600"
containers:
- image: nginx
name: nginx
This looks as expected. The Deployment
gets created and there is an init container present.
But how we can guarantee, that in the future everything will stay as intended? We can not be sure, that everyone will run all the time the helm template command and check for the results. Think about, what happens if the file gets more complicated.
Time for our hero: UNIT TEST and his even partner CONTINUOUS TESTING
Write Some Tests
Before we can start, we need to install the helm unit test plugin from Quintush:
helm plugin install https://github.com/helm-unittest/helm-unittest
Now we can start to work on the actual test.
First, you need to create a folder called tests inside your helm chart root folder. Our test suite file will be now placed under the tests/ directory with suffix _test.yaml.
In our example, we create a deployment_test.yaml
with the following content:
suite: test nginx deployment
templates:
- deployment.yaml
Now we can start to write the actual test jobs. We will follow here the **AAA pattern **from Unit Testing, Principles, Practices, and Patterns by Vladimir Khorikov
The **AAA **pattern is simple and provides a uniform structure for all tests in the suite. This uniform structure is one of its biggest advantages: once you get used to this pattern, you can read and understand the tests more easily. That, in turn, reduces the maintenance cost for your entire test suite.
The arrange section is where you set up the objects to be tested. You bring the system under test to a desired state.
The act section is where you act upon the system under test.
The assert section allows you to make claims about the outcome.
Here are our two test jobs in more detail:
tests:
- it: deployment should render
asserts:
- isKind:
of: Deployment
- hasDocuments:
count: 1
Let us see what the test is doing here:
Check that the kind of resource is a
Deployment
.And that there is a document rendered.
If we execute the test with the helm unittest
command, we get the following output:
❯ helm unittest charts/node-red
### Chart [ node-red ] charts/node-red
PASS test nginx deployment charts/node-red/tests/deployment2_test.yaml
Charts: 1 passed, 1 total
Test Suites: 1 passed, 1 total
Tests: 1 passed, 1 total
Snapshot: 0 passed, 0 total
Time: 12.40997ms
Great! Everything is fine!
Let us add the test for the init container too:
- it: init container should be present
values:
- ./values/deployment_values.yaml
asserts:
- isKind:
of: Deployment
- equal:
path: spec.template.spec.initContainers[0].name
value: busybox
Same as above, but with two differences:
we add test
values.yaml
to the test with thevalues
property.we check of the name of the init container is present
Let's run the test suite again:
PASS test nginx deployment charts/node-red/tests/deployment2_test.yaml
Charts: 1 passed, 1 total
Test Suites: 1 passed, 1 total
Tests: 2 passed, 2 total
Snapshot: 0 passed, 0 total
Time: 14.037046ms
Sweet!, everything pass again!
Now let us assume, someone changed the default value of create to false
:
❯ helm unittest charts/node-red
### Chart [ node-red ] charts/node-red
FAIL test nginx deployment charts/node-red/tests/deployment2_test.yaml
- deployment should render
- asserts[0] `isKind` fail
Template: node-red/templates/deployment-2.yaml
- asserts[1] `hasDocuments` fail
Template: node-red/templates/deployment-2.yaml
Expected documents count to be:
1
Actual:
0
Charts: 1 failed, 0 passed, 1 total
Test Suites: 1 failed, 0 passed, 1 total
Tests: 1 failed, 1 passed, 2 total
Snapshot: 0 passed, 0 total
Time: 18.812819ms
Error: plugin "unittest" exited with error
We get direct feedback and can check, where the bug or regression gets introduced.
GitHub Action for CONTINUOUS TESTING
To enable continuous testing in your GitHub action, you just need to add this step into your pipeline, and you are good to go:
...
- name: Run helm unittest
run: |
helm plugin install https://github.com/quintush/helm-unittest
helm unittest charts/node-red -3
...
Final Thoughts
As we keep on adding additional features to our helm chart, unit testing is the way to assure us that we are not breaking some functionalities and adding some of these pesky bugs.