Automating code quality in go

Table of contents

Code quality is an inseperable aspect of modern software development and has gained significant complexity in the last years. It now covers topics ranging from documentation and code formatting over unit tests and linting to static code analysis. While go provides many tools to help you maintain the quality of your code, they are of little use unless used consistently. To help with this process, we will explain the most common tools, their purpose/advantages and how to automate them.

Tidying the go module

While not necessarily working on the code you wrote yourself, tidying go module dependencies is an important part of development automation. It scans your go code and adds missing dependencies to the go.mod file, removes dependencies no longer required and ensure you are working with updated requirements.

To run it in for a project, simply execute it in the module's root directory:

go mod tidy

Running unit tests and checking for race conditions

Running unit tests ensures your code behaves the way you intended it, even after numerous changes and improvements, possibly by other developers or departments. A frequently overlooked detail when running tests is to enable the -race flag, to make the go command also check your code for possible race conditions (goroutines concurrently competing over resource access). Race conditions can cause unintuitive program behaviour if not caught early, so spotting it during development saves you time and headaches later on:

go test -race ./...

Note the ./... argument we passed to the command. This is not a path, but a pattern specific to go. It tells the command to run for all packages it finds in the current directory (and subdirectories), not just main.

Code formatting

The format (aka writing style) of code has been heavily debated in the past, sparking discussions about tabs vs spaces, whether to use spaces between arguments in comparisons, newlines before opening curly brackets, ... - the list goes on. In reality, people are different, and so is their code. This becomes problematic when multiple developers are working on a single project, resulting in a mix of different code styles within the same code base. Even worse, changes between versions (for examples git commits) will occasionally include lines of code that were changed without changing their logic, simply because someone changed the formatting/writing style of them. Forcing all developers to learn and adhere to a common code style is tedious and will slow down productivity - and still lead to wrongly formatted code sections, because changing the way a senior developer has written code for decades simply doesn't happen over night.

Code formatting has gained significant popularity because it counteracts these problems. The way it does this is simple: it defines one global style and provides a tool that will automatically convert code written in any (syntactically valid) style to a standardized style. The go language not only defines a default style on the language-level, it even ships with the builtin fmt command to automate the process.

go fmt ./...

With a simple tool like this, the codebase style remains consistent while developers are free to write their portion of the code in any style they prefer, without worrying about the issues above.

Fixing old code

Go is a constantly evolving and updating programming language. Over time, this leads to some parts becoming outdated and obsolete, making code using them eventually incompatible with newer go versions. Refactoring legacy code is the nightmare of many developers and causes delays in productivity for companies.

Luckily, the go command has you covered on this front as well, with the fix command:

go fix ./...

Running the fix command on your code will find outdated import paths like golang.org/x/net/context and replace them with the up-to-date version (in this example, context) and change code using deprecated or outdated package APIs to use their modern replacements.

Simple static analysis

The go command ships with a builtin static analysis tool called go vet. Running this will check your source code for common issues like wrong format specifiers in Printf family calls, like using %d (intended for integers) for strings, improperly comparing error types without errors.Is() / errors.As(), potential misuse of locks from the sync package, unreachable code, unused function parameters and more.

go vet ./...

While this will already catch a lot of easily overlooked errors, there are much more complex static analysis tools available (see below).

Advanced static analysis with staticcheck

Although not part of go's builtin tool chain, the staticcheck command has gained traction among both go developers and large companies, because it catches a vast amount of issues ranging from logic errors, wrong assumptions about code behaviour (like thinking &*x would copy x) to simple improper usage, like accidentally deferring Lock() instead of Unlock() on a sync.Mutex.

To run it, you first need to install it:

go install honnef.co/go/tools/cmd/staticcheck@latest

The it becomes a simple one-line command like all the others in the go tool chain:

staticcheck ./...

The list of problems staticcheck can detect is way too long to cover here. You can check it out on their website instead.

Creating a script to run all the tools

Now that we understand what each tool does and why we need it, we have a long list of commands to run every time we make a change to our code. This isn't very economical, so let's combine them all in a simple bash script:

#!/bin/bash

# fail on errors
set -e

# install staticcheck if not present
if ! command -v staticcheck &> /dev/null; then
   go install honnef.co/go/tools/cmd/staticcheck@latest
fi

# execute tool chain
go mod tidy
go test -race ./...
go vet ./...
go fix ./...
staticcheck ./...
go fmt ./...

This simple script will now execute our entire code quality toolchain for us.

Automating the script in git

Even though the commands can now be executed with a single script, this still involved some manual effort every time we change our code. Forgetting to execute it even once before pushing our changes to a remote code versioning service would immediately undo all the benefits of using the commands in the first place. To ensure we never forget to execute it, we can make this script into a pre-commit hook in our git repositories. To do that, simply make it executable and move it to .git/hooks/pre-commit:

chmod +x script.sh && mv script.sh .git/hooks/pre-commit

Now it will run every time we commit our changes to the local git repository. Specifically, it will run before changes are committed and even cancel the commit process if any errors occur (such as static analysis finding a potential issue). This ensures that you will never accidentally push unformatted or unchecked code again.

Issues with the pre-commit hook script

A small issue arises with the go fix and go fmt commands: They change file contents, but the automatic pre-commit hook runs after git add staged files for committing, so changes made by fix/fmt won't be included in the commit. To fix this, we need to call git add on files changed by these commands to stage their adjusted contents before committing.

For go fmt, the fix is quite simple: it prints the files that changed, one per line, so all we have to do is loop over them and add them back to the git commit:

modified_files=$(go fmt ./... | xargs);
for file in $modified_files; do
   git add "$file"
done

The case is a little different for go fix, because it outputs both the name of the changed file and a description of what was changed:

main.go: fixed fmt context

We need a little help from grep and tr to extract just the filenames out of those messages:

modified_files=$(go fix ./... 2>&1 | grep -o '[^:]*:' | tr -d ':' | xargs);
for file in $modified_files; do
   git add "$file"
done

With those fixes applied, our final script will look like this:

#!/bin/bash

# fail on errors
set -e

# install staticcheck if not present
if ! command -v staticcheck &> /dev/null; then
   go install honnef.co/go/tools/cmd/staticcheck@latest
fi

# execute tool chain
go mod tidy
go test -race ./...
go vet ./...
modified_files=$(go fix ./... 2>&1 | grep -o '[^:]*:' | tr -d ':' | xargs);
for file in $modified_files; do
   git add "$file"
done
staticcheck ./...
modified_files=$(go fmt ./... | xargs);
for file in $modified_files; do
   git add "$file"
done

Automatically install the pre-commit hook in git repositories

The last bit of human error would be to forget installing the pre-commit hook into new projects (or adding it to older ones). Fortunately, git has us covered on that front as well, with git templates.

Start by checking if you have a directory set up for that already:

git config --get init.templateDir

If this returns a directory path, skip the next step. If not, let's create one; ~/.git-templates is a good default here:

mkdir ~/.git-templates && git config --global init.templateDir ~/.git-templates

By placing files inside ~/.git-templates (or any other custom dir you may have configured), we can provide files that will automatically be copied to the .git directory when we run git init. Save the script as ~/.git-templates/hooks/pre-commit and confirm the init template works in a new git repo:

cd some/empty/dir
git init
cat .git/hooks/pre-commit

Your new git repository now contains the .git/hooks/pre-commit script, ready to maintain your code quality.

You can also run git init in an existing project to re-initialize the git repository and have the template script added that way.

More articles

Managing packages with apt

How to find, install, update and remove packages - and everything inbetween

Enabling automatic background updates with unattended-upgrades

Keep your debian-based servers up to date automatically

Sharing compiled go code libraries

Ever wondered how to create dll/shared object files in go, like other compiled languages can?

Choosing the right RAID setup

Making sense of pros and cons for RAID configurations

A gentle introduction to systemd

A broad overview of systemd components and features

Advanced Docker Compose features

Getting more out of container stacks