last updated: May 05, 2023

5 minute read

A Barebones Approach to Continuous Integration

Of the many domains of software engineering, devops is one area I'm less familiar with. Since shipping my blog and a few hobby sites, however, I've needed to pick up the basics in managing a deploy process. In fact, I'd say my general lack of expertise makes me a great person to write on the topic of minimalist CI setups - I know just enough to do what I need with nothing too fancy to hold me back. I've done a good deal of trial and error, and I think this is a great approach for small/medium sized projects.

This setup is a simplified version of the deploy process for my blog - a Next.js app - but the fundamentals should apply to most projects.

What is CI?

According to aws:

Continuous integration refers to the build and unit testing stages of the software release process. Every revision that is committed triggers an automated build and test. With continuous delivery, code changes are automatically built, tested, and prepared for a release to production.

To achieve a simplified version of this, we'll create a deploy.sh script to run our test suites locally, and on success, push our code to the server. In a post-receive hook on the server, we'll execute a post-receive.sh script committed to the repo to rebuild the site for production. Finally, we'll reload a pm2 daemon to run the Next.js server and make the updates accessible on the web.

Note: Other approaches to CI frequently use a process that runs on a remote machine, like Github actions or a Jenkins setup. These are great options for larger projects/companies, but in my opinion, a simple setup of running your build and tests locally before pushing to production works just fine.

Pushing Changes to the Server with Git

Let's start at the second half of our continuous integration: pushing any updates to the server. For this section, I'll defer to the following article. It's great - I follow it step-by-step for every new project.

After reading through the article, you should be able to push new code to your server with something like git push server, which will in turn trigger a post-receive hook - more on that below.

Post-receive.sh

Like the article suggests, in my post-receive hook I check out the latest tree into my project folder, but I also have a line to execute a separate post-receive.sh script checked into the repo. See below:

#!/bin/bash
GIT_WORK_TREE=/var/www/[your project] git checkout -f
/bin/bash /var/www/[your project]/post-receive.sh

I'm a big fan of this approach, since it allows me to do the bulk of my work in a script that I can manage locally. When I want to commit any changes, I just push it to the server like any other part of my app.

In the post-receive.sh itself, I have:

#!/bin/bash
LOG=/home/[username]/[your project].log
DIR=/var/www/[your project]
NPM=/path/to/npm
PM2=/path/to/pm2
# overwrite file
echo "" > $LOG
cd $DIR || exit
echo "running npm install..." >> $LOG # append to file
$NPM install >> $LOG 2>&1 # redirect stderr to stdout
echo "ran npm install" >> $LOG
echo "rebuilding..." >> $LOG
$NPM run build >> $LOG 2>&1
echo "rebuilt" >> $LOG
echo "restarting pm2 daemon..." >> $LOG
$PM2 reload "npm run prod" --name blog >> $LOG 2>&1
echo "restarted pm2 daemon" >> $LOG

First, I define a few variables because, as explained in this stackoverflow post, the PATH when a git hook runs isn't necessarily the same as when you ssh into a machine. I run npm install, since I don't check node_modules into the repo (and you shouldn't either!), and run npm run build to build Next.js for production. Finally, I reload the process running the Next.js server to use the updated code. To manage my processes, I use pm2. As a tl;dr, with the command:

$PM2 reload "npm run prod" --name blog 2>&1

I'm telling pm2 to reload the process named blog by running the command npm run prod - and if the process doesn't exist yet, pm2 will create it for us.

I write to a log file before and after every step, and redirect any errors from each step itself to stdout so they'll be recorded as well. To read from your log in real time, use tail -f log.log.

Deploy.sh

That works great for the server, but to start everything off you'll need a local deploy.sh script to run when you're ready to push new code. This script builds Next.js for production locally to make sure nothing breaks when you run the same command on the server, and runs the e2es and unit tests.

#!/bin/bash
echo "building locally..."
if npm run build; then
echo "built locally"
else
echo "build failed, aborting"
exit
fi
# kill anything running on port 3000 so we can use it for e2es
kill -9 $(lsof -ti:3000)
# start up a local pm2 daemon to run our e2es against
pm2 start "npm run start" --name e2e
echo "running e2es locally..."
if npm run test:e2e; then
echo "ran e2es locally"
else
echo "e2e tests failed, aborting"
pm2 delete e2e
exit
fi
pm2 delete e2e
if npm run test:unit; then
echo "ran unit tests locally"
else
echo "unit tests failed, aborting"
exit
fi

If everything passes, you're ready to push to the server! You can either add the following code to the deploy.sh script:

echo "input commit message >"
read COMMIT
git add -A
git commit -m "$COMMIT"
git push origin master
git push server

or move to a separate script entirely.

That's about it! A few daemons, a handful of bash scripts, and you have a simple yet effective CI for your next side project.

Bonus: Learn Bash in 15 Minutes

Learning Bash is an underrated skill, and just investing 15 minutes can improve your scripting productivity dramatically.

you might also like:

A Cheat Sheet For Using Type Assertions in Typescript

October 23, 2023

A short guide to convincing typescript you know better than it does

software eng
typescript
© elan medoff