1 minute read

As a developer I’ve built up a set of tools that we depend on to get shit done, and use them across multiple machines. I use a variety of tools both personally and professional spanning multiple languages. I then complicate it by using these tools across 4 different devices. I’ve got a work laptop (Lenovo P52), and my Surface Pro 4, Raspberry Pi 4 (Using as a desktop machine) and my Asus cloudbook (running Ubuntu). It’s handy to be able to use the same version of these tools and only have to update them once, without having to remember to update each machine. This gives me consistent features no matter which device I’m currently working on.

Step forward Docker. When people think of Docker they always think of their running their software, but never think about using their local tools. On my RPi4 and Asus cloudbook, I don’t have Node, NPM or Dotnet installed on either of them. Yet I can continue to develop using them thanks to Docker. I have a variety of small images for my Node workflow, accessed via an Alias. I also cheat, and don’t have dotnet installed on my machine!

alias npm='docker run -it --rm -u $(id -u):$(id -g) -v $(pwd):/directory -w /directory -p 4000:4000 node:alpine npm'
alias node='docker run -it --rm -u $(id -u):$(id -g) -v $(pwd):/directory -w /directory -p 4000:4000 node:alpine'

Something to note is the -u flag, this preserves the file permissions as the user that I’m currently running the command as, instead of the files being owned by root. The -v $(pwd) allows me to use these aliases as if I was running the tool installed globally on my machine. Bye-bye managing globally installed npm tools. All of these aliases ensure that the container isn’t persistent and is deleted when I’m done with it.

Tags:

Updated:

If you found this content helpful, please consider sponsoring me on GitHub or alternatively buying me a coffee

Comments