I’m planning a docker dev environment and doubtful whether running
npm install as a cached layer is a good idea.
I understand that there are ways to optimize dockerfiles to avoid rebuilding node_modules unless package.json changes, however I don’t want to completely rebuild
node_modules every time
package.json changes either. A fresh
npm install takes over 5 minutes for us, and changes to
package.json happen reasonably frequently. For someone reviewing pull requests and switching branches quite often, they could have to suffer through an infuriating amount of 5 minute
npm installs each day.
Wouldn’t it be better in cases like mine to somehow install
node_modules into a volume so that it persists across builds, and small changes to
package.json don’t result in the entire dependency tree being rebuilt?
Yes. Don’t rebuild
node_modules over and over again. Just stick them in a data container and mount it read only. You can have a central process rebuild
node_modules now and then.
As an added benefit, you get a much more predictable build because you can enforce that everyone uses the same node modules. This is critical if you want to be sure that you actually test the same thing that you’re planning to put in production.
Something like this (untested!):
docker build -t my/module-container - <<END_DOCKERFILE FROM busybox RUN mkdir -p /usr/local/node VOLUME /usr/local/node END_DOCKERFILE docker run --name=module-container my/module-container docker run --rm --volumes-from=module-container \ -v package.json:/usr/local/node/package.json \ /bin/bash -c "cd /usr/local/node; npm install"
By now, the data container
module-container will contain the modules specified by
/usr/local/node/node_modules. It should now be possible to mount it in the production containers using