Why would webpacker default to overwrite my custom configs? It really just shouldn’t do that. It overwrote my custom fucking configs. A weeks worth of time spent on fixing an issue that should have never existed. F to pay respects.
It all started at one commit
The signup errors merge into master was really what did me in.. The commit was failing pipelines for who knew what reason. I was really just trying to pull it into the master branch and figured it was due to the fact that the features were not in master. boy was I wrong
So I merged and it broke. It broke big time. None of the pipelines were building, for anything.
I had no idea why. I thought the JS, using edge packages from alpine, was breaking the configuration. It wasn’t. How did I know? I have 2 other projects with the same cicd pipeline building rails projects and those had no problem. I figured it wasn’t the pipeline it was the developer, a la me. Enter debug mode.
Git add ., git commit “hotfix1”
Well… The debug process for pipelines is a little weird because I can’t view the changes unless I push them to the pipeline. A little bit of a nuasance, sure, but I needed to know why these pipelines weren’t passing. I switched branches, learned my lesson last time, and started to work.
Truthfully I thought I fatfingered something in my
webpacker.yml file. It was weird though because I was able to build the assets on my local machine. Classic “it works on my machine” scenario if you asked me.
Nonetheless I continued to mess with the
webpack.yml config and even deleted
node_modules and some of the other unneeded folders.
After everyone has gone to bed
I couldnt find the root of my problem so I went to directly to the root of it, the dockerfile itself.
I spun up a base alpine image for myself and walked through every step of my dockerfile ensuring that none of the current os packages or baseline dependencies were breaking my configuration. I passed in the project root as a volume with
-v $(pwd):/app/ and everything was partially working. Through this process I ended up migrating from
ruby2.7.0-alpine11 thinking the alpine image was partially to blame.
After standing up the new container and working from an interactive terminal in the container, I was able to build the assets. I was shocked asking my self why this wasn’t working. I was a little upset with Gitlab, and started to put some of the blame on them and their shitty pipeline features with minimal ram. (little did I know it was my problem. also huge shoutout to Gitlab, love their services!) I then asked myself where in the process this was going wrong. I had all the info I thought I needed printing to the output, what more was going on. I looked into the webpacker documentation some more, and then I went back to my pipeline, specifically the webpacker installation portion to find this:
With the words
force I knew immedietly it had to be webpacker overwriting my custom configurations. I decided to run
cat /app/config/webpack.yml before the
webpack:install and after the install and sure enough
webpack:install was overwriting my customizations. I solved this pretty quickly by overwriting the overwrites running the docker copy command after the install completed. The file looks something like this (partial below):
# Compile Assets RUN RAILS_ENV=production bundle exec rails webpacker:install RUN RAILS_ENV=production bundle exec rails webpacker:info RUN yarn install # Overwrite the overwritten defaults COPY ./config/webpacker.yml /app/config/webpacker.yml COPY ./config/webpack/environment.js /app/config/environment.js COPY ./config/webpack/production.js /app/config/production.js COPY ./babel.config.js /app/babel.config.js RUN RAILS_ENV=production bundle exec rails webpacker:compile
All about the data you get
The data output I was receiving from Gitlab was crpytic. It read something like:
Compiling... Compilation failed: Invalid configuration object. Webpack has been initialised using a configuration object that does not match the API schema. - configuration.entry should be an non-empty object. -> Multiple entry bundles are created. The key is the chunk name. The value can be a string or an array. The command '/bin/sh -c RAILS_ENV=production bundle exec rails assets:precompile' returned a non-zero code: 1 ERROR: Job failed: exit code 1
How was I supposed to decrypt that. It wasn’t dropping my config file back to me and nor should it. It was lettimg me know there was an error with my config file,
- Request output for possible file changes when things don’t work as expected. Something may be overwriting your custom files.
- Sometimes applications aren’t very smart, no matter how smart you think they are.