Should I containerize my personal automation scripts with Docker?
#1
I’m trying to decide if it’s worth containerizing my personal automation scripts, which are just a mix of Python and shell scripts that scrape APIs and move files around. It feels like overkill for something that only runs on my own machine, but I keep running into dependency conflicts when I try to update one script and it breaks another.
Reply
#2
I tried Docker for a couple of scripts and it actually kept the dependencies from fighting each other. Each script could pin its Python version and required libs, so updating one package didn’t break another. The downside was the extra boilerplate and a longer start-up time for small tasks.
Reply
#3
Before that I lived with virtualenvs and a shared system site-packages, which broke the moment a library updated. I kept a separate virtualenv per project, but it still bled in when scripts assumed different tool versions. I eventually gave up and kept a patchy set of fixes.
Reply
#4
Is the real problem that the scripts are pulling in different APIs or that the environment isn't stable? I keep wondering if containerization is overkill for something I run on my own hardware, but the panic when an API changes feels real.
Reply
#5
I started using a tiny orchestrator script and a Makefile to bound runs and log output. It helped when I needed to test a new flow, but I still hit path issues and fragile shell steps. Not a clean fix, just a bandaid.
Reply


[-]
Quick Reply
Message
Type your reply to this message here.

Image Verification
Please enter the text contained within the image into the text box below it. This process is used to prevent automated spam bots.
Image Verification
(case insensitive)

Forum Jump: