How should I containerize my automation scripts without breaking old machines?
#1
I’m trying to decide if it’s worth containerizing my personal automation scripts, mostly Python and shell, that I run across a couple of old machines. It feels like overkill for something that isn’t a service, but I’m tired of dealing with mismatched library versions and paths every time I tweak something.
Reply
#2
I tried Docker for a couple of scripts to lock down Python and shell deps, and it did reduce the drift. On the other hand, on old machines that image felt bulky and pulling it over SSH could stall me longer than fixing a virtualenv. It gave predictability, but the maintenance overhead was real.
Reply
#3
I went the virtualenv/pyenv route instead. A small wrapper that activates the right env and runs the script kept things portable across machines I own. Still, you end up recreating envs on each box and you can trip over system libraries when the OS changes.
Reply
#4
Do you actually need to share results across boxes, or are you just iterating on one and porting later? Maybe the real bottleneck is PATH and which Python is first in line, not container overhead.
Reply
#5
I ended up with a hybrid: a tiny launcher that creates a venv if missing, uses a pinned requirements.txt, and then runs the script. Not a full container, but it keeps things consistent without dragging in a whole runtime. I still keep the repo updated and deploy the changes by copying the folder, which feels slower on the old machines.
Reply


[-]
Quick Reply
Message
Type your reply to this message here.

Image Verification
Please enter the text contained within the image into the text box below it. This process is used to prevent automated spam bots.
Image Verification
(case insensitive)

Forum Jump: