How to transport a snake?

So we all know planes are not a good idea. Before you know it you have Samuel L. Jackson shouting in your ear.

So maybe we need a few new ideas.

Point of view

As most quasi intellectuals who want to sound philosophical like to say; It all depends on your point of view. However, in this case I want to make for how to ship Python applications, I want to make sure you embody the different individuals who partake in the process of shipping it or deploying it if you want to be technical. One is the developer who writes the source code, the other is the DevOps engineer tasked with deploying it wherever it is needed.

Ecosystem

The ecosystem for Python applications is great if you are the developer that just has to write the source code to make it work. You activate a virtual environment using any tool that has your preference. You install the necessary libraries outside of the standard library that you need. You write the code until it all works. Job done.

The ecosystem for deploying it sucks. Either you use the same flow as the developer but that is not the context and environment of the DevOps engineer. You really wanted to use the packages from the package managers of the Operating System you deploying it to. You cannot however because the same versions do not sync up. So you have to choose.

Either Or

Either you use the developer tools and flows for the duration of deploying and your Continuous Integration process. Or you use the system packages to pre-built docker images where you run your code and development process. It is either or. There is a big discrepancy between the developer context and the deployment context, or if you will the end user.

Let us assume you made something and you want other people to run it too. How do you get your Python application to the end user? You might get away with making a self running archive. Though one thing needs to be there and that is the C libraries on the host machine. So you cannot get away with prepping the host machine environment.

The one solution is to statically compile everything and not dynamically link it. This means a lot of management done by developers or DevOps engineers themselves. It would solve the problem though that you can ship your archive and anyone else can just run it. Not a feasible thing to accomplish as it introduces a lot of unneeded complexity.

Two layers

There also exists the possibility of having the self running archive and then also putting it inside a docker image. These two layers can have two independent running pipelines. One to produce the archive and one that produces the image. Now you can update the runtime without changing the source code and run the risk of introducing anomalies. You take the archive out and you update the Python runtime and reinstate the archive and run it.

It gives you more fine grained control if that is needed.

#devops #python