Whenever you partake in a team code project, everyone needs to have all of the dependencies installed in order for the code to run. Generally, you share this in the form of a `requirements.txt` file (`pip freeze > requirements.txt`) in the project’s GIT. When you install the packages from the file on your local machine (`pip install -r requirements.txt`), it should in theory allow you to work on the project without issues. Except, in reality, there are often compatibility issues: e.g. you have a different version of the package already installed, a package version on your system gets overwritten with the version from the project’s `requirements.txt` causing issues with dependencies in other projects, etc.

One common way of solving this is by using a virtual environment, the most known being conda or the standard Python module venv.

Problem solved?

Not entirely, in some cases, a specified package version based on which the code was written can have different releases depending on e.g. OS. Meaning that if e.g. your co-worker uses Windows while you use Linux, they might not have access to the exact same release as you.

The next approach to solving this is the subject of this post: dev containers, more specifically we’ll be looking at the Visual Studio Code Dev Containers extension.

What this does is create a full-featured development environment with a VS Code Server in the form of a docker container.

This allows your set-up to be nearly identical to your co-workers’ set-up, with all packages install straight to an identical OS (container) system. Even the VS Code extensions used in the project (e.g. a specific code formatter like black) can be automatically installed. All of this helps to eliminate all the time spent to set-up an environment, which should be straightforward, but often causes some headaches and lost productive time.

It will also allow you to seamlessly switch your entire development environment just by connecting to a different container.

The extension handles all of the setup based on a few configuration files contained in a folder called `.devcontainer`, this is mainly the `devcontainer.json` file, which looks similar to this:

`json
// For format details, see https://aka.ms/devcontainer.json. For config options, see the
// README at: https://github.com/devcontainers/templates/tree/main/src/python

{ 
“name”: “myproject”,
 // Or use a Dockerfile or Docker Compose file. More info: https://containers.dev/guide/dockerfile
“image”: “mcr.microsoft.com/devcontainers/python: 0-3.11”,
 // Features to add to the dev container. More info: https://containers.dev/features.
 // Use ‘forwardPorts’ to make a list of ports inside the container available locally.
 // “forwardPorts”: [],
 // Use ‘postCreateCommand’ to run commands after the container is created.
“postCreateCommand”: “pip3 install –user - r requirements.txt”,
 // Configure tool-specific properties.
“customizations”: {
 // Configure properties specific to VS Code.
    “vscode”: {
       “extensions”: [
           “ms - vscode.live - server”,
           “ms - python.black - formatter”
          ]
     }
   }
  // Uncomment to connect as root instead. More info: https://aka.ms/dev-containers-non-root.
  // “remoteUser”: “root”
}
`

The VS Code extension has tools to help you build this, but you can also just Google the parts you need. E.g. the VS Code extensions are defined in the VS Code Marketplace, and can be found in the lower right, under “More Info > Unique Identifier”.

One issue you might encounter is generated files and plots being more difficult to access. There are multiple possible solutions for this:

  1. You mount a folder in your local filesystem to the docker container’s filesystem and use this to export data.
  2. A better solution, IMHO, is to use an extension such as e.g. the Jupyter extension.
  3. Yet another option would be to look at it as a headless server and to use some sort of file-server set-up or similar more definitive solution to share generated files.

Finally, to make this tool even more useful, we can drive this problem-solution methodology one step further by hosting the dev container in the cloud. With GitHub Codespaces being an obvious one.

The added benefits are:

– Any potential hardware-related discrepancies are eliminated as the whole team works on exactly the same machine.

– GitHub Codespaces offers a web interface for VS Code, next to the possibility to connect to your locally installed VS Code application.

– You’re also not limited to the computing power of the physical device you’re working on.

– It’s well-integrated into GitHub and can be created using an existing `devcontainer.json` file present in your GitHub repo.

This means that in theory you could be coding and running hardware-intensive computations on a device as light as an iPad (as long as you have an active internet connection). It also means that you don’t need to buy a beefy workstation for all your team members, as the heavy lifting is done in the cloud.

The downside is that the service costs money, the GitHub Codespaces system is based on computing time and storage days. However, there is a free tier where you get x hours of CPU time and y storage days for free, more info here.

I hope this post was able to inspire some people to explore different possible solutions to a common problem.