Skip to content

Coding environment and IDE

Every software project has some sort of dependencies. The first to come to mind may be the software run-time or compile-time dependencies, without which one's application can not run or be compiled, however "build tools" (like compilers or interpreters), "development tools" (testing, packaging, documentation), "validation tools" (linters, formatters) and any other tool used during the software life-cycle are also dependencies from the project's point of view.

To enable collaboration, speed-up on-boarding and reduce maintenance, all dependencies should be easily findable and installable, while interfering as little as possible with the host system. In this lecture, we will start from the pkoffee initial archive, add dependency management using pixi and include our first tools. We will demonstrate how to connect the Visual Studio Codium IDE to the project's environment to leverage its features.

Software dependencies

Let's start by having a look at the pkoffee project and figure out which dependencies we need. First, let's clone the repository, checking out the initial state at the tag Day0:

git clone --branch Day0 git@github.com:s3-school/pkoffee.git
and look what is available:
cd pkoffee && ls -l
total 3960
-rw-r--r-- 1 pollet cta 4044243 janv. 12 16:42 coffee_productivity.csv
-rw-r--r-- 1 pollet cta    2293 janv. 12 16:42 main.py
-rw-r--r-- 1 pollet cta     382 janv. 12 16:42 README.md
We have 3 files:

  • README.md is a documentation file containing a little information about the project
  • coffee_productivity.csv is a Comma Separated Values file with coffee-productivity data points stored as 2 columns. (we can have a quick look with head coffee_productivity.csv)
  • main.py is a python script implementing the analysis of the coffee-productivity data.

There isn't much information about the project's dependencies: no instructions to build or run the software, nor how to run tests... What are the compatible python versions? Are there any run-time dependencies? We can only guess.

Since this script has been provided to us by a very trustable source, let's just try to run it ; maybe it will just work!

Never run un-trusted code!

Even if you trust the source, you should know what a software will do to your system: will it install something? Will it write data somewhere? Will it connect to an external service?

python3 main.py # (1)!
Traceback (most recent call last):
  File "/home/pollet/Documents/s3school/pkoffee/main.py", line 2, in <module>
    import matplotlib.pyplot as plt
ModuleNotFoundError: No module named 'matplotlib'
  1. Run main.py using the system's python

The ModuleNotFoundError error we are getting show the script couldn't use the matplotlib dependency, which it appears to require. Looking more closely at the main.py script, we can see the beginning of the file imports a few python libraries

head -n 7 main.py
import numpy as np
import matplotlib.pyplot as plt
import pandas as pd
from scipy.optimize import curve_fit
from pathlib import Path
import seaborn as sns
So it seems our project's has a few requirements. How should we install them?

Virtual environments

The python3 command we used to try to run the script is our system's python. Indeed, some version of the python interpreter typically comes pre-installed on most Linux distribution. We can have a look at our system's python installation:

which python3 # (1)!
/usr/bin/python3
ls -l /usr/
total 160
drwxr-xr-x   2 root root 69632 janv. 12 10:44 bin
drwxr-xr-x   2 root root  4096 déc.  19  2024 games
drwxr-xr-x  42 root root 16384 sept. 24 10:23 include
drwxr-xr-x 114 root root  4096 déc.  12 10:07 lib
drwxr-xr-x   2 root root  4096 déc.  19  2024 lib32
drwxr-xr-x   2 root root  4096 sept. 24 10:24 lib64
drwxr-xr-x  26 root root 12288 sept. 17 08:30 libexec
drwxr-xr-x   2 root root  4096 déc.  19  2024 libx32
drwxr-xr-x  11 root root  4096 févr. 14  2025 local
drwxr-xr-x   2 root root 20480 janv.  9 10:05 sbin
drwxr-xr-x 298 root root 12288 janv. 12 10:10 share
drwxr-xr-x   7 root root  4096 déc.  12 10:07 src

  1. We can find out where our python executable is using the which command

The python we were using is located in /usr/, which contains a set of directories making an "installation" or "environment". Installed binaries go in the bin directory, libraries in lib or lib64, include files in include, etc.

It is a very bad idea to use our system's python to manage our project's dependency: 1. This python is used by other things in our system. Modifying it or installing new dependencies could affect them. 2. What if we get another project that is "pkoffee-incompatible"? Using a single python installation forces us to make all our developments use compatible dependencies, which is almost never possible in practice.

Virtual environments have been invented to tackle this problem: they allow to create separate, isolated installations called "environments" on the same machine. When the environments are "activated" they take precedence over the system's installation, allowing to work with the dependencies and tools installed in the environment. When deactivated, the environments don't affect the system's installation or other environments.
Concretely, a virtual environment consists in a separate set of directories making an installation, and an "activation" script that essentially modifies the shell environment to make the environment content take priority (for instance: updating $PATH and $LD_LIBRARY_PATH variables to use the environment's paths first).

Python Virtual Environments

Since version 3.3, the python standard library includes a venv module that supports creating virtual environments. On Debian/Ubuntu system's however, it doesn't work out of the box and requires the python3-venv package. Once a virtual environment is created, one can use pip, the package installer for python, to install python packages from the python package index. However, this approach is not recommended as the mentioned tools, pip and venv, do not provide on their own a way to maintain reproducible environments.

Due to the limitation of the standard tools, there are a number of additional tools or full-fledged package and environment manager that have been implemented in the python ecosystem: pip-tools, pipx, poetry, pyenv, virtualenv... At the date of writing (01/2026), the recommendation of the author would be to use uv, a "fast Python package and project manager" that also gives reproducibility guarantees.

Virtual Environment for all languages: conda

Virtual environments are not exclusive to python installations, pretty much anything that can be installed in the standard set of directories can be made to be installed in a virtual environment. In the 2010s, complex multi-language environments required by data scientist motivated the development of conda: an open-source, cross-platform and language-agnostic package and environment manager, requiring only a user's permissions. The conda-forge distribution is a community-managed distribution of conda packages that provides conda packages for most of the python packages on pypi, but also R, Ruby, Lua, Scala, Java, Javascript, C, C++, FORTRAN, etc. allowing to craft complex multi-language environment with a single format and specification.

For a projects having only pypi, there isn't necessarily much to gain by using conda: the dependencies are either already multi-platform (pure python) or platform specific packages exist (eg for packages that are python wrappers of a C library), and the environment would be easy to install as a user with uv for instance. Nevertheless, providing a conda package makes the software directly available to conda users, and the overhead of providing a conda package can be very small: conda package can be build for free using conda-forge infrastructure, an initial conda recipe can usually be generated from a python project using grayskull, and updates for newer versions automatically handled as well. The decision for a particular project should be made based on the interest for the conda community in regards of the maintenance costs.

For the purpose of this school, the choice was made to work with the conda ecosystem (even though pkoffee is implemented purely in python) as it shows a more generic (language-agnostic) solution and adapting what is shown to another language should be easy. We will be using pixi, a modern conda package and environment manager that provides reproducibility guarantees and multi-environment support in a single workspace. The usage of conda or mamba are not recommended as they do not offer the same guarantees with respect to reproducibility.

Pkoffee Environment Set-up

If you haven't already, follow instructions to install pixi on your machine.

Let's make a working virtual environment for our pkoffee project. We can use the pixi command line interface to initialize a workspace and add dependencies:

# Initialize the workspace by adding a mostly empty pixi manifest + pixi git handling configuration (.gitignore, .gitattributes)
pixi init
# Add the school's supported platforms to our workspace: adding a dependency will now solve a working environment for all platform
pixi workspace platform add linux-64 linux-aarch64 osx-64 osx-arm64 win-64
# Add the dependencies that are imported in main.py
pixi add python numpy matplotlib pandas scipy seaborn

The above commands have added the following files to our directory:

  • pixi.toml is the pixi manifest describing the workspace. It has the following content
    [workspace]                                               
    authors = ["Vincent Pollet <vincent.pollet@lapp.in2p3.fr>"]
    channels = ["conda-forge"]                                    
    name = "pkoffee"                                        
    platforms = ["linux-64", "osx-64", "osx-arm64", "linux-aarch64", "win-64"]
    version = "0.1.0"
    
    [tasks]
    
    [dependencies]
    numpy = ">=2.4.1,<3"
    matplotlib = ">=3.10.8,<4"
    pandas = ">=2.3.3,<3"
    python = ">=3.13.11,<3.15"
    scipy = ">=1.17.0,<2"
    seaborn = ">=0.13.2,<0.14"
    
    The dependencies were added with bounds based on semantic versioning: the latest found version is used as lower bound and the next breaking change version is used as upper bound.
  • pixi.lock is the pixi lock file. It contains detailed information about every environment, and every package in the environments, in this workspace, including package hashes and download link. It is the lock file that allows to exactly reproduce an environment, even in several years, provided that the packages are still available at their hosting places. Conda-forge, the default channel used by pixi, has the policy that packages are immutable so conda-forge based environments will be reproducible. Note that this is not necessarily the case for pypi packages.
  • .pixi is the workspace environment directory. Unlike conda that used "globally" available environments (activatable from anywhere), pixi workspaces can only be used from a particular location, the workspace, on your machine. The goal of this design is to push users to create many small and specific workspaces for each project, instead of gigantic environments combining several use cases together. As a result, environments are usually stored directly in the workspace directory. We can see the virtual environment directories with for instance:
    ls -l .pixi/envs/default/
    total 120
    drwxr-xr-x  5 pollet cta  4096 janv. 12 23:02 bin
    drwxr-xr-x  2 pollet cta 12288 janv. 12 23:02 conda-meta
    drwxr-xr-x  6 pollet cta  4096 janv. 12 23:02 etc
    drwxr-xr-x  2 pollet cta  4096 janv. 12 23:02 fonts
    drwxr-xr-x 44 pollet cta 12288 janv. 12 23:02 include
    drwxr-xr-x 26 pollet cta 40960 janv. 12 23:02 lib
    drwxr-xr-x  3 pollet cta  4096 janv. 12 23:02 libexec
    drwxr-xr-x  3 pollet cta  4096 janv. 12 23:02 man
    drwxr-xr-x  3 pollet cta  4096 janv. 12 23:02 PySide6
    drwxr-xr-x  2 pollet cta  4096 janv. 12 23:02 sbin
    drwxr-xr-x 37 pollet cta  4096 janv. 12 23:02 share
    drwxr-xr-x  3 pollet cta  4096 janv. 12 23:02 shiboken6
    drwxr-xr-x  2 pollet cta  4096 janv. 12 23:02 shiboken6_generator
    drwxr-xr-x  4 pollet cta  4096 janv. 12 23:02 ssl
    drwxr-xr-x  3 pollet cta  4096 janv. 12 23:02 var
    drwxr-xr-x  3 pollet cta  4096 janv. 12 23:02 x86_64-conda-linux-gnu
    
  • .gitattributes contains a rules for git to treat the lock file as binary data. The lock is intended to be a machine only file, it should never be merged by users. Indeed, if a lock file could be created, it means a environment could be solved that satisfies the pixi manifest, guaranteeing that it can be reproduced. Manually editing a lock file (for instance during a merge) would break this guarantee. The lock file should be checked in source control so your environments specification are properly tracked.
    # SCM syntax highlighting & preventing 3-way merges
    pixi.lock merge=binary linguist-language=YAML linguist-generated=true -diff
    
  • .gitignore contains rules to for git to ignore the environment files that should never be tracked by source control.
    # pixi environments
    .pixi/*
    !.pixi/config.toml
    

Note

Semantic versioning is a versioning convention and best practice which purpose is to communicate software changes using version increments.

Semantic versioning rules

Given a version number MAJOR.MINOR.PATCH, increment the:

MAJOR version when you make incompatible API changes MINOR version when you add functionality in a backward compatible manner PATCH version when you make backward compatible bug fixes

The advantage of this convention is that it allows your user to define dependency constraints that should allow them to automatically be compatible with new dependency versions until the next breaking change. It is strongly advised to follow semantic versioning for your project's versions, as well as to define semantic versioning's base dependency bounds for your dependencies, so that both you and your users can benefit from it. Many projects follow this convention, however not all, starting with python itself, but also for instance pytorch.

Run the pkoffee analysis in its conda environment

pixi provides 2 main commands to interact with environments:

  1. pixi shell activates an environment, similarly to conda activate or a venv source venv/bin/activate. Once the shell is activated the dependencies are available and take precedence over the system installation, so for instance a command like which python would return /path/to/workspace/.pixi/envs/default/bin/python: the path to the workspace default environment python installation.
  2. pixi run will execute a task (see later lectures) or a shell command in an environment. pixi run which python also returns /path/to/workspace/.pixi/envs/default/bin/python.

Let's try to run pkoffee using our pixi environment!

pixi run python main.py

It runs! We managed to install pkoffee dependencies in a virtual environment!

We are incredibly lucky! The pkoffee example is sufficiently simple and recent that it still works with the latest versions of its dependencies. Beware: this is far from being the norm. A project left unattended, without a strong reproducibility enabler like pixi's lock file, quickly becomes unusable after a few years, sometimes months!

Add additional tools

So far we have only added to our workspace the dependencies required to run pkoffee. Let's now add a tool to help with its development: jupyterlab and jupyter notebooks.

Since these dependencies are not required to run pkoffee, it is best to separate them from the required dependencies, so users that don't need them don't have to download them. Indeed: visualization tools such as plotting libraries or graphic interfaces are usually pretty heavy as they require a full display stack to be available (eg X server, qt library, etc.). Python packages typically use optional dependencies to handle this separation, however conda packages don't have the concept of "optional" dependency. Thankfully, we can leverage pixi ability to manage several environments to achieve the same goal!

Pixi features are the definition of a subset of an environment. Features are combined together to form the usable environments. Until now, since we didn't specify a specific feature, the default feature and the default environment were used. Let's now create a feature for our jupyterlab dependency called dev:

pixi add --feature dev jupyterlab
This adds the following table to the pixi.toml:
[feature.dev.dependencies] # (1)!
jupyterlab = "*"

  1. This is the toml table for the dev feature. The default feature uses the dependencies table directly: dependencies is a shortcut for feature.default.dependencies.

This new feature is not yet used: the default environment only uses the default feature, which is defined by the dependencies table directly: dependencies is a syntax shortcut for feature.default.dependencies. In order to use our new feature, let's overwrite the default environment to use both the dev and default features:

pixi workspace environment add --force --feature default --feature dev default # (1)!
  1. The --force argument is required to overwrite an existing environment (default already exists)

This will add the following table to the pixi.toml:

[environments]
default = { features = ["default", "dev"], solve-group = "prod-group" } # (1)!

  1. The default feature doesn't have to be explicitly included: it is always added to an environment unless no-default-feature = true is specified.

We now have a default environment that has both our previous dependencies from the default feature, and the jupyterlab dependency from the dev feature. This split allows us to create another environment, let's name it prod, that has only the default dependencies:

pixi workspace environment add --feature default prod

This adds the prod environment to our environment table:

[environments]
default = { features = ["default", "dev"] }
prod = { features = ["default"] }

There is one issue with the above environments definitions: prod and default have a set of dependencies in common, however nothing enforces that the versions of dependencies solved for prod are the same than for default. This can lead to surprising results, for instance: pkoffee could work properly in the prod environment, but not in the default environment. In order to avoid this surprising behavior, pixi implements "solve groups": environments belonging to the same solve group are solved jointly, ensuring that the versions in all environments are consistent. We can update our manifest to request that the prod and the default environment be solved jointly:

pixi workspace environment add --force --feature default --feature dev --solve-group prod-group default
pixi workspace environment add --feature default --solve-group prod-group prod
which gives in the pixi.toml:
[environments]
default = { features = ["default", "dev"], solve-group = "prod-group" }
prod = { features = ["default"], solve-group = "prod-group" }

Test with several versions of python

Another common use case for using several environments is to test a project with several python versions. At the time of writing, pixi run python --version returns Python 3.14.2: our default environment is using python 3.14. Let's add python 3.12 and 3.13 too! We can of course not add them to the same environment we have been using so far: they would conflict with the python 3.14 version, so let's add them in separate features:

pixi add --feature py312 "python>=3.12,<3.13"
pixi add --feature py313 "python>=3.13,<3.14"
pixi add --feature py314 "python>=3.14,<3.15"

and remove python from the default feature:

[dependencies]
numpy = ">=2.4.1,<3"
matplotlib = ">=3.10.8,<4"
pandas = ">=2.3.3,<3"
- python = ">=3.13.11,<3.15"
scipy = ">=1.17.0,<2"
seaborn = ">=0.13.2,<0.14"

We can now compose environments using the default feature and the python version feature:

# Let's consider python 3.13 is our "production" version
pixi workspace environment add --force --feature default --feature py313 --solve-group prod-313 prod
# Make a the default environment the python 3.13 + development tools environment
pixi workspace environment add --force --feature default --feature py313 --feature dev --solve-group prod-313 default
# Create environment for 3.12 and 3.14 python too
pixi workspace environment add --force --feature default --feature py312 --solve-group prod-312 prod312
pixi workspace environment add --force --feature default --feature py314 --solve-group prod-314 prod314

Our environment definitions now read:

[dependencies]
numpy = ">=2.4.1,<3"
matplotlib = ">=3.10.8,<4"
pandas = ">=2.3.3,<3"
scipy = ">=1.17.0,<2"
seaborn = ">=0.13.2,<0.14"

[feature.dev.dependencies]
jupyterlab = "*"

[feature.py312.dependencies]
python = ">=3.12,<3.13"

[feature.py313.dependencies]
python = ">=3.13,<3.14"

[feature.py314.dependencies]
python = ">=3.14,<3.15"

[environments]
default = { features = ["default", "py313", "dev"], solve-group = "prod-313" }
prod = { features = ["default", "py313"], solve-group = "prod-313" }
prod312 = { features = ["default", "py312"], solve-group = "prod-312" }
prod314 = { features = ["default", "py314"], solve-group = "prod-314" }
We have 5 features: the default feature with pkoffee dependencies but python version not set. The dev feature with jupyterlab and 3 python feature, each with a specific python version. We combine these features to form 4 environments: the default environment with python 3.13, pkoffee's dependencies + jupyterlab, and 3 "prod" environments with pkoffee's dependencies, no additional development tool, and a specific python version. This may seem overkill at the moment, but we will see in further lectures that those environments will be useful to automatically test our package with different python versions.

IDE set-up

Now that we have a working environment for pkoffee, we can start start working on the code. This is best done using an Integrated Development Environment (IDE). IDEs are softwares that intend to boost programers productivity by providing useful information during code editing and automatizing common tasks. There is a wide variety of IDEs: some excel for a specific language but not so much for others, some bundle lots of tools and have great defaults settings, some are extremely customizable but require configuration and external tool, some are proprietary and commercialized, some are free and/or open-source, some integrate working with AI agents...

It is not the point of this lecture to recommend any particular IDE: one should chose one according to their needs and tastes! As an example, each teacher of this school has worked on the school using their preferred IDE(s): we can cite at least VS Codium, VS Code, GNU Emacs, neovim, vi and Vim, zed, cursor... If you don't already have a favorite, we would recommend trying out a few of those to form your opinion.

For the rest of this lecture, we had to chose one IDE to demonstrate the expected features and we will be using VS Codium, a "community-driven, freely-licensed binary distribution of Microsoft's editor VS Code". It offers a good compromise between easy set-up and customizability, multi-language support, debugging and test support, while been free and freely licensed, and benefiting from the big user base of the popular VS Code. Keep in mind that all demonstrated features are not exclusive to this IDE and feel free to adapt this to your favorite IDE.

Expected features

The following code edition features are usually expected from an IDE:

  • syntax highlighting: help programmer quickly identify keywords and structure
  • syntax checking: detect invalid syntax
  • code completion: suggest next characters or lines
  • code navigation: jump to symbol definition, show source of used dependencies
  • inline documentation: display documentation about used symbols (eg classes or function documentation)
  • refactoring: renaming symbols, moving definitions between files

The following features are available in most IDEs. They integrate in the IDE information typically provided by a specific tool.

  • build automation: automatize the usage of a language build back-end and understand its output (eg cmake support)
  • debugging: run debugging sessions from inside the IDE, display debugging information next to code
  • version control information: show git repository graph, commits, authors
  • test integration: display test results or test coverage within the IDE
  • code formatting: automatically format code files

Features such as code completion, syntax highlighting or inline documentation must be aware of your project's dependencies (available in the virtual environment) in order to work properly.

VS Codium Configuration

VS Codium is a generalist IDE that enables language support via extensions. Since pkoffee is a python project we should start by adding a few extensions helpful for python development:

  • Python, Python Debugger and Python Environments extensions enable python language support.
  • ty python language server and type checker. The pylance language server, used by default with VS Code, can not be used with VS Codium, so the very new ty can be used a replacement. If using ty it is recommended to change the python language server setting of VS Codium to None (as ty will provide the language server).
  • Even Better Toml: an extension for .toml language support.
  • autoDocstring - Python Docstring Generator: an extension to quickly generate docstrings. For scientific projects, we recommend using the numpy docstring style.
  • ruff is a popular python linter and code formatter that can be used through an extension. See Quality Tools

Configuring the virtual environment

For python projects, it is usually sufficient to configure the path to the python interpreter (in the virtual environment) for most tools to work properly, as they rely on it to discover installed dependencies. The python extension automatically tries to find a python interpreter from possible locations. If the interpreter from the pixi default environment wasn't automatically selected, follow these instructions to configure it.

VS Codium can auto-detect most environment settings if started in an activated environment

For other languages like C++, configuring VS Codium can require a bit more work: specify paths to the C and CXX compilers, set include and library directories, cmake module directory, etc. However some of those variables are made available as bash environment variables when the pixi environment is activated, and can be automatically detected by VS Codium. You can for instance try pixi run codium . from inside your workspace.

Once the python interpreter is configured, you can try running the main.py script from inside the IDE, or running the main.py with the debugging.

Summary

  • It is of paramount importance to properly specify and document a project's dependencies to enable efficient collaboration and reproducibility.
  • Virtual environments can be used to manage a project's dependencies in an isolated, independent installation
  • uv is the recommended tool for managing pypi-only projects and environments.
  • pixi is the recommended tool for managing projects with broader dependencies than only pypi, using the conda-forge as default channel.
    • The pixi.toml configures a workspace, which can include several environment specifications.
      • environments can be defined as combination of "features", and be requested to be consistent using solve-group. This can be used to test multiple python versions.
      • Development tools, which can bring a lot of dependencies (eg jupyter notebooks) can be added to dedicated features to keep the production environment slim.
    • The pixi.lock is a machine-only file detailing the workspace environments content. It is the key to reproducible environments and should be checked in source control.
  • semantic versioning should be used to increment a project's version, to allow users to define proper dependency constraints.
  • IDE enhance productivity by providing useful information and accelerating programming tasks, once configured to use the virtual environment installation.