uv Changed How I Think About Python Projects

I’ve been writing Python for a long time, and for most of that time, packaging has been the worst part. Not “mildly annoying” worst. “I’m going to close my laptop and go for a walk” worst. The tooling was fragmented, slow, and confusing in ways that made experienced developers feel like beginners and made beginners feel like they’d chosen the wrong language.

The more experienced programmers I learned from made fun of my beloved python for how terrible the package management experience was. Brutal.

Then Astral (now part of OpenAI’s Codex team) shipped uv, and for the first time in years I stopped dreading the packaging parts of my workflow.

The Before Times

To appreciate uv, you have to remember what came before it. Or maybe you’re still living it. Here’s the rough timeline of Python packaging as I experienced it:

2010s: pip + virtualenv. The standard approach. Create a virtual environment, activate it, pip install your dependencies, hope nothing breaks. It worked, mostly, until you had conflicting dependencies, or needed to reproduce an environment on another machine, or forgot to activate the venv, or needed a different Python version. pip freeze > requirements.txt was the state of the art for reproducibility and it was terrible.

Mid 2010s: conda. Solved the “different Python version” problem and handled non-Python dependencies (looking at you, GDAL). But conda was slow, conda environments were enormous, and the split between conda-forge and defaults channels was confusing. Also it didn’t play well with pip, which you inevitably needed for packages that weren’t on conda.

Late 2010s: pipenv. Tried to be the “official” solution. Had a lock file! Had automatic virtualenv management! Was also painfully slow and had a tendency to get into broken states that required deleting everything and starting over. I used it for about six months before giving up.

Early 2020s: poetry. A real improvement. Good dependency resolution, a proper lock file, project metadata in pyproject.toml. I used poetry for years and recommended it. But it was still slow for large dependency trees, it had opinions about project structure that didn’t always match mine, and it didn’t handle Python version management.

2023: rye. Built by Armin Ronacher (Flask creator), rye was the first tool that felt like it understood the full scope of the problem. Python version management, virtual environments, dependency locking, project management, all in one tool. It was fast because it was written in Rust. But it was explicitly a “proof of concept” and Armin was clear it might not last.

And then rye and uv merged forces, and here we are.

What uv Gets Right

Speed

I’m putting this first because it’s the most immediately noticeable thing and it’s not close. uv is fast in a way that changes how you interact with the tool. Package resolution that took poetry 30 seconds takes uv less than a second. Creating a virtual environment is essentially instant. Installing dependencies is so fast that you stop thinking of it as a “step” and start thinking of it as something that just happens.

Docker builds speed up, CI runs faster, the whole workflow is snappier and nicer.

This matters more than you’d think. When pip install takes 45 seconds, you avoid running it. You don’t add the dependency you probably should. You don’t recreate your environment to test a clean install. You let your requirements.txt drift. Speed changes behavior.

Python Version Management

uv python install 3.12 and you have Python 3.12. That’s it. No pyenv, no homebrew python, no system python conflicts, no “which python am I even using right now.” uv downloads and manages Python versions for you.

This solves one of the most common “help I’m stuck” problems in Python development. New developers especially struggle with having the wrong Python version, or multiple Python versions fighting each other. uv just makes it a non-issue.

uv run

This is the feature that changed my workflow the most. uv run executes a command within the project’s virtual environment without requiring you to activate anything. You don’t need to remember to source .venv/bin/activate. You don’t need to worry about being in the right environment. Just uv run pytest and it works.

For scripts and one-off tools, uv run is even better. You can run a Python script with inline dependency declarations:

# /// script
# dependencies = ["requests", "rich"]
# ///
import requests
from rich import print

response = requests.get("https://api.github.com")
print(response.json())

Run it with uv run script.py and uv handles the dependencies automatically. No virtualenv, no requirements file, no setup. Just a script that declares what it needs and runs.

Lock Files That Work

uv generates a uv.lock file that pins every dependency, including transitive ones, to exact versions. Cross-platform. Reproducible. Fast to resolve. This is what pip freeze wanted to be and never was.

It Doesn’t Try to Do Everything

uv manages Python versions, virtual environments, and dependencies. It doesn’t try to be a build system, a task runner, or a project scaffolder. It does its job and gets out of the way. After years of tools that tried to be everything and did nothing well, this restraint is refreshing.

How I Use It

Here’s my current workflow for a new Python project:

# Create the project
uv init my-project
cd my-project

# Add dependencies
uv add requests pandas

# Add dev dependencies
uv add --dev pytest ruff

# Run things
uv run pytest
uv run python my_script.py
uv run ruff check .

For my existing projects (including this blog’s Python tooling), migration was straightforward. Replace pip install -r requirements.txt with uv sync in the Makefile and everything else stays the same.

For my open source libraries, I’ve been gradually migrating to uv for development workflows while keeping standard pyproject.toml for distribution. The library’s users don’t need to use uv. They can still pip install it. uv is for my development process, not theirs.

What It’s Missing

There are still gaps. A few things I’d like to see:

Task running. I still use a Makefile for project commands because uv doesn’t have a built-in task runner. Some people use uv run with scripts for this, but I find Make cleaner for multi-step workflows. This is also arguably not uv’s job, so I’m fine with it.

Workspace support for monorepos. If you have multiple packages in one repository, uv’s workspace support is still maturing. Poetry handled this better, at least for now.

Ecosystem momentum. Not everyone uses uv yet, so you still encounter projects that assume you’re using pip, conda, or poetry. This is changing fast, but it means you sometimes need to translate between workflows.

The Bigger Point

The reason I’m writing about a packaging tool is that uv represents something I don’t see often in the Python ecosystem: a tool that just works. It doesn’t require you to understand the history of Python packaging to use it. It doesn’t have surprising edge cases that eat an afternoon. It doesn’t make you feel stupid.

Good tools are invisible. You use them, they do their job, and you think about the actual problem you’re trying to solve instead of fighting your toolchain. uv is the first Python packaging tool I’ve used that consistently gets out of my way, and that’s the highest compliment I can give it.