To add an alternative to what's already been suggested: you can keep a requirements.in with your explicit dependencies, and use pip-tools's pip-compile to generate requirements.txt with the full tree, version locked. Or you can generate this from a pyproject.toml. Then you can use pip-tools's pip-sync to install and uninstall packages to make your actual environment match the .txt lockfile.
I know i can use pip to uninstall packages, the problem is I don't know which package is needed and which is not needed. I wish pip was like pacman or apt which can list orphan packages.
As already mentioned, poetry (python-poetry) is the best thing I know regarding package management for projects. It's quite easy to setup and use, and I use it together with pyenv.
If your problem lies on your locally installed packages, then I think you're basically on your own manually searching for packages you don't use. There's not really a way for pip/python to know which packages are relevant or not (at least not that I know of), and if your problem is just about a single environment, then just delete the environment and start anew.
As an alternative to having to clean your Python environments, I’d like to suggest putting those efforts into mastering Docker. If you can master using Docker containers as your Python environment, you can cut through a lot of the pains regarding dealing with virtual environments, multiple Python installations, and the quasi-confusing PYTHONPATH environment variable.
I don’t even install Python on my machines anymore. I’m 100% “dockerized” and I’ll never go back.
This works great, assuming you're using Linux. On Windows this kind of a setup is a nightmare to maintain, since if something goes wrong you have to troubleshoot through several layers of virtualization and weird OS integrations to understand what's happening. Venv is a much better solution for that.
Yeah, my experience with docker on windows has been pretty bad, uses high CPU and RAM at the best of times, at the worst completely hangs my computer on 100% CPU usage forcing a restart as the only fix.
I really don't understand why people are overcomplicating this. You can install multiple Python versions at once on Windows and it just works fine (you can use the py command to select the one you want).
Virtual environments are designed exactly for this use case. They've got integrations for pretty much everything, they're easy to delete/recreate, they're really simple to use, they're fast, and they just work.
If virtual environments alone aren't quite enough you can use something like poetry or pipenv or the many other package management options, but in many cases even that is overkill.
Now, of course, Linux knowledge is necessary to be productive with Docker in most cases. So, if Windows is all you know, Docker will have a learning curve for you regardless of where you run it.
In my experience, this kind of setup is ideal for Windows. Working on a Python project with individual developers who are using a mix of MacOS, Linux flavors, and Windows without using Docker would help illuminate Docker’s benefits here.
Can you speak more to how you had more than one level of virtualization when using Docker on Windows? 🤔
The only issue I ever ran into with Docker on Windows was the Docker VM’s clock getting out of sync with the host system’s clock (which was solved by restarting the VM 🤷♂️).