Start a Node project that uses at least five direct dependencies.
Leave it alone for three months.
Come back and try to install it.
Something in the dependency tree will yell at you that it is deprecated or discontinued. That thing will not be one of your direct dependencies.
NPM will tell you that you have at least one security vulnerability. At least one of the vulnerabilities will be impossible to trigger in your particular application. At least one of the vulnerabilities will not be able to be fixed by updating the versions of your dependencies.
(I am sure I exaggerate, but not by much!)
Why is it like this? How many hours per week does this running-to-stay-in-place cost the average Node project? How many hours per week of developer time is the minimum viable Node project actually supposed to have available?
My take: It's because the "trust everything from everybody" model is fundamentally broken.
Note that trust is not only about avoiding malicious or vulnerable code, but also about dependability. Even if you ignore the "supply chain" security problems inherent in this model, it practically guarantees that the breakage you describe will happen eventually.
This is part of why I prefer languages with robust standard libraries, and why I am very picky about dependencies.
I personally don't trust NodeJS libraries that much - I always run projects inside Docker as a regular user with access to the working directory, just in case the supply chain is poisoned.
In the case of Python, particularly when I was testing out the LLaMA model, I just stood up a new VM for that. Back then safetensors wasn't a thing, and the model tensor file format (python pickle-based) could potentially store arbitary code.
The fact that NPM can't use multiple registries (yes, I know about scoped registries) is astounding. For every other language my org will separate artifacts into half a dozen or so virtual repos. The artifact team is quite annoyed that Node/JavaScript has to all go into one uber-repo.