Happy 2024, folks! Just when we thought we’d seen it all, an npm user named PatrickJS, aka gdi2290, threw us a curveball. He (along with a group of contributors) kicked off the year with a bang, launching a troll campaign that uploaded an npm package aptly named
everything
. This package, true to its name, depends on every other public npm package, creating millions of transitive dependencies.The
everything
package and its 3,000+ sub-packages have caused a Denial of Service (DOS) for anyone who installs it. We’re talking about storage space running out and system resource exhaustion.But that’s not all. The creator took their prank to the next level by setting up http://everything.npm.lol, showcasing the chaos they unleashed. They even included a meme from Skyrim, adding some humor (or mockery, depending on your perspective) to the situation.
Feross Aboukhadijeh
I know this is a bad thing, you shouldn’t do this, it harms a lot of people, etc., etc., but let’s be honest here – this is a hilarious prank that showcased a weakness in a rather playful way. Sure, there were real consequences, but it doesn’t seem like any of them caused any permanent damage, data loss, or compromised systems. What’s worse, it seems this isn’t even the first time stuff like this happened, so I find it baffling people can still do this. What are they doing over there?
Ideally, one should not need the “latest daily build” of every package, but rather a stable one that they can rely on.
In the “old times”, we did not pick individual libraries, but targeted a single version of libc or .net framework (or whatever platform we are using). Now with the advent of modern dependency systems, we have a “graph” of versions and resolution algorithms like Conda/Mamba/mpm/etc.
And… this opens up a business oppportunity:
“Selling verified npm repository access”
Or at least a way to have your local mirror with known, secure versions.
So, when you say: “I need to build a ReactJS project with TypeScript that uses Google Cloud APIs”, I don’t care if the libraries are 2 days old or 2 months, or even a year (as long as security patches are up to date). What I need to care is that they (1) work, and (2) they are secure.
sukru,
The thing about this is that the further away you get from the upstream source, the less familiar everyone is with what’s going on with the code. Whether it’s redhat, debian, ubuntu, whoever…they all do the best they can but realistically aren’t privy to the nuances of code in the thousands of packages they manage. This centralization is fundamentally very difficult to scale and this has been driving the push to projects with their own self-contained packages.
While ostensibly a service as you describe could have value, it faces the exact same caveats as centralized repos, which we were trying to solve. If you advocate for this type of centralized oversight, then you may as well advocate for the old fashioned repos since they still have the advantage of less overhead by making much better use of shared libraries.
Alfman,
That is a valid concern.
Many distributions tried, and failed to keep up with even the popular Python packages.
Take the popular “torchaudio” package (Audio models for PyTorch):
Is already on v2.2.0+ on main:
https://pytorch.org/audio/versions.html
Whereas the latest debian sources are at least 4 months old:
https://salsa.debian.org/deeplearning-team/pytorch-audio (v2.0.2 as of now).
However, if you don’t need the absolute latest machine learning models, it usually is not necessary to be that up to date.
The problem?
Debian repositories are not only stale, but they are also limited in number of package that are available.
Worse?
If you want to go “pinned” with a dependency manager, they will not host the packages for too long.
For example will fail, since they have no version of torchaudio older than v2.0.
I think this is where “DevOps” come in (or whatever name that is called today), and make sure the company internally has a stable and consistent version of those libraries.
sukru,
I’m usually ok running the latest/supported version of software. And with x86 I’m quite confident that I’ll be able to keep using hardware long into the future. In the ARM SBC space, I fret this a lot because support is much more fragile with ARM. Will the odroid/bannana pi/etc that I have still be able to apt-get install packages if I didn’t install them beforehand? It sinks my stomach when “apt-get install/update” no longer works. I typically have my own software, but often I need other dependencies from the repos and those dependencies can disappear as well. Even if they exist in a source repository somewhere, the workload to maintain dependencies by hand outside of the repos can balloon into a chore that makes me want to throw the hardware away.
Oh how I wish all these devices would be supported by generic mainline kernels & operating systems without having to worry about manufacturer support & compatibility.
Npm and similar package managers are so useful as devs, but they also terrify me. The ease with which a new dependency can be added to a dependant package that is now suddenly pulled in and part of my production systems is a real concern. I really want to see tha package managers start at least warning of new dependency additions, but fear that too would just become general noise for devs and get ignored.
This is a test comment. Please ignore me
Test to see if we have an edit button.
Still dame autocorrected
Walking by pretending to look at something on the other side of the street.
Nope, we don’t.
We need that edit back.
And definitely a < code > block.
sukru,
There are improvements I’d like to see. The very old (pre-wordpress) osnews had a wysiwyg editor. I miss the discussion threading options too. But honestly in the grand scheme of things we’re lucky this website still exists at all