
As administrators we have a responsibility to vet. Even if it's to "deligate" the vetting, we have to vet the deligate. Npm is a hot mess, and most people get that now. Galaxy / puppetforge / helm stuff ? Take a number. It sprouts faster than you can get on it sometimes. Pays the mortgage :) David On Sat, Jan 25, 2020 at 4:05 PM D. Hugh Redelmeier via talk <talk@gtalug.org> wrote:
| From: Dhaval Giani via talk <talk@gtalug.org>
| On Thu, Jan 23, 2020 at 11:08 AM D. Hugh Redelmeier via talk < | talk@gtalug.org> wrote: | | > < | > https://www.zdnet.com/article/microsoft-spots-malicious-npm-package-stealing... | > > | > | > This article list six cases of malware contributed to npm (the repo for | > sharing node.js and JavaScript source). | > | > How many undetected cases exist? | > | > I've alway pretended that Linux distros vet their code. | | | They do, but npm is different. npm is indepdent of the distro itself. And | people want to use npm because it gives them the latest and the greatest.
I'm sorry that I wasn't clearer.
I was changing the subject a bit. Just like npm has problems because vetting contributions is a hard problem, so too Linux distros have problems because vetting contributions is a hard problem.
I know npm (JavaScript), CPAN (perl), CTAN (TeX), CPAN (Python), github, probably crates.io (Rust), etc. each bypass the disto.
Any security flaws identified by a distro get fixed by updates but anything users sourced from these other repos will not be fixed by distro updates. A serious logistic problem for users, even if they are unaware of it.
I try to avoid these repos for just that reason but it is kind of hard.
| > I'm not sure how | > true that is. Probably the greatest protection is the time delay between | > contribution and distribution. | > | > | I would be wary of this approach. There are a bunch of security fixes, | where you probably don't want too long a delay. Part of responsibility also | lies on the user to validate the update. With it being open source, and a | "volunteer" model, some of that has to be accepted b the user.
Sorry, I meant: the time delay between creating a piece of software and it being adopted by Linux distros. Not a delay because the user avoids distro updates.
I imagine the barrier to contributing to npm is zero. But I don't actually know. See below.
The barrier to contribution to any distro I know of is a bit higher. That involves time, effort, and creativity. But not enough to prevent a determined and skilled contributor of malware. The easiest way is probably to infiltrate a group that produces a piece of software already accepted by many distros. Then it depends on the vetting by that project.
More on npm:
<https://en.wikipedia.org/wiki/Npm_(software)>
npm is the "Node Package Manager". It accompanies node.js. It hooks up to a "registry", by default nodejs.com (a commercial entity). Here's a bit from Wikipedia (all caps added by me):
Over 477,000 packages are available on the main npm registry.[16] The registry has NO VETTING process for submission, which means that packages found there can be low quality, insecure, or malicious.[15] Instead, npm relies on user reports to take down packages if they violate policies by being low quality, insecure or malicious.[17] npm exposes statistics including number of downloads and number of depending packages to assist developers in judging the quality of packages.[18] --- Post to this mailing list talk@gtalug.org Unsubscribe from this mailing list https://gtalug.org/mailman/listinfo/talk
-- David Thornton https://wiki.quadratic.net https://github.com/drthornt/ https://twitter.com/northdot9/