security threats of Open Source

<https://www.zdnet.com/article/microsoft-spots-malicious-npm-package-stealing-data-from-unix-systems/> This article list six cases of malware contributed to npm (the repo for sharing node.js and JavaScript source). How many undetected cases exist? I've alway pretended that Linux distros vet their code. I'm not sure how true that is. Probably the greatest protection is the time delay between contribution and distribution. I wonder what can be done about this problem. I've said so at our meetings a few times too. Of course the problem is worse with closed source: it is impossible to audit the source. But closed source might have fewer contributors and more supervision. Of course much closed soure is built on top of open source and thuse all its weakness

On Thu, Jan 23, 2020 at 1:08 PM D. Hugh Redelmeier via talk <talk@gtalug.org> wrote:
This article list six cases of malware contributed to npm (the repo for sharing node.js and JavaScript source).
How many undetected cases exist?
I've alway pretended that Linux distros vet their code. I'm not sure how true that is. Probably the greatest protection is the time delay between contribution and distribution.
I wonder what can be done about this problem. I've said so at our meetings a few times too.
Of course the problem is worse with closed source: it is impossible to audit the source. But closed source might have fewer contributors and more supervision. Of course much closed soure is built on top of open source and thuse all its weakness
In this vein - - - - a contact who in computer terms calls himself a dinosaur refuses to allow javascript on his computers doing all his browsing on text based browsers. In his opinion javascript is a serious accident already in free fall. What you're sharing only emphasizes that. Maybe its time to join his anti Javascript position? Regards

| From: o1bigtenor via talk <talk@gtalug.org> | In this vein - - - - a contact who in computer terms calls himself a dinosaur | refuses to allow javascript on his computers doing all his browsing on text | based browsers. In his opinion javascript is a serious accident already in free | fall. What you're sharing only emphasizes that. Maybe its time to join his | anti Javascript position? The issues are a little more intricate. Note npm is a repo (mostly?) for JavaScript to run under node.hs. node.js is a server-side thing. It runs JavaScript on the server. Not in the client (browser). JavaScript itself isn't terrible. What is unfortunate, I think, is the unfettered creativity JavaScript in the browser allows web designers. They misuse it, just like they did Adobe Flash previously. To some extent this is caused by the good sides of JavaScript: how easy it is to learn, how easy it is to wip up complexity, how easy it is for the page creator to take control of the browser experience. What I was talking about was how easy it is to inject malicious code into the ecosystem. That isn't actually the fault of the language. (It is imaginable that one could design a language that prevented some abuse.) In fact, the language+browser have been designed to limit the damage that could be inflicted on the client side. The npn problem is mostly server-side, I think (I'm not sure). Making something easier (cheaper, faster, more understandable, ...) allows it to be used more, often to excess. Unexpected side effects can ensue. - increasing efficiency of cars makes driving cheaper so people drive more and end up using more total energy (gasoline). - computers became a lot cheaper. So a lot more money is spent on computers. - programming has become easier. So a lot more pointless programs have been created. - when I worked on optimizing compilers, I thought that I was trying to make existing programs run faster. Then it struck me that it allowed programmers to write programs in a simpler and clearer way and have the compiler eliminate the performance cost. Here's a random example of npm use: <https://www.electronjs.org/>

On Thu, Jan 23, 2020 at 3:37 PM D. Hugh Redelmeier via talk <talk@gtalug.org> wrote:
| From: o1bigtenor via talk <talk@gtalug.org>
| In this vein - - - - a contact who in computer terms calls himself a dinosaur | refuses to allow javascript on his computers doing all his browsing on text | based browsers. In his opinion javascript is a serious accident already in free | fall. What you're sharing only emphasizes that. Maybe its time to join his | anti Javascript position?
Thank you for your response!!
The issues are a little more intricate.
They usually are - - - grin.
Note npm is a repo (mostly?) for JavaScript to run under node.hs. node.js is a server-side thing. It runs JavaScript on the server. Not in the client (browser).
JavaScript itself isn't terrible.
What is unfortunate, I think, is the unfettered creativity JavaScript in the browser allows web designers. They misuse it, just like they did Adobe Flash previously. To some extent this is caused by the good sides of JavaScript: how easy it is to learn, how easy it is to wip up complexity, how easy it is for the page creator to take control of the browser experience.
From what little I know what I"m thinking is that the browser user needs to have some tools to control what the browser does - - - - that seems to be unobtanium at this point.
What I was talking about was how easy it is to inject malicious code into the ecosystem. That isn't actually the fault of the language. (It is imaginable that one could design a language that prevented some abuse.)
In fact, the language+browser have been designed to limit the damage that could be inflicted on the client side. The npn problem is mostly server-side, I think (I'm not sure).
Making something easier (cheaper, faster, more understandable, ...) allows it to be used more, often to excess. Unexpected side effects can ensue.
- increasing efficiency of cars makes driving cheaper so people drive more and end up using more total energy (gasoline).
Our obsession with individual transportation has become a major cost factor in one's personal economy.
- computers became a lot cheaper. So a lot more money is spent on computers.
- programming has become easier. So a lot more pointless programs have been created.
- when I worked on optimizing compilers, I thought that I was trying to make existing programs run faster. Then it struck me that it allowed programmers to write programs in a simpler and clearer way and have the compiler eliminate the performance cost.
Interesting.
Here's a random example of npm use:
Thanks for the sharing! I'm wondering if there even is a way of reining in the wild possibilities in javascript in a browser. If there is it would be quite nice if this would happen quite soon. I'm finding that the web has become quite a frustrating and a very very far from useful place to look for things. Regards

I regularly browse with javascript turned off. I use NoScript. While it is a hassle, I whitelist trusted sites, but refuse script from 3d party sites. There is a bit of setup to do to whitelist sites. Scripts have long been abused. Browsing without js restores a bit of honesty in web pages, as a lot of the razzle dazzle crap code is not executed. I seek information more than eye candy. Cross-site scripting risk is near eliminated, making web browsing safer. You can also see which sites have added a whole lot of crap onto their script code and which 3d party sites they employ. This will colour your selection of credible web sites. As well I intermix browsers as well as use Tor. I encourage you to try it. Tilt the advantage to the user with the NoScript plugin. On Thu, 23 Jan 2020 at 18:30, o1bigtenor via talk <talk@gtalug.org> wrote:
On Thu, Jan 23, 2020 at 3:37 PM D. Hugh Redelmeier via talk <talk@gtalug.org> wrote:
| From: o1bigtenor via talk <talk@gtalug.org>
| In this vein - - - - a contact who in computer terms calls himself a
dinosaur
| refuses to allow javascript on his computers doing all his browsing on text | based browsers. In his opinion javascript is a serious accident already in free | fall. What you're sharing only emphasizes that. Maybe its time to join his | anti Javascript position?
Thank you for your response!!
The issues are a little more intricate.
They usually are - - - grin.
Note npm is a repo (mostly?) for JavaScript to run under node.hs. node.js is a server-side thing. It runs JavaScript on the server. Not
in
the client (browser).
JavaScript itself isn't terrible.
What is unfortunate, I think, is the unfettered creativity JavaScript in the browser allows web designers. They misuse it, just like they did Adobe Flash previously. To some extent this is caused by the good sides of JavaScript: how easy it is to learn, how easy it is to wip up complexity, how easy it is for the page creator to take control of the browser experience.
From what little I know what I"m thinking is that the browser user needs to have some tools to control what the browser does - - - - that seems to be unobtanium at this point.
What I was talking about was how easy it is to inject malicious code into the ecosystem. That isn't actually the fault of the language. (It is imaginable that one could design a language that prevented some abuse.)
In fact, the language+browser have been designed to limit the damage that could be inflicted on the client side. The npn problem is mostly server-side, I think (I'm not sure).
Making something easier (cheaper, faster, more understandable, ...) allows it to be used more, often to excess. Unexpected side effects can ensue.
- increasing efficiency of cars makes driving cheaper so people drive more and end up using more total energy (gasoline).
Our obsession with individual transportation has become a major cost factor in one's personal economy.
- computers became a lot cheaper. So a lot more money is spent on computers.
- programming has become easier. So a lot more pointless programs have been created.
- when I worked on optimizing compilers, I thought that I was trying to make existing programs run faster. Then it struck me that it allowed programmers to write programs in a simpler and clearer way and have the compiler eliminate the performance cost.
Interesting.
Here's a random example of npm use:
Thanks for the sharing!
I'm wondering if there even is a way of reining in the wild possibilities in javascript in a browser. If there is it would be quite nice if this would happen quite soon. I'm finding that the web has become quite a frustrating and a very very far from useful place to look for things.
Regards --- Post to this mailing list talk@gtalug.org Unsubscribe from this mailing list https://gtalug.org/mailman/listinfo/talk

I can second the "noscript" thing. "Default deny" is good practice. No- one has to explain it for firewalls ( any more I hope), so why do we have to explain it in other places? On Thu, Jan 23, 2020 at 7:00 PM Don Tai via talk <talk@gtalug.org> wrote:
I regularly browse with javascript turned off. I use NoScript. While it is a hassle, I whitelist trusted sites, but refuse script from 3d party sites. There is a bit of setup to do to whitelist sites. Scripts have long been abused. Browsing without js restores a bit of honesty in web pages, as a lot of the razzle dazzle crap code is not executed. I seek information more than eye candy. Cross-site scripting risk is near eliminated, making web browsing safer. You can also see which sites have added a whole lot of crap onto their script code and which 3d party sites they employ. This will colour your selection of credible web sites.
As well I intermix browsers as well as use Tor.
I encourage you to try it. Tilt the advantage to the user with the NoScript plugin.
On Thu, 23 Jan 2020 at 18:30, o1bigtenor via talk <talk@gtalug.org> wrote:
On Thu, Jan 23, 2020 at 3:37 PM D. Hugh Redelmeier via talk <talk@gtalug.org> wrote:
| From: o1bigtenor via talk <talk@gtalug.org>
| In this vein - - - - a contact who in computer terms calls himself a
dinosaur
| refuses to allow javascript on his computers doing all his browsing on text | based browsers. In his opinion javascript is a serious accident already in free | fall. What you're sharing only emphasizes that. Maybe its time to join his | anti Javascript position?
Thank you for your response!!
The issues are a little more intricate.
They usually are - - - grin.
Note npm is a repo (mostly?) for JavaScript to run under node.hs. node.js is a server-side thing. It runs JavaScript on the server. Not
in
the client (browser).
JavaScript itself isn't terrible.
What is unfortunate, I think, is the unfettered creativity JavaScript in the browser allows web designers. They misuse it, just like they did Adobe Flash previously. To some extent this is caused by the good sides of JavaScript: how easy it is to learn, how easy it is to wip up complexity, how easy it is for the page creator to take control of the browser experience.
From what little I know what I"m thinking is that the browser user needs to have some tools to control what the browser does - - - - that seems to be unobtanium at this point.
What I was talking about was how easy it is to inject malicious code
into
the ecosystem. That isn't actually the fault of the language. (It is imaginable that one could design a language that prevented some abuse.)
In fact, the language+browser have been designed to limit the damage that could be inflicted on the client side. The npn problem is mostly server-side, I think (I'm not sure).
Making something easier (cheaper, faster, more understandable, ...) allows it to be used more, often to excess. Unexpected side effects can ensue.
- increasing efficiency of cars makes driving cheaper so people drive more and end up using more total energy (gasoline).
Our obsession with individual transportation has become a major cost factor in one's personal economy.
- computers became a lot cheaper. So a lot more money is spent on computers.
- programming has become easier. So a lot more pointless programs have been created.
- when I worked on optimizing compilers, I thought that I was trying to make existing programs run faster. Then it struck me that it allowed programmers to write programs in a simpler and clearer way and have the compiler eliminate the performance cost.
Interesting.
Here's a random example of npm use:
Thanks for the sharing!
I'm wondering if there even is a way of reining in the wild possibilities in javascript in a browser. If there is it would be quite nice if this would happen quite soon. I'm finding that the web has become quite a frustrating and a very very far from useful place to look for things.
Regards --- Post to this mailing list talk@gtalug.org Unsubscribe from this mailing list https://gtalug.org/mailman/listinfo/talk
--- Post to this mailing list talk@gtalug.org Unsubscribe from this mailing list https://gtalug.org/mailman/listinfo/talk
-- David Thornton https://wiki.quadratic.net https://github.com/drthornt/ https://twitter.com/northdot9/

On Fri, Nov 20, 2020 at 03:21:23PM -0500, David Thornton via talk wrote:
I can second the "noscript" thing. "Default deny" is good practice. No- one has to explain it for firewalls ( any more I hope), so why do we have to explain it in other places?
Have you seen what Apple did in MacOS 11 for the firewall interface? Apparently Apple's own apps get to be excempt from firewalls. Didn't take long for people to show how that could be abused to let anything you want through. -- Len Sorensen

Hugh, On Thu, Jan 23, 2020 at 11:08 AM D. Hugh Redelmeier via talk < talk@gtalug.org> wrote:
< https://www.zdnet.com/article/microsoft-spots-malicious-npm-package-stealing...
This article list six cases of malware contributed to npm (the repo for sharing node.js and JavaScript source).
How many undetected cases exist?
I've alway pretended that Linux distros vet their code.
They do, but npm is different. npm is indepdent of the distro itself. And people want to use npm because it gives them the latest and the greatest.
I'm not sure how true that is. Probably the greatest protection is the time delay between contribution and distribution.
I would be wary of this approach. There are a bunch of security fixes, where you probably don't want too long a delay. Part of responsibility also lies on the user to validate the update. With it being open source, and a "volunteer" model, some of that has to be accepted b the user. Dhaval

| From: Dhaval Giani via talk <talk@gtalug.org> | On Thu, Jan 23, 2020 at 11:08 AM D. Hugh Redelmeier via talk < | talk@gtalug.org> wrote: | | > < | > https://www.zdnet.com/article/microsoft-spots-malicious-npm-package-stealing... | > > | > | > This article list six cases of malware contributed to npm (the repo for | > sharing node.js and JavaScript source). | > | > How many undetected cases exist? | > | > I've alway pretended that Linux distros vet their code. | | | They do, but npm is different. npm is indepdent of the distro itself. And | people want to use npm because it gives them the latest and the greatest. I'm sorry that I wasn't clearer. I was changing the subject a bit. Just like npm has problems because vetting contributions is a hard problem, so too Linux distros have problems because vetting contributions is a hard problem. I know npm (JavaScript), CPAN (perl), CTAN (TeX), CPAN (Python), github, probably crates.io (Rust), etc. each bypass the disto. Any security flaws identified by a distro get fixed by updates but anything users sourced from these other repos will not be fixed by distro updates. A serious logistic problem for users, even if they are unaware of it. I try to avoid these repos for just that reason but it is kind of hard. | > I'm not sure how | > true that is. Probably the greatest protection is the time delay between | > contribution and distribution. | > | > | I would be wary of this approach. There are a bunch of security fixes, | where you probably don't want too long a delay. Part of responsibility also | lies on the user to validate the update. With it being open source, and a | "volunteer" model, some of that has to be accepted b the user. Sorry, I meant: the time delay between creating a piece of software and it being adopted by Linux distros. Not a delay because the user avoids distro updates. I imagine the barrier to contributing to npm is zero. But I don't actually know. See below. The barrier to contribution to any distro I know of is a bit higher. That involves time, effort, and creativity. But not enough to prevent a determined and skilled contributor of malware. The easiest way is probably to infiltrate a group that produces a piece of software already accepted by many distros. Then it depends on the vetting by that project. More on npm: <https://en.wikipedia.org/wiki/Npm_(software)> npm is the "Node Package Manager". It accompanies node.js. It hooks up to a "registry", by default nodejs.com (a commercial entity). Here's a bit from Wikipedia (all caps added by me): Over 477,000 packages are available on the main npm registry.[16] The registry has NO VETTING process for submission, which means that packages found there can be low quality, insecure, or malicious.[15] Instead, npm relies on user reports to take down packages if they violate policies by being low quality, insecure or malicious.[17] npm exposes statistics including number of downloads and number of depending packages to assist developers in judging the quality of packages.[18]

As administrators we have a responsibility to vet. Even if it's to "deligate" the vetting, we have to vet the deligate. Npm is a hot mess, and most people get that now. Galaxy / puppetforge / helm stuff ? Take a number. It sprouts faster than you can get on it sometimes. Pays the mortgage :) David On Sat, Jan 25, 2020 at 4:05 PM D. Hugh Redelmeier via talk <talk@gtalug.org> wrote:
| From: Dhaval Giani via talk <talk@gtalug.org>
| On Thu, Jan 23, 2020 at 11:08 AM D. Hugh Redelmeier via talk < | talk@gtalug.org> wrote: | | > < | > https://www.zdnet.com/article/microsoft-spots-malicious-npm-package-stealing... | > > | > | > This article list six cases of malware contributed to npm (the repo for | > sharing node.js and JavaScript source). | > | > How many undetected cases exist? | > | > I've alway pretended that Linux distros vet their code. | | | They do, but npm is different. npm is indepdent of the distro itself. And | people want to use npm because it gives them the latest and the greatest.
I'm sorry that I wasn't clearer.
I was changing the subject a bit. Just like npm has problems because vetting contributions is a hard problem, so too Linux distros have problems because vetting contributions is a hard problem.
I know npm (JavaScript), CPAN (perl), CTAN (TeX), CPAN (Python), github, probably crates.io (Rust), etc. each bypass the disto.
Any security flaws identified by a distro get fixed by updates but anything users sourced from these other repos will not be fixed by distro updates. A serious logistic problem for users, even if they are unaware of it.
I try to avoid these repos for just that reason but it is kind of hard.
| > I'm not sure how | > true that is. Probably the greatest protection is the time delay between | > contribution and distribution. | > | > | I would be wary of this approach. There are a bunch of security fixes, | where you probably don't want too long a delay. Part of responsibility also | lies on the user to validate the update. With it being open source, and a | "volunteer" model, some of that has to be accepted b the user.
Sorry, I meant: the time delay between creating a piece of software and it being adopted by Linux distros. Not a delay because the user avoids distro updates.
I imagine the barrier to contributing to npm is zero. But I don't actually know. See below.
The barrier to contribution to any distro I know of is a bit higher. That involves time, effort, and creativity. But not enough to prevent a determined and skilled contributor of malware. The easiest way is probably to infiltrate a group that produces a piece of software already accepted by many distros. Then it depends on the vetting by that project.
More on npm:
<https://en.wikipedia.org/wiki/Npm_(software)>
npm is the "Node Package Manager". It accompanies node.js. It hooks up to a "registry", by default nodejs.com (a commercial entity). Here's a bit from Wikipedia (all caps added by me):
Over 477,000 packages are available on the main npm registry.[16] The registry has NO VETTING process for submission, which means that packages found there can be low quality, insecure, or malicious.[15] Instead, npm relies on user reports to take down packages if they violate policies by being low quality, insecure or malicious.[17] npm exposes statistics including number of downloads and number of depending packages to assist developers in judging the quality of packages.[18] --- Post to this mailing list talk@gtalug.org Unsubscribe from this mailing list https://gtalug.org/mailman/listinfo/talk
-- David Thornton https://wiki.quadratic.net https://github.com/drthornt/ https://twitter.com/northdot9/

| From: David Thornton via talk <talk@gtalug.org> | Date: Fri, 20 Nov 2020 15:25:42 -0500 Thanks for reviving this thread 10 months later. What prompted you to do that? Note: this is not a complaint. I continue to think that this is an important and unresolved topic. | As administrators we have a responsibility to vet. Even if it's to | "deligate" the vetting, we have to vet the deligate. "have to" means "responsibility to". Unfortunately, responsibility without capability is a recipe for disaster. Clearly you've thought about this in a setting with customers. How do you discharge this responsibility? The GPL says: you get what we offer but we accept no responsibility. Many commercial software contract and EULAs disclaim responsibility and forbid using the software in safety-critical settings. They then often fall back on saying at most you can get back the purchase cost. So a responsible decision-maker cannot delegate the responsibility yet has no practical or even theoretical tools to discharge the responsibility. Except bankruptcy law. - you can ask your customer / client / employer that "here are the risks that I can imagine, are you willing to accept them?" - you can make sure that there are no assets available that can be lost when and if problems arise - you can work to reduce risks. This quickly hits the law of diminishing returns, long before the risks are eliminated. But I'm sure we can do better than the industry norms, as long as customers understand that they must and should pay for the up-front cost. Customers / clients often think that they are safer with large corporations. In that role, I've found the help from large companies (eg. Microsoft, Sun Microsystems (back in the day), ...) to inferior to help from small companies. Both are eclipsed by support from FLOSS communities. But support only deals with problems in the future, not damage that has happened. In the area of security, the worst breaches are the ones you never learn about. | Npm is a hot mess, and most people get that now. | | Galaxy / puppetforge / helm stuff ? Take a number. | | It sprouts faster than you can get on it sometimes. | | Pays the mortgage :) You can't live with them and you can't live without them?

I've seen better coverage but less depth from commercial entities. I just referred a Kobo bug to the in-house counsel, as the assigned support creature could neither understand the problem /nor/ the process. I used to work with their lawyer at Lexis Nexis: that's *not* a common kind of situation (;-)) in the open source world, one arguably only needs to convince a peer that something is wrong, not a legal representative of the company that they're at risk. --dave On 2020-11-21 12:06 p.m., D. Hugh Redelmeier via talk wrote:
| From: David Thornton via talk <talk@gtalug.org> | Date: Fri, 20 Nov 2020 15:25:42 -0500
Thanks for reviving this thread 10 months later. What prompted you to do that? Note: this is not a complaint. I continue to think that this is an important and unresolved topic.
| As administrators we have a responsibility to vet. Even if it's to | "deligate" the vetting, we have to vet the deligate.
"have to" means "responsibility to". Unfortunately, responsibility without capability is a recipe for disaster.
Clearly you've thought about this in a setting with customers. How do you discharge this responsibility?
The GPL says: you get what we offer but we accept no responsibility.
Many commercial software contract and EULAs disclaim responsibility and forbid using the software in safety-critical settings. They then often fall back on saying at most you can get back the purchase cost.
So a responsible decision-maker cannot delegate the responsibility yet has no practical or even theoretical tools to discharge the responsibility. Except bankruptcy law.
- you can ask your customer / client / employer that "here are the risks that I can imagine, are you willing to accept them?"
- you can make sure that there are no assets available that can be lost when and if problems arise
- you can work to reduce risks. This quickly hits the law of diminishing returns, long before the risks are eliminated. But I'm sure we can do better than the industry norms, as long as customers understand that they must and should pay for the up-front cost.
Customers / clients often think that they are safer with large corporations. In that role, I've found the help from large companies (eg. Microsoft, Sun Microsystems (back in the day), ...) to inferior to help from small companies. Both are eclipsed by support from FLOSS communities. But support only deals with problems in the future, not damage that has happened.
In the area of security, the worst breaches are the ones you never learn about.
| Npm is a hot mess, and most people get that now. | | Galaxy / puppetforge / helm stuff ? Take a number. | | It sprouts faster than you can get on it sometimes. | | Pays the mortgage :)
You can't live with them and you can't live without them? --- Post to this mailing list talk@gtalug.org Unsubscribe from this mailing list https://gtalug.org/mailman/listinfo/talk
-- David Collier-Brown, | Always do right. This will gratify System Programmer and Author | some people and astonish the rest dave.collier-brown@indexexchange.com | -- Mark Twain

Fair points, All of the service contracts I've worked behind say effectively: If we can't keep it from happening, then we can't be held responsible for it happening. You paid for a managed linux server, linux has a bug and you crash, we are not responsible. We'll patch when it comes out, we'll add a firewall rule to mitigate. But we could not have kept it from happening. It's pretty weak I know, but one thing I have learned is that there is a lot of conscious and unconscious, communicated and uncommunicated acceptance of risk in many industries. I advocate for professional , responsible, management and communication of risk in my day to day activities. I feel like I've done my best work when I can talk to clients directly and honestly about risk, and how we can manage it. I can do what I can, but I can't worry about or fret about stuff I can't do anything about. (Which is , I think, basically what you are saying above ) I can do a lot of reasonable things to protect against uncontrolled aspects of operation. We had only one hard drive and it failed, so we went to a pair of mirrored disks. We had only one web server and it failed so we went to a cluster of 2 to a bagilion web servers. We used open source software and it was a hot mess so we .....um hullo? anyone else? .... Canonical, Microsoft, Redhat, Oracle, Amazon, Google , what have you.. They can do mitigation and management in ways I can't. I lived and breathed Redhat for along time, and we sold linux under "Redhat is good, redhat can make it go" They added safety and consistency. I mean it wasn't / isn't perfect, but it worked. It got a lot of stuff done in a short amount of time for us. Risk management never gets old, it is as old as the first profession ( Prostitution: "Will my primary mate catch me." ) ( Which of course led to the second oldest professions : Lawyers ) P.S. I decided to give email another go, for old-time sake, that's why I revived thethread I guess: I read my mail :) David On Sat, Nov 21, 2020 at 12:06 PM D. Hugh Redelmeier <hugh@mimosa.com> wrote:
| From: David Thornton via talk <talk@gtalug.org> | Date: Fri, 20 Nov 2020 15:25:42 -0500
Thanks for reviving this thread 10 months later. What prompted you to do that? Note: this is not a complaint. I continue to think that this is an important and unresolved topic.
| As administrators we have a responsibility to vet. Even if it's to | "deligate" the vetting, we have to vet the deligate.
"have to" means "responsibility to". Unfortunately, responsibility without capability is a recipe for disaster.
Clearly you've thought about this in a setting with customers. How do you discharge this responsibility?
The GPL says: you get what we offer but we accept no responsibility.
Many commercial software contract and EULAs disclaim responsibility and forbid using the software in safety-critical settings. They then often fall back on saying at most you can get back the purchase cost.
So a responsible decision-maker cannot delegate the responsibility yet has no practical or even theoretical tools to discharge the responsibility. Except bankruptcy law.
- you can ask your customer / client / employer that "here are the risks that I can imagine, are you willing to accept them?"
- you can make sure that there are no assets available that can be lost when and if problems arise
- you can work to reduce risks. This quickly hits the law of diminishing returns, long before the risks are eliminated. But I'm sure we can do better than the industry norms, as long as customers understand that they must and should pay for the up-front cost.
Customers / clients often think that they are safer with large corporations. In that role, I've found the help from large companies (eg. Microsoft, Sun Microsystems (back in the day), ...) to inferior to help from small companies. Both are eclipsed by support from FLOSS communities. But support only deals with problems in the future, not damage that has happened.
In the area of security, the worst breaches are the ones you never learn about.
| Npm is a hot mess, and most people get that now. | | Galaxy / puppetforge / helm stuff ? Take a number. | | It sprouts faster than you can get on it sometimes. | | Pays the mortgage :)
You can't live with them and you can't live without them?
-- David Thornton https://wiki.quadratic.net https://github.com/drthornt/ https://twitter.com/northdot9/
participants (7)
-
D. Hugh Redelmeier
-
Dave Collier-Brown
-
David Thornton
-
Dhaval Giani
-
Don Tai
-
lsorense@csclub.uwaterloo.ca
-
o1bigtenor