Dan Kaminsky Bugs aren't random ...

Nice talk on the physics of power management in the most recent shared cache exploits. Defcon 26 was held in China this year. https://www.youtube.com/watch?v=f3cyCg7itOI Dan says; It can take looking at a few thousand bugs, but eventually hacking feels like getting really good at telling the same joke, over and over again. It's OK, the computer still laughs, but why isn't software engineering delivering the reliability and predictability of other engineering disciplines? That's a question with an answer. It's not an easy answer, like "devs are lazy" or "tools are bad". Who are hackers to complain about either? But it's an answer I intend to explore, in true hacker fashion, by seeing traditional boundaries as mostly false, but useful for identifying what to fuzz. Why should we separate the humans that write bugs, from the tools the tools they use? Humans write tools. Why these tools in particular? Why would we separate forward and reverse engineering, dev from test? Wait, are those the same thing? Does any other field isolate the creator from the consequences of their creation? Is this going to be just some fluffy exploratory keynote? No, this is way too long a flight for that. We're going to talk about where I think software and hardware architecture is going to go, with actual code you're welcome to try to break. I'll tell you exactly where to look.

On Sat, Aug 11, 2018 at 10:48:17AM -0400, Russell Reiter via talk wrote:
Nice talk on the physics of power management in the most recent shared cache exploits. Defcon 26 was held in China this year.
Defcon 26 was in Vegas the last few days. China held Defcon Beta earlier this year. -- Len Sorensen

Conjecture: Developers are the worker bees, they don't decide what gets done. The business decides what get's done. They are the expressers of morals and values. They can choose to cow-tow to the market or standup for something at odds with the crowd. Classically businesses have operated with a short-sighted, reductionist, "externalization of cost" mind set. The cycle is roughly one employment term in length. You just gotta push the cost off to the next cohort and make off like a bandit myself. ( either to the next hustle or retirment. ) Show me a peice of software you use for greater than 10 years and I'll show you a peice of software that's starting to mature ( windows, chrome, linux? ). Sometimes maturity only comes through the death of the parent ( klang, v8 ). David David Thornton @northdot9 https://wiki.quadratic.net On Sat, Aug 11, 2018, 10:48 AM Russell Reiter via talk, <talk@gtalug.org> wrote:
Nice talk on the physics of power management in the most recent shared cache exploits. Defcon 26 was held in China this year.
https://www.youtube.com/watch?v=f3cyCg7itOI
Dan says; It can take looking at a few thousand bugs, but eventually hacking feels like getting really good at telling the same joke, over and over again. It's OK, the computer still laughs, but why isn't software engineering delivering the reliability and predictability of other engineering disciplines? That's a question with an answer. It's not an easy answer, like "devs are lazy" or "tools are bad". Who are hackers to complain about either? But it's an answer I intend to explore, in true hacker fashion, by seeing traditional boundaries as mostly false, but useful for identifying what to fuzz. Why should we separate the humans that write bugs, from the tools the tools they use? Humans write tools. Why these tools in particular? Why would we separate forward and reverse engineering, dev from test? Wait, are those the same thing? Does any other field isolate the creator from the consequences of their creation? Is this going to be just some fluffy exploratory keynote? No, this is way too long a flight for that. We're going to talk about where I think software and hardware architecture is going to go, with actual code you're welcome to try to break. I'll tell you exactly where to look. --- Talk Mailing List talk@gtalug.org https://gtalug.org/mailman/listinfo/talk

| From: Russell Reiter via talk <talk@gtalug.org> Thanks. I started watching but I found it quite irritating. But it did send me on a web journey. He showed an "Up" computer on a slide, so I searched that out. The Up Squared looks interesting: <https://up-shop.org/28-up-squared> - single-board computer (like Raspberry Pi) - Intel Celeron processor, enough memory etc., 2 gigabit ethernet ports (good for a router). - however I've gotten similar functionality cheaper (certain Zotac Zboxes) Up also has an SBC meant for Computational Neural Nets / Vision, the Up AI Edge: <https://up-shop.org/25-up-ai-edge> That's based on the Intel Movidius Myriad 2 chip. I'd not paid attention to that. <https://www.anandtech.com/show/11771/intel-announces-movidius-myriad-x-vpu> Interestingly, those chips have a pair of onboard LEON4 processor cores. I'd not heard of that before. <https://en.wikipedia.org/wiki/LEON#LEON4_processor_core> Which is a 32-bit SPARCv8 processor! SPARC lives and is being produced by Intel!

On 2018-08-16 02:49 PM, D. Hugh Redelmeier via talk wrote:
<https://en.wikipedia.org/wiki/LEON#LEON4_processor_core>
Which is a 32-bit SPARCv8 processor! SPARC lives and is being produced by Intel!
LEON's used in lots of ESA space projects. Pretty sure this board doesn't use the rad-hard variants, as it's only got half the number of digits in the price of those space-certified SoCs … cheers, Stewart
participants (5)
-
D. Hugh Redelmeier
-
David Thornton
-
lsorense@csclub.uwaterloo.ca
-
Russell Reiter
-
Stewart C. Russell