One of the big problems with how organisations often work, especially private businesses, is the extremely casual attitude to product testing and risk assessment. It’s only after they spend shitloads on lawyers and public relations that they are suddenly able to prioritise creating jobs dedicated explicitly to preventing the damage they cause with that attitude.
But because law-making is slow, weighed down by flawed human power structures combined with legitimately necessary procedures, the only thing businesses need to do is to outpace the speed of law change to avoid being punished. Outpacing the law has been easy enough to do at the best of times, but with half-assed exploitative software development in a rapidly progressing robotics and ‘AI Boom’ environment, it will only get easier and hurt more people.
And then the executives who allowed their shoddy products to hurt people will just change employers, likely for a pay raise or just selling the business outright. The only consequences for their reckless management personally are a few late nights in a bad mood. All because limited liability meant they might as well have just been an innocent bystander.
Meanwhile the victims - if they survive, are left in lifelong pain and misery, because courts ruled that the law doesn’t cover their novel situation. Not to mention the damage to their families and communities.
Globally, we need to start holding individual organisation decision-makers to personal account for the damage their decisions cause. Both financial and prison-time, for both environmental and human damage. I mean like “Board of Directors and all Chief Officers of Cruise on trial for negligent homicide” levels of responsibility. It’s the only way to prevent this kind of unnecessary suffering.
tl;dr 1. Risk of personal loss is the only way people in power will prioritise building safer products.
2. We need the law to catch up faster to a world where humans can offload more life-changing decisions to computers.
3. Law-makers should start assuming we live on the Star Trek holodeck in a Q episode instead of the Unix epoch, if they are going to catch up on their huge backlog.
4. People need to start assuming their code is imprecise and dangerous and build in graceful failures. Yes, it will be expensive in a time-sensitive environment, important things often are.
I’ve read about the cruse team, and “extremely casual attitude to product testing” does not accurately describe what they are doing. The cruse vehicles have a much lower and less severe accident rating than human drivers, and have logged millions of road miles without seriously injuring anyone (until now).
Unlike a certain narcissistic auto manufacturers owner…
Oh, did they actually release data and had an independent research group analyse it? Or is this a statement from their PR department? It’s easy to be better than the average human driver if you only drive in good weather and well built roads.
Tesla always makes big claims about how safe it is, but to the best of my knowledge never actually released any usable data about it. It would be awesome if cruise did that.
‘Less shit than the average human at it’ is a really low bar to set for modern computers, even if Tesla fails at that poor standard and Cruise is currently top of the game. We still need much higher bars when we’re talking about entirely automated systems which are controlling speedy large chunks of metal, or even other smaller-scale-impact-and-damage systems. Systems which can’t just hop out, ask if the victims are OK, render appropriate first aid, accurately inform emergency services, etc.
The more automation, the higher the standards should be, which means we need to set legal requirements that at least try to scale with he development of technology.
I disagree. Human drivers kill over 40,000 Americans a year. If there’s an alternative that kills less than 40,000 a year we should take it. Ideally mass transit but America seems to like cars.
I wasn’t suggesting stopping the development of automated vehicles because it’s impossible to have 0 damage. I was advocating having high standards for software/hardware development and real consequences for decision-makers trying to find shortcuts.
Progress and standards are not mutually exclusive.
One of the big problems with how organisations often work, especially private businesses, is the extremely casual attitude to product testing and risk assessment. It’s only after they spend shitloads on lawyers and public relations that they are suddenly able to prioritise creating jobs dedicated explicitly to preventing the damage they cause with that attitude.
But because law-making is slow, weighed down by flawed human power structures combined with legitimately necessary procedures, the only thing businesses need to do is to outpace the speed of law change to avoid being punished. Outpacing the law has been easy enough to do at the best of times, but with half-assed exploitative software development in a rapidly progressing robotics and ‘AI Boom’ environment, it will only get easier and hurt more people.
And then the executives who allowed their shoddy products to hurt people will just change employers, likely for a pay raise or just selling the business outright. The only consequences for their reckless management personally are a few late nights in a bad mood. All because limited liability meant they might as well have just been an innocent bystander.
Meanwhile the victims - if they survive, are left in lifelong pain and misery, because courts ruled that the law doesn’t cover their novel situation. Not to mention the damage to their families and communities.
Globally, we need to start holding individual organisation decision-makers to personal account for the damage their decisions cause. Both financial and prison-time, for both environmental and human damage. I mean like “Board of Directors and all Chief Officers of Cruise on trial for negligent homicide” levels of responsibility. It’s the only way to prevent this kind of unnecessary suffering.
tl;dr
1. Risk of personal loss is the only way people in power will prioritise building safer products.
2. We need the law to catch up faster to a world where humans can offload more life-changing decisions to computers.
3. Law-makers should start assuming we live on the Star Trek holodeck in a Q episode instead of the Unix epoch, if they are going to catch up on their huge backlog.
4. People need to start assuming their code is imprecise and dangerous and build in graceful failures. Yes, it will be expensive in a time-sensitive environment, important things often are.
I’ve read about the cruse team, and “extremely casual attitude to product testing” does not accurately describe what they are doing. The cruse vehicles have a much lower and less severe accident rating than human drivers, and have logged millions of road miles without seriously injuring anyone (until now).
Unlike a certain narcissistic auto manufacturers owner…
Oh, did they actually release data and had an independent research group analyse it? Or is this a statement from their PR department? It’s easy to be better than the average human driver if you only drive in good weather and well built roads.
Tesla always makes big claims about how safe it is, but to the best of my knowledge never actually released any usable data about it. It would be awesome if cruise did that.
‘Less shit than the average human at it’ is a really low bar to set for modern computers, even if Tesla fails at that poor standard and Cruise is currently top of the game. We still need much higher bars when we’re talking about entirely automated systems which are controlling speedy large chunks of metal, or even other smaller-scale-impact-and-damage systems. Systems which can’t just hop out, ask if the victims are OK, render appropriate first aid, accurately inform emergency services, etc.
The more automation, the higher the standards should be, which means we need to set legal requirements that at least try to scale with he development of technology.
I disagree. Human drivers kill over 40,000 Americans a year. If there’s an alternative that kills less than 40,000 a year we should take it. Ideally mass transit but America seems to like cars.
I wasn’t suggesting stopping the development of automated vehicles because it’s impossible to have 0 damage. I was advocating having high standards for software/hardware development and real consequences for decision-makers trying to find shortcuts.
Progress and standards are not mutually exclusive.