If you prefer to listen rather than read, this blog is available as a podcast here. Or if you want to listen to just this post:
In the previous newsletter we told of how we discovered the Temple of Technology, with wall after wall of knobs that give us control over society. At least that’s what we, in our hubris, assume the knobs of technology will do.
Mostly that assumption is correct. Though on occasion an overager grad student will sneak out under cover of darkness and turn one knob all the way to the right. And, as there are so many knobs, it can be a long time before we realize what has happened.
But we are not all over-eager graduate students. Mostly we are careful, wise professors, and we soberly consider which knobs should be turned. We have translated many of the symbols, but not all. Still, out of those we have translated one seems very clear. It’s the symbol for “Safety”.
Unlike some of the knobs, everyone agrees that we should turn this knob all the way to the right. Someone interjects that we should turn it up to 11. The younger members of the group laugh. The old, wise professors don’t get the joke, but that’s okay because even if the joke isn’t clear, the consensus is. Everyone agrees that it would be dangerous and irresponsible to choose any setting other than maximum safety.
The knob is duly “turned up to 11” and things seem to be going well. Society is moving in the right direction. Unsafe products are held accountable for deaths and injuries. Standards are implemented to prevent unsafe things from happening again. Deaths from accidents go down. Industrial deaths plummet. Everyone is pleased with themselves.
Though as things progress there is some weirdness. The knob doesn’t work quite the way people expect. The effects can be inconsistent.
- Children are safer than ever, but that’s not what anyone thinks. Parents are increasingly filled with dread. Unaccompanied children become almost extinct.
- Car accidents remain persistently high. Numerous additional safety features are implemented, but people engage in risk compensation, meaning that the effect of these features is never as great as expected.
- Nuclear power is buried under an avalanche of safety regulations. This is despite its many advantages. Not only is it carbon free, but ironically it’s also the safest of all methods for generating power.
- Antibiotics are overprescribed, and rather than making us safer from disease they create antibiotic resistant strains which are far more deadly.
Still despite these unexpected outcomes no one suggests adjusting the safety knob.
Then one day, in the midst of vaccinating the world against a terrible pandemic it’s discovered that some of the vaccines cause blood clots. That out of every million people who receive the vaccine one will die from these clots. Immediately restrictions are placed on the vaccines. In some places they’re paused, in other places they’re discontinued entirely. The wise old professors protest that this will actually cause more people to die from the pandemic then would ever die from the clots, but by this point no one is listening to them.
In our hubris we thought that turning the knob “up to 11” would result in safe technology. But no technology is completely safe, such a thing is impossible. No, this wasn’t the knob for safety, it was for increasing the importance of our perception of safety.
- When the government announces that a vaccine can cause blood clots we perceive it as being unsafe. Even though vaccines prevent a far greater danger.
- We may understand antibiotic resistance, but wouldn’t it be safer for us if we got antibiotics just in case?
- Nuclear power is perceived as obviously unsafe because it’s the same process that goes into making nuclear weapons.
- We all think we’re an above average driver, and in any case we do it every day, how unsafe could it be?
- And is any level of safety too great for our children?
Safety is obviously good, but that doesn’t mean it’s straightforward. While we were protecting our children from the vanishingly small chance that they would be abducted by a stranger the danger of social media crept in virtually undetected. While we agonize over a handful of deaths from the vaccine thousands die because they lack the vaccine. The perception of safety is not safety. Turning the knobs of technology have unpredictable and potentially dangerous consequences. Even the knob labelled safety.
I’ve been toying with adding images particularly to the newsletter. If you would like more images, let me know. If you would really like more images consider donating.
Nuclear power does not seem buried under an avalanche of safety regulations. It’s decline (and possible rebirth in a smaller scale) is simple economics. Why have you not put a soda fountain in your house? In terms of cost per unit of soda drunk, it is by far cheaper than getting soda from fast food outlets or even buying cans or bottles from the supermarket. Yet you haven’t. Because a $75,000 investment with a payback of 30 years is a tough sell. Big nuclear suffers from that problem. Tiny modular nuclear does not, but its cost per unit is higher except in niche areas (supplying power to a mining colony near the artic) or for suppling heat directly on an industrial scale.
Car safety is interesting because I think it contrasts with football safety. Football players know they have a crap ton of safety gear on hence they can pound other players harder and take harder pounding. Safer cars are a bit trickier because the driver has to know they are in a less safe vehicle. I once had to drive a van with bald tires, questionable breaks, from Long Island to Jersey one Saturday night as a blizzard was happening with a huge machine in back filled with melted wax. I knew I was in a death trap and I don’t think I made it past 10 mph for more than half that trip. Beyond that, though, how do you know your car is less safe until you actually are less than 3 seconds from an accident and realize your anti-lock breaks are more lock breaks?
The model that we can follow is the airline model. We basically accept zero flaws in air travel. Any crash is scrutinized until its cause is found and countered. Entire fleets of planes will be grounded for flaws that could in theory be lived with if the pilots are made aware of the issue. Here air travel is following a six sigma model and it seems to have worked. It seems to me then you can do the same with cars. Elon Musk seems to have quietly admitted that full self driving is not going to happen until we have a major breakthru in AI but it does seem like better crash avoidence is already here and can be built on.