Eschatologist #23 - Avoiding Risk
If you prefer to listen rather than read, this blog is available as a podcast here. Or if you want to listen to just this post:
I’m going to talk about FTX. I know, you’re sick of hearing about it, and you’re sick of Sam Bankman-Fried’s face and seeing his name abbreviated as just SBF. But out of the thousands of “hot takes” this story has generated, this is the “hot take” you needed, but didn’t know it. Though, as with all examples of greatness, I’ll be standing on the shoulders of giants. Let’s start with Tyler Cowen, the noted economist who observed that:
Hardly anyone associated with Future Fund saw the existential risk to…Future Fund, even though they were as close to it as one could possibly be.
Future Fund was also called FTX Future Fund, and was wholly funded by “profits” from FTX. Their primary focus was on preventing future risk, so you can see how Cowen might find the situation ironic. I also think it’s super ironic, though I’m inclined to cut them a little bit of slack. Risk detection and mitigation is hard, and technology has only made it harder.
Of course thievery predates humans by tens of millions of years, and even Ponzi schemes have been around since at least 1920 when Charles Ponzi started his. (You can see why I’m only cutting them a little bit of slack.) But the crypto-specific version of the scam was brand new. Being able to privately mint something that is half currency/half asset and then sell a small portion of it to create a scandalously inflated mark to market value for that currency/asset is an innovation. An innovation in evil but an innovation nonetheless.
So yes, as has been pointed out, this lack of foresight is perhaps not quite the abject failure Cowen makes it out to be, but it’s still a good illustration of how difficult it is to avoid risk. You can have an organization where that’s their entire purpose, and they can be blindsided by something because they were only looking for specific kinds of risk.
The Future Fund was focused on exotic risks, which is a fascination many people have recently developed. But in their focus on exotic risks they missed a very common risk. They could imagine a malevolent all powerful AI. (It’s the first item on their Areas of Interest list.) But they couldn’t imagine that SBF was a common criminal (or they could but didn’t do anything about it).
The simple point would be: don’t let shiny new exotic risks distract you from common everyday risks. But the larger point is that we have to have a comprehensive approach to risk. The Future Fund and others are correct, technology has created a host of new dangers. But reality is not some game where when you reach the next level you never again see the monsters from the previous levels. We always have to deal with all the monsters, the old ones, the new ones we’ve created, and a whole host of other monsters lurking just out of sight.
The hard and uncaring universe doesn’t grade on a curve. It doesn’t imagine the answer you thought you were giving and say close enough. It doesn’t care what your intentions were — that technology is supposed to be a good thing. When it creates risk it does so randomly and capriciously. To look at just one more recent example: when you close down schools, the universe doesn’t automatically turn that into a good decision because you did it in the name of safety. Risk doesn’t just emerge from actions that are obviously bad.
This is particularly important when considering technology. Nearly all of it was developed for the benefit of humanity, but that doesn’t mean it hasn’t enabled a host of new risks. There are the obvious risks from engineered pandemics, nuclear weapons and being hit by a comet, but it has also brought a host of subtler risks: risks of stagnation, discord, and narcissism. And, as we discovered with FTX, it’s created new ways to package old risks.
So while it’s understandable that Future Fund missed the rampant fraud, it’s not forgivable. Because there is no forgiveness, there are only consequences. And if your fund, or your nation or your world ends, it doesn’t matter how it happened. And while I personally believe our souls will be graded by a kind and understanding judge and our intentions will matter. As long as we’re still in this life, we still need to be aware of all the risks, the old and the new, the big and the small, the flashy and the subtle, but most of all the thousands of new risks we’ve created for ourselves. We need to step up our game.
After all of this you may be wondering, is anything risk free? Or will we inevitably discover that it all has negative second order effects? Well, there is one thing completely free of risk: donating to this blog. And yes, I know that sounds self-interested, but as SBF once said, trust me here. I know what I’m doing.