Tag: <span>Black Swans</span>

Eschatologist #12: Predictions

If you prefer to listen rather than read, this blog is available as a podcast here. Or if you want to listen to just this post:

Or download the MP3


Many people use the occasion of the New Year to make predictions about the coming year. And frankly, while these sorts of predictions are amusing, and maybe even interesting, they’re less useful than you might think.

Some people try to get around this problem by tracking the accuracy of their predictions from year to year, and assigning confidence levels (i.e. I’m 80% sure X will happen vs. being 90% sure that Y will happen). This sort of thing is often referred to as Superforecasting. These tactics would appear to make predicting more useful, but I am not a fan

At this point you might be confused: how could tracking people’s predictions not ultimately improve those predictions? For the long and involved answer you can listen the 8,000 words I recorded on the subject back in April and May of 2020. The short answer is that it focuses all of the attention on making correct predictions rather than making useful predictions. A useful prediction would have been: there will eventually be a pandemic and we need to prepare for it. But if you want to be correct you avoid predictions like that because most years there won’t be a pandemic and you’ll be wrong. 

It leaves out things that are hard to predict. Things that have a very low chance of happening. Things like black swans. You may remember me saying in the last newsletter that:

Because of their impact, the future is almost entirely the product of black swans.

If this is the case what sorts of predictions are useful? How about a list of catastrophes that probably will happen, along with a list of miracles which probably won’t. Things we should worry about and also things we can’t look forward to. I first compiled this list back in 2017, with updates in 2018, 2019, and 2020. So if you’re really curious about the specifics of each prediction you can look there. But these are my black swan predictions for the next 100 years:

Artificial Intelligence

  1. General artificial intelligence, something duplicating all of the abilities of an average human (or better), will never be developed.
  2. A complete functional reconstruction of the brain will turn out to be impossible. For example slicing and scanning a brain, or constructing an artificial brain.
  3. Artificial consciousness will never be created. (Difficult to define, but let’s say: We will never have an AI who makes a credible argument for its own free will.)

Transhumanism

  1. Immortality will never be achieved. 
  2. We will never be able to upload our consciousness into a computer. 
  3. No one will ever successfully be returned from the dead using cryonics. 

Outer Space

  1. We will never establish a viable human colony outside the solar system. 
  2. We will never have an extraterrestrial colony of greater than 35,000 people. 
  3. Either we have already made contact with intelligent exterrestrials or we never will

War (I hope I’m wrong about all of these)

  1. Two or more nukes will be exploded in anger within 30 days of one another. 
  2. There will be a war with more deaths than World War II (in absolute numbers, not as a percentage of population.) 
  3. The number of nations with nuclear weapons will never be fewer than it is right now.

Miscellaneous

  1. There will be a natural disaster somewhere in the world that kills at least a million people
  2. The US government’s debt will eventually be the source of a gigantic global meltdown.
  3. Five or more of the current OECD countries will cease to exist in their current form.

This list is certainly not exhaustive. I definitely should have put a pandemic on it back in 2017. Certainly I was aware, even then, that it was only a matter of time. (I guess if you squint it could be considered a natural disaster…)

To return to the theme of my blog and this newsletter:

The harvest is past, the summer is ended, and we are not saved.

I don’t think we’re going to be saved by black swans, but we could be destroyed by them. If the summer is over, then as they say, “Winter is coming.” Perhaps when we look back, the pandemic will be considered the first snowstorm…


I think I’ve got COVID. I’m leaving immediately after posting this to go get tested. If this news inspires any mercy or pity, consider translating that into a donation.


Tetlock, the Taliban, and Taleb

If you prefer to listen rather than read, this blog is available as a podcast here. Or if you want to listen to just this post:

Or download the MP3


I.

There have been many essays written in the aftermath of our withdrawal from Afghanistan. One of the more interesting was penned by Richard Hanania, and titled “Tetlock and the Taliban”. Everyone reading this has heard of the Taliban, but there might be a few of you who are unfamiliar with Tetlock. And even if that name rings a bell you might not be clear on what his relation is to the Taliban. Hanania himself apologizes to Tetlock for the association, but “couldn’t resist the alliteration”, which is understandable. Neither could I. 

Tetlock is known for a lot of things, but he got his start by pointing out that “experts” often weren’t. To borrow from Hanania:

Phil Tetlock’s work on experts is one of those things that gets a lot of attention, but still manages to be underrated. In his 2005 Expert Political Judgment: How Good Is It? How Can We Know?, he found that the forecasting abilities of subject-matter experts were no better than educated laymen when it came to predicting geopolitical events and economic outcomes.

From this summary the connection to the Taliban is probably obvious. This is an arena where the subject matter experts got things very wrong. Hanania’s opening analogy is too good not to quote:

Imagine that the US was competing in a space race with some third world country, say Zambia, for whatever reason. Americans of course would have orders of magnitude more money to throw at the problem, and the most respected aerospace engineers in the world, with degrees from the best universities and publications in the top journals. Zambia would have none of this. What should our reaction be if, after a decade, Zambia had made more progress?

Obviously, it would call into question the entire field of aerospace engineering. What good were all those Google Scholar pages filled with thousands of citations, all the knowledge gained from our labs and universities, if Western science gets outcompeted by the third world?

For all that has been said about Afghanistan, no one has noticed that this is precisely what just happened to political science.

Of course Hanania’s point is more devastating than Tetlock’s. The experts weren’t just “no better” than the Taliban’s “educated laymen”. The “experts” were decisively outcompeted despite having vastly more money and in theory, all the expertise. Certainly they had all the credentialed expertise…

In some ways Hanania’s point is just a restatement of Antonio García Martínez’s point, which I used to end my last post on Afghanistan—the idea we are an unserious people. That we enjoy “an imperium so broad and blinding” we’ve never been “made to suffer the limits of [our] understanding or re-assess [our] assumptions about [the] world”

So the Taliban needed no introduction, and we’ve introduced Tetlock, but what about Taleb? Longtime readers of this blog should be very familiar with Nassim Nicholas Taleb, but if not I have a whole post introducing his ideas. For this post we’re interested in two things, his relationship to Tetlock and his work describing black swans: rare, consequential and unpredictable events. 

Taleb and Tetlock are on the same page when it comes to experts, and in fact for a time they were collaborators, co-authoring papers on the fallibility of expert predictions and the general difficulty of making predictions—particularly when it came to fat-tail risks. But then, according to Taleb, Tetlock was seduced by government money and went from pointing out the weaknesses of experts to trying to supplant them, by creating the Good Judgement project, and the whole project of superforecasting.

The key problem with expert prediction, from Tetlock’s point of view, is that experts are unaccountable. No one tracks whether they were eventually right or wrong. Beyond that, their “predictions” are made in such a way that even making a determination of accuracy is impossible. Additionally experts are not any better at prediction than educated laypeople. Tetlock’s solution is to offer the chance for anyone to make predictions, but in the process ensure that the predictions can be tracked, and assessed for accuracy. From there you can promote those people with the best track record. A sample prediction might be “I am 90% confident that Joe Biden will win the 2020 presidential election.” 

Taleb agreed with the problem, but not with the solution. And this is where black swans come in. Black swans can’t be predicted, they can only be hedged against, and prepared for, but superforecasting, by giving the illusion of prediction, encourages people to be less prepared for black swans, and in the end worse off than they would have been without the prediction.

In the time since writing The Black Swan Taleb has come to hate the term, because people have twisted it into an excuse for precisely the kind of unpreparedness he was trying to prevent. 

“No one could have done anything about the 2007 financial crisis. It was a black swan!”

“We couldn’t have done anything about the pandemic in advance. It was a black swan!” 

“Who could have predicted that the Taliban would take over the country in nine days! It was a black swan!”

Accordingly, other terms have been suggested. In my last post I reviewed a book which introduced the term “gray rhino”, something people can see coming, but which they nevertheless ignore. 

Regardless of the label we decide to apply to what happened in Afghanistan, it feels like we were caught flat footed. We needed to be better prepared. Taleb says we can be better prepared if we expect black swans. Tetlock says we can be better prepared by predicting what to prepare for. Afghanistan seems like precisely the sort of thing superforecasting was designed for. Despite this I can find no evidence that Tetlock’s stable of superforecasters predicted how fast Afghanistan would fall, or any evidence that they even tried. 

As a final point before we move on. This last bit is one of the biggest problems with superforecasting. The idea that you should only be judged for what you got wrong, that if you were never asked to make a prediction about something that the endeavor “worked”. But reality doesn’t care about what you chose to make predictions on vs. what you didn’t. Reality does whatever it feels like. And the fact that you didn’t choose to make any predictions about the fall of Afghanistan doesn’t mean that thousands of interpreters didn’t end up being left behind. And the fact that you didn’t choose to make any predictions about pandemics doesn’t mean that millions of people didn’t die. This is the chief difference between Tetlock and Taleb.

II.

I first thought about this issue when I came across a poll on a forum I frequent, in which users were asked how long they thought the Afghan government would last. The options and results were:

(In the interest of full disclosure the bolded option indicates that I said one to two years.)

While it is true that a plurality of people said less than six months, six months was still much longer than the nine days it actually took (from capturing the first provincial capital to the fall of Kabul) and from the discussion that followed the poll, it seemed most of those 16 people were thinking that the government would fall at closer to six months or even three months than one week. In fact the best thing, prediction-wise, to come out of the discussion was when someone pointed out that 10 years previously The Onion had posted an article with the headline U.S. Quietly Slips Out Of Afghanistan In Dead Of Night, which is exactly what happened at Bagram. 

As it turns out this is not the first time The Onion has eerily predicted the future. There’s a whole subgenre of noticing all the times it’s happened. How do they do it? Well of course part of the answer is selection bias.  No one is expecting them to predict the future; nobody comments on all the articles that didn’t come true.  But when one does, it’s noteworthy. But I think there’s something else going on as well: I think they come up with the worst or most ridiculous thing that could happen, and because of the way the world works, some of the time that’s exactly what does happen. 

Between the poll answers being skewed from reality and the link to the Onion article, the thread led me to wonder: where were the superforecasters in all of this?

I don’t want to go through all of the problems I’ve brought up with superforecasting (I’ve easily written more than 10,000 words on the subject) but this event is another example of nearly all of my complaints. 

  • There is no methodology to account for the differing impact of being incorrect on some predictions vs. others. (Being wrong about whether the Tokyo Olympics will be held is a lot less consequential than being wrong about Brexit.)
  • Their attention is naturally drawn to obvious questions where tracking predictions is easy. 
  • Their rate of success is skewed both by only picking obvious questions, and by lumping together both the consequential and the inconsequential.
  • People use superforecasting as a way of more efficiently allocating resources, but efficiency is essentially equal to fragility, which leaves us less prepared when things go really bad. (It was pretty efficient to just leave Bagram all at once.)

Or course some of these don’t apply because as far as I can tell the Good Judgment project and it’s stable of superforecasters never tackled the question, but they easily could have. They could have had a series of questions about whether the Taliban would be in control of Kabul by a certain date. This seems specific enough to meet their criteria. But as I said, I could find no evidence that they had. Which means either they did make such predictions and were embarrassingly wrong, so it’s been buried, or despite its geopolitical importance it never occurred to them to make any predictions about when Afghanistan would fall. (But it did occur to a random poster on a fringe internet message board?) Both options are bad.

When people like me criticize superforecasting and Tetlock’s Good Judgment project in this manner, the common response is to point out all the things they did get right and further that superforecasting is not about getting everything right; it’s about improving the odds, and getting more things right than the old method of relying on the experts. This is a laudable goal. But as I point out it suffers from several blindspots. The blindspot of impact is particularly egregious and deserves more discussion. To quote from one of my previous posts where I reflected on their failure to predict the pandemic:

To put it another way, I’m sure that the Good Judgement project and other people following the Tetlockian methodology have made thousands of forecasts about the world. Let’s be incredibly charitable and assume that out of all these thousands of predictions, 99% were correct. That out of everything they made predictions about 99% of it came to pass. That sounds fantastic, but depending on what’s in the 1% of the things they didn’t predict, the world could still be a vastly different place than what they expected. And that assumes that their predictions encompass every possibility. In reality there are lots of very impactful things which they might never have considered assigning a probability to. That in fact they could actually be 100% correct about the stuff they predicted but still be caught entirely flat footed by the future because something happened they never even considered. 

As far as I can tell there were no advance predictions of the probability of a pandemic by anyone following the Tetlockian methodology, say in 2019 or earlier. Or any list where “pandemic” was #1 on the “list of things superforecasters think we’re unprepared for”, or really any indication at all that people who listened to superforecasters were more prepared for this than the average individual. But the Good Judgement Project did try their hand at both Brexit and Trump and got both wrong. This is what I mean by the impact of the stuff they were wrong about being greater than the stuff they were correct about. When future historians consider the last five years or even the last 10, I’m not sure what events they will rate as being the most important, but surely those three would have to be in the top 10. They correctly predicted a lot of stuff which didn’t amount to anything and missed predicting the few things that really mattered.

Once again we find ourselves in a similar position. When we imagine historians looking back on 2021, no one would find it surprising if they ranked the withdrawal of the US and subsequent capture of Afghanistan by the Taliban as the most impactful event of the year. And yet superforecasters did nothing to help us prepare for this event.

IV.

The natural next question is to ask how should we have prepared for what happened? Particularly since we can’t rely on the predictions of superforecasters to warn us. What methodology do I suggest instead of superforecasting? Here we return to the remarkable prescience of The Onion. They ended up accurately predicting what would happen in Afghanistan 10 years in advance, by just imagining the worst thing that could happen. And in the weeks since Kabul fell, my own criticism of Biden has settled around this theme. He deserves credit for realizing that the US mission in Afghanistan had failed, and that we needed to leave, that in fact we had needed to leave for a while. Bad things had happened, and bad things would continue to happen, but in accepting the failure and its consequences he didn’t go far enough. 

One can imagine Biden asserting that Afghanistan and Iraq were far worse than Bush and his “cronies” had predicted. But then somehow he overlooked the general wisdom that anything can end up being a lot worse than predicted, particularly in the arena of war (or disease). If Bush can be wrong about the cost and casualties associated with invading Afghanistan, is it possible that Biden might be wrong about the cost and casualties associated with leaving Afghanistan? To state things more generally, the potential for things to go wrong in an operation like this far exceeds the potential for things to go right. Biden, while accepting past failure, didn’t do enough to accept the possibility of future failure. 

As I mentioned, my answer to the poll question of how long the Afghanistan government was going to last was 1-2 years. And I clearly got it wrong (whatever my excuses). But I can tell you what questions I would have aced (and I think my previous 200+ blog posts back me up on this point): 

  • Is there a significant chance that the withdrawal will go really badly?
  • Is it likely to go worse than the government expects?

And to be clear I’m not looking to make predictions for the sake of predictions. I’m not trying to be more accurate, I’m looking for a methodology that gives us a better overall outcome. So is the answer to how we could have been better prepared, merely “More pessimism?” Well that’s certainly a good place to start, beyond that there’s things I’ve been talking about since the blog was started. But a good next step is to look at the impact of being wrong. Tetlock was correct when he pointed out that experts are wrong most of the time. But what he didn’t account for is it’s possible to be wrong most of the time, but still end up ahead. To illustrate this point I’d like to end by recycling an example I used the last time I talked about superforecasting:

The movie Molly’s Game is about a series of illegal poker games run by Molly Bloom. The first set of games she runs is dominated by Player X, who encourages Molly to bring in fishes, bad players with lots of money. Accordingly, Molly is confused when Player X brings in Harlan Eustice, who ends up being a very skillful player. That is until one night when Eustice loses a hand to the worst player at the table. This sets him off, changing him from a calm and skillful player, into a compulsive and horrible player, and by the end of the night he’s down $1.2 million.

Let’s put some numbers on things and say that 99% of the time Eustice is conservative and successful and he mostly wins. That on average, conservative Eustice ends the night up by $10k. But, 1% of the time, Eustice is compulsive and horrible, and during those times he loses $1.2 million. And so our question is should he play poker at all? (And should Player X want him at the same table he’s at?) The math is straightforward, his expected return over 100 games is -$210k. It would seem clear that the answer is “No, he shouldn’t play poker.”

But superforecasting doesn’t deal with the question of whether someone should “play poker” it works by considering a single question, answering that question and assigning a confidence level to the answer. So in this case they would be asked the question, “Will Harlan Eustice win money at poker tonight?” To which they would say, “Yes, he will, and my confidence level in that prediction is 99%.” 

This is what I mean by impact. When things depart from the status quo, when Eustice loses money, it’s so dramatic that it overwhelms all of the times when things went according to expectations.  

Biden was correct when he claimed we needed to withdraw from Afghanistan. He had no choice, he had to play poker. But once he decided to play poker he should have done it as skillfully as possible, because the stakes were huge. And as I have so frequently pointed out, when the stakes are big, as they almost always are when we’re talking about nations, wars, and pandemics, the skill of pessimism always ends up being more important than the skill of superforecasting.


I had a few people read a draft of this post. One of them complained that I was using a $100 word when a $1 word would have sufficed. (Any guesses on which word it was?) But don’t $100 words make my donors feel like they’re getting their money’s worth? If you too want to be able to bask in the comforting embrace of expensive vocabulary consider joining them.


Nukes

If you prefer to listen rather than read, this blog is available as a podcast here. Or if you want to listen to just this post:

Or download the MP3


The key theme of this blog is that progress has not saved us. It has not made us any less sinful, it has not improved our lives in any of the ways that really matter, but has rather introduced opportunities to sin that for someone living 200 years ago would beggar the imagination.

Of course it’s easy and maybe even forgivable to think this is not the case. We live longer, there’s less hunger and poverty, along with this comes more freedom and less violence. For now we’re going to focus on that last assertion, that things are less violent. And since we already broached the subject of nukes in our last post, we’re specifically going to continue to expand on that idea.

One of the best known arguments about a decrease in violence comes from someone who I actually admire quite a bit, Steven Pinker. He made the argument in his book The Better Angels of our Nature. Taleb, as you might imagine, disagrees with Pinker’s thesis and in what is becoming a common theme, asserts that Pinker is confusing the absence of volatility with an absence of fragility. If you want to read Taleb’s argument you can find it here. Needless to say, as much as I admire Pinker, on this issue I agree with Taleb.

As I have already said, this post is going to be an extension of my last post. In that last post I urged people to take a longer term outlook, and to eschew the immediate political fight in favor of a longer term historical outlook. In other words that post was about being wise, and this post is about what will happen if we aren’t wise. In particular what things look like as far as nukes.

As you can imagine if our survival hinges on our wisdom, then I’m not optimistic, and I personally predict that nukes are in our future. In this, I think, as with so many things, that I am contradicting conventional wisdom, or at least what most people believe about nuclear weapons, if they in fact believe anything at all.  If they do they might be thinking something along these lines: It’s been over 70 years since the last nuke was exploded in anger. (In fact I am writing these words on the 71st anniversary of Nagasaki, though they won’t be published until a few days later.) And they may further think: Yes, we have nukes, but we’re not going to use them. Sure some crazy terrorist may explode one, but the kind of all-out exchange we were worried about during the cold war is not going to happen. First don’t underestimate the impact of a loan terrorist nuke, and secondly don’t write off an all-out exchange either. Particularly if we’re going to poke the bear in the manner I described in my last post.

The first question to consider is why are we still worried about nukes even 70 years after their invention? Generally the development of a technology is quickly followed by the development of countermeasures. To take just one example, being able to drop bombs from the air was terrifying to people when that first became a possibility, but it didn’t take long to develop fighter aircraft, anti-aircraft guns and surface to air missiles. Then why, 71 years after Nagasaki and 50+ years after the development of the ICBM, can we still not defend ourselves? Can’t we shoot missiles down? Well first off even if we could a lot of people think building a missile defense system is the ultimate way of poking the bear. For what it’s worth I don’t fall into that camp despite my reluctance, in general, to poke the bear. But even if we decide that’s okay, right now it just isn’t technologically feasible to make a missile defense system that works against someone like Russia or China.

At this point I’d like to offer up data on the effectiveness of various anti missile systems and unfortunately there’s not a lot of it, and what there is isn’t good. If North Korea or Iran happened to launch a single missile at the United States we might be able to stop it, but when asked what he would do in that case one knowledgeable US official is reported to have said:

If a North Korean ICBM were launched in the direction of Seattle, …[I] would fire a bunch of GMD interceptors and cross [my] fingers.

Some clarification: GMD stands for Ground-based Midcourse Defense and is our current anti-ballistic missile platform, also North Korea currently doesn’t have a missile capable of reaching Seattle. But it’s interesting to note what they do have, given how impoverished the country is in all other respects.

As I said I’d like to offer up some data, but there isn’t much of it. Recent tests of our anti-missile systems have been marginally promising but they have mostly been conducted in a reasonably controlled environment, not on actual missiles being fired by surprise from a random location, at a time chosen by the aggressor for optimal effectiveness.

Tacked on at the end of the Wikipedia article on the US’s efforts at missile defense is a great summary of the difficulties of defending against a Russian or Chinese ICBM. In short:

  • Boost-stage defenses are the only layer that can successfully destroy a MIRV (an ICBM that has multiple warheads.)
  • Even so, boost stage interception is really difficult particularly against solid fuel ICBMs of the type that Russia and China use.
  • And even then the only current technology capable of doing it has to be within 40 km (~25 miles) of where the missile is launched. For those in Utah that means that if you had an anti missile defense system located at Hill Air Force Base it could shoot down missiles launched from no farther away than downtown Salt Lake City.

The Wikipedia article concludes by saying that, “There is no theoretical perspective for economically viable boost-phase defense against the latest solid-fueled ICBMs, no matter if it would be ground-based missiles, space-based missiles, or airborne laser (ABL).” (A reference from the following paper.)

In the end it’s not hard to see why nuclear missiles are so hard to defend against. Your defense can’t be porous at all. Letting even a single warhead get through can cause massive destruction. Add to that their speed and small size and you have the ultimate offensive weapon.

Thus far we’ve talked about the difficulties in defending against a Russian or Chinese ICBM. But of course we haven’t done anything to address why they might decide to nuke us. I did cover that at some length in my last post, but before we dive back into that, let’s look at people who we know want to nuke us, terrorists.

Obviously there are no shortage of terrorist groups who would love to nuke us if they could get their hands on one. Thus far we’ve been lucky and as far as we know there are no loose nukes. And I’m sure that preventing it is one of the top priorities of every intelligence agency out there, so perhaps it won’t happen. Still this is another situation where we’re in a race between singularity and catastrophe. On a long enough time horizon the chances that there will be some act of nuclear terrorism approach 100%. To argue otherwise would be to assert that eventually terrorism and nukes will go away. I will address the later point in a minute, but as to the first I don’t think anyone believes that terrorism will disappear. If anything, most sources of grievance have increased in the last few years. If you think I’m wrong on this point I’d be glad to hear your argument.

Of course, if we never have an incident of nuclear terrorism, then, as I frequently point out, that’s great. If I’m wrong nothing happens. But if I’m right

Perhaps you might argue that a single nuke going off in New York or Paris or London is not that bad. Certainly it would be one of the biggest new stories since the explosion of the first nuclear weapons and frankly it’s hard to see how it doesn’t end up radically reshaping the whole world, at least politically. Obviously a lot depends on who ultimately ended up being responsible for the act, but we invaded Iraq after 9/11 and they had nothing to do with it (incidentally this is more complicated than most people want to admit, but yeah, basically they didn’t have anything to do with it and we invaded them anyway.) Imagine who we might invade if an actual nuke went off.

And then of course there’s the damage to the American psyche. Look at how much things changed just following 9/11. I can only imagine what kind of police state we would end up with after a terrorist nuke exploded in a major city. In other words, I would argue that a terrorist nuke is inevitable and that when it does happen it’s going to have major repercussions.

But we still need to return to a discussion of a potential World War III, a major nuclear exchange between two large nation states. What are the odds of that? Since the end of the Cold War the conventional wisdom has been that the odds are quite low, but I can think of at least a half a dozen factors which might increase the odds.

The first factor is the one I covered in my last post, and that is that we seem determined to encircle and antagonize the two major countries that have a large quantity of nuclear weapons. I previously spoke mostly about Russia, but if you follow what’s happening in the South China Sea (that article was three hours old when I wrote this) or if you’ve heard about the recent ruling by the Hague we’re not exactly treating China with kid gloves either. I’ve already said a lot about this factor so we’ll move on to the others.

The next factor which I think increases the odds of World War III is the proliferation of nuclear weapons. I know that most recently Iran looks like a success story. Here’s a country who wanted nuclear weapons and we stopped them. Well of course that remains to be seen, but it does seem intuitive that the longer we go the more countries will have nukes. Perhaps it might be instructive to determine a rate at which this is happening. In 1945 there was one country. Today in 2016, everyone pretty much agrees that there are nine. Dividing 71 years by 8 we get a new nuclear nation every nine years. Which means that in 99 years we’ll have another 11 nations with nuclear weapons, assuming that the rate of acquisition doesn’t increase. But actually most technological innovation doesn’t follow a linear curve. Consequently we may see an explosion (no pun intended) in nations with nuclear weapons, or it may be gradual or it may not happen at all (again this would be great, but unexpected.)

But let’s assume the rate at which new countries are added to the nuclear club stays constant and it takes 9 years on average to add a nation to the club and that in 100 years we’ve only added 11 more countries. On the face of it that may seem fairly minor, but if we assume that any two belligerents could start World War III then we would have 55 potential starting points for World War III rather than the one starting point we had during the bipolar situation which existed during the Cold War.

In saying this I realize, of course, there were more than two nations with nukes during the Cold War, but everyone had basically lined up on one side or another, in 100 years who knows what kind of alliances there will be. Even France and the United States have had rocky patches in their relationship over the last several decades. (More about France later.)

The third factor which might increase the odds is the wildcard that is China. As I mentioned in my last point for a long time we had a bipolar world. The Soviet Union only had to worry about the United States and vice versa. Now we have an increasingly aggressive China whose intentions are unclear, but they’re certainly very ambitious. And, from the standpoint of nuclear weapons, they’re keeping their cards very close to their chest.

Most people have a tendency to dismiss China, because they are still quite far behind the US and Russia. But they’re catching up fast, and also since they weren’t really part of the Cold War there’s a lot of restrictions that apply to Russia and the US which don’t apply to China’s weapons, allowing them (from the article I just linked to)

…considerably more freedom to explore the technical frontiers of ballistic and cruise missiles than either the US or Russia.

The fourth factor involves a concept we’re going to borrow from Dan Carlin, of the podcast Hardcore History, it’s the concept of the Historical Arsonist. These are people like Hitler, Napoleon, Genghis Khan, etc. Who burn down the world, generally not caring how many people die or what else happens, in their quest to remake things in their image. You can see people like this going back as far as we have records up to as recently as World War II. While it’s certainly possible that we no longer have to worry about this archetype, they seem to be a fairly consistent feature of humanity. If they haven’t disappeared, then when the next one comes along he’s going to have access to nuclear weapons. What does that look like? During Hitler’s rise he was able to gain a significant amount of territory just by asking, how much more effective would he have been if he had threatened nuclear annihilation if he didn’t get his way?

This brings up another point, are we even sure we know all the ways someone could use nuclear weapons? In the past one of the defining features of these historical arsonists was they took military technology and used it in a way no one expected. Napoleon was the master of the artillery and was able to mobilize and field a much bigger army than had previously been possible. Hitler combined the newly developed tank and aircraft into an unstoppable blitzkrieg. Alexander the Great had the phalanx. Nuclear weapons, as I’ve mentioned, are hard enough to defend against in any case, but imagine the most deviously clever thing someone could do with that, and then imagine that it was even more devious than that. With something of that level, you might have historical arson on a scale never before imagined.

The fifth factor which makes the odds of World War III greater than commonly imagined is the potential change in the underlying geopolitics. By this I mean, nations can break up, they change governments, national attitudes mutate, etc. We’ve already seen the Soviet Union break up, and while that went fairly smoothly (at least so far, it actually hasn’t been that long when you think about it.) There’s no reason to assume that it will go that smoothly the next time. Particularly when you look at the lesson of the former Soviet Republics who did give up their weapons. When you look at what’s happening in Ukraine it seems probable that they might now regret giving up their nukes.

Of course the US isn’t going to last forever. I have no firm prediction what the end of the country looks like, and once again it’s possible that we’ll reach some sort of singularity long before that, but it may happen sooner than we imagine, particularly if the increased rancor of the current election represents any kind of trend. Thus if, but more likely when, something like that happens, what does that look like in terms of nukes? If Texas breaks off that’s one thing, but if you end up with seven nations who ends up with the nukes?

And then of course you could have the possibility of a radical change in government. Some people think that Trump would be catastrophic in this respect. On the other side of the aisle, many conservatives think that a country like France might get taken over by Muslims if demographic trends continue and immigration isn’t stopped. Certainly a book about the subject has proven very popular. Does a Muslim run France with nukes act exactly the same as the current nation? Maybe, maybe not.

The final factor to consider, at least for those who believe in revelation and scripture, are the various references to the last days which fit very well with what might be expected from nuclear warfare. We believe that war will be poured out upon all nations, and that the elements will melt with a  fervent heat and finally that the earth will be baptized by fire. Obviously saying I know what this prophecy means is a dangerous and prideful game, and that is not what I’m doing. What I am saying is that this is one more factor to be added to and weighed alongside the other factors which have already been mentioned.

The point of all this is not to convince you drop everything and start building a bomb shelter (though I think if you already have one you shouldn’t demolish it.) Along with everything I’ve said I still believe that no man knoweth the hour. I’m also not saying I know that some form of nuclear armageddon will accompany the second coming. My point as always is that we are not saved and cannot be saved through our own efforts. Only the Son of Man and Prince of Peace has the ability to bring true and lasting peace. Further, and perhaps even more importantly, thinking we have or even can achieve peace on our own, that we just need to keep pushing the spread science, or liberal democracy, or our “enlightened” western values, is more dangerous and more likely to hasten what we fear than reminding ourselves of the fallen nature of man and restricting ourselves to the preaching of gospel, while eschewing the preaching of progress.

In the end, attempting to eliminate World War III may paradoxically hasten its arrival…`


Sports, the Sack of Baghdad and the Upcoming Election

If you prefer to listen rather than read, this blog is available as a podcast here. Or if you want to listen to just this post:

Or download the MP3


When I created this podcast I decided that I wouldn’t shy away from controversial topics. And when people talk about topics to avoid, the first topics they mention are politics and religion. Having already covered the latter I decided that maybe it’s time to tackle the former. I’m a big political junkie, though perhaps it’s more accurate to say I’m a big history junkie, and insofar as politics is a subcategory of history I love politics. Conventions and debates, other than a few phrases here and there, are not history, they’re political theater, and so, with some rare exceptions, I don’t bother watching them, so don’t ask me what I thought of Trump’s speech or Obama’s (or Scott Baio’s for that matter). In my defense, I don’t think either conventions or debates have much power to influence the actual election results. I know that some people will argue that the Nixon-Kennedy debate swung things to Kennedy. Perhaps it did, but I was 11 years away from being born so I couldn’t have watched it even if I had wanted to.

People might also mention the 2000 election, arguing, probably correctly, that even a slight push in one direction would have given the election to Gore, and of course a slight push in the other direction would have kept it from being decided by the Supreme Court. And this is where we start to see the difference between history and politics. I’m glad it was close, because the drama and uncertainty that came with that turned it from just another election into history. Election night in 2000 was one of the most exciting nights of my life, and it only got more exciting as it became clear how tight things actually were.

I bring all this up because I think differentiating politics from history is important. For one thing, politics is very short term. Perhaps a metaphor would help illustrate my point, an election is like watching a football game. If you’re political, you really want your team to win and you really want the other team to lose. Passions are high, and it doesn’t matter what your team does, you still want them to crush the other guys, and it really doesn’t matter what the other side does you still really want them to be crushed. As an example, the BYU-Utah rivalry is big in my area, and one of my neighbors is a huge Utah fan. At one point I was talking to him about a recent game and I said I wanted it to be close and exciting. He vehemently disagreed, he wants Utah to win in a blowout. That’s the difference between politics and history. If you’re strictly political it’s all about your team winning, regardless of how uninteresting it is. If your interests are more historical, then, to extend the metaphor, you’re more interested in watching a last minute come-from behind touchdown, regardless which team does it. In other words, something like the 2000 recount.

Another example, also involving football, involves a BYU fan this time. This was back in the early to mid 2000’s when the memory of the Lavell Edwards years were still fresh. As I was talking to this fan, he mentioned, in all seriousness, that BYU fans sometimes called BYU “The Lord’s Team”. I made the joke that it was dangerous to bring religion into things because if the Good Lord did care about college football (and, I added, I was pretty sure he didn’t) it was clear that he was Catholic, not Mormon, since historically Notre Dame was a better team than BYU. I was surprised by the vehemence of his reaction, though in retrospect, maybe I shouldn’t have been. He claimed that BYU was the better program. I said, you can’t just look at the last few decades when Lavell Edwards was the coach. You have to look at the whole history of the program, unless you want to argue that the Good Lord didn’t start paying attention to things until 1972. Despite pointing this out he refused to budget. I sent him a link to a site that declared Notre Dame to be the all time best football program (In the intervening years Alabama has passed them, currently BYU is 66th behind Utah who’s 37th), and he wasn’t swayed. This was politics. BYU was the best program/team/university ever, and nothing was going to change his opinion.

This is where I think we are today. We’ve been on top for awhile. People are really invested in the Democratic-Republican rivalry. They have their team and all they care about is winning. They’re way more fixated on whether someone plagiarized a speech, or said the wrong thing in emails, or seems to be too friendly with Russia (or whether someone threw a punch or dumped beer on the quarterback’s family) than parallels between now and the last time there was a strong populist candidate, or what kind of agreements we made with Russia when the Soviet Union collapsed, or how the situation in the South China Sea may resemble the situation before World War I (or whether it took 20 years for BYU to win their first game against Utah.) Perhaps this is good, perhaps it’s a waste of time to worry about things that happened decades ago. Perhaps you consider examining previous black swans a waste of time when Trump just barely said something ridiculous (again). But whether you worry about black swans and catastrophes or not they’re going to happen. To paraphrase the old quote attributed to Trotsky, “You may not be interested in catastrophes, but catastrophes are interested in you.” And when they are, understanding things beyond just the “Lavell Edwards” era, is going to come in handy.

As an example of this, I have a theory of history which I call “Whatever you do, don’t let Baghdad get sacked.” You may think this is in reference to one of the recent gulf wars, but actually I’m referring the sack of Baghdad by the Mongols in 1258 (Genghis had been dead for nearly 40 years at this point but the Mongols were still really scary.) This incident may have been one of the worst preventable disasters in history. Somewhere between 200,000 and 2 million people died. Anyone who loves books always shudders when you bring up the loss of the Library of Alexandria, but in the sack of Baghdad we have an equally great library being destroyed. Contemporary accounts said that “the waters of the Tigris ran black with ink from the enormous quantities of books flung into the river and red from the blood of the scientists and philosophers killed.” Even though it happened centuries ago people will say that Baghdad still hasn’t recovered. I don’t know what dominated the thinking of the Abbasid Caliphate in the years before Baghdad was sacked. Perhaps, like us, they argued about taxes, or fought amongst themselves, or worried about foreigners. Perhaps there was even someone who said that they should do whatever it takes to appease the Mongols. If they did I see no evidence of it.

The sack of Baghdad was a black swan, a big one. And the whole course of history is different because it happened. Of all the things that the Abbasid Caliphate did, (or perhaps in this case didn’t do) this is what’s remembered 1000 years later.  Perhaps judging them by that standard is harsh, but what other standard should we judge them by? If the point of government is not to prevent your capital from being sacked, your rulers from being killed, your treasure from being carried away and your women from being raped, then what is its point?

As I said, whatever the Abbasid Caliphate did, it was the wrong thing. Now obviously I’m operating with perfect hindsight, but this takes us back to antifragility. It’s true that you can’t predict the future, but there are things that you can do to limit your exposure to these gigantic catastrophes, these major black swans. And that’s what governments are for.

To put this into terms we can understand. If we end up in a nuclear war with Russia or China whatever else we were focused on, student loans, poverty, Black Lives Matter, etc. it was the WRONG THING. Forget 1000 years from now, all that people will remember in 4 years if the next president gets us into a nuclear war is that. As I said nothing else will matter.

It’s not just nuclear war, there are lots of other things which could end up being a preventable Black Swan that in retrospect makes the petty arguments we’re having about immigration and email seem laughable, if they’re remembered at all. But for the moment let’s focus our attention on nuclear war, because I think some useful ideas might come out that discussion.

At first glance you might think that there’s not much difference between the two candidates on this issue. In fact you might even give the edge to the democrats particularly since Obama, at least at the beginning of his term spent a lot of time working to eliminate nuclear weapons for which, (along with his ability to not be George Bush) he was given the Nobel Peace Prize. But of course the point is that no one wants nuclear war. No one is going to campaign on a platform of nuking Russia. Consequently if we want to examine the candidates on this issue you have to take a few steps back. Where should we look if we’re worried about nukes? There is of course the possibility of a terrorist nuke, or perhaps in it’s death throes North Korea might set off a nuke or two. Both of these would be pretty bad, but, one there’s not a lot we can do about them and two, while they would definitely be giant black swans I think they would only be really impactful in the short term. Which is not to say that we shouldn’t be paying attention to this area, but there’s a limited amount we can do. No, if we’re really trying to prevent the sack of Baghdad we should be looking at China or Russia.

How, then, do the two major candidates (I’ll get to third party candidates later) compare on this issue? Well it’s not something that comes up a lot. At this point in the election there’s been a lot more focus on whether Trump is really as good of a businessman as he claims to be or whether Clinton was being stupid or corrupt when she ran all of her email through a private server, than any discussion of the dangers of a nuclear exchange with the Russians. Of course the Russians do come up. 20,000 DNC emails were released and various people have accused the Russians of being behind it, as part of that they have accused Trump of being too cozy with Putin. This is generally viewed as a negative, but from the perspective of avoiding the big war, this might actually be a good thing.

However, if you dig you can find some illuminating things. No real smoking guns, but it does appear that Clinton definitely leans one way and Trump obviously leans another. Let’s start with Clinton. Clinton appears to be an interventionist. She pushed for intervention in Libya. She appears to have wanted to intervene in Syria as well. On the bigger and scarier issues she is reportedly very hawkish with Russia. She apparently has compared Putin to Hitler. And by the way, on that point, she’s completely and totally wrong. Not because Putin is nicer or better than Hitler but because unlike Hitler, Putin. Has. Nukes. When it comes to China Clinton doesn’t appear to do any better.

Turning to Trump, if anything people feel that he’s too close to Putin, as I already mentioned, but then there are his comments about NATO. And here there is an interesting discussion to be had. A few months ago Trump gave an interview to the new york times and as part of the interview he said that he would be less willing to defend our NATO and East Asian allies at the current level without greater financial contributions from them. The interview rambles a bit, but these appear to be the key quotes:

If we cannot be properly reimbursed for the tremendous cost of our military protecting other countries, and in many cases the countries I’m talking about are extremely rich…

With massive wealth. Massive wealth. We’re talking about countries that are doing very well. Then yes, I would be absolutely prepared to tell those countries, “Congratulations, you will be defending yourself.”

In taking that position would Trump increase or decrease the chances of a nuclear war? In the immediate and unequivocal judgment of many this position dramatically increased the chances of war. The article in Vox was typical of the reactions:

Wednesday night, Donald Trump said something that made a nuclear war between the United States and Russia more likely. With a few thoughtless words, he made World War III — the deaths of hundreds of millions of people in nuclear holocaust — plausible.

I disagree with this assessment. Of course it’s hard to know what will set off a war, and I think World War III was already plausible. But let’s dissect the core idea of whether Trump increased the odds of war with that statement.

The first thing Trump is claiming is that the countries we’re protecting are wealthy countries who can probably pay more for their own protection if such protection is required. This is true. He’s also talking in more broad terms about the US being over-extended. Whether the US is currently overextended or not is up for debate, but what is not up for debate is that being overextended is a significant contributing factor in the falls of all previous great empires.

The second thing to consider is that when he tells NATO nations that they can defend themselves he’s talking about ignoring the collective defense clause (Article 5) of the original treaty. Now in general I’m in favor of following treaties and doing what we say we’re going to do, but NATO has extended well beyond its original purpose, and well beyond its original members, and maybe re-examining it isn’t such a bad idea. But of course the writer at Vox and many others think that questioning it is just the first step towards nuclear war. But is that actually the case, does Trump’s position make war more likely?

At the moment there are 28 members of NATO. If any of them go to war with Russia than the US goes to war with Russia. If we kicked some of the member nations out as Trump seems to be suggesting doesn’t this literally make a war between the US and Russia less likely? Now I’m not saying that it makes a war between, say, Russia and Estonia less likely (Though it wouldn’t be much of a war…) I’m just saying it makes the war we’re trying to prevent, the war the Vox article specifically mentions less likely. Honestly, and I’m sure the author feels like he’s fighting the good fight, it actually just sounds like he’s just looking for any excuse to demonize Trump.

Speaking of Estonia, I’m a big fan of Estonia. I actually applied for e-residency there, but I’m almost positive that if Russia wants it, it’s not worth using nukes to keep them from getting it. Also when you think about Estonia it leads naturally to a thought experiment. Imagine that in the next few years that Texas manages to secede. Now imagine that a few years after it seceded it joined the Russian version of NATO, a military alliance designed exclusively around containing the US. Further imagine that this alliance included nearly all of South and Central America. How would we feel? Well that’s probably a close comparison to how the Russians feel.

Instead of asking whether it would be a good idea to back off from guaranteeing Estonia’s independence with the threat of nuclear weapons, Clinton is instead of the opinion that NATO should continue to expand. Whether this expansion would include countries like the Ukraine and Georgia is unclear, but with her general bias towards expansion and her husband’s own expansion into Poland, the Czech Republic, and Hungary. (All former Warsaw Pact countries.) It’s unlikely that the Russians would believe any assurances she made on the subject, and would rather expect the worst, were she to become President. And let us pause here for a moment to explain the Russian mindset. It’s not just a matter of feeling encircled, or being unable to deal with the loss of their empire. Whatever you believe about Russia and however you feel about Putin, the last example of war they experienced, World War II, was literally (if you look at deaths) 50 times worse for them than for us. When you consider something like the Siege of Leningrad it’s understandable if they’re a little paranoid.

Of course there are at least two arguments which are going to be raised at this point. One being that we are unlikely to use nukes if Russia just invaded the Estonia, or a similar NATO member. This is certainly true, but once you’re in a war escalation becomes natural (just look at World War I which also involved a large alliance.) Also given how few troops we have, using tactical nukes might seem like a natural option. In other words while we’re not likely to use nukes in a situation where Russia invaded Estonia, we’re certainly more likely to do it than if we had no treaty commitment to Estonia.

The second argument is that if Estonia (or a similar member) is not a NATO member than they are far more likely to get invaded by Russia. This is also certainly true, and yes, I know we have made war more likely, but it is not the kind of war we’re really worried about. It is not the Sack of Baghdad. And here we once again get into a discussion about the difference between volatility and fragility. By taking the vast majority of countries in Europe and putting them under the umbrella of NATO and the US nuclear deterrent we’ve made things a lot less volatile. Europe has enjoyed an unprecedented era of peace, but we have made things a lot more fragile. One of the points that Taleb makes is that when you have high volatility the graph moves a lot but not very far. When you have low volatility the graph is largely flat until suddenly you hit a cliff. In this case the cliff would be war between the US and Russia, and it might very well involve nukes.

I don’t think people have really absorbed how different nuclear weapons have made things. Previously it didn’t matter how desperate one of the belligerents became if the other side out fought them and out produced them there was nothing they could do. It didn’t matter how desperate Germany and Japan got, at some point they were going to lose and we were going to win. But imagine if they had had the same number of ICBMs that Russia currently possesses?

I am by no means suggesting that Russia is as desperate as Imperial Japan or Nazi Germany, but this does not mean that they might be feeling angry or backed into a corner. We’ve gone 70 years without another nuke being exploded in anger and after surviving the cold war I think we’re getting complacent and arrogant. These days people don’t take Russia seriously, and they should. Recall that during the Cold War we let the Soviet Union get away with a lot, they installed puppet governments across all of Eastern Europe and when the people of one of those countries, Hungary, had an uprising they crushed it. We let them invade Afghanistan (though this was something of an own goal, a mistake we ended up duplicating) and while we provided assistance to the rebels it wasn’t much, and it was only when they tried to put missiles in Cuba that we really pushed back, and that nearly resulted in catastrophe.

Having said all this you may be wondering what I’m actually advocating for, and you may even get the impression that the whole point of this episode is to declare my support for Trump. That’s actually not the case, and in fact while I was in the process of writing the initial blog post a story came out that Trump had repeatedly asked an advisor why he couldn’t use nukes. Which, if true, is scary. I haven’t had the time to really look into that, and as we saw above it is not unprecedented for people to latch onto things just because they make Trump look bad.

To go back to the very beginning of the post what I am mostly advocating is to take a historical view of elections rather than a political view. And honestly what that mostly means is getting away from the two major parties because that’s nothing but politics. I know it’s a little late in the game to be tossing in a discussion of third parties, but I have long been an advocate for greater third party participation in American politics. I think we need a whole marketplace of ideas with vigorous and informed discussion. In 1257 the citizens of Baghdad didn’t need to hear a discussion of tax rates, or the latest fashion or whether the laws were too harsh or too lax, they needed to hear from the lone general who advocated everything possible to placate the Mongols. Six months before the sack I’m sure there were all sorts of things which seemed very important which didn’t matter in the slightest six months and one day later.

Steering a nation is complicated, and I’m not saying I know who would do the better job, and even if I did the results are well beyond my ability to influence, but when you’re thinking about these things, spare at least some thought for preventing big negative black swans. Spare a thought for what you can do to prevent the Sack of Baghdad.


We Are Not Saved

If you prefer to listen rather than read, this blog is available as a podcast here. Or if you want to listen to just this post:

Or download the MP3


The harvest is past, the summer is ended, and we are not saved.

Jeremiah 8:20

When I was a boy. I couldn’t imagine anything beyond the year 2000. I’m not sure how much of that had to do with the supposed importance of the beginning of a new millennium, how much of it is just due to the difficulty of extrapolation in general, and how much of it was due to my religious upbringing. (Let’s get that out of the way right up front. Yes, I am LDS/Mormon.)

It’s 2016 and we’re obviously well past the year 2000 and 16 years into the future I couldn’t imagine. For me, at least, it definitely is The Future, and any talk about living in the future is almost always followed by an observation that we were promised flying cars and spaceships and colonies on the moon. This observation is then followed by the obligatory lament that none of these promises have materialized. Of course moon colonies and flying cars are all promises made when I was a boy. Now we have a new set of promises: artificial intelligence, fusion reactors, and an end to aging, to name just a few. One might ask why the new promises are any more likely to be realized than the old promises. And here we see the first hint of the theme of this blog, But before we dive into that, I need to lay a little more groundwork.

I have already mentioned my religious beliefs, and these will be a major part of this blog (though in a different way than you might expect.) In addition to that I will also be drawing heavily from the writings of Nassim Nicholas Taleb. Taleb’s best known book is The Black Swan. For Taleb a black swan is something which is hard to predict and has a massive impact. Black swans can come in two forms: positive and negative. A positive black swan might be investing in a startup that later ends up being worth a billion dollars. A negative black swan, on the other hand, might be something like a war. Of course there are thousands of potential black swans of both types, and as Taleb says, “A Black Swan for the turkey is not a Black Swan for the butcher.”

The things I mentioned above, AI, fusion and immortality, are all expected to be positive black swans, though, of course, it’s impossible to be certain. Some very distinguished people have warned that artificial intelligence could mean the end of humanity. But for the moment we’re going to assume that they all represent positive black swans.

In addition to being positive black swans, these advancements could also be viewed as technological singularities. Here I use the term a bit more broadly than is common. Generally when people talk about the singularity they are using the term with respect to artificial intelligence. But as originally used (back in 1958) the singularity referred to technology progressing to a point where human affairs would be unrecognizable. In other words these developments will have such a big impact that we can’t imagine what life is like afterwards. AI, fusion and immortality all fall into this category, but they are certainly by no means the only technology that could create a singularity. I would argue that the internet is an excellent example of a singularity. Certainly people saw it coming, and and some of those even correctly predicted some aspects of it (just as, if we ever achieve AI, there will no doubt be some predictions which will also prove true.) But no one predicted anything like Facebook or other social media sites and those sites have ended up overshadowing the rest of the internet. My favorite observation about the internet illustrates the point:

If someone from the 1950s suddenly appeared today, what would be the most difficult thing to explain to them about life today?

I possess a device, in my pocket, that is capable of accessing the entirety of information known to man.

I use it to look at pictures of cats and get in arguments with strangers.

Everything I have said so far deserves, and will eventually get, a deeper examination, what I’m aiming for now is just the basic idea that one possibility for the future is a technological singularity. Something which would change the world in ways we can’t imagine, and if proponents are to be believed, it would be a change for the better.

If, on the one hand, we have the possibility of a positive black swans, technological singularities and utopias, is there also the possibility of negative black swans, technological disasters and dystopias on the other hand? Of course that’s a possibility. We could be struck by a comet or annihilate each other in a nuclear war or end up decimated by disease.

Which will it be? Will we be saved by a technological singularity or wiped out by a nuclear war? (Perhaps you will argue that there’s no reason why it couldn’t be both. Or maybe instead you prefer to argue that it will be neither. I don’t think both or neither are realistic possibilities, though my reasoning for that conclusion will have to wait for a future post.)

It’s The Future and two paths lie ahead of us, the singularity or the apocalypse, and this blog will argue for apocalypse. Many people have already stopped reading or are prepared to dismiss everything I’ve said because I have already mentioned that I’m Mormon. Obviously this informs my philosophy and worldview, but I will not use, “Because it says so in the Book of Mormon” as a step in any of my arguments, which is not to say that you will agree with my conclusions. In fact I expect this blog to be fairly controversial. The original Jeremiah had a pretty rough time, but it wasn’t his job to be popular, it was his job to warn of the impending Babylonian captivity.

I am not a prophet like Jeremiah, and I am not warning against any specific calamity. While I consider myself to be a disciple of Jesus Christ, as I have already mentioned, this blog will be at least as much informed by my being a disciple of Taleb. And as such I am not willing to make any specific predictions except to say that negative black swans are on the horizon. That much I know. And if I’m wrong? One of the themes of this blog will be that if you choose to prepare for the calamities and they do not happen, then you haven’t lost much, but if you are not prepared and calamities occur, then you might very well lose everything. As Taleb says in one of my favorite quotes:

If you have extra cash in the bank (in addition to stockpiles of tradable goods such as cans of Spam and hummus and gold bars in the basement), you don’t need to know with precision which event will cause potential difficulties. It could be a war, a revolution, an earthquake, a recession, an epidemic, a terrorist attack, the secession of the state of New Jersey, anything—you do not need to predict much, unlike those who are in the opposite situation, namely, in debt. Those, because of their fragility, need to predict with more, a lot more, accuracy.

I have already mentioned Taleb as a major influence. To that I will add John Michael Greer, the archdruid. He joins me (or rather I join him) in predicting the apocalypse, but he does not expect things to suddenly transition from where we are to a Mad Max style wasteland (which interestingly enough is the title of the next movie.) Rather he puts forward the idea of a catabolic collapse. The term catabolism broadly refers to a metabolic condition where the body starts consuming itself to stay alive. Applied to a civilization the idea is that as a civilization matures it gets to the point where it spends more than it “makes” and eventually the only way to support that spending is to start selling off or cannibalizing assets. In other words, along with Greer, I do not think that civilization will be wiped out in one fell swoop by an unconstrained exchange of nukes, and if it is than nothing will matter. I think it will be a slow-decline, broken up by a series of mini collapses.

All of this will be discussed in due time, suffice it to say that despite the religious overtones, when I talk about the apocalypse, you should not be visualizing The Walking Dead, The Road, or even Left Behind. But the things I discuss may nevertheless seem pretty apocalyptic. Earlier this week I stayed up late watching the Brexit vote come in. In the aftermath of that people are using words like terrifying, bombshell, flipping out, and furthermore talking about a global recession, all in response to the vote to Leave. If people are that scared about Britain leaving the EU I think we’re in for a lot of apocalypses.

You may be wondering how this is different than any other doom and gloom blog, and here, at last we return to the scripture I started with, which gives us the title and theme of the blog. Alongside all of the other religions of the world, including my own, there is a religion of progress, and indeed progress over the last several centuries has been remarkable.

These many years of progress represent the summer of civilization. And out of that summer we have assembled a truly staggering harvest. We have conquered diseases, split the atom, invented the integrated circuit and been to the moon. But if you look closely you will realize that our harvest is basically at an end. And despite the fantastic wealth we have accumulated, we are not saved. But in contemplating this harvest it is easier than ever before to see why we need to be saved. We understand the vastness of the universe, the potential of technology and the promise of the eternities. The fact that we are not wise enough to grasp any of it, makes our pain all the more acute.

And this is the difference between this blog and other doom and gloom blogs. Another blog may talk about the inevitable collapse of the United States because of the national debt, or runaway global warming, or cultural tension. Someone with faith in continued scientific progress may ignore all of that, assuming that once we’re able to upload our brains into a computer that none of it will matter. Thus, anyone who talks about about potential scenarios of doom without also talking about potential advances and singularities, is only addressing half of the issue. In other words you cannot talk about civilizational collapse without talking about why technology and progress cannot prevent it. They are opposite sides of the same coin.

That’s the core focus, but this blog will range over all manner of subjects including but not limited to:

  • Fermi’s Paradox
  • Roman History
  • Antifragility
  • Environmental Collapse
  • Philosophy
  • Current Politics
  • Book Reviews
  • War and conflict
  • Science Fiction
  • Religion
  • Artificial Intelligence
  • Mormon apologetics

As in the time of Jeremiah, disaster, cataclysms and destruction lurk on the horizon, and it becometh every man who hath been warned to warn his neighbor.

The harvest is past, the summer is ended, and we are not saved.