Hosted on Acast. See acast.com/privacy for more information.
[00:00:00] The Why Curve, with Phil Dobbie and Roger Hearing.
[00:00:03] New Year, new dangers.
[00:00:05] As we start 2025, is it going to be a meteor strike or a new pandemic that ends mankind?
[00:00:12] Artificial intelligence taking us over and killing us off.
[00:00:15] Putin or Iran launching a nuclear war.
[00:00:17] Or a climate catastrophe.
[00:00:19] What could kill us this year?
[00:00:21] What do we need to worry about in 2025?
[00:00:24] The Why Curve.
[00:00:26] Okay, out of all of those, I think the climate.
[00:00:28] I mean, I don't think this year, but out of all of them, the thing we're most...
[00:00:32] Well, yeah.
[00:00:33] It feels like we've got to worry about, but we seem to be ignoring.
[00:00:36] And Donald Trump, obviously, will make sure we ignore even more this year.
[00:00:38] Well, there is that, yes.
[00:00:39] But that's a longer-term issue in terms of what might actually...
[00:00:41] I suppose big climate events could be massive typhoons, massive whatever that could happen.
[00:00:48] But there's also so much else.
[00:00:50] I mean, the AI thing, everyone, there's more doom and gloom,
[00:00:52] particularly from the people who founded AI, who actually started it all.
[00:00:56] All saying, oh, hang on a second.
[00:00:57] This isn't quite what we wanted.
[00:00:58] So, if you see Arnold Schwarzenegger suddenly appearing in your street...
[00:01:03] God, you're showing your age.
[00:01:04] I know.
[00:01:04] But, yeah.
[00:01:05] But it's funny, isn't it?
[00:01:06] Because the whole...
[00:01:07] Because it seems such a ridiculous story.
[00:01:09] Well, yes, but...
[00:01:10] That machines would get rid of us.
[00:01:13] But then if machines are...
[00:01:14] You become sentient beings and they can...
[00:01:17] You know, they're looking after their own survival.
[00:01:18] Well, or they don't have to become sentient.
[00:01:20] They just have to have the wrong programming, essentially.
[00:01:22] Or program themselves.
[00:01:23] Yeah.
[00:01:23] I mean, there's all kinds of possibilities.
[00:01:24] So, could we all die because of a bad bit of COVID?
[00:01:25] Is that what you're saying?
[00:01:26] We could.
[00:01:26] Well, yes.
[00:01:27] Of course, that's the other thing.
[00:01:28] I mean, that's the thing that has come closest, I suppose, in recent years to doing us all in.
[00:01:33] Yeah.
[00:01:34] And who's to say what may be around the corner in that department?
[00:01:36] Well, just actually, just a systematic collapse of the things we rely on.
[00:01:41] If power goes off, for example, and we lose the internet...
[00:01:44] What do we do then?
[00:01:45] Yeah.
[00:01:45] Yeah.
[00:01:45] And we all sort of like, we're forced to, you know, catch our own food.
[00:01:47] You'll have to have a couple of cows here and make your own...
[00:01:50] I've got a bit of room for a couple of cows.
[00:01:52] Yeah, you could do that.
[00:01:52] Yeah.
[00:01:52] I could probably get a couple of goats into mine.
[00:01:54] Yeah.
[00:01:55] I mean, that sort of thing.
[00:01:55] But also, I mean, you know, in the midst of all this, a meteor, we haven't had one
[00:02:00] for God knows how many centuries, millennia, whatever it is.
[00:02:04] Yeah.
[00:02:04] But it's one of those things that the general consensus is it couldn't do much about it
[00:02:07] if it were coming.
[00:02:08] So, you know, that's another thing to think about.
[00:02:09] Well, there was that movie, wasn't there, Don't Look Up?
[00:02:11] Yes.
[00:02:11] Where our response to it was, just don't look in the sky and pretend it's not coming until
[00:02:15] it comes and then we all die.
[00:02:16] Yes.
[00:02:16] Well, that was an analogy, I suppose, for the climate catastrophe.
[00:02:19] But there are loads of things like this that could do for us all.
[00:02:22] Plus, you know, there's the most basic stuff.
[00:02:24] I mean, certain kinds of earthquake that could set off massive tsunami could do, you know,
[00:02:30] all kinds of damage.
[00:02:31] And again, they've been talking about that as ways in which almost planet-ending chaos
[00:02:36] could begin.
[00:02:36] Well, let's have a look and see what could kill us this year, then.
[00:02:39] Let's talk to Hayden Belfield.
[00:02:40] He's a research associate and academic project manager at the Cambridge University Centre
[00:02:45] for the Study of Existential Risk.
[00:02:47] And he joins us now.
[00:02:48] So, Hayden, first of all, the chances of us dying from some major event.
[00:02:52] And we talked about some of them, you know, artificial intelligence wiping us out, you know,
[00:02:58] the Terminator coming back and killing us off.
[00:03:01] Pandemic.
[00:03:02] An alien invasion or pandemic, which, you know, is obviously the one we've been closest to.
[00:03:05] Meteorite. I mean, all kinds of things.
[00:03:06] Well, the chance of any of those happening this year obviously is infinitesimally small,
[00:03:10] isn't it?
[00:03:11] Well, is it?
[00:03:11] Is it?
[00:03:12] Yeah.
[00:03:13] I'm not sure I would say infinitesimally.
[00:03:15] I wish it were infinitesimally.
[00:03:16] If it was, we could probably wrap it up and go home.
[00:03:19] I think these risks are small, but they would have such massive consequences that it's worth
[00:03:26] taking them very seriously.
[00:03:27] I mean, the risk of like getting in a car crash is pretty small, but you wear a seatbelt.
[00:03:32] The risk of your house burning down is pretty unlikely, but you still get fire insurance.
[00:03:36] So in the same way, the risk of, you know, a nuclear war or a really big pandemic or,
[00:03:41] you know, things like that, they are maybe small year on year, but, you know, over a hundred
[00:03:47] years, are they that small?
[00:03:50] And are they small enough to be able to just kind of like put to one side and not pay attention
[00:03:54] to?
[00:03:54] I would argue no, unfortunately not.
[00:03:56] But isn't there a danger though, that because they are, you know, the consequences will be
[00:04:00] so extreme, they do get annoyed, ignored?
[00:04:04] Because I know in the financial markets, for example, there's been, you know, I've been
[00:04:07] talking a lot to people about what happens if Donald Trump does this and that, or if
[00:04:11] he does something really extreme and people just go, well, yeah, you can't factor in those
[00:04:17] quite, you know, it's a black swan.
[00:04:19] Yeah.
[00:04:19] So how do you, how do you even calculate that?
[00:04:22] So similarly, you know, if, if climate change is going to be very catastrophic, how do you
[00:04:27] price that in?
[00:04:28] Or if, you know, there's a danger of something so big, well, I guess the obvious one is if
[00:04:33] there is a pandemic that wipes out, you know, a large part of the population, how do
[00:04:37] you factor that into, into any equation?
[00:04:39] Or do you just say, well, it's going to be awful if it happens, but we're not going to
[00:04:41] insure against it, obviously, because the cost would just be disproportionate for something
[00:04:45] which is such a small likelihood.
[00:04:47] Yeah.
[00:04:47] So some of the, a lot of these risks that we're talking about are probably too big for
[00:04:50] individual insurance companies to, to, to price accurately, then maybe not too big enough
[00:04:56] for reinsurance companies.
[00:04:58] So this, these are the like insurance companies of insurance companies.
[00:05:02] Um, so like Swiss re or Munich re.
[00:05:05] So they are a bit more interested in these risks because they really do need to take them into
[00:05:11] consideration.
[00:05:12] Um, actually, and something that we've been suggesting with a few investors actually is
[00:05:17] if you are big enough and you've got longer enough time horizons, really one of these massive,
[00:05:23] massive risks is the only thing that could make you bankrupt.
[00:05:26] So for example, we've been advising Cambridge university.
[00:05:29] It's got like the biggest endowment in Europe, you know, basically only thing that could make
[00:05:34] Cambridge bankrupt is one of these massive, massive, and cause it to stop functioning is
[00:05:38] one of these massive, massive risks.
[00:05:40] And the same thing for kind of sovereign wealth funds or, you know, very large long-term pension
[00:05:45] funds.
[00:05:46] Uh, they should have these kind of very long time horizons.
[00:05:50] And then also because they're so big, they have to own a representative slice of the whole
[00:05:54] economy.
[00:05:55] So they can't just say that's someone else's problem.
[00:05:57] Like if anything happens, uh, that is, it's, it will affect their portfolio.
[00:06:01] It will affect their chances of, of survival.
[00:06:04] Well, well, it's a catastrophe, an absolute catastrophe though.
[00:06:07] Who cares about your share portfolio?
[00:06:10] Whoever's left, I suppose, if they are anyone left.
[00:06:13] But hey, let's, let's break this down into the various things that, that, that, that could
[00:06:17] happen.
[00:06:17] And I know one of the areas that you, I think have looked at in quite a lot of detail is,
[00:06:20] is this whole thing about AI.
[00:06:21] Because that's, I suppose, a recent arrival on the scene in terms of our way of looking
[00:06:26] at it as a potential threat.
[00:06:28] The idea, and it's certainly been voiced by some of the people who were in at the beginning
[00:06:32] of AI, that this is something we know, we don't really know how to control.
[00:06:37] And potentially it could be a threat.
[00:06:39] I mean, is it something that you, I mean, just tell us what you think about it.
[00:06:42] Yeah.
[00:06:42] So you're right that it has been a concern kind of really from the early days.
[00:06:46] You know, people have been imagining robots and artificial intelligence for hundreds of
[00:06:52] years.
[00:06:52] And there's always been a kind of slight fear of that.
[00:06:55] But even from when it, the, the, the study of it began properly in the 1950s and 1940s
[00:07:02] with Turing and with others, even then the kind of, you know, godfathers afield raised
[00:07:08] the concern about this way back in 19, around 1950.
[00:07:11] But you're right that for a long, long time, there was kind of a bit of concern in kind
[00:07:16] of movies and things like that, but it wasn't taken seriously much at all.
[00:07:21] One, there's this great survey I like to reflect on from YouGov where they, they asked
[00:07:27] the British public, what do you think is the most likely thing to kill people?
[00:07:30] And the top ones are ones you might be familiar with, and we might get on to talking about later,
[00:07:34] like nuclear weapons or climate change.
[00:07:35] But AI comes way down at the bottom between an alien invasion and a religious apocalypse.
[00:07:43] So back in 2016, people didn't think it was real at all.
[00:07:47] But when you've polled people more recently, it's kind of shot way up to the top of people's
[00:07:52] concerns.
[00:07:53] And I think that reflects the kind of dramatic changes that people have seen in AI over the
[00:08:00] last, well, 10 years, but really over the last kind of two, since the big chat GPT moment,
[00:08:06] the kind of equivalent of the Sputnik moment from 1957, where people have kind of woken up
[00:08:12] to how incredibly powerful these things have been getting in the background and kind of
[00:08:17] what the pace of change is.
[00:08:19] So that last 10 years, we've gone from, you know, it being maybe able to identify kind
[00:08:25] of some handwriting to now putting in kind of, you know, masters, you know, when I'm grading
[00:08:32] essays from master's students, I'm like, well, this could have been produced by an AI.
[00:08:38] So yeah, so I think it's like the supercomputer out of Hitchhiker's Guide to the Galaxy, which
[00:08:44] came out with the answer to life of the universe and everything is 42.
[00:08:47] But the question was, that's some of their essays.
[00:08:48] Phil is really showing his age, but yes.
[00:08:50] Yeah.
[00:08:51] But they've got, but Google has Willow.
[00:08:53] So this, this new quantum chip, which supposedly is so complex, it was, it would take today's
[00:09:00] best supercomputers, an estimated 10 septillion years to solve.
[00:09:06] That's quite a while.
[00:09:06] That's actually longer than the history of the entire universe, many times over.
[00:09:11] And it can solve this in like five minutes, they're saying.
[00:09:13] I don't know whether they built it yet or not, but that demonstrates the speed of development,
[00:09:17] doesn't it?
[00:09:17] And the power.
[00:09:17] Yeah.
[00:09:18] I mean, quantum does seem to be a bit to me like nuclear fusion, kind of always 30 years
[00:09:23] away.
[00:09:25] We'll see whether-
[00:09:26] Well, 30 years and five minutes.
[00:09:27] Yeah, well, that actually is it.
[00:09:29] So, but in terms of AI, I think that the main concerns that people have kind of can be brought,
[00:09:35] broken down to kind of three kind of general buckets.
[00:09:39] So the first one is that it's to do with the system itself, the model itself, that it goes
[00:09:44] haywire, that we lose control of it, that it causes a load of damage.
[00:09:46] And that isn't kind of speculative.
[00:09:48] We've seen that with lots of kind of cyber weapons or cyber worms.
[00:09:53] So Stuxnet was something that was used to take out Iranian nuclear enrichment facilities,
[00:09:58] but then ended up kind of going haywire and causing a lot of damage across the economy.
[00:10:04] So you can see, imagine things happening like that with AI systems.
[00:10:06] The next one is that it will get misused.
[00:10:10] So for example, like if the North Koreans had really advanced AI, they would use it to
[00:10:18] surveil their population, try and predict who was going to be a dissident.
[00:10:21] And they would rely on these kind of drones that would never, ever refuse to shoot a civilian.
[00:10:27] So that's kind of the other kind of second concern.
[00:10:29] And then the third one is just like, it's a classic arms race.
[00:10:32] So in the same way that there were arms races for nuclear weapons, biological weapons, and
[00:10:36] you know, going way back for dreadnoughts, for aircraft, for everything.
[00:10:42] The worry is that there could be a real, real arms race for advanced AI.
[00:10:47] And this could lead to all kinds of shortcuts or tensions or even conflict.
[00:10:52] So yeah, those are the points.
[00:10:53] In all of those scenarios, you haven't covered the one that science fiction, you know, seems to love,
[00:10:58] which is the scenario where these computers become so smart, they decide that humans are worthless
[00:11:02] and we're just a waste of space and they're going to wipe us out.
[00:11:04] Well, is that a possibility?
[00:11:05] I mean, you know, that's been mentioned.
[00:11:07] And I suppose if you're talking about existential risk, that is the one that would do it.
[00:11:11] But Hayden, is that something that is more than just science fiction?
[00:11:15] Is it a possibility?
[00:11:16] So that's why I think of when I think of like loss of control, right?
[00:11:19] At the moment, these things, we can kind of control them.
[00:11:22] But we've had quite a lot of examples of things that we thought we understood.
[00:11:26] We thought we knew how they worked, but then they kind of got out of our hands and they
[00:11:31] ended up causing a lot of damage.
[00:11:33] And you can imagine that, you know, Stuxnet was a pretty stupid, just like malicious worm
[00:11:40] and it caused maybe tens of billions of dollars of damage.
[00:11:43] So if something's much more intelligent and much more capable of kind of replicating itself
[00:11:49] and much more capable of spreading, you know, could cause all kinds of humongous damage.
[00:11:55] And then, you know, there's arguments around if we are trying to kind of shut it down,
[00:12:02] then, you know, it would just respond not like anything would respond to.
[00:12:07] If it is at risk of not being able to accomplish its goals, it will try and gain resources and
[00:12:14] it will try and prevent us from doing so.
[00:12:16] That argues that it's sentient in some way, doesn't it?
[00:12:19] And that it would have to have the consciousness to do that.
[00:12:21] It's just following logic, though, isn't it?
[00:12:22] To try and achieve its end goal.
[00:12:24] Yeah, it doesn't have to be sentient.
[00:12:25] Yeah.
[00:12:26] I mean, like a, you know, are corporations sentient?
[00:12:31] Are, like, states sentient?
[00:12:33] No, but they are very, very powerful and in some ways kind of super intelligent,
[00:12:37] like organisms out there that can wreck our economy and societies if they're not properly.
[00:12:47] So, Hayden, you've absolutely, I mean, you know, it's the beginning of 2025.
[00:12:50] You already ruined it for me because I came on thinking, I'm thinking, well, this will
[00:12:53] just be a bit of fun.
[00:12:54] We're going to just have a bit of a laugh about things that could possibly go wrong.
[00:12:57] But what you're describing, the idea, you have that sort of power that could run amok,
[00:13:02] that just gets in the hands of the wrong people.
[00:13:04] Like, you know, there's enough dictators in this planet who want to take control,
[00:13:08] who are given that sort of power.
[00:13:11] And as you say, they could be killing their own people.
[00:13:13] They could be killing other people.
[00:13:14] You know, the machines aren't going to rebel against that.
[00:13:18] They're going to be much more successful in that endeavor.
[00:13:22] That all of a sudden seems like, well, of course that's going to happen.
[00:13:26] I mean, that seems like...
[00:13:27] Well, how likely is that?
[00:13:28] I mean, let's just say in 2025, because we're looking ahead to the year.
[00:13:33] Hayden, I mean, realistically, is it possible to put a likelihood on that particular scenario?
[00:13:37] Oh, yeah.
[00:13:37] So that is kind of, I don't think AI is going to take over the world in 2025.
[00:13:42] So that's the positive.
[00:13:44] Okay, well, that's very good to know.
[00:13:46] Okay.
[00:13:47] You know, these things are going very, very rapidly, right?
[00:13:50] Like, the biggest, richest companies in the whole world, these kind of tech giants,
[00:13:55] are pouring in literally hundreds of billions of dollars,
[00:13:59] and probably cumulatively trillions of dollars,
[00:14:01] to try and reach this kind of, like, holy grail of artificial general intelligence.
[00:14:07] Since this really, really advanced AI,
[00:14:10] they really believe that there's a possibility of reaching that kind of within,
[00:14:15] let's say, five or ten years.
[00:14:17] So, you know, I'm not sure what the actual kind of likelihood is or the chances,
[00:14:24] but, like, these very rich and very sophisticated players are making a very, very big bet on that.
[00:14:30] So I think we should take that seriously.
[00:14:32] Okay.
[00:14:32] So if you, and if everyone has access to it, then it's just a question of principle, isn't it?
[00:14:38] You know, if there's an AI engine where you can do, you know, you can do anything at all,
[00:14:42] and you just have to type in, kill everyone, and it goes off and does that.
[00:14:47] What are the checks and balances?
[00:14:48] And are we doing that?
[00:14:50] You know, I mean...
[00:14:51] Yeah, so I think it's unlikely that everyone would...
[00:14:53] I think it's, like, these things cost, you know, like, building a massive, massive data center
[00:14:58] that you would probably need to train these very advanced AI systems.
[00:15:02] Like, we've gone from it being, you know, 10 years ago,
[00:15:06] one postdoctoral researcher with kind of, like, two chips could probably make something useful.
[00:15:11] But now you need thousands of people working kind of for a whole year
[00:15:17] and spending hundreds of millions of dollars on these big data centers.
[00:15:20] So that trend is probably going to continue.
[00:15:22] But you could have a religious war, for example, where you put all that resource in
[00:15:25] and say, well, okay, this whatever this religion is, you know, pick your side,
[00:15:30] but it's all based in the Middle East, kill the other side.
[00:15:33] I mean, that's, you know, that would be a scary consequence.
[00:15:35] So the US has actually, like, has blocked most other countries from being able to import in these advanced chips.
[00:15:43] But I think the kind of worry that I would have is these, you know, very, very big, rich players,
[00:15:49] if they are the ones to get to this very advanced level of capability first,
[00:15:54] why will they not just concentrate power in their hands, you know?
[00:15:58] And it's concentrated in the hands of a few tech companies in one country in the world,
[00:16:02] and then everyone else is kind of excluded from that.
[00:16:06] I think there's very big worries about that kind of concentration of power and influence.
[00:16:12] And it will be opposed by a lot of people.
[00:16:14] But let's move then from, Hayden, from that to a relatively modern change of technology,
[00:16:20] something that in a way, I mean, it's not modern, it's not ancient,
[00:16:22] but it's a lot more, we're a lot more used to it, which is the nuclear threat.
[00:16:26] Because we are now moving in, in 2025, to one nation, Russia,
[00:16:30] that's actually revised its nuclear protocols in terms of when it can use weapons.
[00:16:35] We also have one country, Iran, that seems to be moving pretty rapidly to developing one
[00:16:40] at the point where it's feeling very hurt because of what's gone on recently in the Middle East.
[00:16:45] What are the chances?
[00:16:46] And, of course, you've got North Korea, which seems to be at a moment of increasing anger.
[00:16:52] It's been very angry for a long time.
[00:16:53] What are the chances this year of a nuclear war?
[00:16:57] Yeah, way, way too high.
[00:16:59] I think we're probably in the scariest international situation that we've been in,
[00:17:04] at least since the early 1980s, the kind of the second Cold War,
[00:17:08] when Reagan was talking about like the evil empire of the Soviet Union,
[00:17:12] or maybe even the kind of closest we've been since the 1960s and the Cuban missile crisis.
[00:17:19] And of that, JFK said the chances were between one in three and even that we would get to nuclear.
[00:17:27] Would you say we're at that kind of level?
[00:17:28] Oh, not necessarily.
[00:17:30] I mean, it's very hard to say what exactly the risks are.
[00:17:33] I'm just trying to, you know, say how, what a major concern it could be.
[00:17:37] So the risk is, as you say, we've got all these different players.
[00:17:44] I think what, so people are kind of familiar with, we might get some, you know,
[00:17:49] there might be kind of an exchange of nuclear weapons.
[00:17:52] There might be kind of fallout.
[00:17:53] There might be things like that.
[00:17:54] But I think what people haven't really internalized is this theory of nuclear winter
[00:18:00] that was put out in the 1980s of like the burning cities,
[00:18:04] all the smoke rises up and blocks out the sun for maybe a decade, maybe two.
[00:18:09] There's actually been a lot of analysis with new climate models over the last five or 10 years on that.
[00:18:15] And it's kind of confirmed, yeah, that does seem to be what will happen.
[00:18:19] And if it did happen, it would lead to between two and five billion people starving to death.
[00:18:28] So not to bring down your listeners.
[00:18:31] I mean, you're possibly too young to remember, you know, where we were in the 80s.
[00:18:37] But there was this perpetual fear that that's how we were all going to die.
[00:18:40] Well, the British government issued pamphlets and things on how to, you know,
[00:18:44] build yourself a nuclear shelter with a wooden door.
[00:18:47] Exactly.
[00:18:47] And then Roman Briggs,
[00:18:49] this brilliant novel about when the wind blows and yeah.
[00:18:52] But we are at a point more as dangerous.
[00:18:56] I mean, you know, we reflected in the missile crisis in the 60s.
[00:19:00] But of course, there was this period in the 80s.
[00:19:01] Would you say now that, you know, if you were to someone came to you and said,
[00:19:04] Cambridge Centre for Existential Risk, are we at a point as dangerous as those?
[00:19:09] And we should take it as seriously as that.
[00:19:11] I think certainly as similarly dangerous and similarly serious to the 1980s.
[00:19:18] You know, in the 1980s, there was some sort of conflict indirectly between the US and the Soviet Union in Afghanistan.
[00:19:26] And there was kind of, you know, this long simmering tension in Eastern Europe.
[00:19:31] But there wasn't these this like direct, much more direct form of confrontation that we see now in Ukraine.
[00:19:39] And, you know, Putin does seem a lot more kind of unhinged and kind of isolated even than the kind of like late Soviet leaders.
[00:19:54] So, yeah, I mean, I'm certainly very, very worried about it.
[00:19:57] I mean, when you mentioned Briggs, Threads was recently rebroadcast on the BBC.
[00:20:03] And what a terrifying film that is.
[00:20:05] I don't know if either of you two have seen that.
[00:20:07] Yes, it's where they imagine Sheffield after a nuclear attack.
[00:20:12] And what's so good about it, I think, is that it doesn't stop just a kind of couple of days afterwards.
[00:20:16] It goes years, months, you know, decades afterwards and just shows how unrelentingly kind of grim living in this kind of irradiated,
[00:20:27] but also kind of permanently shrouded, freezing cold, you know, no food to be had landscape would be.
[00:20:37] So, yeah, the nuclear situation is...
[00:20:39] And the morning after was another one as well, wasn't it, around that time as well?
[00:20:42] The day of...
[00:20:43] The day of...
[00:20:43] Or whatever.
[00:20:44] That was a climate thing, but yes.
[00:20:45] No, there was one about nuclear war as well.
[00:20:47] It was the day after, yeah.
[00:20:49] And apparently, actually, Reagan saw this film and became kind of convinced about how scary nuclear war would be
[00:20:57] and moved towards this agreement with Gorbachev.
[00:21:01] So films really can have a...
[00:21:03] But it doesn't seem like...
[00:21:03] Let's open up the cinema in the White House come January then.
[00:21:06] Exactly, yeah, quite.
[00:21:06] But it doesn't feel like we're in that place now, that, you know, having lived through then and living now,
[00:21:12] the human, you know, the discourse, public discourse doesn't seem to be quite so much around this fear that was very prevalent back then.
[00:21:20] I mean, you know, people were...
[00:21:22] I met people...
[00:21:23] I was travelling at the time.
[00:21:24] I met people who'd left Europe because they thought it wasn't safe, for example.
[00:21:27] Yeah, it's bizarre, really.
[00:21:28] I really kind of hoped that Oppenheimer would spark a bit more interest.
[00:21:32] And there was this kind of uptick of interest in kind of nuclear topics.
[00:21:37] But, yeah, I mean, like...
[00:21:38] It was quite long, though, wasn't it?
[00:21:39] You know, all the nuclear weapons are still there.
[00:21:41] They're still on hair-trigger alert.
[00:21:42] Yeah.
[00:21:43] Like, we now know with better science that it would be just as devastating as they were worried about.
[00:21:47] And yet, yeah, for some reason, it just never seems to make... to break into the headlines.
[00:21:51] So you're saying this is the biggest fear?
[00:21:53] This is the biggest challenge that we face?
[00:21:56] So certainly it feels the most urgent one to me because it is, you know, literally every 15 minutes the world could end.
[00:22:05] You know, if...
[00:22:07] And I'm saying 15 minutes because that is the amount of time that the president has if there's a Russian sub off the coast and it sends a nuclear weapon, a nuclear missile towards the White House.
[00:22:19] The president only has 15 minutes to decide how to respond.
[00:22:22] And it's a kind of use-it-or-lose-it situation.
[00:22:25] And if he doesn't use his land-based...
[00:22:28] Well, that's terrifying.
[00:22:30] But in a prospect, let me challenge you, Hayden, by saying, certainly to me, if I were looking, and I'm no expert, obviously, in existential risk unlike yourself,
[00:22:38] but I would talk about something that has already happened in A-form.
[00:22:42] And I'm talking about the pandemic because we actually have recent lived experience of that.
[00:22:47] It didn't destroy the world, but it could have done, potentially, or something like it could.
[00:22:52] So what about the pandemic threat?
[00:22:54] Because that's something we know about.
[00:22:56] Yeah.
[00:22:56] I mean, I think everyone will have that experience.
[00:22:59] I think it is also interesting that the pandemic seems to have had very little cultural resonance, right?
[00:23:06] Like, where are the kind of films or TV shows or, you know, popular books set in the pandemic examining that?
[00:23:15] I think it was always a surprise to historians why the Spanish flu 100 years ago never showed up in kind of popular media in the 1920s.
[00:23:25] Because, you know, millions of people died from 1918.
[00:23:29] But you know why that is?
[00:23:30] Because we lived it.
[00:23:30] And we don't want to live it again.
[00:23:32] Nobody wants to remember it.
[00:23:33] Yeah.
[00:23:33] I think that's honestly, yeah.
[00:23:34] It seems to have been, that does seem to be the response of most policymakers in society.
[00:23:39] So no TV shows.
[00:23:40] The TV shows that chose to reflect the pandemic, regular TV shows that ran, you know, through the pandemic and covered the pandemic, didn't do as well as those shows that chose to ignore it.
[00:23:51] Exactly.
[00:23:52] And so, yeah, we have failed, I would say, to learn the proper lessons from the pandemic.
[00:23:59] So we are unprepared for the next pandemic, which, you know, it's literally only a matter of time until that will happen.
[00:24:07] Could that happen in 2025?
[00:24:09] Could we actually see a pandemic on a bigger, worse scale, do you think?
[00:24:12] There's some people who are concerned about bird flu and jumping to between being transmissible between mammals and then between humans.
[00:24:21] And that's kind of on people's radar.
[00:24:24] But then, yeah, I mean, a zoonotic disease is one that jumps over from animals.
[00:24:31] Yeah, I mean, we are continually putting pressure on ecosystems.
[00:24:37] There could be some bat out there that at the moment in its lungs is kind of brewing up the next version.
[00:24:45] But one thing that like people who, you know, professional doomers, the thing that I think people are even more concerned about is not just a kind of natural zoonotic pandemic, but one that is engineered.
[00:24:59] As some people still think potentially COVID-19 was.
[00:25:05] I mean, the jury is out on that, I think.
[00:25:07] And however it happens, the worry is that having been through the experience of the pandemic and dividing society so much about how, for example, worthwhile it was having inoculations and injections,
[00:25:20] there'll be a whole lot more people this time saying, oh, it's not worth it and less government money being spent on it.
[00:25:26] And, you know, I fear that the next time the whole approach would be, you know, almost like the...
[00:25:32] Let it rip.
[00:25:33] Yeah, let it rip.
[00:25:34] Let the bodies pile up.
[00:25:35] That's what I was looking for.
[00:25:36] Yeah, exactly.
[00:25:37] Allegedly.
[00:25:39] So globally, it was probably about 20 million people that have died due to the pandemic and obviously completely disrupted economies and societies for about two years.
[00:25:51] And I think one of the big worries is that if you are a rogue state or if you're a kind of big substantial terrorist group, you would look at that and say, wow, I can completely disrupt these massive economies and kill tens of millions of people if I'm able to create a biological weapon.
[00:26:12] So I think that's kind of major, major concern that people have.
[00:26:16] So apparently ISIS tried to look into making a biological weapon.
[00:26:21] There were terrorist groups like Amsh and Rico from 20 years ago that explored that.
[00:26:26] But my major, major concern is that it's like a state gets back into it.
[00:26:31] So most countries stopped there.
[00:26:36] They had biological weapons in World War II and for some years afterwards.
[00:26:41] And the Soviet Union had this massive, massive secret, completely illegal one in the 70s and 80s.
[00:26:47] But it's kind of faded a bit.
[00:26:49] There isn't...
[00:26:50] At their peak, they had literally 65,000 people working on it.
[00:26:55] But that's kind of faded.
[00:26:56] But my worry is that the experience of the pandemic and kind of genetic engineering getting much, much better and much, much cheaper will kind of lead some of these bad actors to say, oh, well, let's just explore it.
[00:27:11] Let's just see what we can get.
[00:27:12] Let's see what kind of weapons we can create.
[00:27:14] And all this involves bad actors, as we call them, and their potential threat.
[00:27:18] Let's talk about one that hovers in people's minds.
[00:27:21] It's actually something beyond anyone's control.
[00:27:23] I'm talking about the meteorite strike.
[00:27:25] Now, we know that has happened many times during the Earth's history.
[00:27:30] Obviously not in our current recorded time, as far as anyone knows.
[00:27:34] Is it possible to know whether that is a threat?
[00:27:38] I mean, it's one of the classic ones everyone talks about.
[00:27:40] In fact, Hollywood films are made about it.
[00:27:41] Well, we did have, didn't we?
[00:27:42] Sort of like 2004 or something.
[00:27:43] What was it?
[00:27:44] Apophysis or whatever it was called.
[00:27:46] And I think there was an EU-sponsored attempt to see if it's possible to knock a meteorite off its course.
[00:27:52] Maybe it was NASA.
[00:27:53] Yeah, that's right.
[00:27:54] What are the risks?
[00:27:55] So the other ones are much harder to predict because they involve human decision makers.
[00:28:02] And therefore, it's hard to predict that.
[00:28:04] And they also involve kind of technologies that in some ways are improving.
[00:28:09] So AI or genetic engineering, hard to predict what speed that is going at.
[00:28:16] Natural risks.
[00:28:18] So like meteorites or volcanoes, say, are much easier to predict because we can just go out and count them and we can count how many have hit the Earth.
[00:28:26] And meteorites actually is one of the few kind of positive good news stories.
[00:28:32] So, I mean, to keep on harping on about this kind of this point about the importance of movies.
[00:28:39] There were two in the early 90s, Armageddon and Deep Impact that really kind of did scare a lot of people and kind of put it on the radar of politicians.
[00:28:52] And there was this massive kind of comet that slammed into Jupiter and left this kind of massive, you know, moon sized, not whole, but this moon sized kind of creature in Jupiter.
[00:29:06] In the 90s that kind of really got meteorites up the agenda.
[00:29:10] NASA put in a decent amount of money and a decent amount of kind of talented stuff, like, you know, good people.
[00:29:18] And they is called Space Guard, the program.
[00:29:22] And it's went out and counted all of the really, really big asteroids.
[00:29:25] So the kind of planet killers and then the continent killers and so on.
[00:29:29] And the good news is that none are on track to hit us.
[00:29:33] So that was something that we didn't know about.
[00:29:36] Fantastic.
[00:29:37] And they went out and counted.
[00:29:39] And, yeah, none are on track to hit us.
[00:29:42] Well, let's not talk about that one then.
[00:29:44] So what about the climate then?
[00:29:47] That's sort of like a bit of a slow burn is a bad phrase, but I wonder how slow it is.
[00:29:52] Increasingly big hurricanes, typhoons.
[00:29:54] Well, and temperatures as well.
[00:29:55] So we had that over 40 degrees in 2022.
[00:29:59] Before that, it was 38.7 degrees in 2019.
[00:30:02] Before 2022, and this is just in the UK, there was nothing ever over 38 degrees.
[00:30:06] And we keep on beating it.
[00:30:07] I've driven, you said, Roger knows, the audience knows.
[00:30:10] I lived in Australia for a long time.
[00:30:11] They've got used to now temperatures over 40, very regularly, including just over Christmas time.
[00:30:16] So Western Australia got up to 50.7 degrees in 2022 in Onslow, not a bit in the outback.
[00:30:27] But anyway, and obviously America, everywhere in the world, we're hitting these new temperatures.
[00:30:31] What's the danger that, you know, we see it as a slow burn, but what's the danger that we actually just hit a tipping point and things accelerate that much faster?
[00:30:39] And we get caught surprised.
[00:30:42] Yeah, so it's a bit tricky, the climate change one.
[00:30:44] So it's different from the other ones.
[00:30:46] So the other ones are kind of like short, sharp shocks, like nuclear war or a meteorite or a volcanic explosion or something.
[00:30:53] They're all kind of, boom, one big event that is very, very bad.
[00:30:58] Climate change is a bit more kind of like, you know, it turns the pressure up with everything.
[00:31:04] It makes everything kind of worse, everything a bit more, all the scary things.
[00:31:07] It makes them all a bit more likely.
[00:31:09] I mean, there's some people-
[00:31:10] So more likely if we have less food, we'll have to have war, we'll have bigger population movements, etc.
[00:31:17] Yeah, so those are the kind of major effects.
[00:31:19] I mean, there's some people who have looked into kind of the direct impacts of climate change.
[00:31:23] So can you render enough of the earth uninhabitable through climate change?
[00:31:29] As in, so there's this thing called the wet bulb temperature where humans, you know, we regulate our body temperature by sweating.
[00:31:37] If it's humid and hot enough out, you can't sweat off the heat and you can die.
[00:31:42] So, you know, some parts of the world might be just uninhabitable because they're like as hot as the Sahara or they have this wet bulb temperature.
[00:31:51] So there's uninhabitable just for humans.
[00:31:54] There's like, you can't grow as much food as you would before.
[00:31:58] But both of those, it seems unlikely that, you know, the whole of the world would be uninhabitable or, you know, you can grow any kind of food whatsoever.
[00:32:08] So I think the concern is, as you identified, that it's kind of more acute that in particular areas, it becomes very, very hard.
[00:32:17] And you get the extreme weather events, you get extreme heat events, and that causes kind of mass migration, that causes international tension, that causes food shortages, which then, you know, countries compete for that.
[00:32:32] So it's kind of less the kind of direct hazard like the other ones, but it kind of makes everything a bit riskier.
[00:32:40] And potentially could lead to, as you say, I mean, they interact with each other.
[00:32:44] So it makes perhaps nuclear war more likely, perhaps even makes pandemics more likely as people try to cultivate areas they never did before.
[00:32:51] Or just civil unrest.
[00:32:52] So, you know, that whole argument, if you have red ants and black ants, I think it was wrongly attributed to David Attenborough.
[00:32:57] But he said, if you put red ants and black ants together in a jar and you just leave the jar, they're fine.
[00:33:02] If you shake the jar, then the red ants will think that the black ants are out to get them.
[00:33:06] And the black ants will think the red ants and they will kill each other.
[00:33:10] You know, so that danger of if you put – so in other words, you can cohabit.
[00:33:14] But if you put pressure on a situation, things can really just steamroll and get out of hand.
[00:33:20] So, you know, mass civil unrest like that, which might be brought about by climate change, but could that be what finishes us off?
[00:33:26] And is that going to happen?
[00:33:27] It's not going to happen this year.
[00:33:28] These are definitely some of the – yeah, the overall indirect effects are some of the big ones.
[00:33:33] And then the impact that, you know, let's say you have some sort of weather event that causes food shortages, that affects the kind of food network.
[00:33:43] Well, maybe that leads to kind of civil political unrest in a particular country.
[00:33:47] And maybe that further damages ecosystems, which then further damages kind of food networks, which then further leads to unrest and so on.
[00:33:57] So I think that kind of spiraling effect and that kind of cascading effect, but within kind of particular countries or particular regions, is something that people are concerned about.
[00:34:08] Yeah.
[00:34:09] And watching my son, what's the danger that we'll have a whole generation that gets so obsessed with computer games they forget to eat and they just starve to death?
[00:34:19] There we are.
[00:34:19] That's the existential issue.
[00:34:20] Yeah, yeah, yeah.
[00:34:20] That's not on our radar yet, I guess.
[00:34:22] Hey, as we draw this to a close, I mean, it's fascinating doing a sort of tour d'horizon.
[00:34:26] We mentioned alien invasions, for example.
[00:34:28] But, I mean, if you were to say, if I'm going to say to you, what of the things we've talked about, what is the existential risk, if there is one, in 2025?
[00:34:37] What comes top of the list of the ones we've talked about?
[00:34:43] We're not saying it's actually going to happen.
[00:34:44] We're not saying it's even likely, but perhaps the likeliest of the ones we've spoken about to kill us all off in 2025.
[00:34:50] Right.
[00:34:51] Killing us all off, I mean, here's another kind of positive, you know, silver lining to a dark cloud kind of story.
[00:34:58] Killing every single person off is quite hard.
[00:35:00] You know, there are 8 billion people and they're spread out all over the world and they're, you know, they're in submarines at the bottom of the ocean or they're up in the Arctic or they're in kind of uncontacted in the middle of jungles or on kind of farms in the middle of nowhere.
[00:35:15] So it is quite hard to kill every single 8 billion people.
[00:35:18] But the risk that I'm kind of, that really feels like it could happen at any given moment, I think is still the nuclear risk.
[00:35:27] It's the one that's been hanging over us for the last 80 years.
[00:35:31] Well, there's no answer to that either, is there?
[00:35:33] Because we've designed nuclear weapons.
[00:35:36] We have the capability.
[00:35:37] It's only our willingness not to use it.
[00:35:38] It's not a nuclear killer.
[00:35:40] I mean, I'm not so sure.
[00:35:41] I mean, like, you know, we were talking about in the 80s, towards the end of the 80s, we drastically reduced the risk of nuclear war.
[00:35:49] I mean, there used to be kind of 70,000 warheads in the world, right?
[00:35:54] So the US had 23, the Soviet Union had 40.
[00:35:58] We're now down to both of those countries are only allowed to deploy 1,500 of those warheads.
[00:36:05] So it's down.
[00:36:06] That's enough, though, isn't it?
[00:36:07] Well, unfortunately, it's still enough.
[00:36:09] It is still enough.
[00:36:09] But, you know, you've got to look at the slope of the graph.
[00:36:13] Like, if we can continue that progress, like, really farsighted statesmen in the past have been able to reach agreements that are in the interests of both countries to kind of keep on reducing these risks,
[00:36:26] to keep on reducing the number of warheads, to keep on kind of creating institutions of stability that get us to a kind of safer world for everyone.
[00:36:36] And so that...
[00:36:36] But you use the word both.
[00:36:38] You use the word both there.
[00:36:40] And that's the problem, isn't it?
[00:36:41] Because back then, there were perhaps two major powers with nuclear weapons and three or four more.
[00:36:47] We've now got, by my calculation, about nine who probably have them.
[00:36:52] And they don't necessarily play by that game.
[00:36:54] And when you had two up against each other, they could probably work out that neither would work from it.
[00:36:59] Now you're depressing me.
[00:37:00] Sorry.
[00:37:01] It's just an observation.
[00:37:02] Yeah.
[00:37:02] So the nine...
[00:37:03] I mean, so it's notable that, like, most of the other countries have about 100 or 200.
[00:37:09] And I think that's interesting because that shows that those countries think that that is enough to deter an adversary, right?
[00:37:15] You can deter an adversary with just 100 to 200 warheads.
[00:37:19] Like, 1,500.
[00:37:21] No other country.
[00:37:22] You know, the other seven countries do not think that those numbers are necessary.
[00:37:26] So could we imagine that Russia and the US getting down to kind of that level?
[00:37:33] That seems plausible to me.
[00:37:35] Going beyond that, yeah, I agree that you probably need agreement between not just the US and Russia,
[00:37:41] but including China and probably the UK and France.
[00:37:45] But, you know, whether we could make progress on that, I think there is a...
[00:37:49] But then you've got Israel, North Korea, Iran, potentially India, Pakistan.
[00:37:54] All going to be sorted out this month by Donald Trump.
[00:37:57] He's just going to sort them all out in his first reading office.
[00:38:00] Yeah, yeah.
[00:38:00] Yeah, we'll see.
[00:38:01] Hayden, thank you so much.
[00:38:02] By the way, you used a term, I don't know if you're referring to yourself as it,
[00:38:05] what did you say, a professional doomsayer?
[00:38:08] Have you got that on your business card?
[00:38:10] Right, yeah, yeah.
[00:38:11] Do you get many party invites?
[00:38:14] Well, you've certainly shed a bit of doom to us, but also reassured us in some areas as well.
[00:38:18] Oh, I'm glad.
[00:38:19] Hayden, have a great 2025 and thank you for being with us.
[00:38:21] Well, thanks so much.
[00:38:22] Great, thanks for having me.
[00:38:23] If we make it through to 2026, we'll talk to you then.
[00:38:25] Yeah, all right.
[00:38:26] Good, fingers crossed.
[00:38:27] So next week, another disaster.
[00:38:29] Well...
[00:38:30] Some people would say that.
[00:38:31] Well, of a different kind.
[00:38:33] Keir Starmer.
[00:38:34] No, he's not a disaster.
[00:38:35] No, but he's been having a hard time.
[00:38:37] He has.
[00:38:37] You have to say, in the first six months of the Labour administration, you wouldn't have
[00:38:42] a very long queue of people cheering, would you?
[00:38:45] No.
[00:38:46] But, you know, I guess, you know, what are the alternatives?
[00:38:49] And also, has he been given long enough?
[00:38:51] That's the big question.
[00:38:52] But also, why has it not worked?
[00:38:54] I mean, OK, you know, new administrations, usually they get some sort of a honeymoon.
[00:38:56] He doesn't seem to have had much of one.
[00:38:58] Yeah.
[00:38:58] We perhaps expect too much, I don't know.
[00:39:00] Yeah, maybe.
[00:39:01] Yeah, I think people, perhaps he was making promises and people thought those promises
[00:39:04] would be enacted immediately.
[00:39:07] Whereas, obviously, a lot of it is, you know, slow burn stuff.
[00:39:09] And clearing up the mess from the last lot, as they constantly tell us.
[00:39:11] Tell us.
[00:39:12] Yeah, exactly.
[00:39:13] We'll examine that in some detail.
[00:39:14] Yeah, that's next week on The Y Curve.
[00:39:16] Join us for that.
[00:39:17] Thank you.
[00:39:17] See you then.
[00:39:19] The Y Curve.

