Starr Forum: Warnings

MICHELLE ENGLISH: Greetings, and thank you for coming to today's MIT Starr Forum, featuring counterterrorism expert Richard Clarke. He'll be sharing from his latest book, Warnings: Finding Cassandras to Stop Catastrophes. A reminder that the books are for sale, right here, and we will be having a book signing immediately after the event, on the stage, at this table.

Our next Starr forum will be on Thursday, November 16, featuring Peter Krause on his book Rebel Power: Why National Movements Compete, Fight, and Win. Professor Krause teaches at BC and received his PhD from MIT in 2011. Joining that talk as a discussant will be Roger Petersen, who is the Arthur and Ruth Sloan Professor of Political Science at MIT.

You can learn more about the work being done at the center, and sign up to learn more about our Starr Forums, at our information table, outside. For today's talk, we will first hear from our guest speaker, followed by a conversation between the speaker and the discussant, and then, a Q&A session with the audience. For the Q&A, I want to remind everyone to please line up behind the mics and to please remember to ask just one question.

Serving as a discussant for today's talk is Joel Brenner, who, among many positions, was the former head of counterintelligence under the Director of National Intelligence for the United States. Currently, he serves as a Senior Research Fellow at the center and at CSAIL. Mr. Brenner will introduce our guest speaker at this time. Please join me in welcoming Joel Brenner.

[APPLAUSE]

JOEL BRENNER: Thanks very much, Michelle. It's a great pleasure to have the honor to introduce my old friend, Richard Clarke, and to be back together with you, Dick, after a number of years. Dick Clarke's accomplishments are very long, and I'm going to be quite brief, this morning. But I'm going to begin with his principal accomplishment, which is that he's a graduate of this institution and holds a master's degree from MIT, and was born in Cambridge, Massachusetts. Welcome home.

[APPLAUSE]

Now, he served for 30 years in the United States government, in the White House. An unprecedented string of serving three consecutive presidents as a counterterrorism expert and in other capacities. He was the special assistant for Global Affairs, special advisor to the president for cyber-space, national coordinator for security in counterterrorism.

Since leaving in 2003, he's been busy with his company Good Harbor, a risk management company in the Washington area. He's written eight books. He has a flair and a facility with the language and with writing, and his books are really worth your attention. I'll mention just three of them, and then, let you hear from him, directly.

First of all, the one that really, I think, put you on the map in the public's eye was Against All Enemies: Inside America's War on Terror in 2004. It was the New York Times number one bestseller, and it's still a great read some years later. His book Cyber War is a terrific read. I'll confess, Dick, the slight annoyance I had that you beat me to market by some months on the same-- annoyance with admiration-- on the same subject. And finally, what we're here to talk about today is his newest book published this year called Warnings, which is about Cassandras, and why people don't pay attention to them.

Dick, we're looking forward to hearing from you. Please welcome Richard Clarke.

[APPLAUSE]

[SIDE CONVERSATION]

RICHARD CLARKE: I'll yell. How's that?

AUDIENCE: Sounds good. I like being yelled at.

Well, thank you for coming. I'm sure you had lots of other better things to do today.

AUDIENCE: Not really.

[LAUGHTER]

RICHARD CLARKE: Well, thanks, anyway. Just as a test, how many of you lost your personal information in the Equifax hack two weeks ago? No. That's wrong. You all did. You all did. 143 million Americans and 700,000 British citizens. Now, chances are that you're in that number.

Will it make you feel any better if I tell you that last week, we learned that a cyber security researcher, independent of Equifax, independent researcher, had discovered, a year before the hack, that you could hack into Equifax very easily? And he called Equifax, repeatedly, and told them that. Emailed them, wrote them letters, called them. And they ignored his warnings.

And then, using the exact same technique, somebody-- we can all guess-- hacked their way in and took 143 million files. Now, the guy who should have been worrying about that was the CEO. He got fired, by the way. As is increasingly the case with hacks, the CEO got fired. But somebody, we don't know yet who, somebody in the company was warned and did nothing about it.

Turns out, that fits a pattern. This book began over a bottle of scotch-- as all good books do-- when my co-author and I were chatting about this phenomenon that we thought was a phenomenon. And we talked about the Challenger space shuttle, the classic case, where an engineer said the way the o-rings fit on the booster rockets, you can't try to launch this thing below 32 degrees Fahrenheit. It'll blow.

And he warned it, documented it. He was the expert from Thiokol. Documented it, sent it up, nothing happened. On the day of the launch, he repeatedly picked up the phone, worked his way up the tape, got to the flight director, told everybody in the tape, you can't do this. It's below 32 degrees out. The thing will blow up. And they ignored him. And it blew up, killing all the astronauts and retarding the space program.

That's the classic example. But over that bottle of scotch, we asked each other, is this really a phenomenon? Does it happen all the time, or is it just our sensation that it happens all the time? So we made a list of catastrophes that has occurred in the past decade or so. And we went back, and in each case, looked for, was there that gal, was there that guy, that warned in advance?

Now, it just can't be any old person off the street saying the sky is falling. To fit our paradigm, the person giving the warning had to be an expert. Prior to giving the warning, had to be a recognized, acknowledged expert in the field that they were giving the warning in. And they can't just have woken up in the middle of the night of indigestion and said, something bad is going to happen. They had to have a study. They had to be data driven.

And what we found, over and over and over again, in major catastrophes, that phenomenon did, in fact, occur. Talk about a few of them. In the book, we look at seven that have already taken place. Seven case studies. And they are case studies not so much of the disaster, but of the person, and why that person saw something coming when others did not, and why that person wasn't listened to.

One example is the flooding of New Orleans. There was a professor of civil engineering at Louisiana State, who said over and over, repeatedly, for years, that we will, someday, get a category 4, or heaven forbid, a category 5 hurricane, coming right at New Orleans. And the dams and dikes that the Army Corps of Engineers built are not engineered to survive that. They will be overtopped, the city will flood.

He wrote it up, he documented it. He brought it to the Army Corps of Engineers, he brought it to people in New Orleans government, Louisiana government, said to other experts, here's my data. What's wrong with it? Show me where I'm wrong. I want to be wrong. No one could ever show him where he was wrong, so instead, they fired him.

[LAUGHTER]

From the faculty, which is hard to do. As those of you on the faculty know, it's hard to fire faculty members. So that was Katrina.

Then, we looked at Fukushima. Fukushima, if you don't remember the image, four nuclear power plants on the coast of Japan, one after the other, on live television, melted down and blew up. You could see these enormous container domes blowing right off the building, clouds of radioactive gas going up into the air. One after another. It was a horror movie. Hundreds of billions of dollars in damage. Tens of thousands of people moved permanently. Towns closed.

Turns out, if you go back and look at the hearings that were held-- the environmental hearings that were held-- on the siting of those plants, there was a civil engineer who said-- and documented it-- that this area is susceptible to earthquake. I know there hasn't been one in a long time, but it's susceptible to earthquake. And the earthquake, because of the geography of the place, will create a tsunami. And the tsunami will hit this coastal plain, where you're proposing to put four nuclear power plants, and it will cause them to lose electrical control. There won't be electricity running into the plants.

And if anybody here is from the nuclear department here, you know nuclear power plants need external electrical input, not just their own. They need a generator or external-like input to keep the cooling system going when the system shuts down.

He said, don't put it here. And in the hearings, Tokyo Electric said, that's ridiculous. We've looked at a hundred years of records. There's never been an earthquake, there's never been a tsunami. And he said, yeah, but you only looked at a hundred years. I walked up the hill behind the coastal plain, and there, on the hill, was a brass plaque from 400 years ago that said never build anything below this line.

[LAUGHTER]

They doubted his data. That wasn't good data for them. And anyway, they built the four plants, and you know the rest. There was an earthquake, there was a tsunami, and the four plants blew up. So we thought, OK, if this happens in civil engineering, does it happen in other fields?

So we looked at my field. We looked at international security. And there are a couple of chapters on international security. I'll tell you about one of them. The rise of ISIS. Before there was an ISIS, before that name existed, we sent a career Foreign Service Officer, Robert Ford, to be our man in Damascus-- a US ambassador in Damascus.

Robert Ford was what you want the Foreign Service to be. Bob had spent 25 years in the Arab world. He could speak every dialect. He could put on Arab clothes and go to the souk in any country, and no one would think he was an American. And he would talk to people on the street, the Arab street, the proverbial Arab street. And he would figure out what was going on, far better than any CIA agent, far better than any NSA intercepting. He was the quintessential human intelligence.

And sent to Damascus, he did that. He walked around in the souks in the various cities. Then, he went to his old stomping grounds in Iraq and did the same. And he came back to the Embassy, and he wrote a message to the president that said, there's going to be a major revolt here, in Syria. Major. Unlike anything we've seen before against Assad. And we need to back it because if we don't back it, somebody else will.

And that somebody else will be something that doesn't exist yet. It will be a terrorist group, not just rising up out of Syria, but also coming from across the border in Iraq. A Sunni terrorist group from Iraq. And what will happen is, suddenly, we will have a terrorist nation state spanning the border of the two countries and controlling cities. Big cities. Cities with 2 million population, 1 million population.

Now, that had never happened before. We had never, in all the years of fighting al-Qaeda and the other terrorist groups, we had never seen a terrorist group controlling large cities. Small towns, maybe. Never spanning a territory as large as that. The Obama administration considered his proposal at some length and effectively rejected it, that we should support the opposition in a serious way.

And two years later, the Arab Spring hit. The revolution started in Syria, and suddenly, a group jumped up in Iraq, came over, took it over. ISIS. They took over Mosul, they took over Raqqa, they took over Sirte, in Libya, a city of 700,000 people. If you look at those cities now, on satellite photography or on drone photography, they look like Berlin in 1945. There's hardly a building left habitable in any of those cities. Millions of refugees, hundreds of thousands dead. Could we have stopped it? There's no way of knowing. That's alternative history. But what we do know is that we had an expert who was data driven, who nailed it, who predicted it would happen.

Well, what about a different field. What about, say, economics or finance. Well, one thing that came to mind was the Bernie Madoff Ponzi scheme. Bernie Madoff, you may remember, was a pillar of Wall Street. He'd been chairman in one of the exchanges, he was a great philanthropist, he was a leader of the Jewish community in Manhattan. Very respectable fellow.

And he was returning enormous returns in his private investment account, to his friends and other people he was able to convince to invest with him. People who were investing their entire life savings. This wasn't just institutional investors. It was individuals, retired people, synagogues. All their money to Bernie. And they were getting returns that no one had ever seen before. Year after year after year.

And across the river here, a guy named Harry Markopolos, the certified financial advisor and accountant, was asked by one of his customers, should I invest in the Bernie Madoff fund? And so Harry looked at it and he did some regression analysis, and he came to the conclusion that there was one chance in a million that this was a legitimate fund. The stock market would go up, the stock market would go down. One sector would go up, another sector would go up. Whatever happened, whatever happened in the stock market, Harry returned a good return. Not possible.

So Markopolos went down to Washington, went to the Securities Exchange Commission, and laid out his regression analyses, laid out all of his data to a bunch of lawyers-- no offense--

[LAUGHTER]

They didn't know what he was saying. And besides, Bernie Madoff was what? A pillar of the community. So they sent him on his merry way. He did this three times, getting more and more data every time. And every time was ignored. And you know what happened. It was a Ponzi scheme. Madoff was not investing a penny of it. He was just churning it. And all those people lost their money.

We also talked about the crash, the market crash, the great recession. We talk about Saddam Hussein's invasion of Kuwait in 1990. In every case, we found someone who had predicted it, who nailed it, who was an expert, who was data driven, and who was ignored.

So now, we wanted to know why are these people ignored? And we came up with a lot of reasons. We have a nice little matrix of 22 factors about the person who's giving the warning, about the decision maker they are trying to warn, about the nature of the issue that they're discussing, and about the critics of the person giving the warning.

We call these people who are giving the warnings Cassandras because in Greek mythology, there was a woman in Troy named Cassandra, who was cursed by the gods. And her curse was she could accurately see the future. Doesn't sound like a curse, especially if you play the stock market. The curse part was she would see disasters accurately, no one would ever believe her. Cassandras.

So we looked at these Cassandras and why they succeeded and why they failed. Why they succeeded in seeing things before other people. Before other people. Came to the conclusion that there actually is a phenomenon which an Israeli psychologist has called sentinel intelligence. Sentinel intelligence. He calls this a high performing, high anxiety condition, where you walk into a room and the first thing you do is you look for the fire escape. Whatever thing you go into, the first question you ask is, what could go wrong?

I go hiking in the Shenandoah mountains sometimes, and I have a friend who frequently hikes with me. And I look at the beautiful view from on top of a mountain, and she says, where are the bears? Some people have sentinel intelligence, and they see, in a blink, in a flash, the thing buried in the data that other people don't see. They see it first. Eventually, other people come around, but they see it very early.

And so when they go yelling, to a decision maker, there's a problem, the decision maker says, yeah? Who else believes you? What other experts in the field agree with you? And these Cassandras repeatedly say, well, I gave my data to all the other experts in the field and I said, prove me wrong, and none of them could. They had other reasons, other problems, but they could never prove my data wrong.

We heard that exact same phrase from all 14 of the people we interviewed for the book. I have data, I show it to other experts, I seek peer review. No one can tell me what's wrong with the data. They argue about other things. So we asked, OK, that's why they can see it, because they have this sentinel intelligence. Why does nothing happen? Of all the reasons, the most compelling reason was what we call first occurrence syndrome, which is a fancy way of saying, it never happened before.

In every case with our Cassandras were saying was going to happen, had never happened before, in the memory of those people involved. Maybe it happened 400 years ago, but not in the memory of the people involved. And as silly as it is for a decision maker to say, it's never happened before, therefore, it cannot happen, no decision maker will actually say that. But they think that.

If you go into a decision maker who has an agenda and who has allocated resources and you say, ignore your agenda. Reallocate the resources so that you can mitigate or perhaps prevent a disaster, they won't. They'll say it's never happened before. They all have agenda inertia. They want to do what they want to do. They get elected president, governor, they get appointed provost, whatever it is, with an agenda in mind.

And you say, well, but the circumstances have changed. And they say, no, actually, there's very little evidence that the circumstances have changed. It's a form of cognitive bias. A whole series of cognitive biases that kick in, that prevent people from seeing what we call the invisible obvious. If you look after the fact at these cases, it was obvious. In fact, it was obvious at the time, but people didn't want to work on the issue. Maybe it was because to mitigate the disaster, or prevent it, you had to do something that was ideologically abhorrent to you. You might have to increase taxes, you might have to have carbon capture, or some form of regulation, and you don't want to.

So having looked at these factors and looked at the seven case studies from the past, we looked at seven people today who are saying, I think this is going to be a problem, and who really are not getting sufficient attention paid to them. And we asked, in each case, applying this template, are they going to be the Cassandras that we look back on 10 years from now?

What we looked at was artificial intelligence, CRISPR/Cas9. We looked at asteroid impact. We looked at the internet of things. We looked at disease resistant drugs, the drugs that are not capable of dealing with the new diseases. We didn't make a determination in each of those cases if any of those things were going to happen. What we did look for, and let the reader decide, was why are these things being insufficiently addressed, or in some cases, ignored?

My favorite of all of the future Cassandras is a guy named Jim Hansen, who has been, already, a Cassandra. Jim Hansen was a NASA scientist, climate scientist, who, in the late 1980s, said we're going to have climate change. Documented it. Used NASA collection capabilities to document it. And he was suppressed by the Bush administration. Then, when the Clinton administration came into office, he found a champion in Al Gore. And then, later, when the next Bush administration came along, he was suppressed again. But he is generally believed to be one of the first climate scientists who accurately predicted the rate of climate change.

Now, he's predicting something again. Something different, different from all the other experts, at least at the time, two years ago, when we interviewed him. He was looking at the UN standard model for sea level rise. And he said the UN standard model predicts there could be a three meter rise by 2100, globally. It's a lot, three meters. Probably flood this place.

But Jim Hansen says that's wrong. Jim Hansen says what he knows about arctic melt suggests that it is not linear, and suggests, therefore, that there will be a six to nine meter sea level rise, and it will happen more in the order of 2050 to 2075.

Put aside whether or not he's right. Think about the economic, political, social implications of that. Some countries disappear, mass migrations of people. If you do an econometric model of this, I think it will show you get economic collapse. I don't know because no one has done an economic metric model of it. He really can't get people to take him seriously, even though he's been consistently right.

Since we interviewed him and wrote the book, the data that's come in suggests he's right. Other experts are constantly recalculating and changing their estimates of sea level rise and coming to the conclusion it will be much higher and much faster than they had thought just two or three years ago. But no one wants to believe that because there's a magnitude problem here.

When an issue gets to be of that order of magnitude, people stick their head in the sand. They don't want to deal with it. They don't know how to deal with it. And there's diffusion of accountability, diffusion of responsibility. For a lot of these issues, it's just not clear who's supposed to do something, and it's not clear who's going to do it.

So we think, looking at the past seven cases and the future seven cases, there is a phenomenon here. There's a lot of social science out there about prediction, about predicting disaster and catastrophe. What we're suggesting is a different method. It's a simple method. It is to look for a Cassandra, to look for somebody who is an expert in their field, and an outlier at the same time. Somebody who's predicting something that the others are not yet predicting. And pay attention to them. Look at their data. Ask other experts what's wrong with their data.

And then, if you think they might be right, develop a mitigation strategy. Develop a hedging strategy. You don't have to run off and spend the whole gross national product on their issue, but have a process where you can collect data. And as that data comes in, if, as in the case of Jim Hansen, it indicates the Cassandra is right, then start adjusting policy. Start adjusting resources. And start making those contingency plans more credible, because in some cases, you can prevent these disasters. And in other cases, you can at least mitigate them.

So what we call for in the book is a small office in government, outside of government, of experts who would scan the horizon for these Cassandras and give them their day in court. Now, in the absence of that office, in the absence of that small team looking for Cassandras, we have a website, findingcassandras.com, and we've been inviting people to nominate Cassandras. We have gotten a lot of nominations. They are all kooks.

[LAUGHTER]

But maybe here, maybe at MIT, there'll be someone who knows of a Cassandra who is not a kook. If you think you have someone in mind, email me, and we will try to give them the publicity and credit that they deserve. Thanks.

[APPLAUSE]

JOEL BRENNER: Dick, I have to say, to you and everybody, you've done it again. These are wonderful and important stories, well-told, and the pages in this book turn very easily. Let's go back to Cassandra, herself, for a little bit. And the Greeks were wise, and that's why we still read their literature. And maybe there's just a human condition that these people like this are just cursed not to be believed, and that's just tough luck, and there's nothing to do about it.

RICHARD CLARKE: Well, there's a feedback loop. Cassandra, herself, went crazy in the myth. She went crazy because no one would pay attention to her. And she was predicting the fall of Troy, all of her friends would be killed. And so she got increasingly agitated when the King ignored her.

And what we found in the book is a lot of these people get increasingly agitated when they're ignored. Jim Hansen, when he was ignored about climate change, started chaining himself to the White House fence-- something that's not recommended for civil servants in the federal government. Harry Markopolos, in Boston, thought the SEC must be in cahoots with Bernie Madoff. And he thought, oh, that means they are going to tell him about me, and he'll have a contract put out on me. He'll have me rubbed out. So Harry began walking around downtown Boston with a sidearm, which, again, not something that's generally encouraged for accountants.

There's a problem when you get emotionally involved in the issue. Sometimes that leads to your behavior being such that you get discredited.

JOEL BRENNER: I think both of us know some-- I know a number of the people in the book that you're talking about. And I don't need to mention particular names, but a lot of these people have a special talent for burning bridges, for being obnoxious, frankly. And even to-- I'm thinking of your chapter on IOT, and we've both written a lot on cybersecurity. There are people telling stories out there that are really-- have their hair on fire. And they're talking to people who are inclined to believe them and still do not want to have their doors darkened by these people.

RICHARD CLARKE: Yep. Well, that's true. Part of the high anxiety-- highly functioning high anxiety-- in many cases is that these people don't know how to present. And so the guy in the Internet of Things chapter, Joe Weiss, I think has been right for 20 years about the vulnerability of OT. And he talks about the difference between OT and IT and the interface being a problem and the lack of security on ICS and SCADA and in the chipsets. And he's right.

JOEL BRENNER: I think he's right, too. I featured him in one chapter in my book, and at the same time, nobody wants to talk to him.

RICHARD CLARKE: Nobody wants to talk to him because he doesn't know how to present. And even spending hours interviewing him was hell.

[LAUGHTER]

A lot of these people-- you go back to Harry Markopolos. You know, he's an accountant. And he's sitting down before federal regulators, and he didn't know how to talk their language. He didn't know how to get to them.

Part of our last chapter is, if you find yourself being a Cassandra, what do you do? And part of what we talk about is, figure out how to do the presentation in the language of the people you're talking to. Get into their mindset. Tell them why it's in their interest, how they can be the hero for figuring this out. They can take the credit.

JOEL BRENNER: Yeah, well, a lot of those people are not able to take that kind of advice, unfortunately. But let's look at it from the other end. Any president, any CEO of a large company is always hearing from Chicken Littles. And they're sometimes right. Everybody in a position like that knows that some part of the sky is always falling. That is the nature of that kind of responsibility. Something is going seriously wrong somewhere.

Let's suppose the President of the United States-- we won't mention anybody in particular, but the president, in some abstract sense-- believed in every one of these things. AI, pandemic disease, sea level rise, et cetera. Which one should he prioritize? I mean, the president gets on television, and what's he going to do? Scare the hell out of everybody? That's also not wise.

RICHARD CLARKE: Well, I think what you do-- if you're Bill Clinton-- what he would do is say, all right. Let's get experts in the room on each of these topics, and let me hear them out. I've been in those meetings with Bill Clinton, all the experts in the room from universities and what not. And he would go on for hours. You'd have to drag him out of the room. He loved it. And then, OK-- to the staff, he would say, give me three strategies at three different levels of expenditure to mitigate, or perhaps, prevent this occurrence.

I'll give you a specific example. Bill Clinton became worried about-- after, in Tokyo, in 1996-- a terrorist group called the Aum Shinrikyo developed biological weapons and chemical weapons and sprayed them in the Tokyo subway. And Clinton looked at that and said, what if somebody did that in our country? Timothy McVay had just blown up the federal building in Oklahoma City.

And so Clinton was thinking about big terrorism events in our country. And he said, what if you combine the Timothy McVay with somebody like the Aum Shinrikyo? Go find out what would happen if chemical weapons or biological weapons were sprayed in major cities in the US. What would happen? And What I found out was no one, no one, was prepared to deal with that. Would have been mass casualties.

And so he got all the experts around the table in the cabinet room, people who were experts in biological warfare, people who were experts in first response, and we spent about three hours. And at the end of that, he asked me to come up with a plan at three different spending levels. And we spent a shitload of money. We trained and equipped 157 metropolitan areas to detect and react to biological agents or advanced chemical agents.

JOEL BRENNER: One of the stories that you tell early in the book, which is recapitulated later in the Bob Ford story you tell, concerns a man you and I both know well named Charlie Allen. Charlie Allen, for those who haven't heard of him, is a long time CIA analyst, you would say, who used to be in charge of worldwide collection for the agency. And he is, I think we both agree, a national treasure who is extraordinarily astute and lives, eats, and breathes this.

And you tell the story of his trying to persuade the White House of what was going to happen when Saddam was going to go into Kuwait. And to no avail. And then, you talk about how General Clapper-- as the ODNI-- abolished the position that Charlie, at that time had had. He was the national intelligence officer for warning, special job for warning.

And you were criticizing Clapper for abolishing that position and taking the position that every specialist was in charge of warning in his area. Do you really think you can institutionalize a guy like Charlie? Would it matter? Charlie's not going to live forever. Somebody else comes out of the bureaucracy, sits in that place. Isn't that person going to be as likely as not to suffer from the same cognitive biases as anyone else?

RICHARD CLARKE: So when Charlie left that office-- national intelligence officer for warning-- he was replaced by a woman named Mary McCarthy, who I think did a good job. It's a tough job because if you're the national intelligence officer for warning, you want the warn occasionally. But you can't do that more than once or twice. You certainly can't do it more than once or twice and get it wrong. So they're very careful not to issue warnings.

But what they do is, they go around and they talk to the experts in the various fields, in the various geographic areas, functional areas, and they ask them probing questions. Have you thought about this? Have you thought about that? What's the basis for your judgment here? Show me the intelligence that backs up that conclusion.

And having that little nudge, having the professional nudge go around, it's a bit like an Inspector General going around and saying, are you sure you're doing a good job? I found that a very useful phenomenon. And I think saying, well, everybody's supposed to do it. We're not going to have one office that does it, everybody's supposed to do it. I don't think that works.

JOEL BRENNER: Yeah. I wish most IGs were like that, Dick. You know, I think, to a large extent, they're--

RICHARD CLARKE: They're looking for scouts.

JOEL BRENNER: --small time prosecutors. I mean, when I was the IG at NSA, I tried to do it in a way that supported the, how can we make this place work better? But it was astounding to people to do it that way, and it didn't last. It didn't last.

Let's go on to talk about the different reasons that you've articulated for people not wanting to hear this because I think, in large part, what cognitive biases means here is people have reasons not to want to hear this.

RICHARD CLARKE: That's right.

JOEL BRENNER: And in the case of the Katrina, for example, there were powerful real estate interests who wanted to develop the marshland between the city and the Gulf, that, which if left undeveloped, could have absorbed a lot of that water.

RICHARD CLARKE: And what you're talking about there, in a way, too, is what we call regulatory capture. And did you see the 60 Minutes piece two Sundays ago about the DEA and the opioid crisis? If you didn't, go back, watch it online.

So there's a Cassandra-- there's a DEA agent-- whose job it is to look at data about the manufacture and distribution of controlled substances, of narcotics. And he sees these incredibly odd spikes in production of opioids and distribution of opioids, and he sees more opioids going to small towns in West Virginia than the entire country could consume. And he thinks, there's something going on here.

And so he launches an investigation, which is what he's supposed to do. He's not only a law enforcement guy, he's a regulator. And he finds out that, in his view, the pharmaceutical industry is selling opioids knowing that they're being overprescribed, knowing that they're being abused, knowing that people are dying from them. And he goes to the Justice Department and says, we need to get injunctions and stop this.

In comes the big pharma lobbyists-- the lobbyists for the pharmaceutical industry-- and suddenly, the Justice Department thinks, oh, your data is wrong. Go back and do it again. Take some time. Here are 45 questions that you have to answer. Don't worry about it. Calm down. Because the regulator-- in this case, DEA-- was captured-- control of the regulator was captured by the very industry they were supposed to be regulating. Happens all the time. It's called regulatory capture. And that's one of the reasons that this happens.

JOEL BRENNER: Well, let's go to Madoff because I don't think that's what happened in Madoff's case at all. And I want to enrich that story a little because the third person in your book that I happen to have met was Bernie Madoff, as well as one of his sons, Peter. And Peter ran a perfectly legitimate, very profitable electronic trading platform.

And I remember being taken up to meet the Madoffs by a former head of enforcement of the SEC-- a dear friend-- who wanted to show me this fantastic electronic trading operation. It was interesting because Peter was really effusive and wanted it to show, and Bernie came and then, he just walked away. He didn't want to talk to anybody.

But one of the reasons why people didn't want to believe about Bernie-- although, I think the writing was absolutely on the wall-- in addition to what you said, was they were running a completely legitimate trading operation in addition to what they were doing.

RICHARD CLARKE: It provided cover for the Ponzi scheme.

JOEL BRENNER: It did provide cover. The guy was an evil genius. And I think it's quite astounding the position that the Commission took for him.

RICHARD CLARKE: Part of the reason--

JOEL BRENNER: But even the lawyers could understand that.

RICHARD CLARKE: I think eventually they could. Sometimes--

JOEL BRENNER: Oh, quickly.

RICHARD CLARKE: Sometimes the reason that people won't take this stuff seriously is it sounds like a science fiction movie. And we discovered this both with the artificial intelligence chapter, with the CRISPR/Cas9 chapter, and mainly with the asteroid impact chapter. David Morrison, great astrophysicist. David Morrison is our Cassandra on asteroid impact.

And for 30 years, he's been banging the drum saying that there are city killer sized asteroids that we haven't found and that they can appear with almost no notice, and we need a system to track them better. And, moreover, he says, when we see that there's one coming, we need to have a capability to do something about it.

What I normally do-- I wouldn't do this at an MIT audience-- but what I normally do with an audience is say, how many people think we have a capability of dealing with incoming asteroids? And all the hands go up. Now, you people know better. You people know better. But the rest of the country, everybody thinks that Bruce Willis is going to get in his space shuttle and go up and stop the asteroid.

And two movies came out in the same summer, Deep Impact and Armageddon. Both had this as a plot, asteroid coming, people going up and saving the earth. The result of that is, when you say to any audience, one of my issues is asteroid impact, you get what I had here, which was a little tither. When I talked about asteroid impact briefly, everybody went, heh-heh. Right. People can't take it seriously because it sounds too much like science fiction.

JOEL BRENNER: And you can't wrap your head around it.

RICHARD CLARKE: Well, it's not that difficult to understand. Morrison, at NASA, has a proposal-- doesn't cost a huge amount of money-- to increase the deep space surveillance and to create a capability of going up rapidly and nudging-- pushing-- the asteroid slightly off course. We can't do that today.

And so I asked people at FEMA what would happen if you got the notice that 72 hours from now, a city killer sized asteroid was going to hit in the Los Angeles basin? And they said, funny you should ask because we've got a plan for that. We've actually exercised that. We had a tabletop exercise with the state and local first responders in California. I said, great. What do you do? They said, we evacuate Los Angeles.

[LAUGHTER]

Right. That's our plan.

JOEL BRENNER: Right, I mean--

RICHARD CLARKE: Today, that's are plan.

JOEL BRENNER: Yeah, right. Or you go to the liquor store to make sure you're well supplied because there's nothing else to do.

RICHARD CLARKE: Exactly.

JOEL BRENNER: One of the things that we might talk about, do you think that a command economy like China or even, to some extent, Russia-- or maybe Russia in the old days-- was better at this sort of thing than market economy?

RICHARD CLARKE: No, worse. We didn't look at it, but my instinct is a command economy would be worse.

JOEL BRENNER: I mention it because in one of our favorite subjects, you and I, is the cybersecurity problem. And what one sees is lots and lots of individual actors in the private sector making profit maximization decisions with no thought or no responsibility for thinking about the external risks that they're imposing on society.

RICHARD CLARKE: Very little, yup.

JOEL BRENNER: Very little.

RICHARD CLARKE: In the George W. Bush administration, I wrote the national strategy for cyberspace. And because the Bush administration was against any regulation of anything, that strategy says that we eschew regulation. It's a great word. We eschew regulation to create cybersecurity, except in the case of market failure. I inserted those words and no one noticed, so they're there. I would argue, with cybersecurity, we have market failure.

JOEL BRENNER: Yeah, I agree with you. I agree with you for these reasons. And I think that we don't even really understand the cross-sector consequences of cascading failure because we don't have the data to do sophisticated simulation.

RICHARD CLARKE: That's right. And because it's in no one's individual interest to solve the problem.

JOEL BRENNER: And I think, well, actually, I'm even thinking about insurers and others in whose interest it would be--

RICHARD CLARKE: They don't have the data.

JOEL BRENNER: Well, who will not share the data because they've got better data than anybody across sector [INAUDIBLE].

RICHARD CLARKE: Well, so cyber insurance is a thing. I've talked to a lot of insurance companies that are writing policies. They're very expensive policies. They have a lot of loopholes and outs. If you think your covered, let me tell you, you're not.

JOEL BRENNER: High retentions, low limits.

RICHARD CLARKE: Right. Low limits and very few things that you really worry about are covered. Intellectual property loss is not covered in cyber insurance. Reputational damage is not covered in cyber insurance.

JOEL BRENNER: No. Well, let me ask you this. As we've noted, you've produced eight books, eight good books. What are you working on, what are you thinking about, now?

RICHARD CLARKE: Well, so I've got a book proposal for book number nine. One of the books that you mentioned was called Cyber War. And it came out seven years ago, I think I wrote it eight years ago.

JOEL BRENNER: Yeah, like 2010 or '11, early '11.

RICHARD CLARKE: And a lot of what I said-- and my co-author Rob Knake said-- in the book, at the time, we were kind of ridiculed. And then, most of it's happened. Most of it's happened.

So what we're thinking of is, not so much a sequel, but a follow-on called Cyber Peace. We're about to have, I think, more cyber war than we've had already, But if you actually go through and just list all the cyber war that we've had so far, it's pretty amazing. And some things that we didn't anticipate happening, like the last election, are aspects of cyber war.

So what we're going to try to do is write a book that documents what has happened in the area of cyber war, however you want to define that, and then, come up with a series of proposals to move us more as a economy, as a society, and as an international system, in the direction of cyber peace.

JOEL BRENNER: Let me pursue that a little bit with you. It's been US doctrine, consistently, that we seek domination in information space. Do we see an analog here with the seeking of domination in the nuclear area, where finally, we came to the realization that threatening the other side with a first strike was destabilized and that superiority was not what we wanted? We wanted parity.

RICHARD CLARKE: Stability.

JOEL BRENNER: Yes. Do you think that cyber peace is consistent in the long run with the continued pursuit by the United States of domination in cyberspace?

RICHARD CLARKE: No. I love the US military. I've worked with them a lot. But they can't see a new thing without wanting to dominate it. And--

JOEL BRENNER: I'm thinking of Keith Alexander in particular.

RICHARD CLARKE: No, no. They suddenly realized there was, what they call, a new domain. Cyber was a new domain, like space had been a new domain. And they talk this way. They say, historically, when we started building airplanes, we realized that the air was a new domain and we needed to dominate that with the best Air Force in the world. And then, later, we realized space was a new domain and we needed to create space command and dominate that. And now, we realized that cyber is a new domain and we have to dominate it.

I think it's a ridiculous notion every time a new phenomenon comes along to think in terms of domination. Think in terms of control, of stability, of maximization of function.

JOEL BRENNER: Even the notion was information domination, which I always thought was like, you know--

[STOMPS ON FLOOR]

RICHARD CLARKE: Right.

JOEL BRENNER: --that sort of thing. But that was the rhetoric that was coming out. All right. Well, look, I think we've got a very attentive audience here. Let's open it up to questions.

RICHARD CLARKE: Good.

JOEL BRENNER: And I'll let you take the questions, and--

RICHARD CLARKE: Talk about anything you want, except why the Red Sox are not in tonight's game.

JOEL BRENNER: Who's first?

AUDIENCE: Am I turned on?

JOEL BRENNER: Yeah.

RICHARD CLARKE: Yeah.

AUDIENCE: When you were in the White House under the Bush administration, you had brought to the attention of Condoleezza Rice the eminent threat to us, and it was ignored. What kind of sources did you use to analyze that?

RICHARD CLARKE: Well, we had a lot of data about al-Qaeda's capabilities and intentions. We had mainly-- the best sources-- were from NSA. They were signals intelligence, intercepts.

We had defectors from al-Qaeda. We didn't have a lot of spies in al-Qaeda, but we had defectors. We had a lot of satellite photography. We had al-Qaeda's own statements. I thought we had a very persuasive case that al-Qaeda was the number one national security threat.

But all of those cognitive biases that we talked about were at work. First occurrence syndrome-- there had never been a terrorist group that had attacked the United States in the United States in any significant way. It had never happened before, so it was hard for them to believe that it would happen. It was never been the case that the terrorist group was our number one security threat before. They didn't believe it because it never happened.

There was a magnitude overload problem. What I was saying was that there was a huge problem, and they didn't know how to deal with something that huge.

There was the agenda inertia problem. They came into office with an agenda. It looked a lot like the agenda-- I was in the first Bush White House as well. I know what the agenda for the second Bush term was going to be. And when his son became president and his son came in, they had that agenda.

This was the second Bush term, and they didn't want to change their agenda. They didn't want to work on the crazy thing that I thought was going to happen. They went to work on their agenda. So all of those things that we write about in the book were at work in the first nine months of the Bush administration, with regard to al-Qaeda.

AUDIENCE: How it happened, did you have a premonition about that? Did you have intelligence about that?

RICHARD CLARKE: We had intelligence. We had very good intelligence that al-Qaeda, first of all, was a major threat-- very large. We had a good idea about how big the organization was, how much money it had.

We knew that they were planning a spectacular terrorist attack against the United States. What we didn't know was when, precisely. And we certainly didn't know where or how. But we had, I thought, enough evidence in January to make it the number one issue.

AUDIENCE: Did much of that intelligence come from John O'Neill?

RICHARD CLARKE: Come from--

AUDIENCE: Did much of that intelligence come from John O'Neill, a former FBI agent?

RICHARD CLARKE: I worked very closely with John O'Neill. He was the chief FBI agent going after al-Qaeda. He didn't produce intelligence. He produced evidence of, for example, the attack in Yemen on the US destroyer, the attack in East Africa on the US embassies. It wasn't intelligence, but it was evidence. O'Neill was one of the two or three people in Washington who were screaming at the top of their lungs that something was going to happen.

JOEL BRENNER: That's good.

AUDIENCE: Thank you.

JOEL BRENNER: There's, on this side--

AUDIENCE: Thank you very much for your wonderful presentation. What I want to do is poke at this from a slightly different angle. So you gave us examples of past calamities and catastrophes, and noted that in every case you could find a Cassandra-- data-driven, bad personality, ineffective.

I want to play a slightly different game, which is that we can find examples of situations that did not evolve into calamities with associated Cassandras-- the Hadron Collider, if we want to do technology, and black holes. Or you could take early GMOs and predictions of bad effects. And in foreign policy there are many, including the second Gulf War, WMD, or calls for preemptive attacks on the Soviet Union and China, by Curtis LeMay, Richard Pipes, and many others. So what do you do about the false positive cases?

RICHARD CLARKE: The false positive, we talk about that in the book. What we suggest is, if you take our template and apply it to those cases, you would get a low score-- I think, in any event. We believe that if you apply those series of 22 factors, and just score them high, low, or medium on each of the 22 factors-- if you look at most of those past cases where we went wrong, they would have had low scores, not high scores, for Cassandra. Best example of that is the second Gulf War.

JOEL BRENNER: Yeah.

RICHARD CLARKE: If you looked at the evidence-- even contemporaneously, if you looked at the evidence-- you would have given them low scores, as a low probability that Saddam Hussein had nuclear weapons. He had them. We destroyed them. He never rebuilt them. And the evidence that he had rebuilt them was specious.

And there were experts who were somewhat suppressed who said that. I think if you had done a non-bias-- to the extent that you could do that analysis-- and presented it to the president, it would have shown, yeah, there's evidence here, but it doesn't pass the threshold of credibility. And in fact, when they did present the president the National Intelligence estimate on weapons of mass destruction in Iraq, he is reported to have said, "Is that all you got?"

JOEL BRENNER: Those were my words when I saw it.

RICHARD CLARKE: Yeah.

JOEL BRENNER: Yeah.

RICHARD CLARKE: So you know the other phenomena you deal with with false positives, though, is something like-- take the Y2K example we talk about in the book. Y2K, we saw several years in advance that there was a problem that was going to hit when we went from 1999 to 2000, that lots of computer software couldn't handle the year changing to '00. Was a simple software fix, seemingly. It cost hundreds of billions of dollars to go through all of the software that we had to change.

And so three years in advance we started a program to convince industry to change the software, to convince other countries through the UN that this was a problem that they have to address, too, and to get the Congress to appropriate the money so we could fix the systems in the federal government.

A guy named John Koskinen was placed in charge of this in the White House, and had the unlikely title of Y2K Czar. And Koskinen convinced the Congress. Amazing-- he convinced the Congress, and they appropriated a huge amount of money, and we did an IT refresh of the federal government. We haven't done one since, by the way,

We don't know to this day whether Koskinen was a Cassandra and got believed, or whether nothing significant was ever going to really happen. What's your view?

AUDIENCE: My view is that if you look at parts of the world that did not take the precautions that we were advocating, there were some problems, but they weren't that great.

RICHARD CLARKE: Right.

AUDIENCE: Therefore, there would be some benefits that followed from the actions that we took, but the problem was also exaggerated, with lots and lots of equipment being sold, which was one reason why--

[INTERPOSING VOICES]

RICHARD CLARKE: It certainly helped the IT industry.

JOEL BRENNER: Yeah, yeah.

RICHARD CLARKE: Well, I agree with that, but I would add one other consideration, that in lots of parts of the world that didn't fix things and didn't experience much of a problem, they weren't as IT dependent as we were in 1999. In 1999, the United States was far more wired and IT dependent than the rest of the world. The rest of the world is caught up, but I think we would have suffered a far greater problem.

JOEL BRENNER: Do you think-- would you do it the same way again, Y2K, with the private sector in mind, in particular?

RICHARD CLARKE: Yeah. I think the approach to the private sector was a model.

JOEL BRENNER: My view is that a lot of companies wasted a lot of money, and that would have been better off to find out what was going to fail, and fix it, and that a lot of money was spent unnecessarily in Y2K.

RICHARD CLARKE: But I think the government's approach to the private sector was the right approach.

AUDIENCE: Before you went to Syria, Robert [? Floyd ?] had been in Iraq, where he worked with John Negroponte at the time that the paramilitary death squads were being set up there. When he went to Syria, he was known that he was in contact with groups that had terrorist connections, like [? Harrimo ?] [? Schram, ?] Colonel [? Abeatty, ?] and many other factions. When he went to Obama to ask for support for helping the rebels-- which was a very amorphous group at that time-- Obama very sensibly asked him for pledges that the people that we would be supporting would only fight terrorists, and that they would not attempt a regime overthrow-- which, of course, he was unable to give such guarantees. And such people probably didn't even exist. If they were, they were hard to find.

So as you readily admit, we have no idea what an American intervention in Syria would have done. It could have turned out much, much worse. There's no data. It can be calculated to compare the two, side by side.

So we can see in that instance, actually, Obama was actually in the right, given the catastrophes that he had gone through in Libya, where there apparently were no Cassandras. And it had just materialized in Iraq, to resist this Cassandra, that you can easily make the case that far more harm would have been done by listening to Ford-- given his record, and also given the record that the US had in the Middle East-- than if they had actually gone along with him, couldn't you?

RICHARD CLARKE: That's certainly what President Obama thought. And my position here is simply to say, we can't know, because you can't rerun history and try to do it the other way and find out what would happen. We know what happened because we didn't support the insurgents, and it was quite destructive of Syria, and destabilizing of the region. Would US support of the insurgents had a similar outcome? I don't know.

AUDIENCE: That's right. That's right. No, go

ahead.

RICHARD CLARKE: Go ahead.

AUDIENCE: I just got to, like-- yeah. Thank you very much for a very interesting talk. This question is maybe a little bit beyond the spectrum of your subject matter, but since you've worked in government, both of you, you probably have a better idea than anyone else in this room about the likelihood of what I consider one of the most dangerous threats we have today, is we have a egotistical maniac president who has sole discretion on the use of nuclear warhead.

And so I'm just wondering, in your experience, is there really a check on the current president using, say, a first strike against probably Korea-- but who knows, it could be Iran. Who knows. Is there a check on that, beyond--

RICHARD CLARKE: No. No. If the president of the United States wakes up in the middle of the night and takes out his decoder ring, which is called the biscuit, and authenticates himself as the president, and picks up the phone in the briefcase and calls the National Military Command Center-and authenticates himself, and issues an order for a particular preconceived plan, it's supposed to just happen. There is no procedure for anybody else in the decision loop to question that order.

JOEL BRENNER: Isn't it remarkable that ordinarily, in every other circumstance, the chain of command runs through Sec Def?

RICHARD CLARKE: Right. You've got to remember, the system was built-- no, I know. It's a bat shit system. I accept that.

It was built at a time when we thought the Soviet Union could attack us at any time with missiles that we would detect initially 15 minutes after launch. Now, of course, we could detect them immediately, but then it would be 15 minutes after launch. And that would have left us about half an hour to launch our missiles and to get our bombers off. And we had a lot of bombers in the air.

You can't conceive how crazy the Cold War was. We had, at any given time, scores of nuclear bombers flying around in the air. And the more tensions ratcheted up, the more of them were put in the air, and the more of them were put on runaway alert.

But one of the things I did here at MIT in court 17 was calculations about whether or not we could withstand a nuclear assault, and then figure out whether or not to retaliate or whether we had to retaliate before the missiles hit. I know it's hard to conceive of, but it was a system that was designed in a entirely different era. And it's a system that's still there.

Now, does the one-star general who's running the National Military Command Center at 3 o'clock in the morning say, ooh, I think I'm going to call Secretary of Defense Mattis about this? Maybe, but he's not supposed to.

AUDIENCE: Thank you.

JOEL BRENNER: So you're right to be scared.

AUDIENCE: Yeah. Well, we know that there was a Russian colonel who did just that. It saved us all.

The last time Jim [? Hansen ?] was here that I know of, he spoke about another warning, which was super winds-- winds up to 500 miles per hour.

RICHARD CLARKE: Yeah.

AUDIENCE: So that's something to take into consideration. I'd like to open this up a little bit more, because I think what you're talking about is a failure of systems thinking, right? And there's another side of that.

There are positive solutions available that are not recognized the same way that warnings are not recognized. And for me, right now, in terms of climate change, there's the idea of geotherapy-- using existing ecological systems to take carbon out of the air rapidly.

RICHARD CLARKE: Mitigation steps or prevention steps.

AUDIENCE: Right. And some people say reversal steps. There are soil scientists who say 40, 50 years. We know how to do it already.

But it's not really recognized. It's not really talked about here at MIT or Harvard or BU or other places. So in terms of systems thinking, what do you think, beyond what you already talked about, are the problems? And what about those positive ideas which don't get taken up because of the cognitive biases?

RICHARD CLARKE: Well, you're right. It is a problem of systems thinking. And what we suggest in the book is a process that says, what if this is right? What if the Cassandra is right?

And then you develop a rational process of going through and saying, what can I do about it? And what can I do to mitigate? What can I do to prevent? What are the options? And what expenditure do I make at what period in time, as the evidence comes in?

If you believe, as I do, that Jim Hansen is right, you would be asking right now-- if we had a president that thought this way. You would be asking right now, what are the mitigation steps? And it wouldn't just be building a giant wall across Boston Harbor, because that's not going to work. Maybe. But in general, I think you need to do other things in addition to just building walls higher and having a lot of pumps, which is New York City's solution, at the moment.

And you would open the aperture, and people would come in, as you're suggesting, suggesting that we can do earth engineering. I don't know whether that would work or not, but you would have a process that would evaluate alternatives for mitigating and preventing. We don't have that process.

AUDIENCE: Richard Clarke, I'm very glad to hear you today. I just wonder, how optimistic are you about the future? Seriously. I mean, should I spend my 401(k) sooner, rather than later?

RICHARD CLARKE: I actually-- go ahead.

AUDIENCE: I'm sneaking a multi-part question. In your book, is there a hierarchy of issues or threats that you see? What are you most worried about?

RICHARD CLARKE: If you take our seven chapters of future threats, I think the one that would have the worst outcome, if it turned out to be true-- and the most immediate outcome-- is sea level rise. Personally-- and we don't say this in the book because we leave it for the reader-- but personally, I'm not afraid of artificial intelligence.

I know Elon Musk is. I know Bill Gates is. Stephen Hawkings is.

OK, fine. I'm not. I'm more of the Ray Kurzweil school of AI.

CRISPER-Cas9, I also don't worry about a whole lot. Now, we interviewed Jennifer Doudna, the Berkeley professor who was one of the originators of CRISPR/Cas9. And she really opened up to us.

And she said in the movie Dr. Frankenstein, it's the guy who invents Frankenstein who's named Frankenstein. It's Dr. Frankenstein and his monster. And she said, you know, I wake up in the middle of the night and I wonder if I'm Doctor Frankenstein. I wonder if I and the others who have really brought CRISPR/Cas9 into existence aren't going to look back someday and think that we shouldn't have done it, because it'll be misused.

You know, I think that sure, there's a chance it'll be misused, and we should have systems in place to prevent that. But it's going to be a wonderful thing for people who have birth defects and medical conditions. It's already started.

So there are some things in the book that I frankly don't worry about, and other things that I do. But that's for the reader to decide.

You know, people say, oh, what a depressing book. I think it's an optimistic book, because it holds out the hope that if you had systematic thinking-- if you had sort of apolitical, rational analysis systems thinking-- if you want to call it that-- that we could see problems coming and stop them from being really big problems.

I believe in government. Both of us worked in government for decades. I believe in good government's ability to be rational, and save the country, and save the world from some of these disasters. If you ever go to Sienna in Italy, there's a marvelous old, like 300-year-old painting.

JOEL BRENNER: Good government, bad government.

RICHARD CLARKE: Good government and bad government. And good government people are prosperous. They're going to university. They're farming-- all this lovely activity-- and the sun is out. And then the next frame is bad government. And people are being victimized and whipped, and there's tyranny, and there's poverty, and it's dark-- good government, bad government.

I think what the lesson of this book is, is we've got a lot of potential risks coming at us. We always do. But if we can't get good government to address them in a rational, scientific way, we will have the outcome of bad government in that painting in Sienna.

AUDIENCE: Thank you.

JOEL BRENNER: Dick, you know-- here's another question. Go ahead.

AUDIENCE: Well, go ahead.

JOEL BRENNER: No, no. Please.

AUDIENCE: Well, I'm kind of interested in the response to what happened yesterday in New York City, if you haven't already talked about it. I'm sorry. I got here a little late.

I think in some ways-- there's a two-sided aspect to my question. On the one hand, I think we're our own worst enemy in that we amplify-- somebody does something. It could have been just somebody who was 89 years old and lost control of their vehicle, but it's terrorism, and it's amplified. It's on the front page of The New York Times. So is there a way to negotiate how we and media respond and report on incidents like this, so that it's responsible, but not amplifying? That's the first thing.

And then the second thing is, when I listen to the clips of the government officials talking about it, they were really eager to call it terror. And I don't know how helpful that is, how useful it is to label it in that way. And I wonder if there's an alternative way of responding to it, talking about it, rather than the almost cliches of "innocent people were killed," and, "they were just going about their business," and, "this was terror," as if that makes it worse. Thank you.

RICHARD CLARKE: For about 12 years now I've been an ABC News commentator. And when something like what happened yesterday, happens, I get called and thrown on TV to talk about it. And so with very little notice, I got thrown on live TV yesterday to talk about this.

What I tried to do was put it in perspective. Yes, it's the worst-- they were saying it's the worst terrorist attack in New York City since 9/11. But 9/11 was 3,200 people dead, and this is eight, and there is a difference.

This sort of thing has been happening a lot in Europe. It hasn't happened here very much. We don't need to panic.

We live in a fragile society. We live in a society where any person, for any reason, can decide to kill other people. The guy in Las Vegas, we still don't know his motive. We may never know his motive. But it had nothing to do with ISIS or Islam, or any religion, as far as I know.

We live in a society where that sort of thing can happen. And we just have to remember that saying this is only done by Muslims is not only wrong, it's counterproductive. And the number of Muslims who believe in this sort of thing is infinitesimally small as a percentage of the billion-plus Muslims in the world.

Timothy McVeigh, who killed children in Oklahoma City-- the worst terrorist attack in the United States prior to 9/11-- was not a Muslim. He was a Christian. The last time we had someone get a pickup truck and ram a crowd to kill people was in Charlottesville, and that was a Christian-- or at least he called himself one.

So I think that when these things happen, there is a role for media, and it is-- I think we do this at ABC, at least when I'm on. I think we try to put things in perspective, and not hype them.

We cannot fail to cover these events. But I think what responsible journalism can do is explain them, and say, you know, if you're in Cambridge tonight or Dubuque, don't worry about it. The chances of this happening to you are much less than the chances of you being killed by furniture.

Statistically, more people in the United States die because of furniture falling over onto them than die as a result of terrorism in the last 10 years. That I can do other stupid things that people die of that's far greater than terrorism in the United States in the last 10 years. People just need to understand what the real threat here is.

Yeah, there's a problem. Absolutely, there's a problem. And are we doing enough about it? Yeah, I think we are-- probably doing too much, frankly. But to say to journalism, don't call it terrorism, don't cover it, that ain't going to happen.

JOEL BRENNER: Dick, I think we've got to the end of the hour. This has been an extremely stimulating, articulate, and enjoyable presentation, and I would like our audience to help me thank you for that.

RICHARD CLARKE: Thank you. Thank you.