Critical Point Episode 12: Could cyber risk be the next “Big Short”?
Transcript
Announcer: This podcast is intended solely for educational purposes and presents information of a general nature. It is not intended to guide or determine any specific individual situation and persons should consult qualified professionals before taking specific action. The views expressed in this podcast are those of the speakers and not those of Milliman.
Rebecca Driskill: Hello and welcome to Critical Point, brought to you by Milliman. I’m Rebecca Driskill and I’ll your host today. In this episode of Critical Point we’re going to be talking about cyber risk. The entry point for most of the general public when it comes to cyber is attacks like those on Equifax and Target, breaches that were highly publicized and comprised a lot of personal information but appeared isolated to one target company. But cyber risk has the potential for much more serious far-reaching consequences on the scale of 2008’s global financial crisis, as we’re about to discuss. Joining us today are Chris Harner and Chris Beck, consultants from Milliman’s Cyber Risk Solutions practice. Chris and Chris, thank you for joining us.
Chris Beck: Thank you, Rebecca.
Chris Harner: Great to be here.
Rebecca Driskill: So let's just jump right in. What does Milliman’s Cyber Risk Solutions practice do? What makes you guys different?
Chris Harner: The Milliman Cyber Risk Solutions practice first and foremost focuses on not the assessment, per se, but the quantification of cyber risk. And that’s really important, because a lot of the other consulting houses or products out there, they’re really conducting or selling assessments, control assessments, red-teaming, that type of thing. What we’re focusing on is taking inputs like that—those assessments are certainly relevant—but we provide quantification by taking a comprehensive view of industry data, in-house data, those other types of assessments, in order to quantify the risk and really give the client what they need. And that’s a continuous loss distribution, so they can start the discussion on what’s the expected loss of a breach or an event as well as the tail—what does that look like in an extreme event?
Chris Harner: There's also kind of a language communication issue here. And what we mean by that, when we talk to chief risk officers or to chief information security officers, is that one of the challenges they have is too much information. How do you take all that information and how do you synthesize and analyze it and tell a story that's intelligible to the C-suite or to the board? And so part of what we’re doing by quantifying cyber risk, which is a new risk that people are struggling with, they can begin to use that risk language that the board is comfortable with, that they recognize maybe from credit risk or from other domains, and then they can start to demystify the risk and people can start talking about strategy, risk appetite, cost-benefits of certain investments, and so on. So it's definitely the quantification of this risk but really a higher-order issue is I think we're helping the risk people communicate with the information security people and to tell that story to the board.
Rebecca Driskill: So what is the biggest misconception about cyber risk?
Chris Harner: I would say the knee-jerk reaction is people feel that this is strictly technology. It's specialists with that background who own this. And there's a logical reason for that. But we think it's actually an old risk, which is sort of resurrected in a new form, meaning it's bad people using technology to do bad things. So, as I like to say, instead of someone lurking around your home right now with a crowbar to break in, they're using some type of technology or program or spearfishing to break into your system, steal data, and so on. So I think that's one of the biggest misconceptions and that's what drives a bit of this gap, say between the risk side of the house and the technology side of the house.
Chris Beck: I think another misconception is that it's really about data security and information security. And the problem is really much bigger than that. In the world of cyber there are bad actors who are trying for various reasons to grind companies to a halt, grind sectors to halt. And because their motives are potentially very, very different, the impact can be very tough to measure, very tough to analyze. So what do I mean by that? If we look at the different large cyber incidents, some were motivated by financial gain, people trying to steal information so that they can sell it. Some organizations were trying to disrupt business, actually stop a firm from being able to do business. Some are just trying to embarrass a business. We also have seen a large number of cyber events that have been tied to state actors. And we start to see some of the questions or the motives behind cyber, the same motives that any other nation-state would have to do harm to another sector or another country. So when you start looking at all of those different motives it's like the detective stories you can't really figure out until you figure out the motive. With the myriad of motives for a cyberattack it makes it an incredibly complex thing for companies to look at and it makes it different from the normal challenges of other risks.
Rebecca Driskill: Talk about that a little bit. Maybe Chris Harner, if you want to start off. I think when you think about the general public most people think about cyber as these isolated attacks for personal financial information Target, Equifax, Marriott. But there is something that seems much more sinister, silent cyber that doesn't get as much press. Can you talk a little bit about what is silent cyber? Introduce the concept and how it can have sort of more global ramifications?
Chris Harner: Sure. So silent cyber is a phrase that's coming out of the insurance sector and it is the realization, when you underwrite cyber insurance, that a claim doesn't just impact that specific policy, because there are knock-on effects on other policies that have been underwritten. So, for example, everybody's read about a cyber breach and probably the second article the next day is that now shareholders or some group are suing the board of directors or that company. In the insurance world that means not only will that company potentially tap the cyber policy with the claim but now the directors and officers (D&O) policy. There might be other policies that that same underwriter wrote for the company: errors and omissions (E&O). There could be contingent business interruption (CBI). So think of an industrial company whose supply chain was disrupted. It couldn’t deliver on certain contracts, couldn’t produce certain things. And now what the underwriter originally thought—they had a limit around that cyber policy with this one company and thought it was contained at, let's just say $10 million—now all of these other claims roll in over time, which means exposures may be let's just say $50 million. So there’s this nonlinear multiplier effect of how cyber events play out. And that is a big concern for the insurance sector.
Chris Beck: As we live in a continually more and more connected world the second-order and third-order impacts of a cyber event become much, much larger. Our power grid has increasingly moved to places where web and technology vulnerable to cyber is being used to more efficiently deliver you electricity. But that creates a vulnerability that means a cyberattack might lead to not only you not able to turn your lights on at home but issues with hospitals, issues with public service, of fire, police. The ability to look at those is complicated and difficult. But when we look at silent cyber a lot of the things that are potentially impacted were also insured. And that means that a supply chain issue could cause lots and lots of problems and lots and lots of insurance claims. The inability to deliver other goods and services does the same.
Chris Harner: I think I would just add to that, if you take a step back, cyber really highlights and aggravates the law of unintended consequences. So let's take a simple example on healthcare. The Affordable Care Act (ACA) requires the digitization of medical records. There’s a certain logic to that so that physicians can get the lab results very quickly. It's on file. But to Chris's point, now that's a new vulnerability. And depending how that plays out, if someone’s records are compromised or deleted and the physician doesn't have that information and something goes wrong, then is there potentially a malpractice lawsuit? Is there a claim on that policy? So this is part of the complexity of cyber as we become more and more dependent upon the Internet of Things or other software and that type of interconnectivity.
Rebecca Driskill: We've seen this interconnectedness of risk actually play out in a number of cyberattacks fairly recently, I mean within the past decade probably. Chris Harner, why don’t you tell us in what ways we've seen cyber risk and the interconnectedness of that risk play out globally?
Chris Harner: Yes. So I think one event that comes to mind in 2017 was the NotPetya attack. Long story short, it's believed that Russian intelligence created a virus to punish Ukraine, hitting their infrastructure, because of what's going on in the eastern half of the country. They inserted some malware into an accounting software product that started to infect the entire country. And, again, back to the law of unintended consequences, they only wanted to really punish or hurt Ukraine. But it actually spread globally. As that virus propagated throughout the country, eventually it hit the port in Odessa. The largest shipper in the world, Maersk, the Danish company, has operations in that port and their software and their systems became infected. From there it propagated, it jumped to Mumbai, to Rotterdam, to other ports around the globe and eventually knocked Maersk off of 75 out of 76 ports in the world, resulting in the inability to load or unload ships. Twenty percent of global shipping goes through Maersk. And each of those large ships contains roughly 18,000 containers. So this had that second-order and third-order effect of shutting down ports around the world because now you cannot load or unload, which means the trucks can't move. In terms of silent cyber, that was a game changer in which the insurance industry realized there are other claims involved. We never thought of something like that happening, but if you had perishables on board, you could have breaches of contract because, again, in the supply chain you weren't delivering. This is how it spreads. Again, this is kind of nonlinear geometric impact. Last but not least, that case really highlights, as well, that cyber, unlike many other risks, does not have a geographic boundary. So that makes it different in the insurance industry, say versus non-life catastrophic event (CAT) risk for hurricanes, earthquakes, and so on.
Rebecca Driskill: The title of this podcast is "Could Cyber Risk Be the Next 'Big Short'?" What are things that insurers need to watch out for given the interconnectedness of this risk?
Chris Beck: The reason we draw the analogy to The Big Short for all firms, but especially the insurance industry, is that it is nonobvious how the different potential bad day events could impact each other and could lead to a fairly catastrophic event for the insurance industry. So The Big Short analogy, I think, works really well because there were lots of things that the financial industry was looking very hard at in mortgages—measuring performance of different financial tools, measuring risk through classic risk metrics—but they weren’t looking at potential chains of events where, if just a couple of their assumptions failed, the second-order and third-order and fourth-order effects would be so cumulative in nature that it could wreak real havoc on the economy.
So what does that look like for cyber? The NotPetya attack is an example, but we would suggest that it is something inside the expected possibilities for a cyberattack because, even though it was by dollar value one of the largest attacks in history, lots of things kept going forward. Through a little bit of planning and really some luck, lots of systems were able to be put back online fairly quickly. But if you start thinking about what happens if—we’ll keep with the Maersk example—what if it wasn't 75 out of 76 ports? What if it was 76 out of 76 ports? They got lucky that one port was off-line due to a blackout and they were able to recreate their systems from the data in that port. Well, that's not 20% of global shipping off-line for a couple of days. That’s 20% to 25% of global shipping off-line for we don't know how long. And when you start multiplying that by containers and goods to market you start seeing the real potential for the catastrophe.
One of the analogies that comes to mind there from The Big Short is that you get to a place where banks that were taking risks and betting on mortgage-backed securities start to not be able to serve their clients that had nothing to do with those businesses. So you had large profitable companies saying, “I don't have cash on hand to do payroll. That's the reason I've got a rolling line of credit with you. My cash is invested in 500 other things that give me a higher rate of return than taking a loan from you. You’re telling me that even though I had nothing to do with mortgage-backed securities and I wasn’t exposed or taking any of these bets that I might not be able to make payroll in 15 days, in 20 days, and 30 days?” The same analogy works, and frighteningly well, for a cyber event that can take the systems and processes of business off-line for an amount of time that you start to see effects that are fairly catastrophic for companies and industries that didn't think they were in the cyber insurance business, that legitimately thought they had their cyber risk profiles understood and well-controlled for.
Chris Harner: Yeah. I definitely agree with that. And I would add to that, Chris makes a good point that we haven't had the systemic event. The big problem with where technology is taking us is that interconnectedness. What that means is that risk is increasing throughout the economy, which means, regardless of what business you're in, you're less and less resilient because you’re more and more exposed to the failures of counterparties, suppliers, or indirectly. Think of different types of vendors. As engineers would say, everything is becoming tightly coupled, so think if you ride Jersey transit…
Rebecca Driskill: I do. Unfortunately.
Chris Harner: Yes. And if you're trying to get home, well, Amtrak owns those railroad tracks. So if an Amtrak train breaks down and they have priority you’re going to sit there. That’s a very tightly coupled…
Rebecca Driskill: That happened this morning.
Chris Harner: A-ha. So that is a very tightly coupled process. And cyber is aggravating this type of interdependency. It means that you can no longer be as resilient and robust to survive a systemic attack. You know, maybe a one-off just to your business, in your area, and, again, going back to the insurance sector, this is a bit similar to blackout modeling where you have a cascading failure. One node in the power grid goes down so the other areas get overloaded, then they go down. We experienced that actually in August 2003. Ontario went down and took down Ohio, Pennsylvania, and then everybody else with them. And that's really where we’re headed potentially with cyber. And it's not clear if everyone's really thought this through and is prepared for it.
Rebecca Driskill: We can't say that we can ever prevent a cyberattack. But there are better ways to manage this risk. Can you guys talk a little bit about what are some of the dangers or some of the good things that insurers and businesses can be doing to manage this interconnectedness of risk?
Chris Beck: Absolutely. So a couple of things. One is with the available tools, which are doing a very good job of measuring both your exposure to cyber risk as well as the steps you are taking to prevent cyber risk and that almost every firm is doing today, and spending a lot of time and a lot of effort. A lot of very smart people are coming up with great solutions. The next piece is to, also, look at the cyber risk of other companies you're connected to, whether it's through an insurance contract or two companies that are doing business and have data feeds between them. Understand how you’re connected to the larger ecosystem of cyber risk and understand the threat vectors and the potential controls that are in place. The supply chain analogy is great here. You are exposed to cyber risk through overland trucking, as we saw in NotPetya, because when the truck was showing up at the port because of the cyber event, the port didn't know which container to take off the ship in which order because a lot of that was digitized and a lot of that wasn't available because of the cyberattack. So they don't know which container to put on the truck. Your supply chain is now down and you’re a company taking losses through your overland trucking, which you really probably didn't put into your cyber equation. If you’re three stops down the supply chain you probably hadn't accounted for that. But it's looking at the landscape with a bigger and bigger lens and putting together the different connection points so that you can aggregate that risk and then try to find where it is that you are most vulnerable and least in control. Where's the best bang for your buck to try to mitigate the risk? We absolutely understand that people are doing great jobs of solving for the different threat vectors, trying to stay in front of the next state actor or nonstate actor that is trying to bring the next massive cyberattack. A piece that's not obvious in that equation is how do you see a bigger picture and understand that potentially a threat vector or a cyberattack that is not targeted at you might very well impact you significantly and understand the greater landscape and ecosystem?
Chris Harner: Yeah, I would just add to that, looking at the big picture, some people have said that cyber risk is sort of like the Cold War, for every new defense you develop the enemy will develop a new offense. And, I think, if you're a decision-maker, a senior person in the organization, it raises the question, how do I get started? What do I need to do today? I think the first question you ask yourself is: what does a bad day look like? How much is this going to cost? What type of damage are we going to have? And including a lot of those things that we just talked about of what’s driving that. And that's where, you know, we believe strongly that if you can't quantify your risk then you really can't have a meaningful discussion about it. You're not going to get the risk appetite. You’re not going to be able to determine—you know, budgets are limited. Resources have limits. How do you deploy that capital? And that's why it's just so important to begin that process.
A lot of this information, a lot of these things are lying around in different places. So the risk managers probably have certain types of assessments. Certainly, the information security people probably have too much information and are struggling with what to do with it. Then you have the compliance people or people on the legal side of the house who will say, “Hey, a new law was passed in this jurisdiction. We have to comply with that and so on.” So I think it's taking all those pieces and asking yourself: are we comfortable where we’re at? If we get breached, not only do we have a plan but do we know ballpark where we are going to land, in terms of the damages, the remediation, and so on. And if the answer is no, then just like all of the traditional classic risks it’d probably be good to sharpen pencils and start getting a game plan together on how the organization gets their arms around that.
Rebecca Driskill: Well, and you both have been arguing for this new risk management paradigm. What's a high level of review of this way of trying to manage this risk?
Chris Harner: So imagine there’s a car accident and you’re standing on one side of the street and someone else is standing on the other side of the street. You both witness the accident. You both describe it to the police but you come up with two different versions. Not that one person is wrong or the other person is not telling the truth, but there’s a different point of view of what happened. All of that is valuable information if we’re trying to determine what happened, who’s to blame and so on. So when we think about this paradigm shift we've been getting into complexity theory, looking at risk from a different angle, not just the classic structure, which is a very linear version of what again we like to call list management. If we’re going to list out all of these risks and then we’re going to assess some probabilities around their happening, what's the impact? That doesn't capture the relationship, the nonlinear relationships of those risks. What we've been doing is borrowing from the social sciences such as cognitive mapping, so going back to the analogy of the accident, everyone has a different point of view, a different perception, and it’s valid, so that's why we need to bring everyone to the table. This goes back to the language issue. It's not just technology. There are risk people, compliance, legal, finance. Everyone needs to be working together on how to solve this problem. Cognitive mapping allows us to do that, to create a picture and the connection points. If we think of that in risk terms, that allows us to start to understand a story of what triggers have to be tripped, how many, and how that works out to get us to a tipping point, a real event. And you can kind of visualize that. There's a pathway. There are different ways bad things can happen. There are different ways that an accident could occur. It could be because of bad weather. It could be because someone was texting and wasn't paying attention. It could be that someone on a bicycle just jumped right in front of them and they swerved. Likewise with cyber, because it's such a high-velocity risk, it’s moving so quickly that we don't think traditional methods are going to produce the needed insights. You don't have the data, you don't have a rich 50-year data set in order to model it, so we have to think about this differently in order to get comfortable with how that risk actually manifests itself.
Rebecca Driskill: So besides the additional exposures we've been talking about, when it comes to companies’ vendors, clients, and other third-party relationships, Chris, I've also seen you make the argument that this is something that needs to be taken into account for mergers and acquisitions (M&A), if there's one company thinking of acquiring a target company.
Chris Harner: Yeah, that's right. I think this is an overlooked area but I think it's gaining more traction. I think it's dawning on people that, when you conduct due diligence for a very large transaction, there are all of the traditional areas that you're going to look at. If it's two banks, they’re probably going to look at the loan books and what does the aggregate exposure look like. Well, that concept applies to cyber as well. You don't want to get into a situation where you’re acquiring another company and you think it’s going to enhance a certain part of your business but without realizing that you're actually taking on additional cyber risk. Maybe your attack surface changes in a radical way. Maybe that introduces new threat vectors. And if you haven't assessed that then you have to ask yourself: do I really understand the risk of this transaction? Conducting that assessment for M&A transactions, by looking at the cyber landscape of that target, should actually move the needle on price. Probably more importantly, if you see something you don't like either you walk away or you're going to expect that the target is going to change certain things. That's definitely an area that people need to think about.
Rebecca Driskill: What are three questions that business executives should be asking themselves and their boards when it comes to measuring their cyber risk?
Chris Beck: I think the first one is: do we really know what a bad day would look like for cyber? The second one is: do we understand the different things that could occur or trigger that bad day happening? And I think the third question is: are the actions that we’re taking and planning to take the most effective ones to mitigate those risks? There is near universal agreement that there are going to be large-scale cyberattacks on major corporations whether they’re global or domestic. The piece that we’re really urging people to be thinking about is the odds that that cyberattack is going to impact their firm. And the degree to which things become more and more interconnected, the way we've seen the evolution of cyberattacks manifest themselves, the odds are that a large cyberattack on a corporation that's not yours could very well impact you in many of the different ways that we've described.
Chris Harner: I think there’s a consensus that, where we’re headed with cyber, the issue is not if you're going to get breached but when. And, however that attack manifests itself, again, it can be indirectly through a vendor, a third party. So the exposure might be broader than people realize. We would encourage people to really think about that and how that threat can evolve.
Rebecca Driskill: Well, thank you Chris and Chris for joining us. You've been listening to Critical Point presented by Milliman. To listen to other episodes of our podcast visit us at Milliman.com or you can find us on iTunes, Google Play, Spotify, and Stitcher. We'll see you next time.