More than a generation has passed since catastrophe modeling first emerged in the 1980s, giving insurers a new way to estimate and manage the risk of loss from hurricanes, earthquakes and other catastrophic events. Today, catastrophe models are part of the fabric of any well-run insurance organization and used to assess exposure, price risk, manage capital and inform stakeholders such as management, corporate boards, regulators and rating agencies.
The risk landscape is presently in the midst of significant secular and increasingly dynamic change. Population and economic growth has led to major new exposure concentrations in areas prone to natural catastrophes. Thirteen out of 20 of the world’s largest cities, for example, are in coastal areas. Europe’s most populous city, Istanbul, sits on top of a major seismic zone. California is struggling to deal with the implications of wildfire. Climate change is a reality and its effects can be seen by the increasing frequency and severity of natural catastrophes.
While the challenge has grown, the principles of how insurers assess catastrophe risk and share information have not broadly changed in decades.
Over the past 30 years, $4.5 trillion in economic losses have been attributed to catastrophic events, approximately $150 billion annually, of which $1.3 trillion was insured. This highlights the significant contribution of the insurance industry and its pivotal role, often under-appreciated, in helping to mitigate the impact of disasters on society and the global economy. However, it is also clear that the gap between economic and insured losses is an increasing problem that needs to be addressed, with the insurance industry playing an important role.
While the challenge has grown, the principles of how insurers assess catastrophe risk and share information have not broadly changed in decades. The advances in our scientific understanding are evident, and the benefits of the organized thinking that catastrophe management brings are clear. But, fundamentally, the industry continues to use a “rear-view mirror” approach driven by historical scenarios. Improvements to the tools used to assess exposure and trade risk amongst counter-parties have only been incremental.
We know what information is important for catastrophe management -- examples include the roof shape for residential wind exposures, building foundation for earthquakes, or first floor height for flood. We have billions of dollars of paid claims to help understand vulnerability. However, insights remain expensive and hard to extract, and the industry has been slow to adopt proper tools to process and analyze our massive amounts of risk data. The way I see it, our challenges are not technical -- they are institutional. The catastrophe management ecosystem of risk takers and tech providers that developed in the early 1990s was, ironically, one of the earliest waves of “insurtech” but change has since moved tepidly.
Fortunately, the insurance industry has the opportunity to rethink its approach to catastrophe management. The digital technology and thinking of today is reshaping commerce across the globe through ecosystems that thrive upon platforms, data sharing and APIs to drive product innovation. As an industry, we need to embrace this approach and address our basic operational fundamentals through:
Other industries such as banking and telecommunications have figured out foundational solutions to improve work-flow and handle transactional information through standards in messaging and/or data. As an example, the unified banking SWIFT processing system handles 12 million daily messages comprising customer payments, transfers and derivative trades. In today’s world of trading catastrophe risk--a simpler one than the financial system -- any given risk can easily be touched a dozen times along the chain of counter-parties with different, inefficient handling of information along the way. There are no industry standards to organize basic data on exposures or risk assessments.
The adoption of some combination of standards, APIs and/or interoperability tools would allow the industry to operate more efficiently, reduce costs and improve service to our customers. However, while valuable, this is not the real prize for the industry.
By solving these basic operating hurdles, the industry will then have the time and flexibility to create new products and services fueled by an ecosystem of risk insights, richer and deeper than today’s static use of catastrophe models. Exposure and hazard layers will be available at ever increasing granularity. New data and machine learning will help lateral thinking, with the Cloud providing the processing speed. Models will emerge into the mainstream that harness forward-looking views of risk, for example convective storms generated from global circulation models, or perils such as wildfire and flood. Cycle times on lessons learned from actual events or new scientific research will be measured in months, not years – enabled by “plug and play” software. Individual companies will harness these capabilities to innovate their value proposition to mitigate risk, improving the resiliency of their customers.
The time has come for the insurance industry to change its operating paradigm for catastrophe management. By creating greater efficiency, transparency, choice and innovation in the assessment and transfer of risk, the industry will be better positioned to perform its role, in a riskier world, as a vital shock absorber for everyday life.
The insurance industry has available the basic building blocks – the digital tool-kit, use cases from other industries and collaboration frameworks. Will our industry be so bold?
Sean Ringsted is Chief Digital Officer at Chubb. This article originally appeared on LinkedIn.