WILDFIRE MODELLING

Modelling the enigmatic peril: the challenge of pricing wildfire risk

Wildfires have increased markedly in recent years, particularly in the state of California. With little historical data available to model the new risk, how are carriers grappling with providing cover in a market which is suddenly one of the riskiest out there?


In California, a risk previously deemed niche and remote has reared its head in recent years as one of the most severe threats facing homeowners, businesses and, by extension, the re/insurance industry.

The peril in question is of course wildfires, which have wrought destruction across the Sunshine state on a previously unprecedented scale and with ever more alarming frequency, catching residents and coverage providers alike by surprise.

The scale of the problem for the re/insurance market is severe, with thousands of homes and businesses in one of the wealthiest areas of the world caught in the tracks of these terrifying events, posing a technically challenging and financially troubling issue for the market at large.

But with historical data rendered seemingly insignificant due to changes in the environment, how can the market tackle the problem?

In a panel discussion at the Re/Insurance Lounge, Intelligent Insurer’s online, on-demand platform for interviews and panel discussions with industry leaders, executives from brokers, carriers and modelling firms got together to discuss the issue.

“The loss history and the historical fire footprints are not necessarily applicable.”
Tom Larsen, CoreLogic

Wildfire is different

Climate change and growth in properties in the impacted areas mean that accurately modelling these risks has become markedly more challenging in recent years, says Tom Larsen, senior director at risk modelling firm CoreLogic.

Despite advances in technology which enable more accurate modelling of aspects such as how much fuel is available to a developing incident in forests and surrounding materials, the risk still suffers from a chronic lack of data which makes it “enigmatic” to the re/insurance industry, he said.

“Reinsurers in general have become very good at developing acceptability practices and validating that the model they intend to use is appropriate. They can adjust it, there’s a whole sort of universe around that,” he said.

“Wildfire is a bit different. Because it’s particularly what I would call an enigmatic peril: the loss history and the historical fire footprints are not necessarily applicable.”

Part of that is due to changes in the landscape, for example housing developments or business centres where previously there stood meadows, but it’s due also to the concentration of damage compared to a risk such as hurricanes which cause more widespread destruction but also provide more scalable data.

“We need to look forward with this issue, and think about how we can decrease that intensity.”
Nidia Martinez, Willis Towers Watson

“When you are trying to validate these models it takes a lot more process, and we are working with our clients to help them understand and make sure that they have the right tools to make sure that they can validate, look at the loss history and know the model is appropriate,” Larsen explained.

“It’s not just a matter of the existence of the model—there have been models out for a long time. It’s how you can get one that you can embed in your process and have the confidence to steer your business with.”

Nidia Martinez, director of climate risk analytics at Climate Resiliency Hub, Willis Towers Watson, echoed Larsen’s assessment that attempting to use historical data to model the risk would not yield any usable results.

Given the changes to the environment over the past few decades—prior to 2017, the last two major wildfire loss events in California were in 1997 and 2001—Martinez says the market cannot rely on its previous assumptions to price the risk in future.

“Looking at the past and the way it’s been managed, I don’t think that will be our reality ever again. We need to look forward with this issue, and think about how we can decrease that intensity,” she said.

“The next wind-driven fire won’t happen in the exact same location and therefore will look different.”
Kevin Stein, Delus

A dearth of data

Kevin Stein, co-founder and CEO of home insurance and wildfire specialist managing general agency Delus, agrees that the market suffers from a severe shortage of data with which it can begin to more accurately price and model the risks.

“The difficulty when it comes from a modelling sense for all of us is that it is a data-starved problem,” he said.

“We don’t have many events to train our models, so we have to take some insights and make some educated understandings to be able to predict where the next fires can happen, because the next wind-driven fire won’t happen in the exact same location and therefore will look different.”

However, Stein added, complications from climate change and shifts in the underlying market are also making it a particularly complex area to accurately predict.

“There’s been an extension of the fire season. And there has been an increase in intensity in the wet part of the season and the hot part of the season, causing lots of vegetation to have grown during the rainy season,” he said.

“People are noticing there’s a tendency to have fires towards the end of the hot season here in California.

“That’s because we’re starting to have lots of vegetation that is dried out and 100°F temperatures, often through six to eight or nine months in. That’s one of the biggest difficulties when it comes to controlling these fires that are happening here.”


To view the full Re/insurance Lounge session, click here


Main image: Shutterstock / Toa55