DATA

People or processors? Unlocking the secrets of risk and data

Is data really the key to better exposure management for re/insurers? That was the subject of intense debate in a panel discussion featuring senior leaders from several industry players, chaired by Tom Anderson of AdvantageGo.


The rise in $1 billion and bigger losses in recent years is a concern for insurers. It continued unabated in 2020, with 22 weather and climate-related events with losses exceeding $1 billion in the US alone. How can insurers, reinsurers and investors manage that risk, and what role will technology play?

That was the subject of a discussion in February hosted by Intelligent Insurer and led by Tom Anderson, director of sales for the US at commercial re/insurance software specialist AdvantageGo. He was joined by a group of panellists from across the industry and with decades of experience:

  • Conor McMenamin, senior vice president and head of risk and underwriting integration at Bermuda-headquartered re/insurance provider RenaissanceRe;
  • John Huff, former president of US regulator the National Association of Insurance Commissioners and CEO of industry lobby group the Association of Bermuda Insurers and Reinsurers;
  • Matt Belk, partner, managing director and chief actuary of ILS Capital Management, an SEC-registered investment firm specialising in insurance-linked securities;
  • Nick Eromin, managing director and senior technology officer of Hudson Structured Capital Management, the alternative investments asset manager focused on the re/insurance and transportation sector; and
  • Janice Englesbe, chief risk officer of Bermuda-based public limited liability company Arch Capital Group, which writes insurance, reinsurance and mortgage insurance on a worldwide basis.
“You still have to be practical and realistic in terms of your evaluations of risk.”
Janice Englesbe, Arch

A revolution in computing

There was no dispute on the panel that the progress in technology has been profound, in the ability to both capture data and process it.

The ability to model risk today is unrecognisable compared to when many started in the industry. Englesbe said: “When I first started you’d run 2,000 scenarios over 24 hours and then see what you got. Now you can do that in about 12 seconds.”

Moreover, while capabilities have increased, access to such processing capacity is much wider. The barriers to entry have almost entirely crumbled with the advent of the cloud, she added.

“As recently as five years ago, if you wanted some serious computing power, you were talking about a seven-figure investment in servers, server rooms and staff to support all that infrastructure,” said Eromin.

“Today you just log into a portal on your cloud service, and dial up the resources you need. You can perform all the analytics you need quickly without ever having to make that big, upfront capital expenditure outlay.”

The technology in markets such as Bermuda has not just improved the industry’s capabilities when it comes to analysis: it’s been instrumental in providing the flexibility needed for the industry to respond to the COVID-19 pandemic. It proved crucial to the ability to continue to operate in the crisis, according to Huff.

“The investment in technology paid off for the entire Bermuda international market. Literally overnight all the companies went home but more importantly, the regulator was able to do the same,” he explained.

“The reality is that nothing is seamless, and there’s always a lot of human points along the way.”
Nick Eromin, Hudson Structured Capital Management

Needles and haystacks

There are, however, limitations. As Anderson pointed out, despite the massive amounts of data that can now be captured, processed and analysed, many still feel they do not have the insights they need. A survey commissioned by AdvantageGo last year found that 67 percent of respondents felt they didn’t have all the necessary data from an underwriting perspective.

There’s no shortage of data, but sometimes a shortage of insight, said Anderson.

“There’s so much data in the world today—I’m looking for that needle in the haystack. I’m looking for the data capture but to make it valuable and insightful.”

The volume of data can lead to overconfidence, added Englesbe.

“With this greater ability to measure things better, you can’t forget about parameter risk. That can magnify negative outcomes greatly, so you still have to be practical and realistic in terms of your evaluations of risk .”

McMenamin agreed. “Quality, not quantity, is the key,” he said.

“Sometimes having more data isn’t necessarily that advantageous. It’s about the quality and reliability of the data you have and having tools that allow you to deal with it in a quick and effective manner.”

There are several barriers to achieving that, one of which is integrating with existing systems. McMenamin said that it’s a key consideration when investing in solutions.

“The best solutions perfectly tailored to their task can result in the information being siloed with the organisation so that its value isn’t realised,” he explained.

At the very least, this can significantly erode the technology’s ability to simplify the process, according to Eromin.

“Everything is supposed to be ‘seamless’ in the tech world. The reality is that nothing is seamless, and there’s always a lot of human points along the way that involve data entry, data checking, and data scrubbing,” he said.

“There’s a real opportunity for us as an industry to take our collaboration to a new level.”
Conor McMenamin, RenaissanceRe

People power

In part, some of the problems are likely to ease with time. Regulatory demand, if nothing else, will lead to better data standards promoting compatibility. That will aid businesses in both analytics and compliance—particularly if regulators can be persuaded to standardise among themselves, the panel agreed.

“Obviously, the fewer differences you can have in reporting requirements and what you are required to do in different jurisdictions, the easier it is for firms,” said Belk.

Even with the best standards, however, it’s impossible to escape the fact that people remain the core of the underwriting process. The data analytics, said Belk, is a “comparative” rather than a “determining” tool: the underwriting process still starts, and ends, with the underwriters.

“The underwriters with all their experience are the first people to go through the information,” he said. “They know the people—they know the underwriters at the insurers. The quantitative data supplements rather than replaces their qualitative view.”

The panel also discussed the importance of remembering that insurance is still a people business after a year in which much human contact has been curtailed. But that’s not to diminish the technology—just as the return to meeting in person does not negate what the industry has achieved with remote working. Rather, it highlights the potential for greater things.

McMenamin concluded: “Given that we’ve been able to do this remotely, what are we going to be able to do when we all get together again?

“There’s a real opportunity for us as an industry to take our collaboration to a new level, and to find new ways to move the industry forward.”

To view the recording of the discussion click here


Alex Field is client executive at AdvantageGo. He can be contacted at: alex.field@advantagego.com


Image courtesy of Shutterstock / GaudiLab


Sign up to the Intelligent Insurer newsletter


Take a trial subscription