Volatility Adjustment training pack from novices to experts
The Volatility Adjustment (VA) is the most widely used Long-Term Guarantee measure under Solvency II.
Predictive analytics have opened a world of possibilities in the ways marketing, underwriting, and claims management are executed and managed today. Driven by sophisticated artificial intelligence (AI) technologies like deep learning, neural networks, and machine learning that mimics human thought, the field has gained a nearly magical quality. This guide is intended to demystify the topic and bring it easily within operational reach to claims professionals.
Predictive analytics attempt to establish relationships among variables or characteristics in order to predict future outcomes. But the process has long been hampered by access to limited forms of data in legacy systems or overwhelmed by the massive effort typically involved in trying to make connections among a wide number of variables. These obstacles have largely been overcome in today’s predictive analytics, which rely on technologies that use sophisticated algorithms to identify patterns in large data sets to establish relationships.
Used in the context of claims management, predictive analytics can segment or triage claims, prioritizing potentially high-cost claims early in the process for cost containment or fast-tracking low-cost claims for settlement. What had been a sometimes hit-or-miss and often labor-intensive claims identification process can now be much more data-driven and efficient with the use of predictive analytics.
AI uses computers to mimic the human thought process to solve problems. Machine learning, a type of AI, allows computers to understand patterns in data and perform tasks. Often, the chosen task is to predict a future outcome, which is called a predictive model. A predictive model learns the relationships between input and output using historical data. In the case of claims management, a predictive model can learn the relationship between different features like body parts, attorney involvement, and/or location to predict an outcome such as the cost or severity of the claim. The models can also predict a variety of other outcomes like risk level, surgery, or attorney representation. These types of predictive insights enable adjusters to intervene throughout the claims life cycle.
Because so much data in claims systems is unstructured text data, natural language processing (NLP) and deep learning are used to interpret this data. By using NLP, the computer can intelligently extract relevant information from long text data such as adjuster notes to feed downstream predictive models. Deep learning is a branch of machine learning, which attempts to mimic the behavior of the human brain. Deep learning excels on tasks that depend on very large data sets such as NLP, computer vision, and speech recognition, among others.
Claims departments have avoided exploring the application of AI over concerns about missing or incomplete information in their data. But with the use of advanced NLP techniques, information can be accurately extracted from the unstructured text to create high-quality structured data sets. For example, hard-to-access information like comorbidities, which have the potential to cause a claim’s costs to rapidly escalate, can be extracted using these algorithms.
A comprehensive knowledge of predictive modeling techniques and the claims life cycle is critical to the development of accurate models. Equally imperative to development is the support of claims professionals, actuaries, data engineers, and data scientists, who can work together to create and evaluate a predictive model and use their abilities to improve claims performance.
AI effectively bridges the gap between noisy, messy data and predictive analytics in insurance.
Predictive modeling output is often an estimated probability, dollar amount, or score. The exact output varies based on the objective and the stakeholder. Some models can also provide insight into the features that drive the prediction itself, providing context to the user. If a model is perceived as a black box, its output can seem less credible to users and its predictive insights are less utilized.
There are a wide variety of uses for predictive modeling in claims departments. The list below is therefore not exhaustive but provides several key applications.
Predictive analytics provide tangible cost savings in the form of:
Operational benefits include:
Predictive models look for patterns by comparing characteristics of outstanding claims with those of closed claims. An underlying requirement in the model’s development is the need for a robust database that spans multiple economic and insurance cycles and is appropriately balanced with respect to product lines and the age of the claims used in the model. While AI can overcome many shortcomings in a carrier’s claims data, the data used by the modeler to develop the predictive analytic platform needs to have integrity.
Incorporating demographic, meteorological, or other external data in the model can help to improve the performance of a predictive model in some product lines, but the improvement is minimal for a line like workers’ comp, where the characteristics of claims—what type of treatment the claimant received or is scheduled to receive—is more germane to the predictability of a claim’s outcome than, for example, the location of a claimant’s residence.
A predictive model’s accuracy is important, but it is only one measure of a model’s value. Models that have been overparameterized by tightly fitting the model to historical data can produce highly accurate results at first, but as new and different claims data becomes part of the database, the model’s accuracy typically erodes over time. Developing a way to track the model’s performance after implementation against predetermined key performance indicators (KPIs), for example, is instead a more useful way of assessing the model’s performance. A good model will typically improve over time as the carrier’s claims data collection process improves and information from "false-negative" outlying high-cost claims is fed back into the model.
It is also important to consider the level of support that is provided during and after implementation. Access to claims professionals who were part of the development team can provide advice with a model’s deployment and the best ways to use and track the model’s output.
Developing the caliber of expertise in AI technologies to develop a predictive model can tax the resources of even the largest carriers. New algorithms and hardware are continually emerging. Keeping pace with these advances while maintaining a competitive edge in a changing risk market can be a tricky balance for the most agile carrier.
Predictive analytics have steadily become an integral part of many carriers’ operating toolbox. Once suitable for only the largest carriers, they have made their way into claims operations where one person may wear many hats. While predictive analytics will never replace the expertise that claims adjusters bring to the settlement process, they have become a driving force in carriers’ efforts to reach new levels of efficiency and competitiveness.
Milliman’s Nodal is a predictive model for early claims intervention and cost reduction.
Nodal uses advanced AI technologies to identify high-cost and low-cost claims soon after reporting, allowing for efficient triage of claims and allocation of resources that maximize the use of staffing and cost containment strategies. Milliman’s team of actuaries, claims professionals, and data engineers has developed an end-to-end solution that is fully supported through implementation, deployment, and assessment.