Machine learning, AI, cognitive computing and predictive analytics—terms that are oftentimes used interchangeably these days—are washing over the insurance industry with promises of increased profits and savings from every corner of the enterprise. They are however, not the same.
The concept of machine learning has progressed past the bleeding edge of innovation into the mainstream, and is now seen as a differentiator for competitors in the insurance space. Because of that, navigating the accompanying marketing noise, the promises from niche vendors pushing canned use cases, and the pressure to invest and explore these concepts can lead to either fear or fatigue of advanced data analytics. And that’s only good for your competition.
Over the last year at Atlas General Insurance Services, we capitalized on our initial investments made in data tools and resources. From the outset, our data strategy dealt with strengthening our foundation and bringing awareness of the data we had available to us. Centralizing, cleansing and democratizing data across the enterprise is allowing us to lay the groundwork for the predictive analytics goals we set for ourselves at the start of this endeavor.
Navigating the machine learning landscape in search of a solution, we looked at vendors of every stripe in the space to see who best fit our needs. What we found were solutions either too limited in scope or authenticity, or that required us to give up either IP or data in exchange for the predictive models being offered. While some are more affordable than others, we found that in general you get what you pay for. Beware the shortcuts. This is a space where the machine learning strategy you want to deliver will be based on the questions you want answers to, as much as the people you have working on it.
For example, investing in one-off models (costing $15,000-$50,000) may be appropriate for a company that is looking to just check a box and say they have predictive models. Or it may make sense for another company to invest in a custom model, built from a shared data repository with other peers ($150,000-$300,000). Both examples will allow companies that lack statisticians or data scientists, entry into the realm of machine learning, each with different costs, risks and results.
There are no one-size fits all, and your needs will change over time.
At Atlas, we decided that ownership and management of our data, machine learning and predictive analytics was something we wanted to retain control over. We invested in resources, and started down the path of answering simple questions first. Among them: Are we using our underwriters time wisely? Do we know which type of accounts we tend to bind at the price we want? Can we improve the operational handling of our claims? Do we know which submissions we don’t want to waste our time on?
We train our models in close partnership with our internal subject matter experts, and leverage a culture of striving to be the best at what we offer.
But even with our team of budding data scientists, we had to juggle all the other work that comes with maintaining data across an insurance company, including transitioning to Domo as our new BI/analytics platform. Because of that, our model development cycle of preparing, building, validating, testing, tuning and generating the answers to these questions was taking upwards of one-to-three months per model. That’s a decent time frame in the eyes of any IT department, but as a services company we wanted to be able to match the speed of our business—or move faster. I never had expectations around moving faster than the data science norm, at least not without a level of investment that would have killed our ROI. Then things changed, as they do from time to time.
After attending an actuarial conference earlier in the year, we couldn’t stop talking about a data science tool that was demoed there. The vendor was DataRobot, and while they have competitors (who we later vetted), DataRobot’s concept of applying machine learning to the model selection process (and data science in general), ended up being the right fit for where Atlas is today.
Atlas General Services is not affiliated with DataRobot or with Domo. The MGA is a customers of both.
DataRobot allows us to score and compare dozens of algorithms on our data sets, and ultimately delivers suggested models that we can compare, validate, test, tune and integrate with our systems. And instead of this process taking a statistical resource several months to come up with a solid working model, it now takes 30-60 minutes to get a list of two dozen proposed models for the team to work with and tune. We are now capable of pushing out a working model in a week (not counting the data pre-work). That exponential increase in efficiency for our data science team positions us to expand both our backlog of work and our focus with how and where we can effect change. It was a game changer for us.
Internally, we continue to prioritize predictions on fundamental questions to our business: pricing, underwriting, claims handling, fraud detection, customer engagement, operations, etc. We continue to train our models in close partnership with our internal subject matter experts, and leverage a culture of striving to be the best at what we offer. And because we now have the ability to move from idea to answer in days instead of months, it opens a new paradigm for Atlas to bring additional value to our customers as an MGA. Not only are our internal stakeholders more involved (because we’re able to move in lock step with them at their speed), but Atlas is now positioned to expand on the level of customer service and support we already offer using advanced analytics.
It now takes 30-60 minutes to get a list of two dozen proposed models for the team to work with and tune. We are capable of pushing out a working model in a week.
MGAs are strategically placed to be able to provide insight and efficiencies to their carrier and reinsurance partners, as well as their agent and broker partners, because they are a provider of services. Data is at the center of every decision we make and every service we provide. Not being able to offer the additional value from this critical resource will not only become a risk for service companies, but a litmus test for anyone looking for a trusted partner. Adding services around data analytics, and improving or augmenting existing services with data analytics becomes a natural fit for MGAs that have mature data management and analytics solutions.
Adding services around data analytics, and improving or augmenting existing services with data analytics becomes a natural fit for MGAs that have mature data management and analytics solutions.
These efficiencies and improvements are always underscored by changing market conditions or other disruptions. And just as others rightly predict that these technologies will become mandatory for weathering those changes and being able to compete in the marketplace, companies must have a data strategy that aligns with their strategic vision and roadmap. Only then can we understand what level of investment is appropriate, and when. This isn’t a sprint to the finish line, but the race has started and the stakes are significant in the insurance space.
Whether you find good partners, or invest in internal resources (or both), the most important piece is having buy-in and understanding from the top, followed by clear goals. You can start small (and should), but make sure your organization understands where the road leads, and why you’re heading down it.
As featured on Carrier Management
Jesse Mauser, Atlas General Services
Jesse Mauser is Vice President of Data Analytics at Atlas General Insurance Services. He has nearly 20 years of experience working in areas related to business intelligence and analytics across a host of industries, including insurance. He is an established thought leader in emerging data management paradigms and has helped organizations manage their information systems, develop and align data strategies, and is a tireless promoter of establishing a data-first culture that brings accountability of data back into the hands of the business user.