Consumers have a love/hate relationship with sharing their location ― more precisely, sharing their geolocation.
With services such as Facebook and Twitter, location sharing is “almost” a default option, whereas sites such as PleaseRobMe.com warn about the dangers of over-sharing our location information.
Putting consumer interest in the sliding scale of “sharing too much” aside, for insurers, the relevance and importance of consumer geolocation data is undeniable.
The proliferation of smartphones and GPS-enabled devices has driven down the cost of reliable geolocation data. Multiple industries have used geolocation data for a while now for purposes such as online advertising, content rights management, fraud, security analytics, content localization, mobile targeting, etc.
However, insurers are coming to understand the value of geolocation data, and if you look hard enough, you can find an array of nascent, but potentially very powerful, uses:
- Usage-based insurance and telematics. This lets insurers build pricing models based on driving records.
- Distance-based vehicle insurance. This allows insurers to price policies according to how much and where their customers drive.
- Natural disasters. There are countless studies about claims and disaster responses, how geographical information system (GIS) data reflects past and real-time risks, such as wind, hurricane, forest fires, earthquakes, hail, etc.
Related: Imagining UBI’s frictionless future
Simplifying geolocation data into an example, carriers looking to offer car insurance in Williamsburg, N.Y., may rely on ZIP code information for guidance as to what their customers might find on their daily commute. However, ZIP code 11211 is a big area, and using the entire ZIP code might actually limit what insurers can know about an area.
Certain industrial areas of Williamsburg have a massive challenge with cleaning up oil spills, which can have an obvious affect on safe driving. Knowing where these hotspots are will allow a carrier to price more accurately for the risks they assume.
According to William Raichle of Insurance Services Office Inc., there hundreds of millions of dollars are at stake.
Continue reading ...
Geographical information systems can generate a lot of data. The challenge is being able to process it so that the costs don't outweight the benefits. (Photo: iStock)
Using GIS data
These issues manifest in two ways.
First, when an underwriter is evaluating a case, they need good, dependable GIS information ― but that’s not always easy to come by. Insurance is one of the few industries that relies heavily on regional or geographical boundaries.
Underwriters need to rely on being able to do risk calculations based on specific geographic areas ― their pricing, risk and revenue depend on it. Underwriting a property that is a block away from the beach vs. five blocks away? Of course the premium is going to be different. Drive more than three miles to work each day? Of course insurance risk profile is going to be different.
An underwriter is most certainly going to verify Public Protection Classification code territorial assignments, tax district, coverage zone, etc. They could choose to either use maps to point and click or they could quite use their automated risk/loss-modeling software to obtain this data on demand, in real time.
To be most effective, underwriters should use multiple data sources in addition to a comprehensive GIS database ― these include risk/hazard databases, natural disaster databases, crime databases, and others.
Second, once a policy has been underwritten, property has been insured and it is business as usual. However, this is the time where geolocation data actually has to be monitored. That means the insurer needs to have scalable platforms in place to take in such geolocation information, store that volume of data in real time, run calculations/algorithms on the real-time data, run predictive analytics on the past geolocation data that has been enriched by risk/loss modelers, etc.
Again, getting geolocation data may be relatively easy, but the challenge lies in actually processing the volume of data GIS can generate. Executed poorly, the costs associated with storing and processing that data can outweighs the benefits.
Picking software that meets these needs can be a challenge. Insurers should identify an approach that addresses the following questions:
- Do you need to store a large amount of GIS data?
- Do you need compare/contrast capabilities?
- Do you need most often queried data in memory, and can the rest sit on disk waiting to be retrieved?
- Do you need spatial functions? Do you need to support all WKT Geometries?
- Do you need true geodetic support? Spatial indices?
- Do you need to stream location updates from millions of devices 24/7?
- Can you afford to lose location updates or do you have to store each and every location update from every single device?
- Do you need to run real time analytics on your location updates?
- Do you need to run geotargeting on your location updates?
- Do you need have actionable intelligence associated with your location updates ― whether these updates are targeted for the mobile devices, third-party platforms/systems or both?
- Do you have or need context awareness with this data? Do you need to calculate context awareness based on the location updates?
- Will there be even more requirements that your particular use case demands?
Carriers must make sure to complete due diligence on each one of these topics before they can take the plunge. Those that address these questions thoroughly will ultimately end up with successful, and profitable, GIS initiatives.
Ronnie Guha is the founder and CEO of New York City-based Nisos Technologies.
Learn more about how technology is impacting the claims process at America's Claims Event (ACE). From the Internet of Things to customer service to fraud and litigation, this two-day networking and educational conference is designed for claims professionals. Register to attend and save $350.