Digital Transformation? Examine your Data Early in the Process.

 In Price Digests Blog

In commercial insurance, finance, and other industries where the value of commercial vehicles is essential to business decisions, the road to machine learning and artificial intelligence is being driven by consumer demand for faster responses and more online self-service.  Often, the carrier or agent with the fastest response gets the business.  That means speeding up replies by having machines calculate values in milliseconds, pushing basic insurtech or fintech projects into the realm of AI where decisions on policies and loans are made using data, business rules, and machine learning. It’s not always an easy task when values fluctuate, markets move, and demand ebbs and flows, so making sure your data source is accurate, complete, and timely is essential to success.

The right data, cleaned and processed to meet digital transformation challenges, is often cited as one of the most important building blocks for any insurtech, fintech or AI decision-making system. Here’s a quick take on three things to examine closely when you are qualifying data sets.


Proprietary data should be properly vetted to ensure it has appropriate levels of quality control in place, and external data should be sourced from reliable vendors. It is important for data coming from third parties to be unbiased since any bias in the dataset will skew future decision-making. It is also important that the data is, and will remain, consistent. Disruptions, delays, or discontinuations of datasets happen, and important sources should be bulletproof.


When considering what data to gather, either internally or from third party sources, it is important to not limit the types of data too narrowly based on past models or simply gut feelings. AI looks for patterns in data to predict an outcome, so limiting the scope of data may mean you are missing out on a previously unrealized predictor.


Data can be generated and delivered on a variety of frequencies. Generally speaking, high frequency data refers to data that is available more often than monthly, perhaps in real-time or for the previous day. Many data sources are updated less frequently than that, though, and may be available weekly, monthly, or even less frequently. As an example, the U.S. Census Bureau conducts the national census once per decade. Often, the timeliness of the data must be balanced with the accuracy and completeness of the data.

A new white paper written by Price Digests analysts delves into the digital transformation for insurance carriers. To read the full whitepaper, including examples for determining accurate commercial truck values, download our whitepaper “Data and Insurtech: Fueling the Journey to Artificial Intelligence.” 

About Price Digests

Since 1911 Price Digests has served the vehicle data needs of the insurancefinancegovernment, and dealer markets through its portfolio of VIN decoding, specifications, and market value data solutions for the commercial truckpassenger vehiclemarinepowersport, and recreational vehicle asset classes. Our data + intelligence solutions pave roads to faster and better decisions with perfect-fit data delivery, whether it’s seamlessly integrated APIs, online subscriptions or custom data delivery.

Recent Posts
electric heavy duty truck image for Price Digests blogWeb banner Price Digests HRVA finalists