It should be no surprise that data has become one of the most valuable business assets. It comes in many formats, is critical to how businesses run, and is one of the greatest security threats. Yet the quality of most data is not that good. Forrester, a market research company, has documented that less than 0.5% of all collected data is ever analyzed and used (source). This is despite its enormous potential! There are several reasons for this shortcoming. Lack of visibility, context, and timely access are all factors. Poor quality also plays a role. What is the cost of poor quality data? More importantly, how does that cost impact the overall Title industry?
How Much Data is Being Generated Today?
According to research published by Statista, the amount of global data generated in 2020 was about 64 zettabytes (source). To put this in context, 1 zettabyte is equal to 1 billion terabytes, and 1 terabyte is equal to 1,000 gigabytes. A zettabyte is equivalent to 1,000,000,000,000,000,000,000 (1021) bytes. That is a lot of data!
In just two years, the amount of global data in 2025 is expected to exceed 180 zettabytes. That is triple the amount of data that existed in 2020 – a huge increase and jump from prior estimates. Statista attributes the new growth to increased data generated during the COVID-19 pandemic. More people worked and learned from home. That activity created more demand for home entertainment. In addition, with the shift to remote working, a greater volume of communications and document sharing was done digitally.
Of all the data being generated, about two percent of it is saved (source). Despite this low percentage, it is still a big storage requirement. In 2020, the installed base of storage capacity reached 6.7 zettabytes. Future storage requirements are forecast to increase at a compound annual growth rate of 19 percent from 2020 to 2025 (source).
There is a cost implication to the ever-expanding data being generated. To start, there is a cost to store and use this data. And to ensure it is secure. Those operating in regulated industries are at risk if data is not protected – another cost associated with the explosion of digital data.
In an environment where storage requirements are increasing by nearly 20 percent a year, poor quality data adds unnecessary cost. This cost is amplified further when considering the industry’s move to cloud-based applications. As more data is stored on the cloud, the more it costs to use these applications.
What is the Impact of Poor Quality Data on Business Decisions?
A couple of years ago IBM performed a study on the cost of poor quality data. Their study found that poor data quality resulted in $3.1 Trillion lost annually out of the U.S. GDP (source). The calculation was based on several factors. A big impact was that 1 in 3 business leaders don’t trust their own data. Companies are spending a lot of money to harvest data, then don’t act upon it, making the process a big waste of time, resources, and money.
Anyone in a sales or marketing role that has managed an email database knows how quickly information gets out of date. Account Based Marketing strategies are equally challenging given how often individuals change roles or companies. Title companies often invest in data collection strategies, but don’t follow up with cleansing processes to ensure accuracy.
Forrester conducted their own research. It too confirmed that one of the biggest costs was a lack of trust to make informed business decisions. Forrester estimates that if a typical Fortune 1000 business were able to increase data accessibility or quality by just 10%, it would generate more than $65 million in additional net income every year (source).
This logic also applies to the Title industry. Organizations investing heavily in digitalizing workflows and automating business processes are wasting money if the data can’t be trusted.
What is the ROI of High Quality Data?
Every company has a sales and marketing department tasked with understanding the value proposition, identifying the best go-to-market strategy, and evaluating sales opportunities. The value of high quality data can deliver a strong Return On Investment (ROI) to initiatives focused on improving sales and marketing performance. Here are three examples (source):
- More Effective Budget Allocation – by investing marketing budgets in campaigns that are validated to outperform the others, a greater ROI is achieved, leading to more leads and closed business
- Finding the Best Customers – by understanding the lifetime value of a customer, this data can be used for retention campaigns to deliver both short and long-term value.
- Understanding Your Value Proposition – with accurate data on how much it costs to perform every business process, it is possible to identify inefficiencies and even the potential for future price increases to capture higher, unrealized value.
What Can Title Industry Companies Do to Avoid the Cost of Poor Quality Data?
Here are five steps Title industry and insurance companies can take to improve the quality of their data. These initiatives can contribute to greater trust that the data is accurate. With this trust, these organizations can place greater reliance on data-driven decision support.
- Avoid manual data entry processes – take any opportunity to automate data capture, integrate with existing data repositories, or invest in high quality data extraction processes to reduce or avoid manual data entries.
- Remove paper-based processes – invest in online data capture, loan application, or digital title policy review processes.
- Automate more processes – automation works well when based on accurate information. The cost of poor quality data will quickly become apparent if automated processes spread bad data across the organization.
- Improve how business systems are integrated across the enterprise – ensure every enterprise application is integrated to share data seamlessly without additional quality review steps being required that slow responsiveness and reduce operational agility.
- Invest in new workflows – these embedded processes can improve how data is captured, classified, and extracted into enterprise applications.
Low quality data has a big impact on operational performance. When data isn’t trusted, it won’t be acted upon to complete a process. This is a situation that is happening across every industry – including the Title industry. Ensure your digital transformation programs are capturing all potential upside by including data quality safeguards as part of your overall program strategy.