Solvency II and Data Quality

OCDQ Radio is a vendor-neutral podcast about data quality and its related disciplines, produced and hosted by Jim Harris.

During this episode, Ken O’Connor and I discuss the Solvency II standards for data quality, and how its European insurance regulatory requirement of “complete, appropriate, and accurate” data represents common sense standards for all businesses.

Ken O’Connor is an independent data consultant with over 30 years of hands-on experience in the field, specializing in helping organizations meet the data quality management challenges presented by data-intensive programs such as data conversions, data migrations, data population, and regulatory compliance such as Solvency II, Basel II / III, Anti-Money Laundering, the Foreign Account Tax Compliance Act (FATCA), and the Dodd–Frank Wall Street Reform and Consumer Protection Act.

Ken O’Connor also provides practical data quality and data governance advice on his popular blog at: kenoconnordata.com

Popular OCDQ Radio Episodes

Clicking on the link will take you to the episode’s blog post:

  • Demystifying Data Science — Guest Melinda Thielbar, a Ph.D. Statistician, discusses what a data scientist does and provides a straightforward explanation of key concepts such as signal-to-noise ratio, uncertainty, and correlation.
  • Data Quality and Big Data — Guest Tom Redman (aka the “Data Doc”) discusses Data Quality and Big Data, including if data quality matters less in larger data sets, and if statistical outliers represent business insights or data quality issues.
  • Demystifying Master Data Management — Guest John Owens explains the three types of data (Transaction, Domain, Master), the four master data entities (Party, Product, Location, Asset), and the Party-Role Relationship, which is where we find many of the terms commonly used to describe the Party master data entity (e.g., Customer, Supplier, Employee).
  • Data Governance Star Wars — Special Guests Rob Karel and Gwen Thomas joined this extended, and Star Wars themed, discussion about how to balance bureaucracy and business agility during the execution of data governance programs.
  • The Johari Window of Data Quality — Guest Martin Doyle discusses helping people better understand their data and assess its business impacts, not just the negative impacts of bad data quality, but also the positive impacts of good data quality.
  • Studying Data Quality — Guest Gordon Hamilton discusses the key concepts from recommended data quality books, including those which he has implemented in his career as a data quality practitioner.

Red Flag or Red Herring?

A few weeks ago, David Loshin, whose new book The Practitioner's Guide to Data Quality Improvement will soon be released, wrote the excellent blog post First Cuts at Compliance, which examines a challenging aspect of regulatory compliance.

David uses a theoretical, but nonetheless very realistic, example of a new government regulation that requires companies to submit a report in order to be compliant.  An associated government agency can fine companies that do not accurately report. 

Therefore, it’s in the company’s best interest to submit a report because not doing so would raise a red flag, since it would make the company implicitly non-compliant.  For the same reason, it’s in the government agency’s best interest to focus their attention on those companies that have not yet reported—since no checks for accuracy need to be performed on non-submitted reports.

David then raises the excellent question about the quality of that reported, but unverified, data, and shares a link to a real-world example where the verification was actually performed by an investigative reporter—who discovered significant discrepancies.

This blog post made me view the submitted report as a red herring, which is a literacy device, quite common in mystery fiction, where the reader is intentionally misled by the author in order to build suspense or divert attention from important information.

Therefore, when faced with regulatory compliance, companies might conveniently choose a red herring over a red flag.

After all, it is definitely easier to submit an inaccurate report on time, which feigns compliance, than it is to submit an accurate report that might actually prove non-compliance.  Even if the inaccuracies are detected—which is a big IF—then the company could claim that it was simply poor data quality—not actual non-compliance—and promise to resubmit an accurate report.

(Or as is apparently the case in the real-world example linked to in David's blog post, the company could provide the report data in a format not necessarily amenable to a straightforward verification of accuracy.)

The primary focus of data governance is the strategic alignment of people throughout the organization through the definition, and enforcement, of policies in relation to data access, data sharing, data quality, and effective data usage, all for the purposes of supporting critical business decisions and enabling optimal business performance.

Simply establishing these internal data governance policies is often no easy task to accomplish.  Just as passing a law creating new government regulations can also be extremely challenging. 

However, without enforcement and compliance, policies and regulations are powerless to affect the real changes necessary.

This is where I have personally witnessed many data governance programs and regulatory compliance initiatives fail.

 

Red Flag or Red Herring?

Are you implementing data governance policies that raise red flags, not only for implicit, but also for explicit non-compliance? 

Or are you instead establishing a system that will simply encourage the submission of unverified—or unverifiable—red herrings?

 

Related Posts

Days Without A Data Quality Issue

Data, data everywhere, but where is data quality?

The Circle of Quality

DQ-Tip: “Don't pass bad data on to the next person...”

The Only Thing Necessary for Poor Data Quality

Data Governance and Data Quality