Key takeaways

Solvency II requires insurance companies to maintain high data quality to ensure the reliability of capital calculations and regulatory reporting
The requirements focus on three main aspects: data accuracy, completeness, and relevance
These requirements aim to reduce risks while strengthening the trust of stakeholders and regulators

Solvency II requires insurance companies to maintain a high level of data quality (DQ) to ensure the reliability of capital calculations and regulatory reporting. The requirements focus on three main aspects: accuracy, completeness, and appropriateness of data.

The key challenges for ensuring good data quality are as follows:

  • Implementation of robust processes for data collection, management, and validation;
  • Documentation and traceability of data to demonstrate compliance to regulators;
  • Data governance, with clear roles and responsibilities, to ensure quality throughout the data lifecycle;
  • Continuous monitoring and improvement of data quality, including through performance indicators.

These requirements aim to reduce the risk of misestimating technical provisions and solvency capital, while strengthening stakeholder and regulator confidence.


A structured but demanding regulatory framework

Data quality (DQ) is a regulatory requirement, as crucial as capital levels or risk modeling. It must be demonstrated, controlled, documented, and governed.

It is mainly found in two key reference texts:

  • Directive 2009/138/CE (Solvency II) ;
  • Delegated Regulation (EU) 2015/35


Solvency II: Clear Data Quality requirements

Since the implementation of Solvency II, insurance companies must prove that their data is accurate, complete, and appropriate.

These three criteria are at the heart of article 82 de la directive 2009/138/CE nd are critical for the required solvency calculations, technical provisions, and minimum capital requirements.


Delegated regulations and guidelines: continuous strengthening of obligations

Delegated Regulation 2015/35 (Articles 19 to 21 and 262 to 264) clarifies expectations regarding governance, internal control, and modeling.

EIOPA,  through its guidelines, recommends that companies:

  • Implement a structured framework for data quality;
  • Ensure regular monitoring through indicators (KPIs);
  • Conduct periodic reviews and corrective actions;
  • Document traceability and associated management rules.


Real sanctions for tangible shortcomings

The Solvency II regulation, in effect since January 1, 2016, introduced a harmonized, risk-based supervisory framework for insurers operating in Europe. Insurance companies must now comply with these stringent standards, which cover risk management, capital requirements, and transparency—including data quality.

The French Prudential Supervision and Resolution Authority (ACPR) has recently sanctioned several companies for shortcomings related to data quality.

These sanctions underscore the importance of maintaining high-quality data to comply with regulatory requirements and avoid financial penalties.

 

Data quality : a strategic lever facing many obstacles 

Many insurance companies still struggle to adopt a genuine culture of data quality. The issue is often viewed as technical, lacking strong strategic sponsorship, and suffers from siloed, poorly interoperable information systems. Data governance is generally weak or non-existent, with unclear responsibilities.

Efforts tend to focus on regulatory urgencies, without a long-term vision. Data quality is rarely prioritized in budget decisions due to the lack of immediate ROI. The shortage of hybrid skills and leadership exacerbates the situation. Finally, the absence of visible incidents fosters a false sense of security, encouraging short-term responses at the expense of sustainable solutions.

Data Quality is above all an opportunity

Strong and effective data quality offers numerous benefits beyond mere regulatory compliance:

  • Better decision-making: Reliable data enables leaders to make informed decisions based on precise and relevant analyses;
     
  • Operational efficiency: Quality data reduces errors, redundancies, and inefficiencies, leading to process optimization and cost savings;
     
  • Improved customer relationships: Accurate, well-managed data allows a better understanding of customer needs, enabling more personalized services and increasing satisfaction and loyalty;
     
  • Facilitated innovation: Good data quality enhances the use of advanced analytics, artificial intelligence, and other technologies to develop new products and services;
     
  • Simplified compliance: High data quality streamlines and reduces the cost of regulatory reporting while minimizing the risks of non-compliance.


In short, investing in data quality strengthens overall company performance while meeting regulatory requirements.
 

Data Quality, GDPR, and Information Security

Data quality is closely linked to compliance with the General Data Protection Regulation (GDPR). It ensures the accuracy and updating of personal data, facilitates traceability, and supports rigorous governance. It also helps enforce the principle of minimization by avoiding unnecessary or redundant data. Finally, the security measures implemented within a DQ approach enhance the protection of personal data.

Moreover, data quality significantly impacts information security and cybersecurity. Reliable, well-governed data reduces system vulnerabilities, improves threat detection, and ensures precise management of access and identities. DQ thus becomes a pillar of cybersecurity, supporting compliance with regulations (GDPR, DORA) and reinforcing resilience against digital risks.

 

Upcoming dynamics favourable to Data Quality

The years 2025–2026 open an unprecedented window of opportunity for insurance players—mutuals, companies, provident institutions—to turn data quality (DQ) into a true strategic lever. Several dynamics converge to drive the sector to a decisive turning point.
Strengthened regulatory requirements are transforming DQ into an essential prerequisite.

Solvency II demands increasing rigor on the data used for prudential calculations, ORSA, and narrative reporting. At the same time, DORA introduces reinforced obligations regarding traceability, operational resilience, and control of critical data. CSRD and the green taxonomy require reliable, structured, and auditable ESG data. Data quality is no longer limited to reporting: it has become a continuous compliance condition.

The rapid rise of artificial intelligence in insurance highlights a direct dependency: no reliable data, no reliable AI. Use cases are multiplying—fraud detection, churn prediction, dynamic pricing—but all rely on clean, complete, up-to-date, and traceable data. General management teams are realizing that DQ is no longer just an IT issue but a strategic foundation for innovation.

The need for agile, cross-functional management is also growing in an unstable economic context. Inflation, climate disruption, geopolitical tensions: leaders need reliable, real-time indicators. It’s no longer about receiving reports at D+30 but about deciding at D+1 with consolidated, robust, and immediately usable data. DQ thus shifts from a discreet support role to that of a central strategic management tool.

Furthermore, technologies have matured significantly. Data quality solutions, MDM, data lineage tools, and data catalogs are now more integrated, often available in SaaS mode, and easier to deploy thanks to low-code approaches, embedded controls, and automated workflows. The barrier of cost or technical complexity is no longer an obstacle: making data reliable has become not only possible but accessible.

Pressure on key functions continues to grow. ACPR is tightening its expectations, particularly on the quality of data used by Solvency II functions (actuarial, risk management, internal audit), which must now demonstrate traceability, alerts, and solid evidence. For these actors, DQ becomes a means of self-protection against regulatory demands.

Another strong trend: the increasing outsourcing of certain functions requires contractualizing data quality. Trust alone is no longer sufficient. It is essential to frame, formalize, and supervise quality in relationships with service providers. DQ thus becomes part of operational management skills.

Finally, early returns on investment are now evident. Companies that have structured their DQ initiatives report tangible benefits: fewer reprocessing needs, reduced regulatory stress, improved analytical capacity, lower operational risks, better customer knowledge, and easier innovation. Positive examples are multiplying, proving that the topic is neither theoretical nor marginal—it has become a performance factor in its own right. 

By submitting my contact details, I agree to receive information from RSM related to the topic of this article.

All information collected by our firm is subject to automated processing in full compliance with privacy regulations. Your data is intended exclusively for RSM to respond to your inquiries and to provide you with information and services tailored to your needs in audit, accounting, and advisory matters. You may change your preferences at any time by writing to [email protected] or by clicking the “Unsubscribe” link at the bottom of the newsletters you receive from us. In accordance with Regulation (EU) 2016/679 on the protection of personal data, you have the right to access, modify, and delete your data at any time by consulting our personal data management policy here.

RSM experts support companies across all sectors in assessing and managing fraud and scam risks. We can provide fast and effective prevention solutions, including flash diagnostics, employee training, and process security.

Discover our Risk Advisory service.