The 116 DORA Validation Rules: Complete Checklist for Compliance Officers

Published: January 2026 Reading time: 10 minutes

When the European Supervisory Authorities tested financial entities' DORA readiness in 2024, they applied 116 data quality validation rules to every submission. Only 6.5% of firms passed all of them on the first attempt.

If you're preparing your Register of Information for submission, understanding these validation rules is essential. This guide breaks down all 116 checks by category, explains what each validates, and highlights the common failure points that tripped up firms during the dry run.


How DORA Validation Works: The Three-Layer Framework

Before diving into specific rules, it's important to understand how the ESAs structure their validation process. Your register is validated in three distinct layers:

Layer 1: Technical Integration Checks

Before your data is even examined, these checks verify:

Failure at this layer means outright rejection. Your submission won't proceed to data quality checks until these technical requirements are met.

Layer 2: Data Point Model Validation

This layer checks that your data structure matches the ESA's Data Point Model (DPM):

Layer 3: Business Logic and Data Quality Checks

This is where the 116 validation rules come into play. These rules check the actual content of your register for completeness, consistency, and accuracy.


The 116 Validation Rules by Category

The ESAs organize their validation rules into distinct categories. Below is a comprehensive breakdown of each category and the specific checks performed.

Category 1: File Structure and Format Rules (Rules 1-12)

These rules validate the technical structure of your submission.

Rule Type What It Checks Common Failure
ZIP integrity Package contains all required files Missing template files
File naming Files follow ESA naming convention Incorrect date stamps or entity codes
Encoding UTF-8 with BOM headers Wrong character encoding
CSV structure Proper delimiter and quote handling Excel export with wrong delimiter
Column headers Match DPM specifications exactly Translated or abbreviated headers
Row structure Key values properly formatted Duplicate or missing row identifiers

Key Statistic: 41% of dry run rejections were caused by formatting errors in this category.

How to Avoid: Use the ESAs' official templates without modification. If exporting from other systems, verify the output matches expected structure exactly. Purpose-built tools handle this automatically.


Category 2: Entity Identifier Validation (Rules 13-28)

These 16 rules ensure all financial entities and ICT providers are properly identified.

LEI Validation (Rules 13-20)

The Legal Entity Identifier is mandatory for financial entities and critical for third-party providers.

Rule Validation Error Rate in Dry Run
LEI-001 LEI format: 20 alphanumeric characters 8%
LEI-002 LEI structure: 18 characters + 2 check digits 6%
LEI-003 LEI active status (GLEIF check) 14%
LEI-004 LEI matches reporting entity 4%
LEI-005 Parent/subsidiary LEI relationships valid 11%
LEI-006 Third-party provider LEI present (where required) 18%
LEI-007 LEI renewal status current 9%
LEI-008 LEI uniqueness within register 3%

Key Statistic: 32% of dry run submissions contained invalid LEIs.

Common Failure Points:

How to Avoid: Verify every LEI against the GLEIF database (gleif.org) before submission. Check renewal dates—expired LEIs fail validation even if the format is correct.

EUID Validation (Rules 21-28)

The European Unique Identifier is used for EU-based entities without LEIs.

Rule Validation
EUID-001 EUID format matches BRIS specifications
EUID-002 EUID registered in Business Registers Interconnection System
EUID-003 EUID entity type matches register data
EUID-004 No duplicate EUIDs for different entities

How to Avoid: Cross-reference EUIDs against the BRIS database. Ensure entity names match registered business names exactly.


Category 3: Mandatory Field Completeness (Rules 29-58)

These 30 rules verify that all required data fields are populated. This was the single largest source of errors in the dry run.

Key Statistic: 86% of all validation errors related to missing mandatory information.

Template B_01: Financial Entity Information (Rules 29-35)

Required Field Validation Common Gap
Entity LEI Must be valid and active Lapsed LEIs
Entity name Cannot be empty N/A (usually present)
Entity type Must match licensed type Wrong categorization
Country of registration ISO country code Incorrect codes
Competent authority Valid NCA identifier Missing for branches
Group structure Parent/subsidiary flags Incomplete hierarchy
Reporting date ISO 8601 format Wrong date format

Template B_02: Contractual Arrangements (Rules 36-44)

Required Field Validation Common Gap
Contract reference Unique identifier per contract Duplicate references
Contract start date ISO 8601 format Missing (22% of submissions)
Contract end date Valid date or perpetual flag Missing or illogical
Service description Text description required Too vague or empty
Contract value Numeric with currency Missing or inconsistent
Renewal terms Must be specified Omitted for auto-renewals
Termination provisions Required for all contracts Not captured
Governing law ISO country code Missing
Contract type From permitted values list Wrong classification

Key Statistic: 22% of submissions lacked contract start and end dates.

Template B_03: ICT Third-Party Providers (Rules 45-52)

Required Field Validation Common Gap
Provider identifier LEI or EUID where applicable Missing for small vendors
Provider name Legal registered name Trading names used instead
Provider country ISO country code Inconsistent codes
Provider type From permitted values Misclassification
Criticality assessment Required for supporting critical functions Not completed
Ultimate parent Must be identified if applicable Subcontractor chains incomplete

Template B_04: ICT Services (Rules 53-58)

Required Field Validation
Service identifier Unique per service
Service type From ESA taxonomy
Function supported Link to B_05 template
Data processing location ISO country codes
Data storage location ISO country codes
Substitutability assessment Required field

Key Statistic: 27% of submissions had incomplete SLA documentation.


Category 4: Cross-Template Consistency (Rules 59-78)

These 20 rules verify that references between templates are valid and consistent.

Reference Integrity Rules (59-68)

Rule What It Validates Error Pattern
XREF-001 Every contract references a valid provider Orphaned contracts
XREF-002 Every provider is referenced by at least one contract Unused provider entries
XREF-003 Every service references a valid contract Service-contract mismatch
XREF-004 Every function references valid services Incomplete function mapping
XREF-005 Entity references in B_01 match across templates Inconsistent entity data
XREF-006 Date references are consistent across templates Contract vs. service dates conflict
XREF-007 Provider identifiers match across templates Same provider, different IDs
XREF-008 No duplicate key values within templates Duplicate row identifiers
XREF-009 Parent-child relationships form valid hierarchy Circular references
XREF-010 Subcontractor chains properly linked Broken subcontractor references

Data Consistency Rules (69-78)

Rule What It Validates
CONS-001 Provider country codes match across templates
CONS-002 Contract values sum correctly at aggregate level
CONS-003 Service criticality aligns with function criticality
CONS-004 Date ranges don't exceed logical limits
CONS-005 Currency codes are valid ISO 4217
CONS-006 Percentage values are between 0-100
CONS-007 Count fields contain positive integers
CONS-008 Boolean fields contain only permitted values
CONS-009 Enumerated fields contain only permitted values
CONS-010 Free text fields don't exceed character limits

How to Avoid: Cross-template errors occur when data is maintained in separate spreadsheets or systems. Centralized data management—or purpose-built tools—significantly reduce these errors by enforcing referential integrity.


Category 5: Logical Validation (Rules 79-98)

These 20 rules check that your data makes logical sense.

Date Logic Rules (79-86)

Rule What It Checks Failure Example
DATE-001 End date after start date Contract ending before it started
DATE-002 Reporting date is current period Wrong reporting year
DATE-003 Contract dates within business lifetime Pre-incorporation contracts
DATE-004 Notice periods feasible 5-year notice for 1-year contract
DATE-005 Renewal dates align with terms Renewal before end date
DATE-006 Historical dates in past Future dates for past events
DATE-007 Perpetual contracts flagged correctly End date with perpetual flag
DATE-008 Date format ISO 8601 compliance DD/MM/YYYY instead of YYYY-MM-DD

Key Statistic: 14% of submissions had invalid date formatting or logic errors.

Value Logic Rules (87-92)

Rule What It Checks
VAL-001 Contract values positive
VAL-002 Percentages within valid range
VAL-003 Counts are non-negative integers
VAL-004 Currency values have currency codes
VAL-005 Annual costs consistent with contract terms
VAL-006 Aggregate values sum correctly

Relationship Logic Rules (93-98)

Rule What It Checks
REL-001 Critical functions have critical providers
REL-002 Subcontractors linked to prime contractors
REL-003 Group entities have group relationships
REL-004 Branch relationships reflect legal structure
REL-005 Data location within provider operating countries
REL-006 Exit strategy exists for critical services

Category 6: ICT Service Classification (Rules 99-108)

These 10 rules verify that ICT services are correctly categorized according to the DORA taxonomy.

Rule Validation Criteria
ICT-001 Service type from ESA taxonomy
ICT-002 Cloud service subtypes properly classified
ICT-003 Critical/important function flags consistent
ICT-004 Data processing type matches service type
ICT-005 Infrastructure vs. software classification correct
ICT-006 Outsourced vs. in-house status accurate
ICT-007 Service dependencies properly mapped
ICT-008 Substitutability assessment completed
ICT-009 Concentration risk flags where applicable
ICT-010 Impact assessment for critical services present

Key Statistic: 18% of submissions had ICT service misclassification errors.

Common Failure Points:


Category 7: Geographic and Jurisdictional Rules (Rules 109-116)

The final 8 rules validate location and jurisdictional data.

Rule What It Validates
GEO-001 Country codes are valid ISO 3166-1 alpha-2
GEO-002 Data processing locations specified for all services
GEO-003 Data storage locations specified for all services
GEO-004 Primary vs. backup locations distinguished
GEO-005 Third country transfers flagged appropriately
GEO-006 Provider operating countries documented
GEO-007 Governing law jurisdiction valid
GEO-008 Dispute resolution jurisdiction specified

Key Statistic: 13% of submissions had incorrect location codes.

Common Failure Points:


The Validation Rules That Failed Most Firms

Based on the ESA dry run data, here are the top 10 validation rules that caused the most failures:

Rank Rule Category Error Type % Affected
1 Mandatory Fields Missing contract dates 27%
2 Entity Identifiers Invalid/missing LEIs 32%
3 Mandatory Fields Missing SLA information 27%
4 File Structure Formatting errors 41% (of rejections)
5 Logical Validation Invalid date logic 14%
6 ICT Classification Service misclassification 18%
7 Cross-Template Provider ID inconsistencies 15%
8 Geographic Incorrect location codes 13%
9 Cross-Template Orphaned references 12%
10 Entity Identifiers Group structure errors 11%

How to Use This Checklist: A Practical Approach

Step 1: Pre-Submission Validation

Before submitting your register, work through each category:

Technical Check (5 minutes)

Entity Identifiers (30 minutes)

Mandatory Fields (1-2 hours)

Cross-Template Consistency (30 minutes)

Logical Validation (20 minutes)

Classification Review (20 minutes)

Step 2: Systematic Error Correction

When validation fails, prioritize fixes by category:

  1. Fix technical errors first — These block everything else
  2. Resolve identifier issues — LEI/EUID problems affect cross-references
  3. Complete mandatory fields — The largest error source
  4. Address cross-template inconsistencies — Require coordinated fixes
  5. Correct logical and classification errors — Usually quick fixes

Step 3: Continuous Validation

Don't wait until submission to validate. Implement ongoing checks:


Beyond Manual Checking: Automated Validation

The dry run proved that manual validation is error-prone. Firms with structured validation processes performed significantly better:

Manual checklist reviews help, but automated validation against all 116 rules provides:


Conclusion: 116 Rules, One Goal

The 116 DORA validation rules exist for a reason: to ensure the ESAs receive complete, accurate, and usable data about ICT risks across the EU financial system. Understanding what each rule checks helps you submit a compliant register—ideally on the first attempt.

The key takeaways:

  1. Mandatory field completeness is the biggest failure area (86% of errors)
  2. LEI validation trips up a third of firms—verify against GLEIF
  3. Cross-template consistency requires centralized data management
  4. Technical formatting causes outright rejections—use proper tools
  5. Systematic validation before submission dramatically improves success rates

Don't become part of the 93.5%. Validate thoroughly, fix systematically, and submit with confidence.


Pass All 116 Rules Automatically

DoraPass validates your Register of Information against all 116 ESA data quality rules in real-time. No manual checklists. No formatting guesswork. No resubmission cycles.

Try DoraPass Free for 14 Days

Pass your DORA RoI. First try.

Related: Why 93.5% of Firms Failed the DORA Dry Run (And How to Pass)

Sources: