Close Menu
FintechFetch
    FintechFetch
    • Home
    • Fintech
    • Financial Technology
    • Credit Cards
    • Finance
    • Stock Market
    • More
      • Business Startups
      • Blockchain
      • Bitcoin News
      • Cryptocurrency
    FintechFetch
    Home»Fintech»How confident are you in your risk data?: By Ben O’Brien
    Fintech

    How confident are you in your risk data?: By Ben O’Brien

    FintechFetchBy FintechFetchOctober 3, 2025No Comments6 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    Share
    Facebook Twitter LinkedIn Pinterest Email


    Timely, accurate and high-quality data is the foundation of effective modelling, analysis and risk management. Regulators, through initiatives such as BCBS 239, have sought to strengthen control and ensure data is fit for purpose. Yet supervisors often expect
    straightforward answers, and struggle with ranges, error margins and the uncertainty that accompanies them.

    In practice, firms are frequently working with estimates of the “true” position. Whether acknowledged or not, thresholds are always set around what quality is acceptable. Where those thresholds are ill-defined, uncertainty feeds directly into decision-making.

    Everyone wants to focus on what the outputs say. But if the quality of the underlying data, the transformations applied, and the controls in place are not clearly understood, risk leaders are making decisions on unstable ground. To complicate matters further,
    the level of accuracy required often depends on the type of decision being made.

    In fact, Gartner has estimated that financial institutions lose on average $15 million annually due to poor-quality data. Our experience shows that structural flaws in data are behind many of the most persistent model performance issues.

    A structured audit of data brings these weaknesses into view early. It reduces rework, improves defensibility and increases confidence in the models that ultimately drive decisions.

    Why data is often underestimated as a risk

    During the model lifecycle, the bulk of attention goes to the model layer itself — segmentation logic, performance statistics, overrides. By comparison, the data layer often receives lighter treatment: checks for completeness or missing values, but little
    more.

    This reflects an attitude that “the data is the data” rather than defining what quality is required and addressing deficiencies upfront. The consequences are clear. A Mosaic Smart Data survey reported that 66% of banks face ongoing challenges with data quality
    and integrity, while 83% lack real-time access to transaction data due to fragmented systems. In such environments, model outputs cannot be fully trusted for low-latency decision-making.

    When data weaknesses appear in validation reports, it is usually too late to resolve them without significant cost. Redevelopment or re-engineering is expensive and slow, so deficiencies are often tolerated. Regulators increasingly treat this as model risk
    in its own right. As firms expand use of behavioural, granular and third-party datasets, these risks are becoming harder to ignore.

    Common examples include:

    • Variables defined differently across environments

    • Transformations applied outside the build and undocumented

    • Lineage breaks, with no clear trace from source to model-ready data

    • Version drift from dataset refreshes without revalidation

    • Legacy or redundant inputs left in active use

    Such flaws are difficult to detect through performance metrics. Once embedded, they are harder to isolate and even more difficult to remediate.

    Industry studies suggest data scientists spend around 60% of their time preparing data, with a further 19% spent locating the right information. When fundamental data issues surface late in the lifecycle, these inefficiencies escalate, leading to costly
    rework, delays in approval, and governance debt.

    What a data audit should test

    The fundamental question for any audit is: does this dataset credibly support the model’s assumptions, outcomes and regulatory requirements?

    It seems that answering it requires more than checks for completeness or accuracy. A thorough audit should assess:

    1. Lineage and traceability – Every variable must be traceable back to source, with transformations and adjustments evidenced.

    2. Variable construction and logic – Definitions should be valid and applied consistently across development, validation and production.

    3. Governance and ownership – Clear accountability for datasets, with controls around access, updates and versioning.

    4. Completeness, consistency and reconciliation – Identification of conflicting sources, hidden biases, outliers and reconciliation gaps between environments.

    5. Usage alignment – Evidence that every input is relevant, contributes to performance, and is understood by stakeholders.

    An audit structured in this way demonstrates that the data environment is robust, explainable and fit for purpose, which is increasingly the expectation of supervisors.

    Regulatory expectations are changing

    Supervisors now look beyond statistical measures of performance. Evidence that model inputs are governed, explainable and reliable is becoming a prerequisite, particularly for IFRS 9, ICAAP and stress testing models.

    Across supervisory reviews, the same findings appear repeatedly:

    • Insufficient traceability of input variables

    • Inadequate documentation of variable construction

    • Limited governance of third-party data

    • Weak alignment between development and production data pipelines

    These observations are not minor. Even well-performing models may face challenge or delay if their underlying data is undocumented or opaque.

    The PRA continues to flag weaknesses in governance and data quality through IRB and IFRS 9 reviews. The ECB and EBA have issued similar findings. The ECB’s TRIM review, for example, produced more observations on validation than on any other topic, with the
    highest number of severe issues.

    As AI-driven models and increasingly granular datasets become standard, supervisory scrutiny will only intensify. The message is clear: if firms cannot explain the data, they cannot defend the model.

    Why firms should begin with data

    Starting the model lifecycle with a structured data audit provides early visibility of risks that might otherwise derail the process or go unnoticed until post-implementation.

    Benefits include:

    • Early detection of systemic flaws – preventing the same weaknesses being repeated across portfolios

    • Reduced remediation burden – avoiding late-stage redevelopment or governance overlays

    • Regulatory assurance – delivering a clear audit trail of inputs, transformations and ownership

    • Model defensibility – ensuring outputs can be justified on the basis of stable, reliable inputs

    • Closer alignment with business context – keeping models in step with current behaviours, policies and regulatory requirements

    • Lower governance costs – building trust in reusable data components and reducing the ongoing cost of management

    For senior risk leaders, data quality and governance are now central to both internal assurance and regulatory approval. Supervisors are sharpening their focus on inputs, not just outputs.

    Placing data at the starting point of the model lifecycle is the most effective way to provide confidence in the numbers, in the model, and in the decisions that follow.



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleYY Group Partners with Obita on Stablecoin Payment Infrastructure
    Next Article Last Call Before Lift-Off? Dogecoin Coils For Crucial Breakout
    FintechFetch
    • Website

    Related Posts

    Fintech

    Monzo Partners with Sage to Launch In-App Tax Filing Tool for Business Customers

    October 18, 2025
    Fintech

    What Happens to Card Schemes in a World Dominated by Account-to-Account Payments?: By Christoffer Hernæs

    October 18, 2025
    Fintech

    Navigating the payroll paradox: empowering professionals for a strategic future: By Anton Roe

    October 18, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    2025 (almost) in review: For the insurance industry, the future got here early: By Franklin Manchester

    October 8, 2025

    Breaking the Core Banking Bottleneck: A Smarter, Safer Path to Modernisation: By Nick Levy

    March 4, 2025

    Yes, CRM Systems can Boost your Corporate ROI!: By Morgan Williams

    March 16, 2025

    HubSpot and Canva Partner to Streamline Design and Marketing Integration

    February 20, 2025

    Solana Forms Ascending Triangle For Possible Breakout, Analyst Sets $565 Target

    February 10, 2025
    Categories
    • Bitcoin News
    • Blockchain
    • Business Startups
    • Credit Cards
    • Cryptocurrency
    • Finance
    • Financial Technology
    • Fintech
    • Stock Market
    Most Popular

    In Profile: Kimberley Waldron at Started PR

    August 12, 2025

    Ethereum Closes on $2,800 in 7% Gain But Resistance Remains 

    July 10, 2025

    Infrastructure for GENIUS-Compliant Stablecoins: Build or Buy?: By Lars Rottweiler

    July 31, 2025
    Our Picks

    Solana Price Risks Major Breakdown, Should Investors Worry?

    October 18, 2025

    Monzo Partners with Sage to Launch In-App Tax Filing Tool for Business Customers

    October 18, 2025

    Why did Apple subtract the “+” from Apple TV?

    October 18, 2025
    Categories
    • Bitcoin News
    • Blockchain
    • Business Startups
    • Credit Cards
    • Cryptocurrency
    • Finance
    • Financial Technology
    • Fintech
    • Stock Market
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    • About us
    • Contact us
    Copyright © 2024 Fintechfetch.comAll Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.