CALL US NOW (310) 899-5441
SEND US AN EMAIL info@veloxmetadata.com
Login
  • Home
  • Solutions
    • For Payers
    • For Providers
    • For Patient Apps – Coming Soon
    • For Data Partners – Coming Soon
  • Technology
  • About Us
    • Leadership Team
    • Advisory Board
    • Contact
  • Resources
    • Glossary / Definitions
    • Unlocking Clinical Data Success: The 10-10-10 Process
    • Health Metadata Explained
    •  Response to RFI CMS-0042-NC from Velox Health Metadata 
  • Request a Demo
  • Blog / Updates
CALL US NOW (310) 899-5441
info@veloxmetadata.com SEND US AN EMAIL
Log In

The Hidden Cost of In-Flight Data Errors: Lessons from Aviation and Healthcare

Author:

Joe Bastante, Founder, Advisor, Exponent7

In 2018 and 2019, two devastating crashes involving Boeing 737 MAX aircraft—Lion Air Flight 610 and Ethiopian Airlines Flight 302—claimed 346 lives. Investigations revealed that both incidents were triggered by erroneous data from a single angle-of-attack sensor, which led to repeated nose-down movements that the pilots were unable to counteract, resulting in catastrophic outcomes. While most data quality issues don’t result in such tragic consequences, they can still have significant impacts. In healthcare, for instance, data inaccuracies can lead to misdiagnoses, incorrect treatment plans, and medication errors, posing severe risks to patient health.

Beyond Data Entry: In-Flight Data Errors

Data quality issues aren’t confined to the point of data entry. Increasingly, errors are introduced as data moves through complex networks of internal systems and external partners. For example, a patient eligible for a health service might be incorrectly deemed ineligible because critical data was lost or altered during transmission between systems.

Common Causes of In-Flight Data Errors

Understanding the root causes of these errors is essential for prevention:

  • Failed API Calls: When systems communicate via Application Programming Interfaces (APIs), failures can occur if the calling program doesn’t properly handle errors, leading to data loss.
  • Batch Job Failures: Scheduled data processing tasks may fail entirely or partially, especially if error handling is inadequate, resulting in incomplete or corrupted data.
  • Human Error: Manual interventions, such as rerunning failed processes, can introduce errors if not executed correctly—like processing the wrong data set or duplicating entries.
  • Guaranteed Delivery Failures: While many data streaming tools promise reliable delivery, factors like system outages or misconfigurations can still lead to data loss.
  • Code Errors: Bugs in data processing scripts or applications can inadvertently alter or discard data during transformation or loading stages.
  • Value Set Incompatibilities such as outdated, con-conformant, wrong code-set.

Strategies to Ensure Data Integrity

While it’s impossible to eliminate all errors, implementing robust detection and correction mechanisms can mitigate their impact:

  • Metadata Management: Tracking metadata—such as data lineage, processing timestamps, and record counts—at each stage of data processing can help identify anomalies and ensure completeness.
  • Automated Quality Rules: Automated quality rules complement the metadata approach by inspecting and validating data contents. For example, if a critical data element must remain intact from the source to targets, quality rules can verify that the data matches exactly across systems. 
  • Data Inspection Rules: Additionally, data inspection rules can ensure data remains intact at an aggregate level. Sticking with the earlier example, rules can tally the count of eligible patients or members of a health service by employer, then make sure those counts match between source and target systems. 
  • Eliminating intermediaries and transformations: Data that can be captured in its native format directly from the source is less susceptible to errors and omissions that occur during handoffs (hops) and transformations (mapping) that can and often do happen when there are intermediaries. 

Through effectively selecting and implementing such rules, and automatically generating an incident when problems are detected, intervention can be assured to remedy data issues.

Integrating these measures into the data pipeline allows for real-time monitoring and prompt issue resolution, ensuring that data remains trustworthy throughout its lifecycle.

Data quality shouldn’t be an afterthought. Organizations often prioritize new feature delivery over establishing data quality controls, leading to technical debt that’s harder to address later. By embedding quality assurance practices into the development lifecycle, organizations can proactively prevent data issues rather than reactively fixing them.

Velox makes a great starting point and a common way of capturing and communicating in-flight data errors, their causes and how to address them.


Post navigation

Previous Post Response to RFI CMS-0042-NC from Velox Health Metadata 

cropped-Velox-Icon.png

Contact Velox!

info@veloxmetadata.com
15332 Antioch St., #189, Pacific Palisades, CA 90272
hipaa-compliance-1
AICPA-SOC
HITRUST_Inheritance_Provider
HITRUST_r2_certified
compliance_seal_1

©2025 Velox Health Metadata. All rights reserved. Privacy Policy