Digital health is about applying advanced information technologies to enable free flow of patient information across the circle of care. For patients, that means every health-care provider they see at different locations should be able to access relevant health record information quickly and efficiently.
Digital health technology, such as electronic health records, is believed to enhance patient-centred care, improve integrated care and ensure financial sustainability of our health-care system. However, Ontarians are facing the tough reality that their health data are still fragmented, despite billions of dollars spent over the last two decades to enable fast and secure exchange of health information. The COVID-19 pandemic has brought to light even more data quality issues.
As noted in a recent National Post article, much of the public data on COVID-19 is a mess. Not only are data on infected cases and deaths delayed, they are also incomplete. Ontario reportedly offered inconsistent counts between provincial medical officials and local public health units. No wonder the Ministry of Health admits that “consistent standards are lacking across sectors — making it extremely difficult to integrate patient records or to integrate local systems with provincial ones.”
Neither sustainable nor effective
The Ontario government is taking two approaches to improving data quality, examples of which include accuracy and timeliness of data reported across different service providers. The first approach centres on improving health data exchange across heterogeneous systems (systems developed by different vendors and requiring different hardware and software configurations to operate) by using common communication standards.
However, this approach is neither scalable nor sustainable as communications across these systems become increasingly complex, time-consuming and error-prone when more systems are added to the mix of systems. Inconsistent counts of COVID-19 infected cases and deaths provided by different levels of governments is a case in point. Not to mention that these standards evolve rapidly and even previous versions of the same standard cannot be easily mapped and migrated to current ones.
The second approach relies on the minimum common data set proposed in the Digital Health Playbook, a resource intended to guide health-care organizations to build their digital systems. The minimum data set contains data classes (such as individual patients) and their corresponding elements (such as date of birth) for clinical notes, laboratory information, medications, vital signs, patient demographics and procedures, to name a few uses.
These data sets, while appropriate for the requirements of family physicians whose main responsibility is disease control and prevention, are not sufficient for treating complex patients who suffer from multiple health issues, which demand a vast amount of health data from various health-care providers.
These two approaches adopted by the Ontario government to address data quality issues are neither sustainable nor effective, so can hardly serve as a strategy guiding health digitalization.
As researchers focusing on IT in health governance, we propose that a data strategy encompass four pillars:
1. Data quality standards
First, data quality is an umbrella term that encompasses multiple dimensions that include things like accuracy, accessibility and timeliness. And there are trade-offs among these dimensions. For example, increasing timely data reports may affect data comprehensiveness, which demands time to cover all the required data.
While “fit for use” (meaning the quality of data fits the requirements of their intended users) is considered appropriate and pragmatic, it needs to be clearly spelled out what quality standards need to be reinforced. Given the limited resources and increasing pressures to curb health-care costs, it becomes increasingly urgent to decide which data quality standards should be the focus.
2. Sustainable, scalable, patient-centric platform
Second, the health-care sector is not alone in dealing with decades-old systems and the low-quality data — such as inaccurate COVID-19 case counts — generated by these systems. Drawing on experiences from banks and other organizations, the health-care sector could create an open data platform that enables data sharing across health-care providers and allows patients to share data from their social media and mobile and wearable devices. Countries such as the United Kingdom and Germany have started implementing the open data platform idea.
3. Measurable indicators of improvement
Third, measurable outcomes pertaining to data quality improvement efforts need to be defined. Improvement efforts could include training programs on best practices related to data entry, and introducing system features that enable data quality checking (for example, completeness or consistency). Measurable outcomes would ensure accountability and the achievement of the intended objectives, and inform future funding decisions.
4. Improvement process adopted by providers
Lastly, a data strategy needs to clearly define a data quality improvement and monitoring process where the quality of the data is continuously monitored and assessed to ensure that data support patient care and research. Data quality is a shared responsibility, so the quality assurance process needs to take place collectively across providers but also within each provider.
To define and implement the data strategy, meaningful engagement with all stakeholders is key. For example, patients and providers need to be involved to identify the data required to treat the diseases that claim the most of our health-care budget, define quality dimensions of the data, and specify roles and responsibilities of maintaining the quality of data.
In contrast to the Band-Aid approach adopted by the Ontario government, the four-pillar data strategy is long-term, focused and holistic. It would ensure that data quality is placed at the front and centre of Ontario’s effort in health digitization. Following the strategy, our health-care system would develop a sustainable mechanism and a scalable capability to continuously improve data quality.