Subscribe to the Zog Blog to get news Delivered straight to Your box!
Newsletter Signup
Recent Posts
Archives
Archives
- November 2024 (1)
- October 2024 (1)
- August 2024 (1)
- July 2024 (1)
- June 2024 (1)
- May 2024 (1)
- December 2023 (2)
- November 2023 (1)
- August 2023 (1)
- June 2023 (1)
- May 2023 (1)
- April 2023 (1)
- December 2022 (4)
- November 2022 (3)
- October 2022 (2)
- September 2022 (2)
- August 2022 (3)
- July 2022 (2)
- May 2022 (3)
- April 2022 (2)
- March 2020 (1)
- November 2019 (1)
- October 2019 (2)
- September 2019 (3)
- August 2019 (2)
- July 2019 (5)
- June 2019 (3)
- May 2019 (2)
- April 2019 (1)
- March 2019 (2)
- August 2018 (2)
- July 2018 (1)
- June 2018 (1)
- May 2018 (4)
- April 2018 (5)
- March 2018 (2)
- February 2018 (3)
- January 2018 (3)
- December 2017 (3)
- November 2017 (2)
- October 2017 (3)
- September 2017 (4)
- August 2017 (2)
- July 2017 (4)
- June 2017 (4)
- May 2017 (5)
- April 2017 (4)
- March 2017 (3)
- February 2017 (4)
- January 2017 (5)
- December 2016 (4)
- November 2016 (5)
- October 2016 (4)
- September 2016 (3)
- August 2016 (4)
- July 2016 (1)
Why Do We Have Data Interoperability In Healthcare?
Think of the last time you had an emergency and needed to get rushed to the hospital ASAP.
Maybe it was a heart attack, perhaps you had an appendix giving you trouble. Whatever the issue, if you were not close to home or to your in-network hospital, you might have later on experienced some pain in making sure everyone was talking to the right people.
The reason? Their systems—specifically databases—did not use the same data formats. Your insurance provider, doctor and the hospital might have had some issues getting their data format to speak with the other out-of-network providers.
Health data interoperability—the likes of HL7 (and more recently defined in health information guidelines provided in the Trusted Exchange Framework and Common Agreement, known in the industry as TEFCA)— were intended on making data exchange in healthcare easier.
The problem is that most of us have not come to an agreement as to what a common language is exactly.
Instead, the US healthcare system has a hodgepodge of data forms. For instance, your electronic health record system may output a date in the format of MM-DD-YY, whereas another may spit out YYYY-MM-DD. I am pretty sure if you looked at one date format or the other, as a human being you’d be able to tell they were the same.
The problem is that computers and databases don’t easily translate format nuances like this.
The movement of information between health systems—say your office and another across state lines—has become a problem of capability. Can your network translate information quickly and accurately from one office to another (or even between providers within the same office that might annotate notes differently)?
The reason we have interoperability? To efficiently and accurately move patient data to those that need it.
How can we ever get interoperability right?
At its most basic level, electronic data is based on a standard of ones and zeroes—this is why when electronic health records were introduced in the late 90’s, many thought this as a manageable task. At that time, we already knew that electronic information could be created, stored and exchanged over the internet—that having a computer manage record in place of a human being made for a reduction in errors (especially related to handwriting issues and copying mistakes).
A computer seemed more powerful, chock full of algorithms, insights and a quick and comprehensive way to make sure information was getting to the places it was intended.
Everything in theory seemed right about exchanging data among healthcare organizations. BUT as soon as we turned around, there were new technologies, new innovations and other proprietary technology—written in locked data types—all creating new silos making a much rougher ride for data exchange.
I’m sure you now have to pay for services to translate data outputs from your EHR platform to eventually get to organizations that track your quality metrics, state-mandated reporting and insurance reimbursements.
You still have two major reasons for exchanging data—diagnosis and billing—but at this point in our involvement with electronic health data, we are continually met with road blocks.
EHR platforms that were initially created to make life easier for providers haven’t really met expectations (they are often tedious to input records and move them across networks).
You, as a healthcare provider, have petabytes (that’s millions of gigabytes) of data at your fingertips, but because record design and maintenance were not designed to make sure you were able to use every last drop from those records. Organizations still cannot learn that much from their information, and cannot even speak to others through their data.
As your organization is increasingly focused on more holistic wellness of your patients—maybe evaluating some population health indicators for example—you expect more out of interoperability to ensure you have a good picture out how to best treat those that come to you for help.
What are leaders in the industry doing about problems with interoperability?
It’s definitely easy to blame the EHR developers for not having a clear enough crystal ball when releasing their platforms to help payers, providers, federal and state agencies, among other organizations looking to understand healthcare information.
Information systems experts have started to recognize that as an industry—and as individual organizations—we have to make some improvements to how data is handled.
Here are 4 best practices in handling electronic health information:
Understand where standards come into play—most organizations implement EHR systems and institute policies and procedures on data without understanding or devising universal standards across their organization or organizations within their network. By starting a discussion about standards—if you are in health information management I’m sure you are well aware of these issues—you can get decision makers to come to turns with implementing standards throughout your information exchanges. The first step is defining who you regularly talk to (exchange data with) and figuring out what standards would work well across your networks of data exchange.
Set up a playbook—before diving into a new project, consider who should be involved, what the scope of the project is and how you will get to that final goal. Making improvements upon or setting up a data exchange both should be considered as projects within your organization, along with all the nuts and bolts that come with it. Make sure you have an implementation or change playbook that addresses all objections or concerns before taking a deep dive in making data flow smoother across your data exchange.
Communicate your anticipations across your team—don’t just think that everyone will be on board with changes (they never are). Invite your entire team—anyone that has a stake in the end product—to get involved in making improvements or changes to how data is exchanged. Top-down decisions on organization-wide changes that may change how everyone works might not work as well as getting key people across your organization involved in making change.
Beware that noise will always be around—data will only communicate so much of the message. It will need clear communication—either notes or phone calls—to give context to whatever information you are transmitting. Without proper context, no matter how exchangeable your data, you open up major risks in misinterpretations.
Leave a Comment
Your email address will not be published. Required fields are marked *