The new acting Medicare chief is on to something: The idea that meaningful use of Internet communications between patients and doctors should really mean something. But instead of a narrow interpretation of meaningful use relating to electronic health records, it’s time to go a step further. What we really need is meaningful and safe use of all types of online communication involving patient health records.
Speaking at a San Francisco conference, Medicare’s Andy Slavitt referenced pending changes in awkward “meaningful use” rules that mandate patient communications through the use of Electronic Records Systems (EHRs).
However, a truly meaningful analysis of online communications would take a broader view, raising questions such as these:
- How do people actually use the Internet to find information and communicate with healthcare providers?
- Can they be reasonably sure they are protected against embarrassing or hurtful disclosures?
If the answer is no, then the first priority should be high standards for all types of databases where private health information is stored.
Meaningful priorities and processes
Currently, vulnerabilities go unaddressed in some cases. The question becomes one of meaningful priorities and processes. Consider these two scenarios:
Compliance audits are set to begin this year to police what healthcare providers are doing to secure patient records. Yet while local doctors and community providers are being held accountable, big data companies are escaping the scrutiny they ought to get. Indeed, the EHRs these companies produced with federal subsidies may not even be capable of securing patient records.
Amid multiple reports pointing to trouble within federal EHRs, the Journal of AHIMA released new survey results indicating widespread problems in accurately matching individuals with their healthcare records. Duplicate records commonly exist, creating greater likelihood of errors in treating people.
In an interview in Healthcare IT News, Raymond D. Aller, MD, director of informatics at the University of Southern California, pointed to the risk of death and improper treatments resulting from errors.
In a second example of vulnerability, consider that patient data resides in many places not covered by privacy laws, such as health apps, wearable devices and on websites.
In one situation, a Florida woman purchased a paternity testing kit at a local drug store. When she visited the website of the lab associated with the kit, she quickly discovered that by tweaking the site’s URL, she could access a directory with test results for some 6,000 others.
In that case, privacy rules didn’t extend far enough to apply to the lab.
The point is that we need a meaningful analysis of digital communications within healthcare. Otherwise, as a nation, we risk going easy on big data companies and others large players, while local practitioners suffer through dense rules that divert them from patient care. Do we need new standards for digital communications? Yes, and they need to be meaningful.