4 ways instant-messaging apps put patient privacy and security at risk

October 13, 2017 David Bennett

Mobile devices are everywhere, and most healthcare organizations are following the trend and adopting mobile devices at the point of care.

In particular, clinicians at hospitals and healthcare organizations are using text messaging and instant-messaging apps to communicate on a regular basis due to the convenience. However, even with a proper BYOD policy in place, a number are using text messaging in ways that put patient privacy and security at risk and clearly violate HIPAA, putting their organizations at risk for a data breach or huge financial penalties.

A recent study of 2,107 doctors across five hospitals found that 98.9 percent of clinicians own a smartphone, with just over a third of clinicians using web-based messaging apps to send clinical information. However, a survey published in the Journal of Hospital Medicine reported that a mere 27 percent of respondents said their organization had implemented a secure messaging application. Worse yet, only seven percent said most clinicians were using a hospital-issued messaging app, meaning most were wrongfully using consumer applications readily available on the App Store.

These surveys reveal a clear demand from clinicians for the integration of mobile technology into workflows. However, without the infrastructure in place, the survey also reveals the demand is so strong that clinicians will take matters into their own hands. Consumer-grade instant-messaging apps, like WhatsApp or KakaoTalk, are convenient and accessible, but they also come with a number of risks, including:

  • Lack of security. Consumer messaging applications are built for communication between friends, but they should never be used for sharing confidential information – especially sensitive patient information. It’s true that many consumer-facing apps, like WhatsApp or Signal, have encryption baked in. However, these apps usually are not password protected. Meaning, if a phone is lost or compromised, an unauthorized eye would have access to each and every message. A robust mobile health IT app needs to have extra layers of security in place so that clinicians must log into a phone and app using secure passwords and a multi-factor authentication process. Most consumer apps are not going to offer this type of security by default.
  • Photo syncing. Photo sharing goes hand-in-hand with text-based messaging. Taking a photo on a smartphone is one of the most convenient ways for a clinician to show, document, and share pertinent patient information. However, doing so brings the additional risk of incidental, or accidental, back-ups to cloud-based storage systems. Many smartphone systems automatically sync photos to cloud services. This auto-backup function poses yet another security threat for clinicians, especially if the cloud account is shared with family members – or worse, is made public. While clinicians do tend to be careful when taking photos of patients, there is always a chance of capturing an image that could be used to identify a patient and violate their privacy.
  • Consumer apps lack audits. Conversations about an individual’s medical information need to be stored somewhere—preferably within the EHR. While records are kept within most consumer-facing apps, none are tied back to the patient’s medical records. There is significant benefit in having these conversations linked to the patient record, so that the communication is stored, linked, and able to be monitored and reviewed if required. It also allows clinicians from within the healthcare journey, who haven’t been included in those conversations, to see developments and the latest updates. Further, messages on consumer apps can simply be deleted, making any record of what was sent and received difficult to trace. 
  • Data mining. Have you ever wondered how social media and so many messaging apps are free? Some operate on grants or donations. But in many cases, the reason apps are free is that the user is the product. Every message sent, every photo backed up, and every person messaged says something about who the user is – and, to a lot of advertisers, information is money. What does this mean? It means many texting apps – especially standard SMS or instant messaging – lack security by design, as the data you provide through using the app is monetized in a variety of ways. For patients, this means their private information is falling into the wrong hands through no nefarious or negligent use on part of clinicians, who may not even know the app they are using lacks security by design.

This list only scratches the surface of the risk consumer texting apps pose to providers and hospitals. Data breaches caused by human error, negligence, or misuse of technology costs healthcare organizations millions in penalties and lawsuits. While there are numerous benefits to using mobile communication apps within a healthcare organization, there needs to be an emphasis on the use of hospital-issued messaging apps, the protection of mobile data, and adherence to strict BYOD policies. Clinicians can have the convenience of texting without putting private patient information at risk, and healthcare organizations can support them in doing so, ensuring they won’t turn to the App Store and take matters into their own hands.

The original article can be found at the Becker's Health IT & CIO Review site here.

Previous Article
An insider’s view on data-driven healthcare
An insider’s view on data-driven healthcare

Improving health and achieving commercial success via data analytics Precision Medicine is the growing bod...

Next Article
Orion Health Achieves HITRUST CSF Certification  to Further Mitigate Risk in Third Party Privacy, Security and Compliance
Orion Health Achieves HITRUST CSF Certification to Further Mitigate Risk in Third Party Privacy, Security and Compliance

Orion Health Achieves HITRUST CSF Certification to Further Mitigate Risk in Third Party Privacy, Security a...

×

Subscribe to our Blog!

First Name
Last Name
Thank you!
Error - something went wrong!