Do you trust your virtual assistant program with sensitive patient data or personal information? Here’s why you shouldn’t.

Smart speakers are perhaps the fastest-growing trend in the tech world. These single-unit wireless speakers or soundbars built with artificial intelligence (AI) come from Amazon, Apple, Google, Microsoft and Samsung (with more brands to hit the market soon).

Though originally intended for home use, smart speakers have gradually edged their way into medical offices. For this reason, it’s a natural progression that physicians and healthcare professionals will be enticed to use them for the convenience of note taking, web research, or even accessing medical records.

That could be a colossal mistake.

Virtual assistant programs like Alexa, Siri, Google Assistant, Cortana and Bixby are not in compliance with the Health Insurance Portability and Accountability Act (HIPAA) – at the time of this post. Hopefully, this will change in the near future, but for now, it’s critical to know that using these devices in a medical organization has serious data security risks.

Even within the short span of time that smart speakers have been commercially available, there are already many examples of the technology being implemented in hospital settings.

For example, some hospitals are experimenting with ways to use Alexa to help surgeons comply with a safety checklist before a procedure or offer Alexa apps that provide instructions patients can use at home. In fact, voice-activated patient tools have been rolled out in large health systems such as the Mayo Clinic in Rochester, MN, Northwell Health in New York and Carolinas HealthCare System in Charlotte, NC. Uses range from allowing users to access common topics related to first aid, to finding the nearest urgent care center and wait times.

These kinds of voice-activated tools may eventually become one of the essential ways patients deal with doctors and hospitals, as well as schedule appointments, access and update personal medical records, or refill prescriptions. From the hospital and doctor’s perspective, they might enable providers to more closely monitor patients at home, such as activation tools used for medication reminders.

Taking it a step further, smart speakers will most likely end up in patient rooms, where voice commands will operate televisions and other appliances, forward patient requests, and notifications to mobile devices used by doctors and nurses. Smart speakers may become integrated with building management system platforms, where voice control can adjust lighting levels and window blinds. These smart speakers could free up nurses and other staff, allowing them to spend less time running tedious, non-medical errands, and freeing them to spend more time on issues requiring actual medical expertise.

However, while there is great potential for positive impact of the technology, the issue is that still it needs additional advancements and protection to ensure that sensitive patient data is kept safe. Failure on the part of a staff member to secure medical record data could cost an organization hundreds of thousands of dollars, as well as supply cyber-criminals with an opportunity to commit identity theft.

Perhaps even more concerning is the increasingly creative ways cybercriminals are hacking and stealing data, particularly in the healthcare industry. Contrary to popular belief, digital devices like smart speakers are not immune to hacking. In fact, as reported in Wired recently, a group of Chinese hackers developed a technique for hijacking Amazon’s voice assistant gadget. Although Amazon has pushed out security fixes, it highlights the fact that in the age of the Internet of Things, nothing is ever 100% safe from hacking.

As was reported by NBC News, Candid Wueest, Symantec’s principal threat researcher, explained: “Someone could hack into these devices remotely and turn them into a listening device. Some of them even come with cameras, so they could see what you’re doing.”

Healthcare presents specific challenges related to HIPAA compliance for the security of patient data. The current structure of most smart speaker’s architecture doesn’t align with HIPAA restrictions, particularly in terms of access of personal health information (PHI). For example, a key challenge for Alexa is that they may not only transmit PHI to a user, but might also collect data through speech-to-text. The question then is how to prevent unauthorized access to that data, and whether HIPAA requirements for those devices be met and audited for compliance. In fact, this is a core task of the new Alexa health and wellness team, according to a recent CNBC article.

So while it might be tempting to bring an Amazon Echo, Google Home or HomePod to the office, until they become fully compliant, to do so would be risking HIPAA violations. Until such time that smart speaker technology has developed new advances that meet stringent compliance regulations, it would be better to use your smart speakers for the uses in which they were originally designed.