Over the last several years, Amazon has been wading out of the e-commerce realm and into other areas, including groceries, web services, and healthcare. Most recently, the company announced that its Alexa artificial-intelligence assistant is HIPAA-compliant.
What does this announcement mean for healthcare consumers? Are we ready for Alexa in healthcare? How will Amazon ensure patient privacy? Let’s examine this issue in this week’s “Hot Topic” blog.
The nuts and bolts of Alexa in healthcare
This Wall Street Journal article outlines some uses for Alexa in healthcare. According to the article, “Alexa can now transfer sensitive, personal health information using software that meets health-privacy requirements under federal law.”
Five companies, including some major hospitals such as Boston Children’s Hospital, have developed Alexa-based applications that let them:
- Schedule urgent care appointments
- Track when drugs are shipped
- Check health-insurance benefits
- Read blood sugar results
This 2018 article in Harvard Business Review by researchers at Boston Children’s outlines some of the ways the hospital has explored using voice-enabled technology. The three pilots the hospital undertook were:
- ICU: Voice allowed nurses to ask for key pieces of administrative information, such as which nurses were on staff where, and where there were open beds.
- Organ transplant: The hospital used voice for the pre-op checklist process.
- Home health: Parents could ask Alexa about symptoms of illnesses and suggestions on when to escalate and visit a doctor.
Physicians seem to be open to voice too. The Boston Children’s researchers indicate that in a survey, 48% of pediatricians said they would be willing to deploy Alexa in a clinical setting, while only 16% said they would not be open to the idea.
The question, though, is what about consumers? While they have embraced Alexa in the home, are they ready for Alexa in healthcare?
The perception of Alexa might need to change
The WSJ article talks about how Northwell Health in metro New York launched an Alexa service a couple of years ago, but consumers haven’t really used it. Some concerns around Alexa center on privacy – which arguably, being HIPAA compliant could change.
- First, there was this situation, in which Alexa reportedly sent out a couple’s recorded conversation to someone in the husband’s contact list.
- Then, there’s this article in Bloomberg News, which talks about how Amazon employees regularly review what customers say to Alexa to improve the assistant’s capabilities.
Given these concerns, can Alexa be trusted with health information?
When it comes to healthcare, Amazon says Alexa will require verification of the speaker, either with a spoken code or password. The company also says data transmitted to developers to improve features will be stripped of any identifying information.
Alexa’s entry into healthcare latest move by big tech companies
This announcement by Amazon is just the latest foray into healthcare by the tech giant. In 2018, Amazon, Berkshire Hathaway, and J.P. Morgan announced they were getting together to try and tackle rising healthcare costs. Their joint venture has since been named Haven and is being led by physician Atul Gawande.
Other big tech moves into healthcare:
- Google has been exploring how to use artificial intelligence in disease detection.
- Apple has been in the wearables market, most recently integrating EKG technology into the Apple Watch.
- Facebook backtracked after asking several hospitals for anonymized patient data, which included information on illnesses and prescriptions.
Would you trust Alexa in healthcare?
I’d like to hear what you all think about our hot topic. Would you trust your private health information over Alexa? Would you be comfortable with its use in a hospital setting? Are there limits to how we should use Alexa in healthcare?
Please share your thoughts in our comments section below. I’ll start by sharing mine.