A lawsuit against Amazon was filed this week by some healthcare workers who allege that their Alexa devices have been recording their conversations.
Healthcare IT News reported that four plaintiffs believe their Amazon Alexa devices may have seized information, such as private conversations with patients, that are HIPAA protected.
"Amazon's conduct in surreptitiously recording consumers has violated federal and state wiretapping, privacy, and consumer protection laws," the lawsuit read.
It added: "Despite Alexa's built-in listening and recording functionalities, Amazon failed to disclose that it makes, stores, analyzes and uses recordings of these interactions at the time plaintiffs' and putative class members' purchased their Alexa devices."
The plaintiffs, of which one is a substance abuse counselor and another a healthcare customer service representative, said that the recording of their conversations with patients was a direct violation of the Health Insurance Portability and Accountability Act (HIPAA) that they work under, which protects their patients' privacy.
The cited a study by Northeastern University, which demonstrated how an Alexa device can be woken by "wake words" or phrases that trigger them to start recording and transmitting what it hears.
Asking the device things like, "Alexa, is it going to rain?" will wake it as it answers the question.
The study looked at how many times an Alexa might "wake up" accidentally. They found that statements like "I care about," "I messed up," and "I got something" would trigger the Alexa, and even things like "head coach," "pickle" and "I'm sorry" woke it up too.
An article from Northeastern University quoted David Choffnes, an associate professor of computer sciences at the university, saying, "A lot of us, when we think about being in our home, we think that's a private space where we can have conversations that are intended not to be shared. And now we have all these devices with microphones that could be taking those conversations and sharing them."
There are also multiple studies that show Amazon uses human intelligence to listen to conversations, which is the second aspect of the lawsuit mentioned by the plaintiffs.
The lawsuit mentioned that in 2019, Amazon announced an effort to ensure that transcripts of conversations, whether transcribed by AI or humans, would be deleted from Alexa's servers, but the lawsuit argued that "By then, Amazon's analysts may have already listened to the recordings before that ability was enabled."
A spokesperson from Amazon shared a statement with Newsweek that read, "Alexa and Echo devices are designed to only detect your chosen wake word (Alexa, Amazon, Computer, or Echo). No audio is stored or sent to the cloud unless the device detects the wake word (or Alexa is activated by pressing a button)."
Amazon confirmed that if the Alexa device is recording, the blue light indicator will always appear, which means it will always indicate when it is recording something.
"Our annotation process does not associate voice recordings with any customer identifiable information," Amazon stated. "Customers can opt-out of having their voice recordings included in the fraction of one percent of voice recordings that get reviewed."
They added, "Customers have several options to manage their recordings, including the option to not have their recordings saved at all and the ability to automatically delete recordings on an ongoing three- or 18-month basis."
"device" - Google News
July 03, 2021 at 04:01AM
https://ift.tt/3we4Fl4
Healthcare Workers Sue Amazon Over Potential HIPAA Violations With Alexa Device - Newsweek
"device" - Google News
https://ift.tt/2KSbrrl
https://ift.tt/2YsSbsy
Bagikan Berita Ini
0 Response to "Healthcare Workers Sue Amazon Over Potential HIPAA Violations With Alexa Device - Newsweek"
Post a Comment