I feel like everyone gave Google and Amazon way too hard of a time for reviewing assistant queries, which doesn't seem that different than reviewing typed in searches to Google. This on the other hand seems a little different since most of these recordings weren't even user queries. The HIPAA violation alone seems pretty serious. I guess it really depends on the false trigger rate of each platform though. https://arstechnica.com/gadgets/2019/07/siri-records-fights-doctors-appointments-and-sex-and-contractors-hear-it/
What’s the problem if these recordings cannot be tied to real people?
According to the article they have full name and approximate location. I know that would be identifiable for me.
I did not see that in the article. Also, even if it’s there, it’s not true. Apple jumps through a lot of hoops to protect it’s customer’s privacy.
There is NO HIPAA violation here. This is NOT how HIPAA works.
Can you explain? On the HIPAA website it says "The HIPAA Privacy Rule protects the privacy of individually identifiable health information...". It sounds like they were transcribing conversations with a physician without the knowing consent of a patient, and pairing that with names and locations. It does talk about certified entities, is that what you're referring to? That Apple is not a certified entity so they don't have to comply? Even if that is the case I would think the healthcare provider would be liable for letting such a device be present.