I recently noticed this on the visit note summary generated after a recent Medstar visit:
“ Documentation on this patient encounter was supported using ambient scribe technologies/voice AI technology. The patient consented to recording for the purpose of documenting the encounter. Provider reviewed content of the generated note prior to signature.“ The problem is the provider did not ask my permission to record the visits for purposes of AI scribing. Even worse, the provider did ask me for permission several visits ago, and I explicitly declined, in part because the provider could not tell me what AI company was being used, whether the underlying transcript from which the medical visit note was generated would be retained as part of my medical record, what the policy of the AI company would be as to retention of my record in the AI provider records and, if retained, whether my record was de-anonymized, used for AI-learning, sold to other companies or otherwise discoverable by third parties. I was also concerned that there was no explicit written request to make my records available to the AI technology as there would be for a HIPAA privacy release request. OFC, the doctor couldn’t answer any of those questions. At the time he said that Medstar was mandating the use of this AI unless patients declined and he said he didn’t like visit recording either. Ironically, the AI seems to have missed discussion of the main topic of why I had the visit in the first place. Instead, it captured something that was essentially a last request/afterthought as the visit was ending and recast the entire visit as if the last request was the main purpose. So, I’m curious…. Have others been asked for AI recording/scribe consent? Have you consented or declined? Did you find you’d been recorded without your consent? Are other providers like John’s Hopkins doing this AI recording/scribe thing? What do people think about this in terms of privacy? TBH, I’m really uncomfortable with it, especially in this post-Dobbs, current Trump Admin world. RFK has expressed interest in having the USG buy medical records to create an autism registry. DOGE is destroying what had previously been pretty separate information troves (IRS, social security, student loans). Rumors are that ICE has bought auto insurance records to use for deportation purposes. And, the administration is going after LGBTQ people. The last thing I want is for my private medical visit to be recorded. Thoughts? |
I have had two different providers — one JH family medicine at the North Bethesda location, one not local— use AI. I know the non-local provider asked my permission. At JH my memory is it was more of a heads up that it was being used.
I get your concerns about privacy and the fact that all of this is generating data that could be used against us. At the same time I understand the desire of my docs to be more efficient. I felt like it was a “value add” for one visit where the doc focused on me. The JH visit still involved my provider staring at her computer typing away the entire time, so not sure what AI offered. |
Prepare for even more errors in your records. Just great! |
I’m at a different provider and my doctor has been asking for permission to use this AI scribe whenever I see her, and I always say no. |
I wouldn't be too concerned about privacy, tbh. If you look at my google search history, you will find out about all of my health issues (I have several rare diseases).
I would be concerned about accuracy, or lack thereof. Which you already noticed. Send a message over the portal noting the error/mistake so there is a record somewhere. And complain - if they receive enough complaints, maybe they'll rethink the idea. |
We use otter at work and it’s always riddled with errors. |
For me it might be better. I’ve had doctors write completely wrong stuff in my records and it was clear with the lack of effort. |
At this stage in AI development, I would send a short and succinct note to someone like a VP of Privacy, Customer Experience, or whoever very high-level needs to know that these systems are annoying and make errors. AI is being pushed on everybody. It saves time but it can miss the point. However harried doctors also miss the point. |
No way I would consent to that for so many reasons. |
JH has signs advising you that it’s being used. I have never been asked for permission that I recall. |
Report for a HIPAA violation and find a new provider. |
If it keeps doctors from staring at a computer screen for the entire appointment, I’m all for it. The systems have to be HIPAA compliant. |
The company I work for tests the HIPAA compliance of software used in medical settings. We're a neutral third party that many big hospital groups and medical practices use. You'd be shocked at how many errors and noncompliance we find. It's also why we recommend that software be tested bi-annually if not more often if the software is frequently updated by the provider because bugs and cracks develop during releases. We also do other pen testing and again, it is rare for us to NOT find a security gap. |
This seems great to me, it makes no sense that a doctor has to spend time telling me stuff then type the same thing in later. |
I just imagine this person yelling about HIPAA into a blue postal box. |