As a medical doctor I extensively use digital voice recorders to document my work. My secretary does the transcription. As a cost saving measure the process is soon intended to be replaced by AI-powered transcription, trained on each doctor's voice. As I understand it the model created is not being stored locally and I have no control over it what so ever.
I see many dangers as the data model is trained on biometric data and possibly could be used to recreate my voice. Of course I understand that there probably are other recordings on the Internet of me, enough to recreate my voice, but that's beside the point. Also the question is about educating them, not a legal one.
How do I present my case? I'm not willing to use a non local AI transcribing my voice. I don't want to be percieved as a paranoid nut case. Preferravly I want my bosses and collegues to understand the privacy concerns and dangers of using a "cloud sollution". Unfortunately thay are totally ignorant to the field of technology and the explanation/examples need to translate to the lay person.
Do your patients know that their information is being transcribed in the cloud, which means it could potentially be hacked, leaked, tracked, and sold? How does this foster a sense of distrust, and harm the patients progress?
Could you leverage this information and the possibility of being sued if information is leaked with the bureaucrats?
I don't where you live. But almost all of bigtec US cloud is problematic (Read: Illegal to use) for storing or processing of Personal information according to the GDPR if you're based in the EU. Don't know about HIPPA and other non-EU legislation. But almost all cloudservices use US bigtech as a subprocessor under the hood. Which means that the use of AI and cloud is most likely not GDPR-complaint. Which you could mention to the right people and hope they listen.
Edit: It's illegal to use for the processing of the patients PII, because of transfer to insecure third countries and because bigtech uses the data for their own purposes without any legal basis.
I agree and I suspect this planned system might get scuttled before release due to legal problems. That's why I framed it in a non legal way. I want my bosses to understand the privacy issue, both in this particular case but also in future cases.
You don't have to use a cloud service to do AI transcription. You don't even need to use AI. Speech to text has been a thing for like 30+ years.
Also, AWS has a FedRAMP authorized Gov Cloud that's almost certainly HIPAA (and it's non-us counterparts) compliant.
Also also, there are plenty of cloud based services that are HIPAA compliant.
Stop using the digital voice recorder and type everything yourself. This is the best way to protect your voice print in this situation. It doesn't work well as a protest or to educate your colleagues, but I suppose that's one thing you can use your voice for. Since AI transcription is a cost saving measure, there will be nothing you can do to stop its use. No decision maker will choose the more expensive option with a higher error rate on morals alone.
Unfortunately the interface of the medical records system will be changed when this is implemented. The keyboard input method will be entirely removed.
I would have work sign a legal discharge that from the moment I use the technology, none of the recordings or transcription of me can be used to incriminate me in case of an alleged malpractice.
In fact, since both are generated or can be generated in a way that both sounds very assertive but also can be adding incredibly wild mistakes, in a potentially life and death situation, they legally recognise potentially nullifying my work, and taking the entire legal responsibility for it.
As you can see in the most recent example involving Air Canada, a policy has been invented out of thin air. Such policy is costing the company. In the case of a doctor, if the administration of the wrong sedative, the wrong medication, or if the wrong diagnosis was communicated to the patient, etc; all that could have serious consequences.
All sounding (using your phrasings, etc) like you, being extremely assertive, etc.
A human doing that job will know not to derive from the recording. An AI? "antihistaminic" and "anti asthmatic" aren't too far off, and that is just one example off of the top of my head.
Unfortunately a guy I know works for a gov hospital and they've used such technology for over a decade at this point. It seems unavoidable.
Simple jobs are going to continue to go away in favor of more efficient spending.
You’re not going to get around the removal of simple jobs from the market in favor of newer concepts and more complex operations.
All these people that said going to college to further your education was stupid and a waste of money are going to be the first to bitch and moan because the rest of us who spent the time and money to better ourselves would like to reciprocate that same logic into the world so you don’t have to worry about things like underpaid fast food workers spitting in your food, delivery drivers stealing your food, etc.
Some people who can only do “simple” tasks are the ones who stand the most to be hurt by the world moving forward and becoming more advanced and complex, but I’m not sure what we can do to help them outside of seriously considering UBI. The wealth we are generating and saving through automation deserves to be equally spread amongst the people it replaced. That’s fair.
So what's your concern? I'm a bit confused.
- Using cloud to process patient data? Or,
- Collecting your voice to train a model?
Personally I'd be more worried about leaking patient information to an uncontrolled system than having a voice model made
Thats another issue and doesn't lessen the importance of this issue. Both are important but separate. One is about patiwnt data, the other about my voice model. Also in thsi case I have no control over the mesical records and it's already stored outside the hospital in my case.
This is really weird. Is it common in other countries for doctors to not input the data in the system themselves?