As a medical doctor I extensively use digital voice recorders to document my work. My secretary does the transcription. As a cost saving measure the process is soon intended to be replaced by AI-powered transcription, trained on each doctor's voice. As I understand it the model created is not being stored locally and I have no control over it what so ever.
I see many dangers as the data model is trained on biometric data and possibly could be used to recreate my voice. Of course I understand that there probably are other recordings on the Internet of me, enough to recreate my voice, but that's beside the point. Also the question is about educating them, not a legal one.
How do I present my case? I'm not willing to use a non local AI transcribing my voice. I don't want to be percieved as a paranoid nut case. Preferravly I want my bosses and collegues to understand the privacy concerns and dangers of using a "cloud sollution". Unfortunately thay are totally ignorant to the field of technology and the explanation/examples need to translate to the lay person.
Simple jobs are going to continue to go away in favor of more efficient spending.
You’re not going to get around the removal of simple jobs from the market in favor of newer concepts and more complex operations.
All these people that said going to college to further your education was stupid and a waste of money are going to be the first to bitch and moan because the rest of us who spent the time and money to better ourselves would like to reciprocate that same logic into the world so you don’t have to worry about things like underpaid fast food workers spitting in your food, delivery drivers stealing your food, etc.
Some people who can only do “simple” tasks are the ones who stand the most to be hurt by the world moving forward and becoming more advanced and complex, but I’m not sure what we can do to help them outside of seriously considering UBI. The wealth we are generating and saving through automation deserves to be equally spread amongst the people it replaced. That’s fair.