:what-the-hell:

In the latest study, published in the journal Nature Neuroscience on Monday, scientists found an AI system called a semantic decoder can translate a person’s brain activity as they listened to a story, or imagined telling a story, into text.

The new tool relies partly on models similar to the ones that power the now-famous AI chatbots – OpenAI’s ChatGPT and Google’s Bard – to convey “the gist” of people’s thoughts from analysing their brain activity.

But unlike many previous such attempts to read people’s minds, scientists said the system does not require subjects to have surgical implants, making the process noninvasive.

...

Addressing questions about the potential misuse of the technology, such as by authoritative governments to spy on citizens, scientists noted that the AI worked only with cooperative participants who willingly participate in extensively training the decoder.

For individuals on whom the decoder had not been trained, they said the results were “unintelligible”.

  • 4_AOC_DMT [any]
    ·
    2 years ago

    For sure. Something scarier would be if you don't need fmri but could use some other modality. A researcher at the institution I worked at almost a decade ago had a grant for a project on reconstructing fmri from ultrasound, but that requires direct contact too. I'm keeping my fingers crossed that the skull is thick enough and 70mV magnitude spikes are small enough that there's just inherently no good information that can be recorded at a distance that is sufficient to reconstruct with high fidelity the neural signatures of language or other kinds of conscious thought.