This is why I fucking hate singularity cultist techbros. They convince the entire rest of society that AI is fake or that true AI is impossible or whatever by basically starting a religious cult around it.
This is harmful because AI is Incredibly dangerous and we need people to acknowledge that to start taking action to ensure that it's developed safely and don't suddenly have capabilities spike by 300% one month and now suddenly we have 30% unemployment, or a super-plague gets released because chatGPT 5 in 2026 told some idiot how to make flu viruses 10x more transmissible and 10x as deadly or whatever.
My worry isn't sapient AI, I genuinely do not care whether it's sapient, my worry is that in the short term it will enable people to commit bioterrorism and mass produce high quality propaganda, and in the longer term that it's capabilities might increase to the point of being difficult to control.
This is exactly the shit I'm talking about, you seem to dismiss the entire Idea that AI might outstrip human intelligence (and that this would likely be very bad) out of hand. I think this is a mistake born from not being familiar enough with the field
deleted by creator
This is why I fucking hate singularity cultist techbros. They convince the entire rest of society that AI is fake or that true AI is impossible or whatever by basically starting a religious cult around it.
This is harmful because AI is Incredibly dangerous and we need people to acknowledge that to start taking action to ensure that it's developed safely and don't suddenly have capabilities spike by 300% one month and now suddenly we have 30% unemployment, or a super-plague gets released because chatGPT 5 in 2026 told some idiot how to make flu viruses 10x more transmissible and 10x as deadly or whatever.
deleted by creator
My worry isn't sapient AI, I genuinely do not care whether it's sapient, my worry is that in the short term it will enable people to commit bioterrorism and mass produce high quality propaganda, and in the longer term that it's capabilities might increase to the point of being difficult to control.
This is exactly the shit I'm talking about, you seem to dismiss the entire Idea that AI might outstrip human intelligence (and that this would likely be very bad) out of hand. I think this is a mistake born from not being familiar enough with the field