Bloobish [comrade/them] to news • 2 years agoOh no, who could have expected this outcome!?imagemessage-square18 fedilinkarrow-up185file-text
arrow-up185imageOh no, who could have expected this outcome!?Bloobish [comrade/them] to news • 2 years agomessage-square18 Commentsfedilinkfile-text
minus-squareGreenTeaRedFlag [any]hexbear27·2 years agoit has zero concept, period. These things don't think, they're just nested if/then statements with a grammar checklist. link
minus-squareGrouchyGrouse [he/him]hexbear14·2 years agoThat's true. I'd rather talk to my insomnia hallucinations. link
minus-squareGreenTeaRedFlag [any]hexbear12·2 years agothat, on some level, comes from a human mind, capable of imagination and feeling. The chatbot come from a software engineer link
minus-squareOwl [he/him]hexbear9·2 years agoLLMs (latest AI fad :kelly:) are remarkably low on nested if/then statements. They're mostly matrix multiplications instead. link
minus-squareProfessorAdonisCnut [he/him]hexbear3·2 years agoThe one in the story isn't a LLM bot though, it's a manually scripted one link
it has zero concept, period. These things don't think, they're just nested if/then statements with a grammar checklist.
That's true. I'd rather talk to my insomnia hallucinations.
that, on some level, comes from a human mind, capable of imagination and feeling. The chatbot come from a software engineer
deleted by creator
LLMs (latest AI fad :kelly:) are remarkably low on nested if/then statements. They're mostly matrix multiplications instead.
The one in the story isn't a LLM bot though, it's a manually scripted one
Whoah that's rare these days, I just assumed