• Technus@lemmy.zip
    ·
    2 months ago

    I have a theory hypothesis notion that the concept of hallucination in artificial neural networks is not a failure mode that is unique to ANNs but is an inherent property of any neural network, artificial or biological.

    Essentially, I posit that a neural network by itself is incapable of maintaining coherence without a rigid external framework, such as consistent feedback in training an ANN, or the laws of physics for biologicals.

    This would explain why people start tripping balls in sensory deprivation chambers. And it provides a counterargument to any thought experiment or philosophy that involves a disembodied brain vividly hallucinating reality.

  • 10_0@lemmy.ml
    ·
    2 months ago

    Not enough connections due to degenerative disease, so he forgot his keys