• AppelTrad [she/her]
    ·
    edit-2
    1 year ago

    Describing the perception of longer wavelengths as "seeing" might be a bit of a stretch. When I was taught about microscopes, it was explained that resolving finer details requires shorter wavelengths of light so the radio waves would be unhelpfully blurry.

    Also worth remembering that we're already sensitive to some infrared, which we experience as warmth rather than colour. If new sensing mechanisms are used to respond to other wavelengths of the EM spectrum, and if the brain is able to develop* so that it can integrate and process those inputs, then I imagine they'd be associated with experiences outside the familiar rainbow, just as warmth is.

    But if you're just going to somehow redefine the sensitivity† of our retinal pigments, then our brains aren't going to know that, and will process the signals it receives just the same, even if the photons initiating those signals are very different. This is how things like cochlear implants can be useful, after all. What I'm curious about now is: what happens when the chromatic appearance of familiar objects doesn't match memories? Over time, will the brain try to synthesize an experience that links the old and new perceptions?

    * Not just in infancy, but evolutionarily, since the visual cortex is going to need to work a whole lot harder. In the absence of those developments, I'd expect new stimuli, at best, to be arbitrarily mapped onto the processing regions for other stimuli, producing a kind of phantom colour experience.

    † We sort of do this when we look at false-colour images, and they don't have new colours; the colours we know are shifted and spread across different parts of the EM spectrum.