the-podcast guy recently linked this essay, its old, but i don't think its significantly wrong (despite gpt evangelists) also read weizenbaum, libs, for the other side of the coin

  • Tomorrow_Farewell [any, they/them]
    ·
    6 months ago

    Again, though, this simply works to reinforce the computer analogy, considering stuff like file formats. You also have to concede that a conventional computer that stores the poem as a .bmp file isn't going to tell you what the 300th word in it is (again, without tools like text recognition), just like a human is generally not going to be able to (in the sort of timespan that you have in mind that is - it's perfectly possible and probable for a person who has memorised the poem to tell what the 300th word is, it would just take a bit of time).

    Again, we can also remember different things about different objects, just like conventional computers can store files of different formats.
    A software engineer might see something like 'O(x)' and immediately think 'oh, this is likely a fast algorithm', remembering the connection between time complexity of algorithms with big-O notation. Meanwhile, what immediately comes to mind for me is 'what filter base are we talking about?', as I am going to remember that classes of finally relatively bounded functions differ between filter bases. Or, a better example, we can have two people that have played, say, Starcraft. One of them might tell you that some building costs this amount of resources, while the other one won't be able to tell you that, but will be able to tell you that they usually get to afford it by such-and-such point in time.

    Also, if you are going to point out that a computer can't tell if a particular image is of a 'wide shot of nature' or of a 'reflection of the light in the eyes of one's loved one', you will have to contend with the fact that image recognition software exists and it can, in fact, be trained to tell such things in a lot of cases, while many people are going to have issues with telling you relevant information. In particular, a person with severe face blindness might not be able to tell you what person a particular image is supposed to depict.

    • plinky [he/him]
      hexagon
      ·
      6 months ago

      I’m talking about visual memory what you see when you recall it, not about image recognition. Computers could recognize faces 30 years ago.

      I’m suggesting that it’s not linked lists, or images or sounds or bytes in some way, but rather closer to persistent hallucinations of self referential neural networks upon specified input (whether cognitive or otherwise), which also mutate in place by themselves and by recall but yet not completely wildly, and it’s rather far away picture from memory as in engraving on stone tablet/leather/magnetic tape/optical storage/logical gates in ram. Memory is like a growing tree or an old house is not exactly most helpful metaphor, but probably closer to what it does than a linked list

      • Tomorrow_Farewell [any, they/them]
        ·
        6 months ago

        I’m talking about visual memory what you see when you recall it, not about image recognition

        What is 'visual memory', then?
        Also, on what grounds are you going to claim that a computer can't have 'visual memory'?
        And why is image recognition suddenly irrelevant here?

        So far, this seems rather arbitrary.
        Also, people usually do not keep a memory of an image of a poem if they are memorising it, as far as I can tell, so this pivot to 'visual memory' seems irrelevant to what you were saying previously.

        I’m suggesting that it’s not linked lists, or images or sounds or bytes in some way, but rather closer to persistent hallucinations of self referential neural networks upon specified input

        So, what's the difference?

        which also mutate in place by themselves and by recall but yet not completely wildly

        And? I can just as well point out the fact that hard drives and SSDs do suffer from memory corruption with time, and there is also the fact that a computer can be designed in a way that its memory gets changed every time it is accessed. Now what?

        Memory is like a growing tree or an old house is not exactly most helpful metaphor, but probably closer to what it does than a linked list

        Things that are literally called 'biological computers' are a thing. While not all of them feature ability to 'grow' memory, it should be pretty clear that computers have this capability.

        • plinky [he/him]
          hexagon
          ·
          6 months ago

          What is visual memory indeed in informational analogy, do tell me? Does it have consistent or persistent size, shape or anything resembling bmp file?

          The difference is neural networks are bolted on structures, not information.

          • Tomorrow_Farewell [any, they/them]
            ·
            6 months ago

            What is visual memory indeed in informational analogy, do tell me?

            It's not considered as some special type of memory in this context. Unless you have a case for the opposite, this stuff is irrelevant.

            Does it have consistent or persistent size, shape or anything resembling bmp file?

            Depends on a particular analogy.
            In any case, this question seems irrelevant and rather silly. Is the force of a gravitational pull in models of Newtonian physics constant, does it have a shape, is it a real number, or a vector in R^2, or a vector in R^3, or a vector in R^4, or some other sort of tensor? Obviously, that depends on the relevant context regarding those models.

            Also, in what sense would a memory have a 'shape' in any relevant analogy?

            The difference is neural networks are bolted on structures, not information

            Obviously, this sentence makes no sense if it is considered literally. So, you have to explain what you mean by that.