I must confess I have a personal vendetta against Yudkowsky and his cult. I studied computer science in college. As an undergrad, I worked as an AI research assistant. I develop software for a living. This is my garden the LessWrong crowd is trampling.

  • ProfessorAdonisCnut [he/him]
    ·
    11 months ago

    The GPT-3 that exists before being prompted and the GPT-3 that exists afterwards are identical, it is the same unmoving surface that prompts bounce around in to give a probability to each candidate for the next token. That's just how LLMs work. Even setting aside all the other reasons why this is a ridiculous thing to say, you couldn't make an LLM suffer by interacting with it.

    If every chicken that existed was simply a frozen instantiation of some platonic ur-chicken, each totally identical not only to each other but also to themself from one moment to the next, then they too would be incapable of suffering.

    • UlyssesT [he/him]
      ·
      edit-2
      11 months ago

      Ever read that interview from that particular Silicon Valley executive that claimed that the LLM product from their own company "felt warm" and emotionally moved them while also admitting that they lacked experience with human contact so they had nothing else to compare that experience to and thusly concluded that human contact was inferior to the LLM's printed validation mantras? yea

    • Mardoniush [she/her]
      ·
      11 months ago

      Oh don't fucking say that they'll start claiming the eternal soul is an llm