• Feinsteins_Ghost [he/him]
    ·
    3 months ago

    for criminal prosecuting agencies

    fuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuck youuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuu

    • hexaflexagonbear [he/him]
      ·
      3 months ago

      Honestly this probably actually going to be a very expensive way to predict what the prosecutor office's annual budget will be, and it won't work.

  • Vent@lemm.ee
    ·
    3 months ago

    "ChatGPT says you're guilty and Copilot wrote a warrant. You're coming with us to find out how long Bard says you'll rot in jail for."

  • Deadend [he/him]
    ·
    3 months ago

    AI does have a usage in legal.. which is mostly churning though 100,000,000 pages of text from discovery so lawyers can find 99,000 docs that may be relevant.

    And stenography aids.

    But generating text for actually being looked at? It’s garbage.

    • TechnoUnionTypeBeat [he/him, they/them]
      ·
      3 months ago

      Except that when a legal firm actually did this, it generated a completely fake precedent where it accused (and "convicted") a real person of a crime they didn't commit, fabricating the details in its summary of the case

      LLMs are designed to always give you what you ask for. If it doesn't find any information it will fabricate it, because it can't respond with a negative

      • Deadend [he/him]
        ·
        3 months ago

        That’s what I said - generating text to be looked at.. LLMs are bad at that.