• sovietknuckles [they/them]
    ·
    1 year ago

    The deal includes [...] and protections for the use of artificial intelligence in the writing process. Per the guild’s agreement:

    • AI can’t write or rewrite literary material, and AI-generated material will not be considered source material under the MBA, meaning that AI-generated material can’t be used to undermine a writer’s credit or separated rights.

    • A writer can choose to use AI when performing writing services, if the company consents and provided that the writer follows applicable company policies, but the company can’t require the writer to use AI software (e.g., ChatGPT) when performing writing services.

    • The Company must disclose to the writer if any materials given to the writer have been generated by AI or incorporate AI-generated material.

    • The WGA reserves the right to assert that exploitation of writers’ material to train AI is prohibited by MBA or other law.

    This comes after countless funny and unsuccessful attempts to scab using ChatGPT

    • drhead [he/him]
      ·
      1 year ago

      So writers can use AI tools if they want to and the company is okay with it, but cannot be forced to work with it, and if they do then the company can't get out of paying writers by saying that part of the material was written by an LLM because anything AI written can't be counted as source material. Am I understanding this correctly?

      I've heard from writers saying they like using AI language models for idea generators or other things and ones that have experience being forced to use them and finding that it is more effort to correct the output into something usable than just rewriting from scratch. So if I'm understanding this right this sounds like it should keep either kind of writer happy and protected.

      • kristina [she/her]
        ·
        edit-2
        1 year ago

        Yeah the ai tools are legitimately useful, I use them for exploring research ideas and rephrasing search queries in foreign languages

        They are often wrong on numbers and facts but occasionally they spit something interesting out that can be followed up on