Permanently Deleted

  • Wildgrapes [she/her]
    ·
    4 years ago

    Its endlessly hilarious the slippery slope argument. Oh no my small vidja game company doesn't want to host explicit child sex stories on their server. Welcome to 1984.

    On the other hand I've seen it said that the system basically flags anything with number <18 and fuck near each other. Which is hilarious.

    "10 years of war with the orcs. Fuck when will it end" opps that's pedophilia baby. Hilarious if true. Because tech companies want to do everything with algorithms that never work. Lol.

    • drhead [he/him]
      ·
      4 years ago

      I've tried it since the changes and I haven't run into any problems (it apparently has a unique message for triggering it, "Uh oh, this took a weird turn... Help us figure it out?" which can be quite funny when it triggers on something innocuous -- someone apparently had it show up for trying to buy 4 watermelons). But,

      spoiler

      the game still has no trouble making characters of unspecified age 16 against my will. Which sadly,

      spoiler

      is a marginal improvement, because it USED to make characters far younger against my will. :agony-turbo:

      So overall, I don't think it is necessarily tuned right in either direction. But I have solidified my suspicion that it is largely hitting what it's supposed to target.

      • Wildgrapes [she/her]
        ·
        4 years ago

        So basically the algorithm still doesn't work perfectly though it's an improvement... but the gamers are mad because now they can only text fuck 16 year olds.

        • drhead [he/him]
          ·
          4 years ago

          From reading more about it, it appears it might be an A/B test thing, so some people might just be in the control group. I've also seen it claimed that the algorithm is basically:

          spoiler

          if text contains ("[n] year old" where n < 18, "little/young boy/girl", "child", "the dog", "the horse") and (any of a number of sex-related terms, which I won't list) then flag

          which obviously can cause problems if you're using the many other definitions of "fuck", or trying to ride a horse, for example. Contemplating this makes you really appreciate how versatile of a word "fuck" is, and how hard it is for an AI to comprehend such a linguistic enigma.

          So, the filter might still be incredibly crudely designed in ways that they should have known would cause problems... and also apparently goes a bit beyond the stated scope, which is kind of important to disclose if you want people to properly identify bugs.