Permanently Deleted

  • PorkrollPosadist [he/him, they/them]
    ·
    edit-2
    1 year ago

    We should assume that there are thousands (or more) different operations taking place to manipulate public opinion and perceptions on Internet at any given time. These will range in sophistication from one crank with a handful of alts engaging with themselves, to extremely elaborate and well-staffed operations. They will also range in access, from outsiders brute forcing CAPTCHAs to create bot accounts attempting to circumvent heuristic detection algorithms, to insiders operating with the assistance of the platforms, with some middle of the road cases where the platforms might not be assisting them, but notice what they're doing and decide not to intervene.

    The term 'botting' itself is far too vague IMO. There are many different tactics which fall under this umbrella, from shilling (not really even employing bots) to vote/engagement manipulation (100% automated), to psycological operations (i.e. staging a debate between two apparently unrelated users in a controversial thread to construct a strawman and incinerate it), which may or may not be backed up by engagement manipulation to boost visibility. The tactics vary a great amount from platform to platform too. On Twitter you can get something trending by having a bunch of fake accounts tweet it, but it leaves evidence, meanwhile on Reddit it is possible to manipulate the feed with complete anonymity (to the public, anyway) if you have enough fake accounts to throw upvotes and downvotes around. Finally, the platforms themselves manipulate what people see, and employ tons of automation in their moderation strategies.

    • Einstein
      hexagon
      ·
      edit-2
      1 year ago

      deleted by creator

  • logflume [they/them]
    ·
    1 year ago

    Bot farms are extremely common but at the same time very unsophisticated. LLM technology (like ChatGPT) is still nascent enough that bots aren't really using it yet. Most bot farms are dedicated to those shitty YouTube comments about making $500 a day working from home. The technology isn't there yet to make it pump up anything other than metrics - which is to say they're only used to signal boost rather than change opinions explicitly.

    Astroturfing as you've noted is also extremely common though that's a metric that's harder to measure since they at least attempt to be slightly covert. It is pretty obvious when you look at the content and the compare it to the engagement. This stuff is still pretty manual and me being a software engineer have no idea how it actually works under the scenes. Bot farms I have a bit of expertise.

    BUT these two tools are obviously used in tandem. A few "marketers"/"propagandists"/"influencers" make astroturfed posts and they hit the switch on the bot farms to signal boost them.

    • Einstein
      hexagon
      ·
      edit-2
      1 year ago

      deleted by creator

  • Wheaties [she/her]
    ·
    1 year ago

    I'm sure it happens, but it should never be your first assumption. It's a big world, there are a lot of people.

    Lots of people are really into k-pop bands -- which seems more likely, that people post about the music they like? Or that someone is putting in hours of work to make a band seem marginally more popular than it is? Making an automated poster doesn't have much in the way of returns. It really only makes sense if you're running a scam and trying to reach as many eyes as possible.

    As for twitter, there was and is lot of stupid money sloshing around silicon valley. They got a lot of money from venture capitalists, hoping for a big return.

    • Einstein
      hexagon
      ·
      edit-2
      1 year ago

      deleted by creator

  • Yurt_Owl
    ·
    1 year ago

    I think the bots in the farms should unionise

    • Einstein
      hexagon
      ·
      edit-2
      1 year ago

      deleted by creator

  • Comp4 [comrade/them]
    ·
    1 year ago

    I have no idea how, but I've gained over 100 followers on Twitter (and the trend is rising). I have a feeling that most of them are bots. I don't post much, and my content isn't particularly unique. I mainly follow accounts related to both adult content and politics, and I suspect that these areas attract a significant number of bots.

    I do believe that bots are a real issue, but measuring their impact and scale is quite challenging. I also believe that troll farms can influence public opinion. To be frank, if I were in charge of a country, I would establish a dedicated department for memetic warfare.

    • Einstein
      hexagon
      ·
      edit-2
      1 year ago

      deleted by creator