People talk about social media algorithms as if they're something disconnected from the decisions of the companies that make and control them. "The Algorithm" is not making YT push shitty content on your home page, YouTube is making that happen. It's a combination of ignoring certain trends and actively promoting others.

For starters, these companies made the algorithms, they tweak them constantly. When Elsagate happened, YT made changes the reduced the amount of that very specific type of garbage that was shown. When advertisers stop advertising, they suddenly have great influence over the recommendations. That to me proves they have to ability to control with pretty fine detail what is recommended by their sites.

It's been revealed that TikTok has a manual "heater" function that allows them to force certain videos to appear in recommendations. They use this to set the tone of the site, lure influencers, and make brand deals. That exposure causes heated channels to gain subscribers, further amplifying the effects.

YT trending is manually chosen as well, 10 main videos, 10 gaming videos and 10 shorts, updated every 15 minutes. When videos end up on the trending page, they get more views, which makes them get recommended even more. This gives them a constant source of influence over the recommendations.

One mistake I see people make is to assume that recommendation algorithms are simply a reflection of the audience; "The algorithm is bad because we are bad". My counterpoint to that is that when the recommendations hurt the bottom line of the business, these companies change them. At the very least it's social media companies choosing not to fix bad recommendations and at worst intentional manipulation. Sure, people choose to watch a lot of gross stuff, but let's not act like YouTube couldn't get rid of, for example, misogyny for children content(Andrew Tate etc) quickly if they wanted to.

The other is to treat it as a sentient creation that nobody has control over, "We're just chasing what the algorithm wants". It's one of the things tech bros dream of with regard to AI. They want to be able to put an algorithm in charge of the orphan crushing machine and say, "Sorry, I don't know why the algorithm keeps choosing to crush the orphans".

Tldr: The purpose of a system is what it does.

  • MayoPete [he/him, comrade/them]
    ·
    4 months ago

    What I don't get is why the CIA, or anyone for that matter, would want to push far right content? I assume a government would rather have moderate, safe, neoliberal videos get the most attention.

    Far right content radicalizes people to become Fascists, and if you get enough fascists in one place they tend to form paramilitaries and start trying to coup the government.

    • RyanGosling [none/use name]
      ·
      edit-2
      4 months ago

      Domestic fascism can be iffy for the CIA, but historically it’s not something they actively aim for. You don’t really see a pattern of them trying to get Juan Hitler Delgado into POTUS like you see elsewhere. They don’t care if foreign countries are far right as long as they are loyal to the US and kill communists and protect American profits.

      The FBI, however, has a more colorful history with killing communists and progressives and manipulating politics at home. Obviously the CIA does operate on US soil, but their stated roles often times mean that resources are allocated for their stated roles and it’s more efficient for the agencies to focus on different agendas.

    • egg1918 [she/her]
      ·
      edit-2
      4 months ago

      Because far right politics are better for business. And by business, I mean violently overthrowing socialist governments, assassinations, all the good stuff those ghouls love.