People talk about social media algorithms as if they're something disconnected from the decisions of the companies that make and control them. "The Algorithm" is not making YT push shitty content on your home page, YouTube is making that happen. It's a combination of ignoring certain trends and actively promoting others.

For starters, these companies made the algorithms, they tweak them constantly. When Elsagate happened, YT made changes the reduced the amount of that very specific type of garbage that was shown. When advertisers stop advertising, they suddenly have great influence over the recommendations. That to me proves they have to ability to control with pretty fine detail what is recommended by their sites.

It's been revealed that TikTok has a manual "heater" function that allows them to force certain videos to appear in recommendations. They use this to set the tone of the site, lure influencers, and make brand deals. That exposure causes heated channels to gain subscribers, further amplifying the effects.

YT trending is manually chosen as well, 10 main videos, 10 gaming videos and 10 shorts, updated every 15 minutes. When videos end up on the trending page, they get more views, which makes them get recommended even more. This gives them a constant source of influence over the recommendations.

One mistake I see people make is to assume that recommendation algorithms are simply a reflection of the audience; "The algorithm is bad because we are bad". My counterpoint to that is that when the recommendations hurt the bottom line of the business, these companies change them. At the very least it's social media companies choosing not to fix bad recommendations and at worst intentional manipulation. Sure, people choose to watch a lot of gross stuff, but let's not act like YouTube couldn't get rid of, for example, misogyny for children content(Andrew Tate etc) quickly if they wanted to.

The other is to treat it as a sentient creation that nobody has control over, "We're just chasing what the algorithm wants". It's one of the things tech bros dream of with regard to AI. They want to be able to put an algorithm in charge of the orphan crushing machine and say, "Sorry, I don't know why the algorithm keeps choosing to crush the orphans".

Tldr: The purpose of a system is what it does.

  • peppersky [he/him, any]
    ·
    4 months ago

    I feel like Algorithms are still in themselves bad, at least for the vast majority of use cases. The Spotify algorithm for example isn't bad at recommending music (at least in my experience my weekly discover is full of obscure music from all over the world that I'd have very little chance at finding on my own), but it just can't help devaluing the music at the same time. It's endless, without a human touch, without context. It will never challenge your taste and makes the listener lazy. And it makes the music itself worse too, since suddenly all music needs to compete against all other music ever made, since the algorithm sees it all as worth the same amount of space and care.

    • quarrk [he/him]
      ·
      4 months ago

      Sorting by new is itself an algorithm. Algorithms aren’t inherently bad. It’s just a question of how democratic they are