Not sure why nobody in the comments is distinguishing between blocking a community on an instance (removing /c/piracy) and defederating instances (saying your users can't subscribe to otherinstance.com/c/piracy). They are very different things. We should be very skeptical of defederation.
Removing a community because it violates the rules of your instance is A-OK and every instance should do this. Anybody can run an instance, and anybody can set their own rules, that's the whole idea of federation.
De-federating other instances because you find their content objectionable is less ok. Lemmy is like e-mail. Everybody registers at gmail or office365 or myfavoriteemail.com. Every email host runs their own servers, but they all talk to each other through an open protocol. You would be pissed to find out that gmail just suddenly decided to stop accepting mail from someothermailprovider.com because a bunch of their users are pirates or tankies. Or blocked your favourite email newsletter from reaching your inbox because it had inflammatory political content.
Allowing your users to receive e-mail, or content from subcommunities on other lemmy instances is not a legal risk like hosting the content yourself is (IANAL etc). Same way Gmail is not liable if somebody on some other e-mail server does something illegal by emailing a gmail user. That's why you can register at torrentwebsite.com and get a user confirmation email successfully delivered to your inbox. Gmail is federated with all other e-mail services without needing to endorse them or accept legal liability for them.
Lemmy's strength, value, and future comes from being the largest federated space for link-sharing and other forms of communication.
defederation is good for nazi and CSAM instances. no one should touch either with a 10ft pole. there's absolutely no reason to give them a larger platform.
"CSAM instances" <-- Pretty sure any publicly facing instances with this problem would be tackled by law enforcement pretty quickly.
as far as I've heard, they're still up and major instances are still federated with them.
"Nazi instances"< -- These ones will likely de-federate themselves from the wider federated web, they can't handle a broad range of perspectives well.
this is a deep misunderstanding of how far-right groups operate. they actively seek connection with the wider community because it presents them a chance to recruit and they're numbers get decimated when they're deplatformed. offering them a base of users to proselytize to only benefits them.
Social media has enabled these groups to both silo themselves and get promoted to users site-wide
yes precisely
This method of content promotion is responsible for the explosion of online hate content in the last decade
this has a deeper material reason underlying it. it's got more to do with economic decay and the lack of prospects people face than the algorithms. we saw the same thing early last century. far-right ideology explodes in popularity when the left fails to make the case for a more equitable distribution of resources and because our oligarchs fund them to an obscene degree -- minor fascists with a hundred followers on social media will receive hundreds of thousands of dollars in funding (cf Ali Alexander). fascist ideology spreads because it poses scapegoats for the problems in society.
Nazis had plenty of websites in the 90 and early 2000's but they didn't get much traction with them because Facebook wasn't forcing them into your home feed
yes, precisely. if normal instances federate with the nazi ones, this won't be true any longer because their content WILL flood the feeds of many people. this will have disastrous consequences for lemmy as a platform.
I really don't have a problem with these sites existing, people should be free to have their own disgusting racist thoughts and share them with their own little chat rooms and forums and the like.
I do as me and mine belong to groups they target. if they're allowed to rise to accumulate any power, it will spell death for us. there have already been multiple attempts in the US to organize pograms against trans people, as an example.
And they should be ruthlessly mocked and kicked out of every other space they could possibly go to.
however, I'd like to point out that 4chan originally started making memes to mock the fascists -- their use of irony turned over time into unironic fascism and they became a hotbed for neo-fascists.
Again, using the e-mail example, I can get an email from whitepowerwebsite as a gmail user. That's not google giving them a platform, it's just a neutral protocol for online communication (e-mail) working in a federated state as it's meant to
email is a bad example because it only provides point-to-point communication, unless you join a mailing list. social media is different -- views get broadcast to the wider public on a given platform. federating with nazis allows them to broadcast their views and create a sense that their vision of the world is actually what everyone else believes. exploding-heads is federated with lemmy.world and the consequence is that many users have left lemmy.world specifically to get away from the fascists dumping their disgusting worldview onto the platform.
Gmail isn't expected to police the entirety of e-mail, the legal liabilities lie with the sender and receiver.
they actually do have liability under laws like the DMCA, SESTA/FOSTA, and the new slate of laws recently passed to go after sex traffickers (and in reality a wide host of "undesirable" content more generally). but that aside, I'm not talking about legal liability. I'm talking about the responsibility the people running these instances have to not help build fascism. it's an ethical/political responsibility, not a legal one.
You are right about worsening economic conditions leading to the rise of far right movements. I was more speaking to their digital footprint. If you remember early Facebook, it was nothing like what people use today.
yes, precisely. if normal instances federate with the nazi ones, this won’t be true any longer because their content WILL flood the feeds of many people. this will have disastrous consequences for lemmy as a platform.
If lemmy A is federated w lemmy B (the nazi one), it means:
Users on Lemmy A can subscribe to communities and users on Lemmy B and vice versa
Users on Lemmy A can comment on communities on Lemmy B and vice versa
It does not mean:
Posts from lemmy B show up on Lemmy A (except in the "global" view on main page, which is non-default, and likely won't show up their either due to massive downvoting). I would imagine, in time, that the global tab actually gets entirely removed since you have a problem where a single lemmy instance can massively inflate their vote count to make their votes the top voted posts across the whole network. You can't enforce instances to follow the rules on this and you can't audit their compliance. There are certainly some solutions to this involving blockchain but that's an aside and those are at least a few years away afaik. 90% of users never do the "non-default" option in whatever app they're in.
So this flooding the feeds scenario, I just don't see it. In user-moderated platforms, vocal minorities don't show up anywhere, they get moderated out basically automatically except in their own little enclaves. There is no scenario in which Lemmy as a federation provides a good platform for them (outside of their own nazi-friendly instance), because Lemmy doesn't work like other social media works.
In user-moderated platforms, vocal minorities don't show up anywhere, they get moderated out basically automatically except in their own little enclaves.
I will take this to mean communists make up a soft majority on lemmy given the number of complaints about commie posting keep popping up on the major comms
Not sure why nobody in the comments is distinguishing between blocking a community on an instance (removing /c/piracy) and defederating instances (saying your users can't subscribe to otherinstance.com/c/piracy). They are very different things. We should be very skeptical of defederation.
Removing a community because it violates the rules of your instance is A-OK and every instance should do this. Anybody can run an instance, and anybody can set their own rules, that's the whole idea of federation.
De-federating other instances because you find their content objectionable is less ok. Lemmy is like e-mail. Everybody registers at gmail or office365 or myfavoriteemail.com. Every email host runs their own servers, but they all talk to each other through an open protocol. You would be pissed to find out that gmail just suddenly decided to stop accepting mail from someothermailprovider.com because a bunch of their users are pirates or tankies. Or blocked your favourite email newsletter from reaching your inbox because it had inflammatory political content.
Allowing your users to receive e-mail, or content from subcommunities on other lemmy instances is not a legal risk like hosting the content yourself is (IANAL etc). Same way Gmail is not liable if somebody on some other e-mail server does something illegal by emailing a gmail user. That's why you can register at torrentwebsite.com and get a user confirmation email successfully delivered to your inbox. Gmail is federated with all other e-mail services without needing to endorse them or accept legal liability for them.
Lemmy's strength, value, and future comes from being the largest federated space for link-sharing and other forms of communication.
De-federation is bad.
defederation is good for nazi and CSAM instances. no one should touch either with a 10ft pole. there's absolutely no reason to give them a larger platform.
deleted by creator
as far as I've heard, they're still up and major instances are still federated with them.
this is a deep misunderstanding of how far-right groups operate. they actively seek connection with the wider community because it presents them a chance to recruit and they're numbers get decimated when they're deplatformed. offering them a base of users to proselytize to only benefits them.
yes precisely
this has a deeper material reason underlying it. it's got more to do with economic decay and the lack of prospects people face than the algorithms. we saw the same thing early last century. far-right ideology explodes in popularity when the left fails to make the case for a more equitable distribution of resources and because our oligarchs fund them to an obscene degree -- minor fascists with a hundred followers on social media will receive hundreds of thousands of dollars in funding (cf Ali Alexander). fascist ideology spreads because it poses scapegoats for the problems in society.
yes, precisely. if normal instances federate with the nazi ones, this won't be true any longer because their content WILL flood the feeds of many people. this will have disastrous consequences for lemmy as a platform.
I do as me and mine belong to groups they target. if they're allowed to rise to accumulate any power, it will spell death for us. there have already been multiple attempts in the US to organize pograms against trans people, as an example.
however, I'd like to point out that 4chan originally started making memes to mock the fascists -- their use of irony turned over time into unironic fascism and they became a hotbed for neo-fascists.
email is a bad example because it only provides point-to-point communication, unless you join a mailing list. social media is different -- views get broadcast to the wider public on a given platform. federating with nazis allows them to broadcast their views and create a sense that their vision of the world is actually what everyone else believes. exploding-heads is federated with lemmy.world and the consequence is that many users have left lemmy.world specifically to get away from the fascists dumping their disgusting worldview onto the platform.
they actually do have liability under laws like the DMCA, SESTA/FOSTA, and the new slate of laws recently passed to go after sex traffickers (and in reality a wide host of "undesirable" content more generally). but that aside, I'm not talking about legal liability. I'm talking about the responsibility the people running these instances have to not help build fascism. it's an ethical/political responsibility, not a legal one.
You are right about worsening economic conditions leading to the rise of far right movements. I was more speaking to their digital footprint. If you remember early Facebook, it was nothing like what people use today.
If lemmy A is federated w lemmy B (the nazi one), it means:
It does not mean:
So this flooding the feeds scenario, I just don't see it. In user-moderated platforms, vocal minorities don't show up anywhere, they get moderated out basically automatically except in their own little enclaves. There is no scenario in which Lemmy as a federation provides a good platform for them (outside of their own nazi-friendly instance), because Lemmy doesn't work like other social media works.
I will take this to mean communists make up a soft majority on lemmy given the number of complaints about commie posting keep popping up on the major comms
Very good point