Hi All,

Due to some overbearing/draconian laws coming into place in the UK, I need to take steps to protect the site long term. You may have already seen this post in /c/technology which proves that these laws have a real negative effect on small independent websites, especially those hosted/run by people in the UK.

While I assume this will play out the same as GDPR did and actually most things will be fine, one thing that continues to be an issue is how Lemmy handles NSFW content and account creation.

Currently Lemmy.zip offers accounts to anyone 13+ (or whatever the minimum age is in your country), and asks that you only activate the NSFW flag if you are 18+.

However we have no way to enforce that, nor turn if off if someone says they're under 18, nor really any way of monitoring that process. Lemmy does not currently give admins the ability to add extra confirmations about age, or a customisable pop up warning when that flag is clicked, or the ability to add more text to the button, or anything like that during the signup process, without creating custom UI stuff (which I am not able to do). The outcome of this is that anyone is able to just click the "Enable NSFW" flag during signup or in their profile settings, and view NSFW content, without any explicit check.

It would be great if the functionality of Lemmy was changed to make this more accommodating to children so they can't access NSFW content, however that could take a long time to implement and I want this site to be safe before this law comes in to effect in March.

In order to make things as simple as possible going forwards, I am therefore proposing:

From the 1st of February 2025, Lemmy.zip will only offer a service to people who are over 18.

Before I go any further with this, I am asking all Lemmy.zip users to share their thoughts on what this change might mean. I know the Fediverse tends to skew older, but I am also aware that this might affect some current users too.

I am not asking anyone under 18 to dox themselves either, if you're worried this might apply to you, you are more than welcome to reach out via PMs or preferably Matrix (my link is in my profile).

This restriction makes it clear that you must be 18 to have an account on the site and therefore removes some of the burden on the age verification, and although it isn't the perfect solution, it moves us slightly further away from any grey areas in the law.

There is an alternative though. I could turn off NSFW completely and then no one would be able to see any content marked NSFW, even if it's not actually NSFW. This obviously isn't something I want to do either.

Happy to hear thoughts/concerns, but would appreciate any feedback either way on this. I'll leave this up a couple of days to give people a chance to read it.

Thanks

Demigodrick


Footnotes:

  • some of this actually already exists under The Digital Economy Act 2017 (age verification), and the upcoming Online Safety Bill puts further duties on websites to shield children from NSFW content, which is where the software lacks in features.

  • The Ofcom Online Safety Bill guidelines are over 1000 pages long. There is no provision for small independent websites, they are lumped in with the likes of Meta. Ofcom seem to think paying tens of thousands of pounds to update websites with their suggestions is "reasonable". It is clearly not.

  • ProfessorOwl_PhD [any]
    ·
    8 hours ago

    I'm not saying the guidance is good or sensible, I'm saying that you currently have the same options as LFGSS - spend an inordinate amount of money verifying users or shut down. It explicitly calls out your solution as unacceptable, regardless of your feelings. Yes, the methods are unreasonable, yes the government wants complete control over the internet, no it's not going to make a difference when you cop your £18 mil fine. Welcome to the UK, it fucking sucks, get used to it.

    • Demigodrick@lemmy.zip
      hexagon
      M
      ·
      edit-2
      7 hours ago

      8.54 Following our November 2023 Consultation, Ofcom included proposals on highly effective age assurance (‘HEAA’) in the December 2023 Consultation on our guidance for service providers publishing pornographic content on their online services (‘Part 5 guidance’). Age assurance proposals were also included for U2U services in our May 2024 Consultation. However, our expectations around HEAA will not be finalised at the time we publish these Illegal Content Codes of Practice for U2U services.

      8.55 Therefore, rather than delay the introduction of the safety defaults measure, we proposed in our November 2023 Consultation that we should initially introduce the measure with a stipulation that services should only be in scope if they have an existing means of identifying child users, whether that is a form of age assurance or another method.

      The HEAA guidance isn't yet published, and neither is the risk assessment documentation, so we don't actually know what type of category we'll fall in to yet to even make the decision LFGSS made. Admittedly they are a magnitude larger than lemmy.zip and probably fall higher up the risk assessment than us.

      It also doesn't apply if there isn't already an age verification process in place (until the above HEAA guidance is published)

      • ProfessorOwl_PhD [any]
        ·
        6 hours ago

        I mean you might get an exemption by disabling DMs, but even low risk carries a lot of requirements that aren't going to be cheap or easy to maintain.

        It also doesn't apply

        Doesn't initially apply. You are buying yourself a few months at most.