11.1 million (Out of about 14 million) to be more exact. Almost 80% of all their videos.
Isn’t my main porn site personally, but still pretty lame that all that amateur stuff is gone. Looks like it’s heading the way of Tumblr.
Just a reminder that Pornhub is worth nearly 3 billion dollars and easily could've afforded as big of a moderation team as they could've wanted lol.
Edit: Alright, this got dumber than I thought it would, so I'm leaving off with this:
I can't believe I have to say this, but it does not make someone a rapist or a pedophile to suggest that a multibillion dollar company, one responsible for a site where the public can upload things to it, can afford to have a moderation team.
Literally every site that allows public uploads. porn or otherwise, runs the risk of someone uploading something fucked up to it, that's why moderation teams exist in the first place. To find and remove such content.
Yes, it sucks ass that those mods would have to even glance at such things, but Pornhub is not some crazy unique special case here. Mods everywhere have to see fucked up shit before it can be removed to protect others.
Rule-breaking content has to verified that it is, indeed, breaking the rules. That's how moderation works.
I apologize for nothing.
Peace.
Eh, if anything more pro-worker and anti-sex abuse measures could be taken. I personally don't consider the enjoyment and utility of a porn website to it's users to weigh anywhere near as much as the safety and material benefit to the people that work for it.
Or they could've just actually put in the effort of, ya know, moderating their content.
But this is cheaper and easier, sooo...
Removing videos of children being raped and revenge porn is obviously worth the cost of you having fewer options to jack it to, how is this even a question? You holding up the idea of some sort of imaginary perfect moderation system is just a distraction
No it isn't.
Pornhub has far more than enough money to do so, it's a valid criticism.
You're acting like literally half their videos were child/revenge/etc. porn, when in reality an average moderation team likely would've taken care of the problem and it wouldn't have gotten to this point in the first place.
deleted by creator
Yeah, there has been multiple stories over the years about the reviewers for google.
It's fucking awful. You see the worst shit imaginable constantly. People burn out and end up with PTSD.
What should ChapoChat do to prevent CP from being uploaded? Maybe we should follow Pornhub's lead and ban non-verified users from uploading pictures/videos.
deleted by creator
deleted by creator
deleted by creator
deleted by creator
I'll ask you the same question:
Technically child porn can be uploaded to Youtube too.
Should all videos from unverified users be taken down from there as well?
deleted by creator
Bet you a million dollars their algorithms don't catch all of them.
So my question stands: Should all those millions of videos of people doing absolutely nothing wrong still get nuked?
deleted by creator
deleted by creator
deleted by creator
deleted by creator
deleted by creator
deleted by creator
Explain to me how long it's acceptable for a video of a teenager being raped to be on pornhub and how a moderation team would have prevented it from being up that long
Okay, you're clearly acting in bad faith with this loaded question bullshit lol.
It would've been taken down once it was noticed/pointed out as opposed to staying up for way longer til' their company exploded, how about that?
Exactly, whereas with what they've done it's up for 0 time instead. Do you agree that's worth the sacrifice of you having fewer options to spank it to?
Technically child porn can be uploaded to Youtube too.
Should all videos from unverified users be taken down from there as well?
You accuse others of bad faith arguments, but say horseshit like this. You can have algorithms that detect nudity and remove them automatically until a human verifies that it isn't nudity. You cannot make an algorithm that automatically removes underage nudity because there is no way for an AI to recognize a 16 year old from an 18 year old. Even humans can't always tell.
Yeah let's make a bunch of underpaid workers spend hours a day watching weird pornography to try and decide what laws it breaks. A very moral position you've found there.
Says the guy who kills dogs.