Permanently Deleted

  • Asa_the_Red [he/him]
    ·
    2 years ago

    Insisting that men on the left must treat women as human beings and do not have any right to their bodies is actually good for men! I mean obviously its important for women (given their actual safety is more important than men's feelings), but men being forced to confront any sexist or misogynist views they've been taught to see as normal also makes them better and healthier people.