• zed_proclaimer [he/him]
    ·
    edit-2
    11 months ago

    If it has been “westernized” then you are basically admitting right there it is not the west.

    Yes, much of the world has been colonized and injected with white supremacist ideology, ie “westernized”. That doesn’t make much of the world “The West”