In my opinion, “the West” is (nearly) a synonym of “the imperial core”. I have met a lot of non-white people who live in “the West” and embody a lot of what I hate about “white people” (the selfishness, for example). My point is that apart from people who actively fight against it (most leftists, for example), people who live in the West in general adhere to, follow, promote, etc, Capitalist Ideology. This is less the case for those living in the Global South (in my experience).
The West just means European culture, wherever it is
As sailorfish pointed out, not all of European culture. Eastern Europe isn't always included in "the West", which is instead taken to mean (mainly) Western Europe, the U.S. & Canada, and Australia.
deleted by creator