Any good sci-fi out there that isn't apocalyptic, post-apocalyptic, or some form of capitalism in the future? You know, some kind of portrayal of a better world is possible? Star Trek is an obvious one but I'm scratching my head trying to think of other examples.
I like to imagine The Culture series as part of The Culture on boarding program and Banks had to return to his home planet.
banks was stoked to get those drug glands implanted! i forget which book but i know there is reference to Earth getting a first contact package of some kind long after the designation of Earth as a sort of hands off control group in the 1970s. maybe 2030s or even later? earth gets the "prime directive" treatment until then so we better bring about global communism because i want my genderfluid FALSAC