So ultimately I at the bottom level believe very strictly speaking that there is no objective morality. My objective might be more objective than most conceptions, as I'm talking about in the whole literal universe if not all possible universes. There is no good/evil of an asteroid crashing into a planet full of giant cool dinos and their buddies and killing them off. Likewise an alien species or AI agent that goes grey goo and tries to turn everything into itself, can't objectively moralize that. At this same distant far away humans dropping nukes on each other or starting a mass extinction just for fun (Hey, Tall Trees did it too) I can't objectively judge.
HOWEVER, I think that as a collection of organisms that we call Humanity, there is a (however rudimentary) morality if you are no longer being objective and instead focusing on the 7 billion animals out there that share 99% of the same source code with each other. So within the realm of humanity, there are certain subjective moral truths (i actually really hesistated to use that word, not sure I like it) that are for all intents and purposes objective so long as we are discussing said morality solely within the set of other humans.
IDK if thats just humanism or what, its kind of a cop out since it lets you say both the obvious fact that there is no objective morality out there in the universe, but also lets you claim there most definitely is a good and a bad way for people to act, without contradictions. There's gotta be a name for sort of way of thinking, right?
I consider it rational self interest, but with an expanded concept of self