You often see westaboos romanticising western civilizations by saying shit like "Before the Roman empire unified everyone all those places were warring with each other! Rome ushered in centuries of peace!"
Or similar shit about America or Africa "Those slur tribes were all fighting each other anyway! It became more peaceful under colonial rule!
Even ignoring how silly believing imperial propaganda about "barbarians" is, life under these so-called peace times was fucking horrible. Slavery, apartheid and exploitation, the fucking collesium??!?. A huge portion of the population were literally kept like cattle, whipped, brutalized and forced to do the hard labour.
Peaceful for who, exactly?
It's equally silly to imagine the people the Romans conquered as being better simply because they were the victims of empire. Rome didn't fall out of a coconut tree, and the societies surrounding it were no less reliant on violence and oppression.