What right do we have to impose our views on other societies? For instance, should we force other societies to prohibit slavery, grant civil rights to women, or require they be democratic, as indicated in the Universal Declaration of Human Rights? Can such intervention be justified?
We don't have that right, seeing that I'm in the USA when we suppress people's rights here so I'm not surprised we are doing it to others