White culture has always been assimilate and eliminate, take whatever another culture has that is beneficial and subjugate it's people, kill those who refuse to obey the rule of divine right. The Greeks, the Romans, the Germans, French, English, even the U.S at every point in history we've used imperialism to completely strip another culture of identity, wealth, and power to bolster our own.
|