Is it ever alright for the United States to pursue an imperialistic policy?
The whole concept of manifest destiny was imperialistic, yet it is a source of great pride in US history. Is it ever all right? There are people that say we should be morally above imperialist practices. But if you think about it, we practice it every day. The new form of imperialism in the 21st century is financial, not military. The huge global corporations built on the US stock market is the perfect manifestation of the capitalist credo. Is it all right? I suppose most Americans would say yes. But I personally believe that corporate imperialism will further alienate us in the eyes of other peoples, and it will eventually lead to our downfall.