Was American Imperialism Justified
The United States has been accused of practicing imperialism since its early days. American expansionism westward brought the country into conflict with Native American tribes, and later on, the US acquired a number of territories through war or purchase. Some argue that American imperialism was justified, as it helped to spread democracy and modernize backward … Read more