why was the united states justified in their imperialistic policies of the late 1800s and the early 1900s ? why did the united states shift from expanding internally towards foreign expansion in the late 1800s and early 1900s?
Say what? What foreign expansion are we talking about? The United States has never had overseas colonies. Are we talking about the annexation of Hawaii in 1898, or the cession of the Phillipines to the US by Spain at the conclusion of the Spanish-American War in the same year? If this counts as "imperialism," what are we to make of the (much larger) annexation of Texas in 1845, the extraction of half of Mexico at the conclusion of the Mexican-American War, or the acquisition of the Louisiana Purchase from France in 1803? Either the United States has never been imperialist, but, up until the mid-20th century interested in acquisition of territory, or it has always been. "Imperialism" is the domination of one country by another without outright incorporation. The Brits were imperialists in India because they controlled India, but India did not become part of Great Britain. The French were imperialists in North Africa because they controlled Algeria, but Algeria was not made part of France. And so on. The only place the United States could be said to be imperialist is in the Phillipines between 1898 and 1935, and this was arguably somewhat accidental, the consequence of a war with Spain that occured for other reasons.
Join our real-time social learning platform and learn together with your friends!