Ask your own question, for FREE!
History 13 Online
OpenStudy (anonymous):

World War II was one of the most significant events of the 1900s and one of the most important events in US history. Think about how much the United States changed between the Great Depression and the postwar era, when the country had become an economic powerhouse. How did World War II influence and change the identity of the United States throughout the 1900s and into the present? What are some positive and negative changes that occurred in the United States in the years after World War II?

OpenStudy (anonymous):

someone please help!! i'll give a medal to the best answer and fan

OpenStudy (anonymous):

In World War II, America had been attacked for the first time by a foreign force after the British. It brought them a stark reminder that war was at their doorstep. This gave them a massive increase in the arms industry, both positive and negative. This also brought America to the front stage of the world as one of the most powerful nations in the world.

Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!
Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!