USA HISTORY

AMERICAN IMPERIALISM(1890 1919)

THE UNITED STATES IN WORLD WAR I

[SOURCES]
How did the roles of Women change during WWII?

(A) They stayed home and cleaned the house.

(B) ** They took over jobs traditionally reserved for men.

(C) They went out dancing, drinking and cursing.

(D) They waited for the men to come home because they felt hopeless.

EXPLANATIONS BELOW

Concept note-1: -In particular, World War II led many women to take jobs in defense plants and factories around the country. These jobs provided unprecedented opportunities to move into occupations previously thought of as exclusive to men, especially the aircraft industry, where a majority of workers were women by 1943.