Legacies
When their country needed them most, the women of America came
together and helped the war effort on and off of the battlefield.
together and helped the war effort on and off of the battlefield.
America won World War II and women were part of that success!
Women gained more rights and today are given the same freedoms as men.
Women will now and forever be included in the work force of America.
Women will now and forever be included in the work force of America.
¹Picture from Vintage Everyday;