WW2 changed America very much. Just like WW1, it made the economy much better and gave people jobs. Women began working in different new jobs they hadn't before, and unlike the Vietnam war, soldiers got actual help when they got home. After WW2, America became very powerful and helped other countries. People felt very proud and worked together. The war influenced people made new movies and songs. Overall, World War II made America much stronger and changed how people lived.