.

Wednesday, October 9, 2019

How World War II Transformed the United States Society and Economy Essay

How World War II Transformed the United States Society and Economy - Essay Example The nation’s survival to one of the most dreadful events in world history made America a powerful nation ever to influence the rest of the world up to this day. WWII is indeed a defining moment for a nation suffering from the â€Å"agonies of the Great Depression† (Kennedy xi). After the Second World War, the U.S. became the standard of power and economic prosperity. Until now, many nations want to establish diplomatic relations with the country in order to keep that connection with the world’s most influential nation. How the U.S. transformed itself to gaining this coveted position is being traced to its significant participation in the Second World War. In the beginning, the U.S. did not want to compromise anything just to prove something to the world. The American government was seriously dealing with the dilemmas of the Great Depression (Kennedy xi) and its own national problems. There was no intention to engage in an all out battle against Germany, Japan, or Italy if not due to the attacks made by the Japanese Military on the US Naval base in Pearl Harbor, Hawaii. Therefore, the country was provoked to defend itself because it had a valid reason to resort to such a courageous action. However, instead of plotting revenge against Japan, the U.S. wanted to begin with defeating the cause of the problem, which is the then Nazi-dominated Europe (â€Å"Wartime Strategy†). Uncle Sam was more worried in regard to the possibility of the German scientists inventing weapons of mass destruction than the possible attacks of the Japanese Military (â€Å"Wartime Strategy†). Hence, it was a decision the U.S. had to make for the sake of its future and people. World War II ended after six years leaving the world with trauma. However, this ending marked the start of a renewed America, perhaps the momentum of monumental change for the once isolationist nation. Americans have realized the value of life, probably for the reason that they have s een the vast impact of the war on their country. Women began to realize how they may help the society by being part of the country's workforce. This past event is one reason why the U.S is open minded in terms of sharing responsibilities between the male and female members of the household. Freedom is apparent because the Americans themselves recognize its importance by allowing the members of society to taking part in making their communities a better place to live. WWII has, therefore, changed the perception of the American Society regarding the equal roles of men and women. Thus, this change fueled the American society to becoming liberal in making crucial decisions that are necessary for their survival. Slavery in the United States was also one crucial part of its history. African Americans were then forced to become slaves to the White Americans, and were not given equal rights by the government. The ending of slavery was something that the African Americans had yearned many ye ars ago alongside with the ending of extreme discrimination. African American longed for freedom, which was later on granted by the American government at the onset of WWII. Many Blacks were given the chance to serve in the army, which had in some ways lifted their status in society during the Second World War. In addition, WWII had, in some ways, diminished racial inequality in a place that is inhabited by varying ethnicities, like America. Even if they suffered from discrimination after their return from the war, opportunities for the African-Americans increased as legislations, such as the 1965 Civil Rights Act, as well as the Fair Housing Act of 1968, were enacted (Frazier and Tettey-Fio 85). Eventually, the enforcement of these Legislations enhanced the economic status of the Blacks, which led to

No comments:

Post a Comment