America After World War I

Read Complete Research Material

AMERICA AFTER WORLD WAR I

America after World War I

America after World War I

Introduction

The main purpose of this paper is to take a comprehensive look on the impacts of the great world war I which brought a record destruction and changes in the world. In this paper, I will describe the impact of World War I on America. World War I was not based on America but it had a great impact on America. The reasons for this will be explored in the paper.

America after World War I

Despite the fact the war was fought in Europe and U.S. casualties and property loss were far less than that of the allies, the war had a significant impact economically, politically, and socially on the United States. The war brought great economic prosperity to the country through the production of wartime goods, but during the postwar era there was widespread unemployment, increased labor strife, racial hatred, and poverty. Propaganda campaigns, designed to create support for the war effort, resulted in strong anti-foreign and anti-Communist feelings, which led to violence and the violation of civil rights for many Americans. Politically, the postwar period resulted in the return to the political philosophy of the late nineteenth century.

Politically, also, many of the leaders started thinking that world war is a destructive action. And economically and politically, many of the countries got into troubles and to stand them, it took many years. Governments took on many new powers in order to fight the total war. War governments fought opposition by increasing police power. Authoritarian regimes like tsarist Russia had always depended on the threat of force, but now even parliamentary governments felt the necessity to expand police powers and control public opinion. The economic impact of the war was very disproportioned. At one end there were ...
Related Ads