Once the United States got involved in World War I, what role did it play in winning the war and framing the peace that followed? Should the United States have stayed out of the war?

Once the United States got involved in World War I, what role did it play in winning the war and framing the peace that followed? Should the United States have stayed out of the war?

 

answer should be about six paragraphs long and include details and examples that support each of your points