What event brought the United States into world war 2
The Japanese bombing of Pearl Harbor, is the event brought the United States into World War 2.
Question
Asked 10/31/2021 5:11:49 AM
Updated 10/31/2021 5:33:50 AM
0 Answers/Comments
This answer has been confirmed as correct and helpful.
Confirmed by
MrG [10/31/2021 5:33:44 AM]
Rating
There are no new answers.