WWII is the founding myth of modern America. Grave and obvious evil threatens the world, and the US, after a difficult struggle, defeats the evil and establishes itself as the global champion for freedom and democracy.
We've been living out the consequences of this myth ever since.