The idea of "national goals" in the context of the United States is increasingly untenable. I mean honestly what sort of goals really unite Blue and Red State America anymore? Post-cold war, post-war on terror, what does "America" stand for?
The Left over the past ~10 years has largely been occupied with a project of historical revisionism that questions the entire rationale for the USA, i.e. 1619 project.
While the Right has, in reaction, constructed some Cereal Box fantasy of "American Greatness" that existed "sometime" in the past, and put it on a red baseball cap.