When did American nationalism become a dirty word? Under Obama of course! Obama’s leftist war against America is a war against American nationalism. America emerged from WWII as the most powerful economic nation on earth and invincible militarily. Enemies of the United States domestically and internationally would need a different strategy to defeat her.