The USA and the Decline of Western Civilization
- The Decline of the West is the direct consequence of the emergence of
the USA. Without the USA, there would have been no pressure to grant
independence to the colonies.
- There was no reason to do it. The European
powers had weapons to impose colonialism on all their colonies for
many more decades and maybe centuries.
- They did so because they were
confronted by a new power that used to be a European colony and that
did not approve of colonies.
- Four/five decades later, those colonies
have become fast-growing economies that are causing the decline of the West.
|