It seems like for the first century after the United States came into being, England was constantly undermining its existence. What event or series of events changed this mindset? Is there another pair of countries/states with a similar course? Thanks!
hi! there's always room for more input on this, but FYI, there's a relevant section in the FAQ* - check it out for previous responses
*see the link on the sidebar or the wiki tab