Citizens of the United States today are proud to claim the title of “Americans.” Most of us probably have no idea that this term was originally intended as an insult.
The term “Americans” was first used by English writers during the colonial period. They employed it as a way of referring to the colonial residents as a marginal or peripheral segment of British society, unworthy of equal status with proper Englishmen. The word was intended, and understood, as a degrading epithet; an “American” was not just a resident of the thirteen colonies, but an inferior, provincial creature.
The colonists eventually declared their independence, not to assert their rights as “Americans,” but in order to reject that designation and claim the full rights due to all British citizens.