Definition of American Empire in English:

American Empire

noun

  • 1An empire in America.

  • 2The United States of America, especially viewed as an imperialistic state, either in possessing territories overseas or in exerting power and influence beyond its borders; American imperialism.

Origin

Early 18th century; earliest use found in John Oldmixon (d. 1742), historian and political pamphleteer.