Empire. It is a word that most Americans loathe. After all, the United States was born through its rebellion against the great (British) empire of the day. American politicians, policymakers and the public alike have long preferred to imagine the U.S. as, rather, a beacon of freedom in the world, bringing light to those in the darkness of despotism. Europeans, not Americans, it is thought, had empires. Some version of this myth has pervaded the republic from its earliest colonial origins, and nothing could be further from the truth.