: the policy, practice, or advocacy of extending the power and dominion of a nation especially by direct territorial acquisitions or by gaining indirect control over the political or economic life of other areas; broadly : the extension or imposition of power, authority, or influence If you look at the definition of Imperialism you can see that America has indeed tried to extend it’s power and dominion, but not in the traditional way as lets say back in the Roman Empire period where they overtly conqured other nations and states. America has been rather elusive about it. In an effort to provide for a nation and have a quality economy and trading, each of the nations in the world would serve the united states if they were a democarcy and had a capitalistic type economy. You might want to research and cite your research about “Global Power and Reach JOINT VISION 20/20” This is about the military being able to respond to any sitiuation or uprising anywhere in the world