Was the United States better served by a policy of isolationism or imperialism?
The US has never had a policy of imperialism. We have not conquered and set up colonies like the French, Belgians, British, or Spanish had. The unwritten policy of isolationism ended with WWI, and and the leadership and power of the US filled the void of power after the war. There has been no turning back, because to do so would create another void of leadership.