How did Healthcare Come About in the United States?
As the nation went to war in the early 1940s, many able-bodied men went overseas and competition for workers began to increase. In order to lure new employees, American companies had to resort to new tactics, one of which was offering health insurance as part of a more attractive benefits package. That became the basis for today’s system – where costs are spread across a large group of people, and the healthy help pay for the sick. In the 1940s, the growing strength of the unions gave workers more bargaining power, and a tax-free, employer-sponsored health program became a common concession. In the 1950s, healthcare became even more sophisticated as a host of new medicines became available, including antibiotics to fight infection. Unfortunately, the cost of treatment continued to rise, and those without employer-based healthcare had to pay cash, or not see a doctor at all. By the dawn of the 1960s, companies across the nation began to offer health insurance in response to the rising c