Why do Americans promote religion and faith?
Please don’t generalize. “Americans” as a whole don’t promote religion and faith. Despite polls claiming some 80-90% of Americans have some kind of religious belief (mostly christian of one sort or another), the vast majority of Americans simply don’t really care about religion. The only ones “promoting” it are a small minority of fundamentalist believers, and the greedy, money-grubbing con men who’ve sucked them in. Probably somewhere around 20-25% of the population at most. Peace.