Religion in the United States

If the U.S. was once a white Christian nation, it is no longer.