The United States is a significantly less Christian country than it was seven years ago. That’s the top finding — one that will ricochet through American faith, culture and politics — in the Pew Research Center’s newest report, “America’s Changing Religious Landscape,” released May 12. This trend “is big, it’s broad and it’s everywhere,” said Alan Cooperman, Pew’s director of…
Want to read the full article and more?