Most Americans believe God can bring about supernatural physical healing, according to a new study released by Barna, with evangelicals leading the way in that belief. However, where Baptists stand in those attitudes depends on whether they have adopted the high-profile healing practices of other Christian groups, according to Art Allen, who recently retired from Howard Payne University in Brownwood,…
Want to read the full article and more?