I have been wondering lately what has happened to the spiritual discernment of Christians. When Harry Potter came on the scene, did we leave every lesson taught to us, from the Bible, at the back door? I am confused — at what point did God tell us that witchcraft was okay? I mean, when did God change? As far as…
Want to read the full article and more?