When we look at the balance of news coming out about Christianity, one fact seems absolutely inescapable: the religion in general is starting to lose its privileged position in American culture. But for some reason, many Christians still act like they're totally astonished and blindsided when they are confronted with this fact. Today I want to examine this tendency--because it shows us something important about what's going on in right-wing Christian culture.
↧