I'm looking for non-fiction books that explore the change from a religiously literal world to our, the west's, now secular world where belief is optional. I'm interested in what people's lives and worldviews were like then vs now and why and how they changed. I'm not interested in "science good, religion bad" type books. I'm agnostic but I find that kind of take extremely reductive.
The closest thing I've read is Dominion by Tom Holland. I've been eyeing books by Charles Taylor, Peter Berger and Alasdair MacIntyre.
Thanks!
by BookooBreadCo