Hi,
For quite some time I am engaged with liberalism and the loss of its appeal around the world. I wanted to know some books that tackle the complex relationship of the liberal west with the non western cultured world. Meaning if and when there is appeal in other countries or how societies have been damaged by say imperialist western tendencies. In the best case it would be nuanced and gets into complexities and not some postmodern / post colonial ideological books but really about the facts whatever can be deduced from there.
I hope it’s understandable what I am looking for 🙂
by Sensitive-Grade2632