Hello to anyone reading this; I’ll start off by saying that I’ll ask you not to start any kind of political debate. I find them annoying when online, have heard most arguments, and still cannot get a few things, hence I’m looking for a book, if possible giving in-depth analysis and deep thoughts, on rape culture.
I’m looking for suggestions, to give everyone a general idea, after a discussion with my girlfriend, about the fact that to me, there are precautions to be taken as a woman when going out and/or going to a party. I’ve made a few comments that she told me stemmed or directly participated to rape culture. As for myself, I can’t see why nor how telling a woman to take precautions can be problematic.
As such, I’m looking for a book that would offer me a different, in-depth view of what I apparently can’t see. Please recommend the best, most in-depth book you can think of. If it isn’t beginner-friendly, that’s fine (I’d prefer a book that is not beginner-friendly, for the record). However, if you believe some basics are required to fully get your best recommendation, please provide recommendations on said basics
Thanks in advance!
by Random_0nlooker
1 Comment
Beartown by Frederik Backman