Someone I care about objectifies women. He’s aware it’s a problematic and he’s certainly ashamed of it, and is in therapy for an adjacent issue, but I wanna find a book that might help him understand what it feels like on the receiving end. His therapist is a man also, so a book written by a woman is probably ideal, although I wouldn’t be opposed to suggestions from men or NBs ‘cause I think all perspectives can be helpful.
by nicokthen
3 Comments
based rec
“The Will to Change” by bell hooks
*The Handmaid’s Tale*