Hello, everyone!
I’m looking for a book—fiction or non-fiction—that had a profound impact on you (as a woman) and, above all, changed your perspective on the status of women. One of those books that really moves you and stays with you long after you’ve finished reading it. The kind of book that made you stare at the ceiling for an hour after you finished it. A book that every women should read in their life.
If you have a recommendation that touched you in this way, I’d be really interested in hearing about it.
Thank you in advance for your suggestions.
P.S : I'm a woman in my twenties
by Complex_Mud_2382
3 Comments
The Vegetarian by Han Kang has stuck with me since i read it, incredible book.
Half The Sky
The bobcat by katherine Forbes Riley. It’s about moving on from sexual violence. It’s not graphic and it doesn’t linger much on the assault itself, it’s more about healing.
Into the planet by Jill heinerth. She broke women’s records for cave diving and takes insane risks. As an engineering student I felt seen when she describes being in male dominated spaces.