Honestly, the genre doesn’t matter, and the length doesn’t matter. I’ve always been a Romance/Fantasy fan (it’s also what I write myself), but I’m not opposed to self-help nonfiction books.
What book emphasized these feelings for you:
- Everything is going to be okay
- You are not your trauma/You existed before trauma happened to you
- You will find a new norm
- Finally feeling the sunshine on your face after a long rain
Some books I’ve read that I enjoyed:
-
I know Sarah J. Maas is juvenile to some, but I did really enjoy her Throne of Glass series. I also read ACOTAR but found that a little less impactful.
-
I was on a Colleen Hoover binge and I enjoyed It Ends with Us to a certain degree, but I do feel that most of her works lack the depth that I’m looking for. I’m looking for something a little more prolific.
-
I read quite a few classics in college, and Death of a Salesman was one that stood out to me a lot despite it being a play. It’s not something I would typically choose to read on my own, but I’d like to explore more thought-provoking texts like this.
I really want to FEEL something from what I read, and honestly whatever comes to mind after reading this post, please just throw it at me. I do enjoy a plot twist, and I don’t mind unconventional storytelling.
The content doesn’t matter: I’m okay with explicit content, or clean content, I’m really just looking to be hit in the chest with a message.
Thanks so much in advance 🙂
by livelaughloveev
1 Comment
I would try Kristin Hannah’s books, namely The Women, The Great Alone, or The Nightingale!