Hi! I am looking for a book that deals with trauma, and a small whisper of hope. I typically enjoy fiction, but am not really interested in fantasy or romance. I am feeling quite lost in my healing journey and part of it feels like it comes from my lack of connection with other people and their stories – I feel like I need something that tells me that everything will be okay, no matter how crazy life has become. I am hoping to avoid things that reek of 'toxic positivity' or anything overtly optimistic – I want something more on the realistic side. Like a terrible fever that breaks and you can finally eat, or a long harsh winter that finally shows days of warmth coming through. Idk if this is even making any sense but thanks in advance to you all <3
by SkotchMiist