Someone told me about this book, and the title seemed interesting, but upon looking into it, it seems to heavily be based in pushing the idea “trauma isn’t real, don’t pay attention to it, it’s not a big deal” live laugh love sort of stuff.
Are there any better books that you think fit the title about living courageously in a world where you are disliked, that aren’t written by people who think denying trauma completely is the way to go?
by starlight_chaser