I read tons of books that focus on mental health, gender equality, women’s rights, LGBTQ+ struggles, or the “life is beautiful and everyone has their reasons” type of messages. I am not saying those books are not important, they really helped me understand a lot since I am 8 years old. But right now, I just want a book that is purely gripping, unforgettable and doesn’t try to teach me anything about how to be a better person or how society should be. Any genre or theme is fine. Just no “life is beautiful” stuff, please. Sorry if it comes out rude. Thanks everyone.
by Every_Smell_1204
1 Comment
I feel like a bit of a shot in the dark. And Then There Were None by Agatha Christie. Ten people are invited to a mysterious Island and started getting murdered in the pattern of a nursery rhyme. So if murder mysteries appeal, I can attest that it doesn’t preach about life being beutiful.