So I’ve gone a little into a rabbit hole of finding out the weird societal conceptions about periods. I’ve been thinking a lot about what the world might look like if people had actually given a shit about the fact that women had periods. Now I am asking the wisdom of reddit: Are there books you can recommend who give you that “The Body Keeps The Score”-type effect of “this book… changed my life. This book… made me change my view — on *periods*”. Essentially a discussion of periods and/in society. Ty <3
by LavishnessTop3088
1 Comment
Oh, the book “Cunt” (can’t remember the author) was just amazing in terms of the female body and periods.