Hey guys I’ve read soo many books about feminism and different women in society now and I’m really interested in masculinity and how it’s being perceived and portrayed currently in society as well as the history of it.
Whenever I look for books about this they just feel like that typical “men self help” trope and I don’t naturally feel inclined to explore them. Idk if I’m being prejudiced on the cover & blurb, but if someone can recommend some insightful books about this I’d really appreciate it :))
by Status-Chemistry3583