Hi all, 20yr male, I’ve always been curious about feminism and gender roles in society, however I’ve never really been exposed to these topics (only when I was younger with those “feminist owned” pipeline videos), also growing up in a Christian household.
While I’ve changed by ways of thinking a lot as I got older, especially how I acted and viewed masculinity and my old homophobic tendencies (some books on this as well will be appreciated) I’ve never seen it from women’s perspective. Nor do I really know the history of it.
I’ve always been educating myself and passionate about helping men grow, change and express themselves, but I realise it’s just as important to understand our women.
I’d also like some recommendations for black women in civil rights.
TLDR, books about the culture, history and current view that we have on women today, how it affects them and how it can change.
Thank you
by Dark-rythem