Hi all! I’m interested in finding books that can educate me on many different social issues! I’m looking for strictly non-fiction novels that talk about issues such as:
- Institutional/Systemic Racism
- Anti-Fascism/Revolution
- Propaganda (The power of it, how to spot it in the media you consume, etc.)
- The way the government maintains our oppression (debt/poverty, policies, healthcare system, etc.)
- Feminism & Gender
- Women’s Health & Hormones
- Alt-Right Pipeline (how to understand social trends that seem innocent but are actually manipulating the way you think/view society)
- Indigeneity
- Individualism
- Capitalism
- Health and Wellness – How we’ve become estranged from it, and how it is repackaged and sold back to us
- Environment/Nature
- Herbal medicine/teachings (ex. Gut healthy foods that are naturally occurring that we’ve been taught to demonized for corporate profit)
I know this is a long list! But I’m seeking radical change and knowledge is power! I am in Canada so I’d enjoy local recommendations, but I am also okay with US/Global authors and information.
Thanks so much in advanced!!
by Freckles-234
1 Comment
(3rd bullet point) How to Stand up to a dictator by Maria Ressa is a fascinating dive into Propaganda via Facebook and the election of Marcos in the Philippines.
(4th bullet point) Empire of Pain- this isn’t exactly your 4th bullet point but it’s fascinating. It covers the rise of the Sackler family and their part in the opioid crisis. I chose it for this one because it shows you the corrupt nature of profit driven healthcare businesses.
(11th bullet point) Braiding Sweetgrass by Robin Wall Kimmerer – This focuses on the wisdom of indigenous cultivators as an alternative to western farming practices