I’m looking for books that explore how the idea of “woke” shapes the way people talk about movies, TV, and music.
For context, I’m a visual artist and cinephile. Over the past few years, some of my more conservative friends have started dismissing things I recommend as “woke”—like The Last of Us HBO series or, more recently, One Battle After Another (OBAA).
It frustrates me when “woke” gets used as a catch-all to reject media without engaging with it critically. In the case of OBAA, I could talk about the pros and cons for hours- I’d love to talk to my conservative friends about it, too. From what I can tell, “woke” has become a kind of cultural shorthand to bypass criticism and I’d like to understand why thats happening.
Can anyone recommend books that unpack the origins, evolution, and cultural branding of “woke” as it relates to art and entertainment?
by cwcoates
1 Comment
Origins of Totalitarianism by Arendt