Hey y'all.
My mother is visiting Japan soon, so she bought a book called "The Japanese Mind: Understanding Contemporary Japanese Culture," edited by Roger J Davies and Osamu Ikeno. I took Japanese language classes for years, so the book seems interesting.
But it occured to me that I've seen many books like this about Japan, but as an American I don't know of any books that do the same with America. Or, the ones I know of are specific to American subcultures (racial groups, religion, etc) but not America as a whole.
I'd love to read a book like that to help me understand my own culture. Does anyone have any good recommendations for a book that'd be the equivalent of "The American Mind: Understanding Contemporary American Culture"? Thanks!
by cyanmaar