Particularly books that focus on Western colonization of the Hawaiian islands from the perspective of the native Hawaiians. I will also happily take suggestions that focus on Hawaiian history before colonization too.
Bonus points if you can suggest me nonfictions that almost read like fiction, since I’ve always had a difficult time getting pulled in by nonfictions.
Also, if there’s a better subreddit to ask this question, please let me know!
by Mad_kling
1 Comment
Reclaiming Kalākaua by Tiffany Lani Ing.
Taking Hawai’i by Stephen Dando-Collins.
Hawai’i’s Story by Hawai’i’s Queen.