Hi! I have been interested in the western genre for most of my life, but I’ve never been able to get into it because lots of books written in the genre are extremely racist and/or extremely depressing. Are there any books that focus on cowboys/outlaws in the west that are more upbeat and don’t portray Indigenous people so poorly? I apologize if this is a bit of a niche ask. Thanks!
by Juno_Da_Vampire