hi everyone!
i’m looking for fiction about healthcare practitioners/medicine. i’m not a healthcare practitioner myself, but i work adjacent to the field and i am just fascinated by medicine.
i just finished reading The House of God by Samuel Shem, which is i guess like one of the classic books about medicine and doctoring. it was pretty interesting, but kind of outdated/borderline offensive in a lot of ways, so i’m looking for something a bit more recent than that. (it was published in the 1970s). or at least, for lack of a better term, more “woke.”
i’m a big fan of Abraham Verghese (The Covenant of Water, Cutting for Stone). i do love doctor’s memoirs (like Verghese’s and all the books by Atul Gawande) and books about the history of medicine, but I’m really looking for fiction at this point.
any suggestions?
by thefaceinthefloor