Guys can anyone here please recommend me good books about fathers and fatherhood? Where men discover the essence of fatherhood, their bond with their children is explored, positive change in bought in them due to fatherhood where the navigate the deep waters of life all just to give their children a better world. Simply I wish to read howggood fathers actually can be. That their children are their strength instead of weakness.
I've read The Road, but it was sooo bleak it hurt me. I would love to read something positive and cheerfully and something good. Would love if the mother is also present in the story and not just dead or abusive.
Prefer narration from fathers pov. And no foster situation either please.
Open to any genre, hell would love to read even scientific non fiction about animal fathers in the wild lol except nothing too dark or bleak please.
Thank you!!
by Cherei_plum
1 Comment
I’m not a parent, let alone a father but a great start would be Things My Son Needs to Know About the World by Fredrik Backman