Hello Beautiful people! Can you kindly recommend a book/set of books to know about American History right from the period where only native Americans were there until now? How all the development happened, the stories about their independence, their civil war, etc. And most importantly it should not be biased. Please let me know in the comments of any such books. Also, any biographies or autobiographies' suggestions are also welcome if it adds to this topic. TIA!
by Citizen_0f_The_World
2 Comments
You could start with A PEOPLE’S HISTORY OF THE UNITED STATES by Howard Zinn. THAT’S NOT IN MY AMERICAN HISTORY BOOK by Thomas Ayres is another.
There’s no such thing as an unbiased history book, but here are some really good ones that are clear about their biases up front and have loads of references you can check for yourself if you doubt any claims they make.
*Not a nation of immigrants* by Roxanne Dunbar-Ortiz
*A history of America in ten strikes* by Erik Loomis
*Black against empire* by Bloom and Martin