Having lived in both the US and Canada, I’ve noticed that they both have two very different sides of the War of 1812. Canada claims the US was trying to conquer it, but I remember being taught that the British provoked the US to war due to overexertion of power. (Many Canadians I know also claim they “won” that war, but again, that’s not what I was taught in grade school…) I’m not much of a history buff, but this completely fascinates me. Are there any books that present a more neutral take on this specific war?
by Witty_Swing4243