So I recently picked up What Hath God Wrought by Daniel Walker Howe. Despite being American I feel like I'm pretty woefully uninformed regarding my own country's history so I've been wanting to read more.
The Jacksonian Era has always seemed interesting and this came off like a fairly comprehensive book on it so I read some reviews, noted that there's some anti-Jackson/pro-Clay and Whig bias but hey whatever I can deal with that. The problem is that I'm not even out of the first chapter and both it and the prologue have been a constant assault on how evil whites are for engaging in conquest of the noble and amazing Native Americans who were oh so complex and awesome. There's a tone of "we should still be flogging ourselves for the acts committed against the natives and blacks, the most evil acts man has ever done". It reads way more like a 700 page long tract on how much white people back then sucked.
And really looking around it seems like a lot of books that cover the 1800s and especially westard expansion have the same general tone. That's not to say I want the opposite, giant conservative screeds about how awesome America is and the glory of Manifest Destiny and fuck the redskins. And I know every writer will have their biases but I think it's possible to temper them: I'm reading Embracing Defeat at the same time and the author at least so far does a fairly good job of not allowing himself to make constant moral judgements about the American occupation forces or this or that.
So I guess are there any good books regarding American history, specifically the 1800s and maybe westward expansion, that are maybe a little more straightforward while not laying the anacrhonistic moral judgements on so thick? And I guess American History genearl, talk about American history and what not ITT.