it really depends on what state you live in, and what decade you grew up in. Southern states were particularly prone to whitewashing US history, especially with respect to colonialism and slavery. I did learn about slavery and indigenous genocide in school, but as an adult I still find the public education I received lacking, incomplete, and still somewhat whitewashed, even if it was loads better than the McCarthyist and Daughters-Of-The-Confederacy sponsored shit I would have gotten jammed into my brain in the 1950s.
For example here are some issues I had with my liberal education in the 1990s:
it was pretended that the civil rights movement was only successful because of peaceful protesters like MLK and was almost ruined by totally unwholesome radicals like the Black Panthers
it was pretended that only the south had an economic interest in slavery. It was entirely ignored that the North relied economically on slavery indirectly.
the civil war was depicted as an ideological crusade by the north to end slavery. this is an inversion of the confederate myth that it was about “states rights.” The main objective of the south was to preserve slavery. The main objective of the north was to preserve the union. Neither side was abolitionist, it’s just that abolition became practical in 1863 as the war dragged on. Lincoln issued the emancipation proclamation so he could draft black soldiers and further demoralize the south. he had never been ideologically an abolitionist, though some in his party to his left (like Thaddeus Stevens) were.
it was pretended that all the problems of capitalism were entirely isolated to the gilded age, and that once we got a semblance of social democratic reforms (8 hour day, overtime pay, etc.) capitalism was now “fair.”
labor militancy was altogether ignored. it was pretended that social democratic reforms were won entirely because silver-tongued reformists demolished capitalists with logic and reason, not because shit like the battle of blair mountain happened.
it was depicted that indigenous genocide was mostly a matter of “both sides” being “equally mean.” i.e. that manifest destiny was mostly colonizers just protecting themselves from raids or something
zero mention of CIA coups or any of the stuff declassified in the church committee
it also always ended with “but now we’re in modern times where racism is over, and we are friends with the native americans now =)”
Might be different now that history has restarted, but when I was going through in the obama years, yeah history was taught to me like a long running TV show that had just had its series finale and all is well
In my history classes, it was like black folks were a footnote until you get to the lead up to the Civil War. Then after the Civil War they disappear from the stage again until the civil rights movement.
I did have a lib teacher who thought it was super important to teach us about Native American society and culture, even if he didn’t cover the genocide part as much as he could have.
it really depends on what state you live in, and what decade you grew up in. Southern states were particularly prone to whitewashing US history, especially with respect to colonialism and slavery. I did learn about slavery and indigenous genocide in school, but as an adult I still find the public education I received lacking, incomplete, and still somewhat whitewashed, even if it was loads better than the McCarthyist and Daughters-Of-The-Confederacy sponsored shit I would have gotten jammed into my brain in the 1950s.
For example here are some issues I had with my liberal education in the 1990s:
it also always ended with “but now we’re in modern times where racism is over, and we are friends with the native americans now =)”
Might be different now that history has restarted, but when I was going through in the obama years, yeah history was taught to me like a long running TV show that had just had its series finale and all is well
In my history classes, it was like black folks were a footnote until you get to the lead up to the Civil War. Then after the Civil War they disappear from the stage again until the civil rights movement.
I did have a lib teacher who thought it was super important to teach us about Native American society and culture, even if he didn’t cover the genocide part as much as he could have.
Texas dictates what most states’ textbooks are. Every American child grows up learning a lot of bullshit.
That’s exactly what I was taught too.
Live in a red shithole, rural public education. Still learned the horrifics of slavery, trail of tears, black panthers, etc.
who must go?