Looking back to what I was taught in elementary, middle, and high school, particularly in my history classes, I have come to realize that American schools, at least from my personal experience, sugarcoat a lot of portions of history. The education system teaches a type of history in which the dark, dirty part of history is conveniently glossed over and omitted from much of the curriculum that students are taught.
Looking back, even as early in elementary school, I cannot recall a single instance in which a teacher spent an adequate amount of time discussing parts of American history that are painful to hear, whether it be the wars that the U.S. partook in abroad or the atrocities and crimes against humanity that the U.S. engaged in here.
From a young age, we are taught to celebrate Christopher Columbus and other settlers that made their way to the Americas, naively celebrating them for settling this place that eventually came to become this “great nation.”
Yet, when something like Thanksgiving is discussed in school, we fail to recognize the brutal reality that this country was founded on. I can sincerely say that I did not learn about the real, violent history of what Native Americans endured with colonization (and continue to endure as a direct result of this brutal part of history) during my primary school years.
In my educational experience prior to college, I also cannot recall any of my teachers or textbooks going into detail about slavery. Yes, slavery was touched upon, but the dirty, horrendous reality that slaves were subjected to since prior to this nation’s inception was not actively discussed. And this kind of glossing over can be seen when I learned any other kind of history in my schooling prior to college.
Whether it was Japanese internment camps which violated Japanese and Japanese-Americans’ constitutional rights in this country, movements such as those for women's’ rights, LGBTQIA+ rights, or the United Farm Workers’ movement, or CIA-backed regime changes, coups, and backing of brutal dictators abroad have seldom been mentioned once, if at all, in any of my history classes.
In fact, on the contrary, none of my classes or teachers ever taught history from a critical point of view in which they critiqued the U.S. or called it out for its brutality.
Even when taught current events, the U.S. is never discussed or analyzed from a critical perspective, and this kind of blind patriotism is ingrained in students. No doubt, there are things that the U.S. should be praised for, and that people should be proud of.
However, when one blindly loves their country and fails to recognize its shortcomings, corruption, or brutality, it is dangerous. Dangerous because when one fails to criticize their own country, this blind nationalism ensues and one falls into this idea that only their country is the best, and that other countries are inferior.
This kind of nationalism prevents people from seeing their country for what it is, or from recognizing when their nation is engaging in wrong, violent behavior.