
From the 1600s to today, Westerners have always very consciously known that liberalism and “freedom” are for white people. The US and was terrified by the Haitian revolution, First Peoples were purposely target for genocide, SCOTUS explicitly ruled in 2019 that non-westerners have no human rights under US law
While the founding fathers weren’t great people, the uniting idea that formed the USA and continued to push it forward was freedom and that the people are the will of the country and will come together in others time of need. That when change needs to happen, our people are unified by the idea of working towards a better future for all of us and won’t tolerate a government not following the will of the people.
Actually, thats not true. Thats propaganda the empire teaches to justify its existence. I recommend you read Domenico Losurdo’s A Brief History of Liberalism, or Achille Mbembe’s Necropolitics, or J Sakai’s Settlers. Western Liberalism has always been based on the idea that “rights” and “freedom” are exclusively for westerners.
yea idk why people act like the US wasn't built in blood. We should not glorify that or the ongoing genocide that our government is responsible for. Chicago folks are doing great things with their support for undocumented people though! That's just the society we want and the dream we're sold and it's worth fighting for, but it's not what our government has ever endorsed.
The uniting ideas of our country and ideas that lead people to seek out our country, are not the same as the abuses people in our government (current and past) have endorsed or participated in. Wanting African slaves wasn’t why the USA was made, wanting women to be subservient to men wasn’t why the USA was made, etc. yes it was definitely apart of the people who made it and how it was built but those aren’t the uniting ideas (unlike with the confederates which the unified idea was slavery)
I’m talking about the USA as the actual people deciding to unify into the USA and into the union, not the government that took control of the USA during those times. I absolutely hate our government and the things they’ve done but I don’t see a country and all of its people as the government or the values of the government to be the same values of the people. So when I’m talking about the uniting principles, I’m talking about the ideas that brought people together to the USA and to form the USA
Like in my own family history, my ancestors came to the USA for hope of a better future, and when they got here they settled in a pacifist community but ultimately uprooted everyone because they wanted to fight specifically against slavery so they joined the union (this is all over multiple generations). So I see THAT as the uniting principles and not the government endorsing slavery or genocide, etc. because it’s actually what brought people /together/
I hear you, and I wish your family's story was the norm as far as europeans who moved here goes because we would all be in a better place if that was the case. There have always been people who celebrate diversity and those stories shouldn't be erased. I think there is an inclination to overstate the prevalence of the good we know of the past, which is just as incorrect as saying it didn't exist so I don't want to sound too doomer about it
I’m not talking about the US gov, I’m saying the US was explicitly founded as a white supremacist entity where “freedom” is for white europeans, and that has always been the understanding. This isnt my opinion, there are scores of academic research and primary sources backing this up