America has always had racism, but America has never been a racist country
I’m really quite stunned by this claim. She seems to be parsing words to dodge reality. The US wrote slavery into its Constitution, engaged in a series of racist genocides of the indigenous people, and excluded immigrants of various ethnicities throughout its history. We’ve certainly improved, but to say we’ve never been governed by racism is either ignorant, revisionist, or pandering to some of the worst people.
For that matter, why did it take a Constitutional amendment to consider Black Americans as citizens?