Many people in what we’ll call “liberal” culture in the United States believe that this country is fundamentally racist, since slavery was a central part of our society when our Constitution was written and was written into that document (through the...