I've heard some people mention on here in the past that they don't see abundant racism in America anymore. I have heard that racism doesn't affect people that much in America, because we're such a melting pot and such.
If this story doesn't change their minds, then I don't know what will, short of another series of lynchings.
https://www.msnbc.msn.com/id/12011019/site/newsweek/
The story is about a black, female doctor. She will come into the room and examine the patient (having said she was the doctor), check him or her thoroughly, and then say what was needed. Afterwards, the patient will ask, "When is the doctor going to see me?"
The woman has had a tough time because she is a female doctor; however, she says that she knows several other female doctors who do not get "the look," as she calls it, as often as she herself does. Of course, the other female doctors she knows are white.
She even tells the story of how one little black girl (six years old) refused to let the doctor treat her. The little girl wanted only a white doctor. The girl didn't know anything of racism at such a young age - she just knew that, according to those around her, doctors are "supposed" to be white.
The doctor says that whenever someone stares at her Id card, she feels like saying, "It's true. I'm a real doctor. Perhaps you've seen a black one on TV?"
This is a very discouraging article that shows that racism, though perhaps of a different sort, still runs rampant within our society. As long as blacks aren't expected to do the same jobs as whites, there will be racism. The same goes for all the other races: Latino, Asian, Jewish, etc.
Give the article a read and let me know what you think.