For several centuries in the past, black people have suffered all kinds of racism and abuse, but this is really strange to me since science has proven that man was originally black and of African origin.
So, my questions are, what moment in history marked a before and after for the black race? And why wasn't it the other way around, why didn't blacks enslave whites considering all of the above?
You might be interested in some previous answers on the history of racism;