The beginnings of rock music clearly lie within black communities. It was mostly African Americans that pioneered rock n roll.
However, at some point this seemed to change. I'd guess that around the late 70s or early 80s, most rock artists and audiences had become overwhelmingly white.
Today, rock is generally seen as a "white" thing, for both artists and fans. If we include genres that came out of rock, such as metal, we rarely ever see any black bands or audiences.
How did that happen? And when exactly?
Thank you very much in advance!
You might be interested in this previous question answered by u/hillsonghoods. Some key points I take from u/hillsonghoods answer which I'd agree with pretty strongly is that: