Is this a misconception? Was it all really just about slavery? What were the rights that these states were fighting for? More state control/less powerful government?
Hey there, more can always be said but you might enjoy reading these older threads on the subject, or the similar "Was it really about slavery" question.
Why can't the Civil War be called the "War of Northern Aggression," or a war about states' rights? by /u/freedmenspatrol is particularly great, and shows how the whole States Rights thing only happened after the war. Before and during the war, the South was more then happy to trample over every other states rights.
There's also Was the American Civil War about more than just slavery? by /u/Georgy_K_Zhukov which shows that ultimately, the cause of the war leads back to slavery. It does so for a bunch of different reasons, but still ends in slavery.