Where did the concept of women being expected by society to shave their legs and even more intimate areas? Is it a modern phenomenon or one that traces back further than the 20th century? When did it become the norm in British, or Western, society?
Thanks in advance.
There is always more to be said, especially with this one, but you may be interested in my rather old response to a similar question here. The fact is that there has not been much research on the topic.