I know it was important for women to marry as virgins. But there are so many shows/books where women have sex before marriage. Was it an unspoken truth that most women did?
There's always more to be said, but you might be interested in my previous answers:
ETA: Whoops, /u/AlternativeQueen, I also meant to include Teenage love in the Victorian era, which speaks exactly to your question!