There's this idea in media set in the 19th century especially that society looked down on women who had even a passing interest in any topic not related to the domestic sphere. But based on what I've read, during and after The Enlightenment especially, there was a growing line of thought that suggested women should be educated in a variety of subjects, if only because they were their children's first teachers.
I understand that women more often than not didn't have the same educational opportunities as men, but were educated women actually disapproved of in the past? Did it depend on their social class?
Before someone tackles the large question of female education, you may be interested in my previous answer about the adjacent issue of female reading and how it was frowned upon (or not).