Is tanning purely a 20th-century invention, or did earlier cultures with fair skin put emphasis on intentionally "darkening" their skin?

by Euralos

Pretty much what the question says, I am curious to know if anybody pre-1900 was tanning and, if not, why did it become such a boom in the 20th century?

reqdream

I'm not sure about other cultures, but pale skin was considered the standard of beauty in pre-modern feudal Europe. Tanned skin was associated with lower class laborers since they would spend a lot of time working in the sun. So I think it's safe to say no one was intentionally tanning during this time period.

Here is a short but thorough article about the history of tanning (based on 100,000 Years of Beauty) from the Skin Cancer Foundation. The tl;dr of it is that until very recently, most societies favored fair skin.

It also addresses the popularization of tanning. In the mid 19th century light therapy was introduced and marketed as a treatment for ailments ranging from tuberculosis to depression leading to the concept of a "healthy glow". Then in the 1920s Coco Chanel popularized tanning for fashion and beauty.

Schootingstarr

as with many standards of beauty, a tan skin is associated with wealth and therefore viewed as desireable

as reqdream already stated, tanned skin was usually a sign of less prestigious occupation, that required people to work outside.

in our modern societies, the roles reversed, and people usually work inside. in other words, their skin stays fair. tanned people have enough time to spend outside or even abroad on vacation to get a darker complexion

it's the same for weight of a woman. you might've recognized it already, but the women in paintings from a few hundred years back are all a bit chubby by todays beauty standards. or a more obvious example would be venus figures

back then, a woman of such proportions were obvious signs of wealth and good health. today it's the opposite, it's fairly easy and cheap to get overweight, but really hard to stay slim