Starting December 2, 2024, we are thrilled to welcome Dr. Nikolai Klebanov, a Harvard-trained, board-certified…
Tanning continues to be popular among many women despite the known risks. When did tan skin first become popular?
Research done by Dr. Deborah Cummins and her colleagues at John Hopkins School of Medicine indicate that in American popular culture the shift toward favoring tan skin likely occurred in 1928 based on review of popular magazines of that period.
Prior to the 20th century, fair skin was often associated with beauty and refinement. In the 1920’s new theories led people to believe that sunlight was good for health and subsequently tanning became popularized. A wealth of data now makes us aware of the many risks associated with tanning most notably risk of skin cancers including melanoma, and many public health efforts center on reversing the tanning trend.
CHANGES in SKIN TANNING ATTITUDES Fashion Articles and Advertisements in the Early 20th Century