Definify.com
Webster 1913 Edition
Naturism
Na′tur-ism
,Noun.
(Med.)
The belief or doctrine that attributes everything to nature as a sanative agent.
Definition 2024
naturism
naturism
English
Noun
naturism (countable and uncountable, plural naturisms)
- The belief in or practice of going nude in social settings, often in mixed-gender groups, specifically either in cultures where this is not the norm or for health reasons.
- The belief or doctrine that attributes everything to nature as a sanative agent.
Synonyms
Related terms
Translations
the belief in or practice of going nude or unclad
the belief or doctrine that attributes everything to nature as a sanative agent