Definify.com

Webster 1913 Edition


Naturism

Na′tur-ism

,
Noun.
(Med.)
The belief or doctrine that attributes everything to nature as a sanative agent.

Definition 2024


naturism

naturism

English

Noun

naturism (countable and uncountable, plural naturisms)

  1. The belief in or practice of going nude in social settings, often in mixed-gender groups, specifically either in cultures where this is not the norm or for health reasons.
  2. The belief or doctrine that attributes everything to nature as a sanative agent.

Synonyms

Related terms

Translations

See also