Naturism



ENGLISH MEANING
noun
1.
The belief or doctrine that attributes everything to nature as a sanative agent.