dentalism
/dɛnˈtælɪzəm/Definitions
1. noun
The practice or theory of treating dental problems with a focus on natural chewing function and prevention, often using non-invasive methods.
“The dentist specialized in dentalism, recommending regular check-ups to prevent future problems.”