dentistry

/ˈdɛntɪstri/

Definitions

1. noun

the branch of medicine concerned with the study, diagnosis, prevention, and treatment of disorders and conditions of the teeth and mouth.

“She decided to pursue a career in dentistry after volunteering at a local clinic.”

2. noun

the practice of dentistry.

“He opened his own dentistry clinic in the downtown area.”

Synonyms

  • oral medicine
  • oral surgery

Antonyms

  • undentistry