doctordom
/ˈdɒktɔrdɒm/Definitions
1. noun
The state or position of being a doctor, especially in a way that holds authority or influence.
“The doctor’s doctordom in the medical community was unmatched by his peers.”
1. noun
The state or position of being a doctor, especially in a way that holds authority or influence.
“The doctor’s doctordom in the medical community was unmatched by his peers.”