plantar
/ˈplæntɑr/Definitions
1. noun
The underside of the foot, especially the sole.
“She landed awkwardly on her plantar, causing pain.”
2. verb
To fix or attach something in a firm or permanent position.
“The carpenter will plantar the beam securely to the wall.”