biologism
/ˌbaɪəˈlɒdʒɪzəm/Definitions
1. noun
The belief that the natural world, particularly living organisms, should be the basis for social and political decisions.
“The environmental activist argued that the town’s new development plan was driven by biologism, prioritizing the needs of local wildlife over human interests.”
2. noun
A theory or ideology that emphasizes the importance of biological factors in understanding human behavior and society.
“The social scientist’s work was criticized for promoting a form of biologism, which some argued oversimplified the complexities of human culture and history.”