westernism
/ˈwɛstrənɪzəm/Definitions
1. noun
A cultural or philosophical ideology that emphasizes Western values, customs, or attitudes, often considered dominant or superior to others.
“The professor’s critique of Westernism in literature highlighted the need for greater cultural diversity in educational curricula.”