-
Of or pertaining to the practices and institutions that legitimize and privilege heterosexuality, heterosexual relationships, and traditional gender roles as fundamental and "natural" within society.
(adjective)
Wiktionary.org : Text is available under the Creative Commons Attribution-ShareAlike License