Before European Christians Forced Gender Roles, Native Americans Acknowledged These 5 Genders

A gender role is a set of societal norms dictating the types of behaviors which are generally considered acceptable, appropriate, or desirable for people based on their actual or perceived sex or sexuality. But it wasn’t until Europeans took over North America that natives adopted the ideas of gender roles. For Native Americans, there was no such a thing as specific rules that men and women had to follow in order to be considered a “normal” member of their tribe. People who had both male and female characteristics were viewed as gifted by nature because they…

Read More