Gender Essentialism: Basically this ideology states that gender and sex go hand in hand, meaning that gender roles are the result of biology. Now granted, there is a little more to the idea than that, however that's the bit I'm focusing on for this week's question.
I was always taught that women are nurturers; women are maternal, loving and filled with an overwhelming desire to have children. The home is a woman's sphere. Men on the other hand, are meant to provide and protect; Men are supposed to take care of women.
The argument has kicked up it's heels again recently (what with all the great political, woman furor that has surrounded this election cycle) and I'm wondering how you feel about the issue.
Personally, I lean away from gender essentialism, however nothing is really that simple is it?
I'm curious, do you believe that there are defined gender roles for men and women? Do you believe biology affects gendered attributes? What about small things? There are dozens of studies that discuss the differences between men and women, "Men only worry about things that directly affect the physical safety of those they love, while women worry about the small things--which is why men seem lazy and women are nags" (Laura Brizendine).
Anyway, answer the question. What do you think?