Feminism – What Is It?

Feminism is the belief in the social, economic, and political equality amongst the genders.

We have yet to arrive at equal opportunity and experiences in the workplace or in society at large. Men, predominantly straight white cis heterosexual men, are disproportionately represented in roles of influences i.e. men control power and money at work.

Conscious and unconscious gender bias is present in all of us: Let’s work to make the world a better place irrespective of people’s gender.

“Feminism Will Make It Possible For The First Time, For Men To Be Free.”

Floyd Dell , The New York Times.

Learn More About Feminism

Feminism Defined


Further Resources To Deepen Your Understanding Of Feminism