The 10 Worst Lies That Society Tells Women

The world often lies to young women while promising happiness and fulfillment. We’re sold pyrite and told it’s 24 karat gold. It’s time that we weed through and expose those lies so we can become the strong, good, and beautiful women we were created to be.

By Molly Farinholt4 min read
shutterstock 695914132
Shutterstock/Jacob Lund

1. Marriage and Children Are Barriers to Your Dreams

Society loves to tell young women that the housewife role is antiquated and wrong. We’re told that our worth is in career success, not raising a family. Those who choose to “just” stay at home are said to be capitulating to old-fashioned ideas of the female role. If unshackled from the “burdens” of husbands and children, they could be chasing their dreams. We should be told that 1) you can achieve other dreams while also being a wife and mother, and 2) being a housewife can be your sole dream. 

2. Monogamy and Virginity Are Virtues of Yesterday

Since the Sexual Revolution of the ‘60s, women have been told that sleeping around is an expression of their freedom. Monogamy and virginity are no longer badges of honor, but rather traits of the uncool, the inexperienced, and the oppressed. 

Committed, long-term relationships with the end goal of marriage are said to be boring and confining.

Society tells us we can’t be happy unless we’re intimate with multiple people. Committed, long-term relationships with the end goal of marriage are said to be boring and confining. Women are told that they can’t “find themselves” in such relationships. The reality is that sleeping around has devastating consequences while monogamy leads to true and lasting happiness. 

3. You Need To Have the Body of a Supermodel 

Beauty standards have shifted greatly since the beginning of mankind. Once, a woman’s natural body — unaltered by diets, excessive exercise, and surgery — was considered beautiful. Healthy, fertile bodies were the subjects of great masterpieces. They were seen all over Hollywood. They were posted on billboards and in magazines. 

Now, we’re made to believe we should have the slim bodies of supermodels or the toned arms and abs of elite athletes. We’re given tips on how to cut calories, workout like men, and alter ourselves to fit society’s standard. Our health is tossed aside in this vain pursuit. We don’t need to have balanced hormones, good mental health, and fertile cycles; it’s more important that we look like Jillian Michaels. 

4. There's No Difference Between Allure and Blatant Sexuality

Society also sells us the idea that true beauty and attractiveness come increasingly from sharing our bodies and sex lives with the world. For this reason, women’s clothing lines are more focused on getting Instagram likes than helping women build a stylish, sophisticated wardrobe. Modesty is said to be a quality forced upon women by an oppressive patriarchal society. We should be free, instead, to exhibit our bodies, attracting attention from all. Paradoxically, society condemns men for looking. We’re to be seen as “sexy,” but not by the men who we don’t wish to view us in that way. Got it? 

5. Your Body, Your Choice 

The same culture that tells us we need to mold our bodies into their image of perfection also tells us that, when it comes to childbearing, it’s “our body” and “our choice.” This narrative hurts women because it reinforces that we are victims instead of active participants in our lives. We have a lot of choices, not just the one to terminate a pregnancy.

We’re not told that abortion leads to depression and other psychological issues. 

We’re not told that abortion leads to depression and other psychological issues. We’re not told about the many physical risks faced by women who choose abortion. We’re not told that we need to take responsibility for our sexual choices. Instead of encouraging women to act the victim in their own life, why doesn’t society lift up women in their ability to grow and nurture life? 

6. Men Are the Enemy of Female Empowerment 

According to our culture of victimhood, women are the perpetual victims in a male-dominated society. Men are the enemy in education, sports, the workplace, and politics. Why do we not hear the truths that women outnumber men in American universities, that there are more female than male sports teams in the NCAA, that the wage gap is a myth, that many women hold powerful positions in the business world, and that there are 26 women in the Senate and over 100 in the House? Women have just as much opportunity as men do in our country, but society would rather have us continue to play the victim card.  

7. You Can Do Whatever a Man Can Do

Today’s version of feminism is less about promoting femininity in all of its unique splendor and more about encouraging women to act like men. The general goal of equality is well-intentioned, but the type of equality and the means of attaining it are misguided and unsound. Women don’t need to — and oftentimes can’t — do whatever men do because we aren’t physically, mentally, and emotionally the same as men. 

Women aren’t physically, mentally, and emotionally the same as men. 

Instead of attempting to undertake the doings of men, women should be pursuing the many wondrous acts and habits that are distinctive of our sex. In doing so, we’ll bring back femininity in all its glory and reach new heights as women. 

8. Your Life Is Less Relevant If You’re Not Present on Social Media

Statistically, women use social media sites such as Instagram and Facebook more than men. Predisposed to comparison, women often see the “highlight reels” of others’ lives and suffer feelings of jealousy, inadequacy, and pressure to become like the glossy versions of others. We’re made to feel that if we don’t post perfectly curated photos showcasing “perfect” lives, then we’re irrelevant and insignificant. We’re missing out. We’re doing something wrong. 

This influence to overshare is actually what’s making us miss out. We miss out on being present in our own lives, on seeing and appreciating ourselves for who we truly are, and understanding that what matters is not the highlights but life in its entirety — the ups, the downs, and everything in between. 

9. There Are No Good Men Anymore 

Where have all the good men gone? The answer? Modern-day feminism has either destroyed them or sent them into hiding so they are, in fact, harder to find. We desire chivalrous men, but we’re told that chivalry is bad. We’re told that it’s demeaning to women when, in reality, chivalry honors women as beings worthy of deep respect. Many men today are, because of this lie, afraid to act properly towards women. Those who still do are often demonized. They do exist, though. We’re just looking in the wrong places (i.e. bars, night clubs, college parties) and for the wrong things (i.e. one night stands and casual relationships). 

10. You Have To Be Involved In Politics

"The personal is political." There has been a push to politicize everything about our daily lives. We are supposed to view everything from our work to our romantic relationships through the lens of feminism and the patriarchy.

This type of thinking is incredibly destructive. It undermines your trust in your coworkers, your partners, and your society. If everything is part of a system being used to oppress you, how can you ever achieve happiness? Women are encouraged to abandon such tawdry pursuits as marriage and children in exchange for "proving themselves" in the corporate world while fighting patriarchy in the street. It's ok - and in fact, more healthy - to have parts of your life separate from your political views.

Closing Thoughts 

Society may not abandon its false narratives, but that doesn’t mean that we have to listen to them. We can be independent thinkers, seeking out and living by the truth. We can rewrite the narrative and reclaim true femininity.