Health

10 Things That Happen To Your Body When You Get Pregnant

It turns out that pregnancy is the ultimate bio-hack.

By Alicia Bittle6 min read
Pexels/Alexander Krivitskiy

When it comes to health, wellness, and longevity, diet and exercise get all the credit. But why? Why is possibly one of the most key factors in women’s health always left out? 

Pregnancy, if ever spoken about at all, is consistently cast in a grim light. According to the media, you should either be actively trying to avoid it or recovering from it. 

It’s quite literally treated like the plague and to top it off, women in America are faced with an insane societal paradox where we are simultaneously expected to both act like nothing has happened at all by “bouncing back” to our pre-pregnancy body as quickly as possible, while also behaving as if our body has been absolutely ravaged of any and all nutrients, and gained positively nothing in return.

This has been the story for decades, and it’s clearly designed to serve the anti-natalist agenda that’s been festering in the American psyche since the sexual revolution of the 1960’s, so, what’s the unbiased truth? 

We’ve heard enough about the possible downsides of pregnancy and quite frankly, the sorry script is cliché and boring at this point. So let’s flip it. What are the benefits of pregnancy? And how do they stack up against diet and exercise? 

As a mother of four, I’m tired. Tired of the lies, the gaslighting, the lying, and the intentional fear-mongering that surrounds motherhood, and I’m determined to put a stop to it.