There are thousands of reasons I could list for why you might need to heal your relationship with your body. Dominant American culture is deeply rooted in white supremacy, misogyny, cisheteronormativity, ableism, anti-fatness, puritanical shame, and capitalism. While all of these will impact our individual lives differently, they all serve to teach us that in some way (or usually many ways) our bodies are bad, wrong, and not enough. Often these larger societal messages overlap with specific, direct messages we’ve heard from families and loved ones. So many of us have internalized deep hatred of our bodies and intense shame about their needs and desires. If we leave this unaddressed we easily, intentionally or not, pass it to one another.
I work from a place of centering pleasure. Your body is yours to delight in. Your body is your constant companion in your lifetime. I want you to experience joy and pleasure in your being. This might mean healing your relationship with food, movement, masturbation, sex, your gender, your sexuality, or simply how you look at yourself in the mirror. In a culture that teaches us that our bodies are things to be dominated, to be shamed, and to be controlled, learning to accept your body, and feed it in the way it needs, is nothing short of a radical political act of rebellion.