Eye Candy
Our body is the way we interact with the physical world, and it also holds our most basic instincts. It's where we feel pleasure and love, judge and destroy, bleed and smile. Since David and Venus, we've put the human body on the highest pedestal, making and keeping an unreachable beauty standard. Through this warped view of beauty, our natural shape has become sexualized, demonized, and sometimes shamed into a taboo. Why is the naked body so offensive? Does it say anything about politics? Were we not more than what we seem to be?