To be androgynous, by definition, means to be neither specifically feminine nor masculine. But there's more to it than that — it's about fearlessly owning who you are, even if who you are doesn't fit into society's convenient little box. It's about confidence, self expression, and, most importantly, self love. And the beauty industry is finally becoming a place where it's more celebrated than ever.
Why Celebrities & Derms Are So Obsessed With This Sunscreen