3 Tips for Parenting to Promote Boys’ Positive Body Image

What is body image? Body image is the perception of one’s body and related thoughts and beliefs.

Below, I discuss this topic, particularly how it relates to boys, with Charlotte Markey—a Psychology Today blogger, author, professor of psychology, and founding director of the health sciences center at Rutgers University. (Click for full article)