Why Men Don't Like Doctors
Since like forever, overall care of family members (specially children and elder people) have been in the hands of women. Men, on the other hand, not so much. Though social role labels may seem a little cliché or square, truth is they exist because of a reason. These clichés exist in society, whether we agree with them or not. Because of the role women were assigned, men were somewhat alienated from the “care giver” group. Men see their doctor when they are children, and it is mom who takes them. As of adolescence, men are cut off from the health-related interactions until they have to re-establish that relationship when they hit forty or fifty to have their prostate checked or something. Read more
Tweet This Post