Earlier this week while having a discussion with my friend about natural hair we discussed if natural hair is now just a trend. In other words are women going natural because they actually care about what their putting in their hair. Or are they just jumping on the bandwagon because they see everybody else rocking a natural style. It’s very apparent that natural hair has become very popular in the last 3 years. For instance, look at black magazines like Essence they now feature more articles on natural hair and theirs even a page on their site dedicated to natural hair. Also lets not forget that most hair product companies are now focusing on adding more natural ingredients into their products.
So the question is did you become natural because everyone else was doing it? Or were you concern about the healthiness of your hair? Does it even matter?
I know personally for myself with stress and perms my hair was falling out and breaking off and I wanted a healthier solution for my hair and that was to go natural.
What do you think? Take the poll or leave a comment.