"Diets" - in reference to an eating regimen for the purpose of losing weight - have been around for a long time.
And I hate them.
More often than not, diets are restrictive in nature, and I simply can't understand why ANYONE would chose to deny themselves food. Sure, some foods are "bad," and we shouldn't eat them all the time, but purposely cutting out essential vitamins and minerals (or even the occasional chocolate bar) just seems extreme (and cruel).
I would much prefer the term "lifestyle adjustment" to the term "diet." I will, without hesitation, agree to eating healthier, exercising more, and stressing out less. (I think most nutritionists would agree that, the main reason people gain weight is because they don't eat well, don't exercise, and end up stressing out, thus causing them to eat poorly and stop exercising.) In fact, when I'm "not allowed" to do something, I get annoyed/angry/stressed/upset, and then I end up retaliating by giving myself what I wanted in the first place. Where's the sense in that?!
So, my friends, I urge you to stop "dieting". I'm not a health specialist, but I know my body better than anyone else. It doesn't like restrictions; it likes options. Go hang out with your favorite health professional and learn how to ease your body into the healthy state it needs to be in, without putting it through unwanted mental and physical pressure that probably won't yield lasting results.
What do you think?