LogicWarrior Demand Reason


The Faults of the Calorie Counting Myth

Summary: The myth of calorie counting states that small daily errors in counting calories will accumulate over time and over decades can amount to a large swing in weight and, since no one can count calories with sufficient accuracy, calorie counting should not be done.  This notion is unreasonable as it assumes one uses neither feedback mechanisms, nor any sort of adjustment to modify the actions the person takes like periodically checking the scale or noticing body changes.  Calorie counting can be a useful technique, but it has limitations.  Requesting people discard the tool because of this premise is arguing in bad faith.

Background: I first encountered the Calorie Counting Myth, the idea that weight change can not be accurately described as calories in minus calories expended throughout the day (the energy balance equation), on Windows Weekly hosted by Paul Thurrott.  He was reading "Why We Get Fat" and commenting on how calorie counting didn't work.  His argument, which I believe is repeated from the text, cites that to gain a pound a year, one only needs to eat one addition cracker a day .  Over time, this adds up and even an astute calorie counter can become overweight over decades.  Later, I read a Reader's Digest article “Is This Any Way to Lose Weight?” from the February 2011 edition which stated the following which I got from PrimalToad as the text is not available online directly.  This is a more detailed explanation of the "small errors build" of above and can be skipped:

Public health authorities want us to practice ‘energy balance,’ which is a new way to say that you shouldn’t take in more calories than you expend. So what does energy balance entail?

If you consume about 2,700 calories a day, which is typical if you average men and women together, that’s a million calories a year, or ten million calories in a decade. Over the course of a decade, you’re eating roughly ten tons of food. How accurately do you have to match calories-in to calories-out so that you don’t gain more than 20 pounds over the course of a decade? Because if you gain 20 pounds every decade, you’ll go from being lean in your 20s to obese in your 40s, which many of us do. And the answer is: 20 calories a day. If you take in an extra 20 calories a day and put it into your fat tissue, you will gain 20 pounds every decade.

The point is, nobody can match calories-in to calories-out with that kind of precision. Twenty calories is like a single bite of a McDonald’s hamburger. It’s a couple of sups of Coca-Cola or a few bites of an apple. No matter how good you are at counting calories, you can’t do it.

The Argument: I will restate the above argument in more rigorous terms:  "Were one to try to perfectly count calories through the common tools available to the dieter and were that dieter to strictly use calorie counting as the estimator of one's weight, over time that estimation would become inaccurate as small discrepancies between calories consumed and calories expanded mounted and thus this estimator should not be used."  Once we phrase it that way, the argument becomes a bit dubious.  No one just uses calorie counting as their sole gauge of weight, there are other tools one uses like a damn bathroom scale. To not weigh oneself would be the equivalent of determining when one should refuel by only reading the gas pump display and odometer and never looking at your fuel gauge. The bathroom scale is a feedback system that allows one to re-adjust and calibrate ones calorie counting just as the fuel gauge is a feedback system showing how to update one's estimation on the distance one can go until your tank is empty.  I found that over time, there was a 300 calorie error in my estimation of daily burn, once I compensated for this, I started again achieving the weight loss results I wanted.  This is a correction factor that will change over time, but that has made calorie counting a helpful weight loss tool.  The cause of this gap isn't entirely important as once I considered it, the gap between model and practice closed.  These correction factors aren't rare like "oh, the tank goes quickly once you hit half" or "this place says a pizza will feed 3 people but they're small so over-order".

Conclusion: Saying that calorie counting doesn't work is a misrepresentation of how calorie counting is actually used and is an example of the perils of oversimplification.  Calorie counting is a valid tool but is indeed useless to use in a vacuum which doesn't reasonably occur.

Assumptions I Make in the Above Post:

  • That Paul Thurrott accurately portrayed the sentiments of the author.
  • That people who are calorie counters also regularly weigh themselves.
  • That the proponents of the calorie counting myth don't also propose some alternative hypothesis which would show calorie counting not just to be the small and persistent inaccuracy I show is fine but some very large inaccuracy, for instance by saying that nutrition labels don't accurately model how the body burns calories.

Another Example: I was looking for examples of where uses a derived estimator coupled with a periodic observation of a parameter and found a neat one.  A lot of military robots use GPS to figure out where they are but often go into areas where GPS will either not work or where doing such, apparently, gives away the position of the item.  In these cases, the robot will determine its location by using accelerometers to derive how far its traveled since its last point.  These estimations become wildly inaccurate over time as determining position from acceleration requires double integrating and the robot must periodically relocate itself using GPS, if only for an instant.  Projecting a location based on position data plus the characteristics of movement is called dead reckoning and is an example of propagation of uncertainty, a term I now love.

Bias: I've used calorie counting among other tools to help me lose 85 lbs as of this initial post.  I couldn't reasonably think of how my belief that it's a reasonable tool would undercut the argument that it is misrepresented, but if the reader believes he or she has found such a way, please comment.