The biological assumptions that calorie calculations are presently based on are far too simplistic and may not accurately reflect calorie content of food items, Rob Dunn writes in a feature article that is part of part of this month’s Scientific American’s special issue on food.
Almost every packaged food today features calorie counts on its label. These calculations originated in the 19th century, when a system was devised for calculating the average number of calories in one gram of fat, protein and carbohydrate. Recent research has revealed, however, that the calorie content of food depends on many factors that render current calculations inaccurate. For example, calories may depend on how much the food is digested. If products are not digested fully, they relinquish fewer calories than the food’s total calories would suggest. Individual differences—such as the size of people’s guts and the variance of gut enzymes—also play a part. Lastly, the extent of processing or cooking employed can also influence calorie counts, and calculations that take this into account could, in theory, make them more accurate. Dunn notes, however, that, as yet, no one seems to have launched any efforts to modify labels to reflect processing.
These factors lead Dunn to suggest that merely counting calories based on food labels is an oversimplistic approach to eating a healthy diet—one that does not necessarily improve our health, even if it helps us to lose weight. He concludes that while nutrition scientists are beginning to learn enough to possibly improve calorie labels, digestion turns out to be complex and, therefore, we may “never derive a formula for an infallible calorie count.”