We can't even measure calories accurately, never mind predicting how much your specific body will actually absorb. Maybe we could be more accurate with vitamins and stuff, but I dunno.
The only way to get an accurate reading on calorie count is to burn it. 1 kilocalorie (nutritional calorie) can increase the temperature of 1kg of water by 1 C°
But burning isn't how your body utilizes the calories. Some things burn just fine yet are entirely useless as a (human) food source, like wood. This complicates things.
For instance, we still don't know if our bodies can actually use ethanol (drinking alcohol) as a fuel source. Is that vodka shot adding to your daily calorie intake?
Even more reason there is plenty of science to be discovered. Until then, the rough estimate we have is still proven to work (calories consumed minus calories burned).
I mean there’s no way that they’re gonna be able to do metrics for every person since every person is built differently so there has to be a common standard. Or you you saying that certain types of calories are burned the same way for all people?
What? Calorie is a perfectly accurate method of measurement. Just because your body might absorb more or less than the next person doesn't change the amount of calories in a food.
Not to mention, even if you can accurately measure calories in a specific serving, companies produce thousands and thousands of servings per day. They can't accurately measure all of them. And ironically, the more 'natural' the food is, the less accurately they can measure the nutritional value: protein paste is going to be a lot more predictable than pasture-raised chickens.
I think he's saying that you can measure how much energy the food contains but not how much energy each individual will successfully absorb and metabolize.
Nah,, that's the funniest attempt at dissing someone that said something you don't understand I've ever seen.
Calorimeters do a specific job. That job is not the same as digestion and metabolism. Not all foods "give up" calories in the same way, and no foods do so in the same way as inside a calorimeter.
Measured calories via calorimeter are indeed accurate with exactly what they measure, i.e. The exact food that is placed into them.
What a calorimeter can't do is guarantee that everything put into it is the same.
The more complex the substance is, the more variation there will be between measurements of different batches of that substance. Something like refined sugar is going to give the same results reliably because there's just not that much variation. Same with refined fats and proteins. Once you get simple enough, the results vary by so little as the be meaningless.
Put two bananas in the same machine, the variance will be greater than that of simpler materials. Is that variance enough to matter on a practical level? Not usually, but it can be.
But, that variance is still there, and the range of possibilities is enough to be significant when calculating what you might slap on a nutritional level of a given food.
Hence, the results aren't accurate in the sense that they can be reproduced in a precise way. There's just too much natural variance in foods, even carefully prepared foods.
While what you said isn't wrong, it's not really the main issue. The energy a human body gets from food can be vastly different than what is produced by burning it, and there are further variations per person.
The calorie count on food to my knowledge is based on actual measurements with humans... from one guy doing experiments in the 1800s. And while it's probably reasonably accurate on average, it's not really possible to know how much energy a specific person will get from a food from a generalized calorie label. So even if the food itself had no variance, it would be impossible to label the energy intake you will get from it accurately.
But for relatively unprocessed foods, seems completely reasonable to me at first glance. The relative sugar content of, say, an apple, is dependent on all sorts of parameters (sun, water, soil...). The gluten content of wheat, iron content of vegetables, all of these things are variable. The more "natural" a food is, the higher the variability (as opposed to, say, artificial candy --- that should be pretty uniform).
Actual reason? Not sure because I wasn't around for the comment period.
Likely reason? People are terrible at making decisions based on ranges or anything more complex than a single number. They aren't even that good at a single number.
Since mixed things like trail mix can have some variety in ratio from bag to bag, going with an average and some variance means having some kind of flexibility. Then there are vegetables and other plants that can vary wildly too.
But what about something like gummy bears where the whole thing is very consistent? Can't have different rules for different foods, because companies will tie the whole thing up in court.
So the end result is a rule that allows flexibility for the things that actually need it that is also applied to everything else for simplicity.