Are the Recommended Daily Requirements or Daily Reference Intake before or after a bioavailability/stochiometry calculation?


#1

This perhaps isn’t common knowledge, but the RDA was derived from taking the top median population scaled by health, and figuring out what their nutrient intake was. From there our “typical healthy people nutrient values” were born.

We are feeding ourselves by aggregate data, and it seems to more or less work - no one has died by following the RDA conservatively. However, I have seen different strategies towards making a Soylent recipe that would be more or less justified depending on how the RDA is interpreted. Some people, like myself and I think Rob at one point, have thought about treating the RDA as an elemental, or “real” target - inferred by the fact that you can figure out the elemental content of a salt, e.g. potassium chloride, and then count each mole as being 52.455% potassium and 47.55% chloride.

Most recipes I have seen do not treat their ingredients with this much resolution. In a matter of fact, many nutritional labels do not do this either. Nutritional labels equivocate the RDA fulfillment with the weight of the salt instead of the elemental weight. This can be palpably untrue, as you might see from the above statistic.

Based off of its derivation, does the RDA encapsulate bioavailability/stochiometry information and risk? Or do we have to know more about the survey method? I’m thinking what’s missing in our analysis is how they figured out the nutritional content of the diet in the first place. Measuring it from food includes bioavailability in the target, meaning that we assume typical bioavailabilities and salts when creating our recipe; measuring it by some other means, e.g. in the blood or urine, loses the bioavailability information. This becomes more important when we find that bioavailability varies highly within a given nutrient complex.

By wanting to increase precision we also increase the complexity of the approach, which increases risk by human error and risk from the inferential bias. Due to the success of Soylent, what we are doing is adequate, so nothing really has to change if we don’t want it to. Nevertheless by exploring this question we either become better at meeting our nutritional needs, or we find that our bioavailability doesn’t matter or worse can lead to dangerous results if we don’t understand it well but attempt to.


#2

This perhaps isn’t common knowledge, but the RDA was derived from taking the top median population scaled by health, and figuring out what their nutrient intake was. From there our “typical healthy people nutrient values” were born.

The DRI reports say that the DRIs were derived by looking at a lot of different types of scientific evidence – both correlational and experimental. They assume that for each person there is a defined level of intake which will be sufficient for that person’s health, but that this level may be different for each person. They use a variety of scientific evidence to derive an “estimated average requirement” (EAR), which is estimated to be the average of all of these different intake levels for people in a given population segment (by sex, age, etc.). In addition to deriving this average they derive a standard deviation. Then the RDA is calculated by going two standard deviations up from the EAR, meaning that it is a level of intake which (it is estimated) will meet the requirements of 98% of people.

Maybe you know something I don’t about this; would you care to expand?

Some people, like myself and I think Rob at one point, have thought about treating the RDA as an elemental, or “real” target - inferred by the fact that you can figure out the elemental content of a salt, e.g. potassium chloride, and then count each mole as being 52.455% potassium and 47.55% chloride.

I believe this is correct, and the nutritional labels which do it the other way are doing it wrong.

Based off of its derivation, does the RDA encapsulate bioavailability information and risk? Or do we have to know more about the survey method? I’m thinking what’s missing in our analysis is how they figured out the nutritional content of the diet in the first place. Measuring it from food includes bioavailability in the target, meaning that we assume typical bioavailabilities when creating our recipe; measuring it by some other means, e.g. in the blood or urine, loses the bioavailability information. This becomes more important when we find that bioavailability varies highly within a given nutrient complex.

The questions you ask can be answered on a nutrient-by-nutrient basis by looking at the reports and seeing, for a specific nutrient, what studies they looked at to establish the EAR. My impression is that they are usually based on how much nutrient is consumed, not on how much is absorbed; so it takes into account (in a statistical manner) the fact that not all consumed nutrient is absorbed.

There is a fair bit of criticism of the DRIs, and in many places they may not be based on the most sound science. Most of us shrug our shoulders and say they’re the best we’ve got right now. But I for one would be interested in hearing about any research on better intake recommendations.


#3

Thanks for the thorough reply @nwthomas.

I only have vague memories of where I garnered my claims, so if I stumble upon them once more I will post them here. Until then you seem more familiar with the matter.

So if I’m understanding you, you’re saying that the RDA is meant to be interpreted as elemental targets and that I should being doing the stochiometry?

What’s preventing them from going further? On a normal distribution I guess the next S.D. produces some interesting extremes.


#4

Yes. At least for minerals. For vitamins the situation is complicated and I can’t claim to understand it.

What’s preventing them from going further?

You could certainly go further, but it might not be a net win. There might for example be outliers who need 10x as much of a given nutrient as the average person does, and it might be unhealthy for a normal person to consume that quantity of that nutrient. Depends how large the standard deviation is for a given nutrient; I haven’t looked at any of them.

It is possible to determine whether one is one of these outliers with respect to a given nutrient using blood tests. I know that for example Andrew Weil did this, and figured out that he was an “outlier” with respect to selenium. So now he eats a ton of brazil nuts (IIRC) to get his very high daily requirement of selenium.

Interestingly, if one was really determined not to be deficient in anything, this would probably be recommended. There are 29 or so essential nutrients, and for each nutrient one has (insofar as the DRIs are correct) a 98% chance of one’s requirement for that nutrient being less than or equal to the RDA. Assuming that one’s requirements for different nutrients are statistically independent (they probably aren’t), one would have a .98^29 = 55% chance that all one’s requirements fell below the RDAs.

In other words half of us (according to my shitty back of the envelope statistics) need more than the RDA of something. That said, I’m not sure how time-consuming/costly it is to figure out which nutrients one is an “outlier” for, and so learning this might not be within the reach of people who aren’t wealthy health nuts.