Methods to Determine Dietary Intake

4
Methods to Determine Dietary Intake


Nadia Slimani, Heinz Freisling, Anne-Kathrin Illner and Inge Huybrechts


International Agency for Research on Cancer, France


4.1 Challenges to assessing and monitoring dietary intake


Among the different environmental and lifestyle risk factors, diet is one of the most complex exposures to investigate in relation to some diseases. Indeed, diet is a universal exposure consumed in infinite combinations of foods and recipes, with large variations within and between individuals and over the whole life span. In addition, the several thousand chemicals (including contaminants) present in the diet may have complex synergistic or antagonistic bioactive effects. As a consequence, it makes it difficult to disentangle individual chemical and nutrient effects as well as to remove confounding completely when investigating diet–disease relationships and their underlying biological mechanisms. Diet may also have strong social, religious and psychological features that have impacts on study and questionnaire designs, logistics and ultimately the individual’s dietary intakes.


The ‘nutrition transition’, characterised by a moving away from traditional diets towards more Western diets (rich in energy, fats, salt and sugar), is consistently observed with accelerated phenomena worldwide. This is another major challenge in measuring, monitoring and investigating diet and its associations with diseases, particularly cancer and cardiovascular disease. Indeed, cancer is a multiphasic and multifactorial disease, often occurring late in life. However, the lifelong cumulated risks might be affected by different (early) ‘exposure windows’, which are difficult to evaluate through single (or limited) repeated dietary measurements collected in nutritional epidemiology. Furthermore, the food frequency questionnaire (FFQ) assessment method predominantly used in large study settings has been repeatedly challenged with respect to its validity and reliability for measuring individual dietary intake. As a consequence, nutritional research has increasingly favoured approaches integrating traditional and more innovative measurements of dietary exposure (including biological and metabolite – intermediate or surrogate – markers) to improve the individual and population mean intakes and distribution.


This chapter reflects the major methodological and technological transitions in measuring individual dietary intake that have occurred over the last two to three decades by reporting both traditional and more innovative dietary assessment methodologies (or technologies), including combined approaches (Figure 4.1). A better understanding of their respective strengths and limitations as well as their comprehensive integration should pave the way for a more holistic and reliable estimation of individual (or population) dietary exposure, as an essential prerequisite for cost-effective and front-line nutritional research and monitoring.

c4-fig-0001

Figure 4.1 Evolution of design, methodology and technology in dietary assessment.


4.2 Traditional dietary assessment methods


Dietary assessment methodologies can be classified according to different criteria, including the duration of the period of registration (short-term versus long-term dietary assessment methods) and the time frame of the data collected (e.g. past/retro versus current/prospective dietary intake assessment). Although the dietary assessment methods described in this chapter do not use any arbitrary categorisation, these notions are important to have in mind when evaluating and selecting the most appropriate dietary assessment method according to the study-specific aims and designs, as well as the logistical conditions and constraints. In this section, the dietary assessment methods and their respective strengths and weaknesses will be described in turn. The main results are summarised in Table 4.1 to facilitate comparison.


Table 4.1 Traditional dietary assessment methods (comparison of important characteristics, errors and potential for standardisation).




















































































































































































































































































































































Food records 24-hour dietary recall FFQ Diet history Screener
Type of information available
Detailed information about foods/recipes x x x
Not detailed information about food groups x x
Scope of information sought
Total diet x x x x
Specific components x
Time frame of single administration
Short term (e.g. yesterday, today) x x x x
Long term (e.g. last month, last year) x x x
Adaptable to diet in distant past
Yes x x x
No x x
Cognitive requirements
Measurement or estimated recording of foods and drinks as they are consumed x
Memories of recent consumption x x x
Ability to make judgements of long-term diet x x x
Potential for reactivity
Low x x x x
High x
Time required to complete
Low x x
High x x (x)* x
Respondent burden
Low x x x
High x x
Investigator cost
Low x x
High x x x
Affecting food choices
Yes x
No x x x x
Possibility for automated data entry
Yes x x x x x
No
Literacy required#
Yes x x x x
No x
Usable for retrospective data collection
Yes x x x
No x x
Potential for standardisation
High potential x x
Low potential x x x
Error
Systematic under-reporting of intake x x x ? x
Systematic over-reporting of intake x (detailed FFQ)
Person-specific biases associated with gender, obesity etc. x x x x

* high amount of time required to complete very detailed FFQs.


# depending on administration method (e.g. interview versus self-administration).


Description of methodologies


Observation methods


When using the observation method to assess participants’ dietary intake, fieldworkers visit homes or school canteens to observe meal times and record dietary intake. Observation is an objective method to assess dietary intake, although in practice it can only be done in settings such as canteens or school dinner halls and for discrete time periods. However, new and existing technologies like cameras also allow the observation of subjects’ dietary intake in different settings (see Section 4.3).


An important strength of the observation method is the fact that it provides an objective assessment of dietary intake. However, this method is highly intensive for researchers and is therefore expensive. When not performed covertly, the observation may alter individuals’ usual eating patterns. Furthermore, this method is not feasible for obtaining habitual dietary data at either a group or an individual level. Observation of dietary intake is most commonly undertaken as a reference method for validating other dietary assessment methods.


Food diary or food record methods


The food record or food diary (Figure 4.2) is an open-ended method that requires that the subject (or observer) reports all foods and beverages consumed at the time of consumption, to minimise reliance on memory. These records can be kept over one or more days and portion sizes may be determined by weighing or by estimating volumes (e.g. using visual aids like pictures, food models or food packets). In some situations, only those foods of particular study interest are recorded. For example, to estimate the intake of a certain food component (e.g. cholesterol, which is found in animal products only), food records might be limited to meat, poultry, fish, eggs or dairy products. However, if total energy intake or total diet estimates are required, the food record must include all foods and beverages consumed. Food records are generally completed by the subjects themselves using paper-based or more innovative (web/IT) technological supports (see Section 4.3), though in some situations a proxy might be employed (e.g. for children, the elderly or when literacy is too limited). To complete a food record, each respondent must be trained in the level of detail required to describe adequately the foods and portion sizes consumed, including the name of the food (brand name if possible), preparation methods, recipes for food mixtures and portion sizes. Reviewing the food records with the participants right after data collection is desirable in order to capture adequate detail.


Figure 4.2 Food diaries. (a) Example of a food diary from the UK EPIC study. (b) Example of a Belgian food diary.

image
image

The most important strength of the food record is its level of detail, given its open-ended nature and the fact that it refers to the current diet (i.e. dietary intake estimated at time of consumption). In addition, the report of actually consumed foods contributes to increasing the accuracy of portion sizes. As this method does not require recall of foods eaten, there is no memory problem. However, participants who keep food records sometimes delay recording their intakes for several hours or days, in which case they rely on memory. The most important disadvantages of the food record are its high investigator cost and respondent burden and the fact that it might affect the respondents’ eating behaviour (subjects might change their eating behaviour due to the recording). Extensive respondent training and motivation are required and several repeated days are needed to capture individuals’ usual intake. The intake often tends to be under-reported and the number of food items regularly decreases with time. Drop-out increases with the number of daily records requested, and the fact that literacy and high respondent motivation and compliance are required may lead to a non-representative sample and subsequent non-response bias.


The food record is often used in dietary programmes, as writing down all food and drinks consumed could enhance self-monitoring for weight control or other behaviour change (see Section 4.3). Furthermore, multiple food records (usually between three and seven days) are often used as a reference method in relative validation studies (e.g. for validating FFQs).


24-hour dietary recall methods


The 24-hour dietary recall method (Figure 4.3) is an open-ended method asking the respondent to remember and report all the foods and beverages consumed in the preceding 24 hours or over the previous day. The recall is often structured (e.g. per meal occasion), using specific probes and cognitive processes, to help respondents recall their diet. Probing is especially useful in collecting the necessary details, such as how foods were prepared. The recall typically is conducted by interview (in person or by telephone), either using a paper-and-pencil form or through computer-assisted interview. However, self-administered electronic forms of administration have also recently become available (see Section 4.3). When the recall is interviewer administered, well-trained interviewers are crucial. However, non-nutritionists with sufficient training on foods and recipes available in the study region and in interview techniques can be cost-effective.

c4-fig-0003

Figure 4.3 Food descriptions in the standardised EPIC-Soft 24-hour dietary recall method. (EPIC-Soft has since been renamed GloboDiet.)


Important strengths of the 24-hour dietary recall method are its relatively low respondent burden and the fact that it does not affect respondents’ eating behaviour. This method is appropriate for most population groups, which reduces the potential for non-response bias and facilitates comparisons between populations. Another advantage is the fact that portion sizes are being recalled for all foods and beverages (using different quantification means), allowing estimation of individual intake. Disadvantages of the 24-hour dietary recall method are its high investigator cost (when interviewer administered) and the fact that repeated measurements are needed to capture individuals’ usual intake (see also the section on food records earlier in this chapter). Furthermore, the fact that 24-hour dietary recall relies on subjects’ short-term memory should also be considered as a relative disadvantage compared to food records (but not FFQs). In addition, socially desirable answers could introduce some recall bias during the a 24-hour dietary recall interview. As for food records, a 24-hour dietary recall tends also to under-report individual intakes.


Two repeated 24-hour dietary recall interviews are often used in large-scale dietary monitoring surveys, because of the low respondent burden and high level of standardisation. Furthermore, this method has also been applied as a reference calibration method in large-scale surveys to estimate population mean intake and correct for the measurement error of less accurate methods (e.g. FFQs).


Diet history methods


In 1947, Burke developed a dietary history interview and attempted to assess an individual’s usual diet. This original dietary history interview included 24-hour dietary recall, a menu recorded for 3 days and a checklist of foods consumed over the preceding month. This checklist consisted of a detailed listing of the types of foods and beverages commonly consumed at each eating occasion over a defined time period, most often a ‘typical’ week. A trained interviewer probed for the respondent’s customary pattern of food intake on each day of the typical week. The reference time frame could also be the past month or the past several months, or might reflect seasonal differences if the time frame was the past year. This checklist was the forerunner of the more structured dietary questionnaires in use today (e.g. FFQs, described below). A highly skilled and trained professional is needed for both the interview and the processing of the information.


An important strength of the diet history is that it assesses the individual subject’s usual intake while not affecting eating behaviour. This method is very detailed, which means that information on the total diet can be obtained.


An important disadvantage of this detailed method is its high respondent and investigator burden. It is a difficult cognitive task for respondents to recall their usual individual intake and the estimation of usual portion sizes remains a challenge.


Due to its significant respondent and investigator burden and high costs, the dietary history is seldom applied in current or recent dietary surveys.


Food frequency questionnaire (FFQ) methods


The basic food frequency questionnaire (FFQ) consists of two components: a closed food list and a frequency response section for subjects to report how often each food (e.g. banana) or food group (e.g. fruit) was eaten. For each item on the food list, the respondent is asked to estimate the frequency of consumption based on open or specified frequency categories, which indicate the number of times the food is usually consumed per day, week, month or year. The number and/or types of food items and frequency categories may vary according to the study objectives and designs. Brief FFQs may focus on one or several specific nutrients. FFQs generally include between 50 and 150 (mostly generic) food items.


Different types of FFQ are usually considered: non-quantitative (alternatively called qualitative), semi-quantitative or completely quantitative FFQs. Non-quantitative questionnaires do not specify any portion sizes (standard portions derived from other study populations or data sets might be added afterwards), whereas semi-quantified instruments provide a combination of individual or typical/standard portion sizes to estimate food quantities (standard portions are part of the food item line). A quantitative FFQ allows the respondent to indicate any amount of food typically consumed. FFQs are commonly used to rank individuals by intake of selected foods or nutrients. Although FFQs are not designed for estimating absolute nutrient intakes, the method is often used for estimating average intake of those nutrients that have large day-to-day variability and for which there are relatively few significant food sources (e.g. alcohol, vitamin A and vitamin C).


Some FFQs also include questions regarding usual food preparation methods, trimming of meats and identification of the brand of certain types of foods, such as margarines or ready-to-eat cereals.


FFQs are generally self-administered (see Figure 4.4), but may also be interviewer administered. Proxies can be used to complete the FFQ in particular situations (e.g. for children, elderly, hospitalised patients and so on).


Figure 4.4 Self-reported FFQs. (a) A self-reported FFQ from the Italy EPIC study. (b) An example of a Belgian FFQ.

image image

The most important strengths of the FFQ are its low investigator burden and cost and the fact that it does not affect the respondent’s eating behaviour. Furthermore, it has the advantage that usual individual intake is being requested (over a long time frame), which avoids the need for repeated measurements. The completion of an FFQ remains a difficult cognitive task for respondents and this should be considered as an important limitation of this dietary intake assessment method. Usual portion sizes are difficult to estimate precisely and the intake estimates may be misreported.


Because of its low respondent burden and rather reduced cost (compared to more detailed methods like food records or 24-hour recalls), the FFQ is often the method of choice for large-scale dietary studies investigating subjects’ usual/habitual dietary intake, for instance large-scale cohort or intervention studies. However, its limited accuracy for assessing usual individual intakes increasingly means that complementary or alternative approaches are required (see Section 4.3).


Screeners or brief dietary assessment methods


In a variety of settings, comprehensive dietary assessments are not necessary or practical, for instance in studies where diet is not the main focus or is only considered as a covariate, as in health interview surveys. This has led to the development of diverse brief dietary assessment instruments, often called ‘screeners’, aiming to measure a limited number of foods and/or nutrients. Short questionnaires are often used to assess the intake of particular food items like fruit and vegetables in surveillance and intervention research. As mentioned in the previous section, complete FFQs typically contain between 50 and 150 food items to capture the range of foods contributing to the many nutrients in the diet. If an investigator is interested only in estimating the intake of a single nutrient or food group, however, then fewer foods need to be assessed. Often, only 15 to 30 foods might be required to account for most of the intake of a particular nutrient.


The most important strengths of brief instruments or screeners are their low respondent burden and low investigator cost. Screeners generally assess usual individual (specific food group) intakes, though often only for a limited number of food items (e.g. fruit and vegetables). Like other retrospective dietary assessment instruments (e.g. FFQs), they do not affect the subject’s eating behaviour. The disadvantages of these brief instruments are very similar to those reported for FFQs, namely a difficult cognitive task for the respondent and a challenge to quantify usual portion sizes. Furthermore, screeners often only assess a limited number of nutrients/foods.


These brief instruments may have utility in clinical settings or in situations where health promotion and health education are the goals. They can also be used to examine relationships between some specific aspects of diet and other exposures, as in the National Health Interview Survey. Finally, some groups use short screeners to evaluate the effectiveness of policy initiatives.


Specific tools for dietary supplement intake assessments


Dietary supplements contribute to the total intakes of some nutrients, such as calcium, magnesium, iron and vitamins C, D and E. Failure to include these nutrient sources would lead to a serious underestimation of intakes. Therefore, dietary supplement information is increasingly collected via the traditional dietary intake assessment methods described above. However, precise information on product names and brand names as well as related quantities consumed (e.g. number and frequency of consumption of pills, drops, tablets) is required to assess accurately the nutrient intakes derived from dietary supplements. Furthermore, many formulations are now available over the internet and validation of the nutrient content can be difficult. Another method applicable to supplements but not to foods is the use of pill inventories, which are widely employed in obtaining information about other medications. For some supplements, inferences about use can be made from blood or urine biomarkers, if available, although they provide only qualitative rather than quantitative information.


Because most of the methods for assessing the intake of dietary supplements are similar to (or part of) those used for assessing dietary intake, they have the same strengths and limitations as the other methods mentioned in this chapter.


Main applications of traditional dietary assessment methods


The choice of the most appropriate dietary assessment method depends on many factors and requires careful consideration. The following questions should be answered in selecting the method that will best meet the study objectives:



  • Is information needed about foods, nutrients, other food components (e.g. bioactive components) and/or specific dietary behaviours and which items are of primary interest according to the research question?
  • Is the focus of the research question on the group or individual data level and are absolute or relative intake estimates required?
  • What are the population characteristics (age, sex, education, literacy, cultural diversity, motivation) and the time frame of interest?
  • What level of accuracy and precision is needed?
  • What are the available resources, including money, logistical conditions and constraints, interview time, staff and food composition data (if nutrients are to be calculated)?

Based on the answers to these questions, one can decide on the most appropriate dietary intake assessment method to be used for the particular study design and conditions.


Although these traditional methods are also used in clinical settings, the methods to be employed depend on the clinical conditions, which go beyond the scope of this chapter.


In epidemiological settings, at least three important study designs can be considered: cross-sectional/monitoring surveys, case-control studies and cohort studies. Any of the dietary instruments discussed in this chapter can be used in cross-sectional studies. Some of the instruments, such as 24-hour dietary recall, are appropriate when the study purpose requires detailed and reliable quantitative estimates of intake, and frequently as a substitute for food-weighted or recorded methods. In addition, the 24-hour dietary recall method has the advantage that it does not require literacy, which in large-scale surveys increases the number of respondents, including those of lower socio-economic status. Other instruments, such as FFQs or behavioural indicators, are appropriate when qualitative estimates are sufficient for ranking individuals according to their (low, medium or high) level of consumption, for example frequency of consuming soda/fizzy drinks.


For case-control studies, the period of interest for dietary exposure could be either the recent past (e.g. the year before diagnosis) or the distant past (e.g. 10 years ago or in childhood). Because information about diet before the onset of disease is needed, dietary assessment methods that focus on current behaviour, such as food diaries or 24-hour dietary recalls, are not useful in retrospective studies. The food frequency (and diet history) methods are well suited for assessing past diet and are therefore the only viable choices for case-control (retrospective) studies (unless more accurate information from the past is available, as for instance in nested cohort case-control studies). However, the accuracy of such distant past dietary intake estimations is lower than for recent dietary intake assessment methods (e.g. food diaries or 24-hour dietary recalls) due to the significance of recall bias.


In cohort studies or prospective dietary studies, dietary intake and/or status are measured at baseline, when study subjects are free of diseases, and are then related to later incidence of disease. A broad assessment of diet is usually desirable in prospective studies because many dietary exposures and many (intermediate) disease endpoints will ultimately be investigated and areas of interest may not even be recognised at the beginning of a cohort study. In order to relate diet at baseline to the eventual occurrence of disease, a measure is needed of the usual intake of foods by study subjects. Multiple 24-hour dietary recalls or food records, diet histories and food frequency methods have all been used effectively in prospective studies. Cost and logistical issues tend to favour food frequency methods because many prospective studies require thousands of respondents. However, because of concern about significant measurement error and attenuation attributed to the FFQ, other approaches are being considered (see Section 4.3). Incorporating emerging technological advances in administering dietary records, such as using mobile phones, increases the feasibility of such approaches in prospective studies (again, see Section 4.3

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Jun 13, 2016 | Posted by in NUTRITION | Comments Off on Methods to Determine Dietary Intake

Full access? Get Clinical Tree

Get Clinical Tree app for offline access