Approach to the Patient With Renal Illness

Edward J. Fly MD, FACP, FIDSA , in Cecil Essentials of Medicine , 2022

Assessment of Dietary Sodium Intake

At steady country, when body weight is neither increasing nor decreasing, the dietary sodium intake can be judged by 24-hour urine drove. To plant capability of urine collection, the measurement of urine creatinine in 24-60 minutes urine sample is important. The creatinine excretion rate in an adequately collected specimen should approach 1 k/mean solar day for women and 1.5 g/day for men. Dietary potassium and protein intake can be monitored similarly. Measurement of urine urea nitrogen in the 24-60 minutes urine sample tin can reveal the adequacy of dietary protein intake. Dietary sodium restriction can improve blood pressure, can enhance the biologic actions of inhibitors of the renin-angiotensin system, and may protect the heart, blood vessels, and kidneys independent of comeback in claret pressure.

Systematic Review of Passive Sensors for Detection of Food Intake

Tonmoy Ghosh , Edward Sazonov , in Reference Module in Biomedical Sciences, 2021

Introduction

Dietary intake assessment is one of the active enquiry areas, specifically the impact of diet on the homo body ( Schoeller and Westerterp, 2017). Bodyweight dynamics related to underweight, overweight, and obesity can exist obtained by monitoring food intake (Hall et al., 2012). In the study on "healthy diet," the World Health System (WHO) emphasizes the importance of a healthy nutrition and physical exercise (Earth Health Organization, 2019). Unhealthy eating increases health risks, for example, type-2 diabetes, asthma, cardiovascular diseases, etc., and causes a large number of deaths globally. It is essential to place the dietary restrictions that should exist followed for a healthy life and in society to achieve that technology-driven dietary cess methods are gaining popularity.

Traditional procedures for monitoring food intake are food records, 24   h recalls, and food frequency questionnaires (Bingham et al., 1994). These methods may be inaccurate and suffer from underreporting (Lopes et al., 2016) as well as put an extra brunt on the user. Alternatively, numerous technology-driven dietary assessment methods accept been proposed in the research literature. Precise and objective assessment of dietary intake requires automatic detection of food intake, recognition of food items, estimation of portion size, estimation of energy content, interpretation of the meal microstructure, and the dynamic process of eating (Doulah et al., 2017; Fontana and Sazonov, 2014). Technology-driven methods may be active or passive in terms of user perspective. Agile methods require user participation or intervention; nevertheless, passive methods practice non require any user input.

Active methods rely on the user to report on smartphone-based apps or to capture the food images by a hand-held camera. Smartphone application-based nutritional care processes were discussed in Chen et al. (2018) and Moguel et al. (2019). The assay of the captured images is usually performed by manual annotation or automatic prototype recognition using computer vision. Transmission annotation tin exist done by a trained analyst or/and the participant (Boushey et al., 2017) to place food items and their portion size. Paradigm recognition methods utilise computer vision algorithms to recognize foods, segment food objects from the groundwork, approximate the book of the nutrient, and compute energy content (Kong and Tan, 2011; Fang et al., 2015). The major advantages of agile methods are they provide detailed information about the mealtime, location, and elapsing of eating. However, the major limitation of the active method is information technology requires active participation from the user, resulting in user burden.

The passive method using wearable devices to record signals (epitome, physiological, audio-visual, motions, etc.) continuously (both eating and non-eating events) without the active participation of the user (Doulah et al., 2020; Sun et al., 2014). The major advantage of the passive methods is that they minimize the burden of the user. In passive methods, a user requires wearing the device and the device will automatically detect eating events, repast start and terminate times, meal duration, number of chews, food items, caloric intake, etc. However, passive methods are difficult to develop due to the fact that they require continuous capture of data (bespeak/paradigm) and a well-trained machine learning model to classify eating activity from non-eating. The limitations of passive methods include privacy concerns because some methods capture continuous audio or images.

The main objective of this paper is to provide a systematic evaluation of passive sensors for food intake detection. There are few review papers bachelor that performed a comprehensive review of dietary assessment. For example, 2 systematic reviews were published (Boushey et al., 2017; Paramastri et al., 2020), where authors reviewed the mobile application for nutrition behavior, mainly focused on active methods. Other articles (Kalantarian et al., 2017; He et al., 2020; Prioleau et al., 2017; Vu et al., 2017), focused on a review of sensor modalities used for food intake detection, but some important modalities were not reported such as glucose monitoring, intraoral, and gut sound sensors. Thus, there is a inquiry gap to investigate available sensor modalities and eating proxies (activity related to eating and physiological response to food consumption) that tin can detect food intake without user intervention.

In this paper, we systematically reviewed published methods that detect automated nutrient intake using passive sensors by searching in iv major databases following the Preferred Reporting Items for Systematic Review and Meta-Analysis (PRISMA) method. The contributions of this review are twofold: (1) a taxonomy of food intake proxies and sensor modalities was identified following a comprehensive review of state-of-the-art passive sensors for food intake detection, and (2) accurateness of food intake detection outside controlled environment was assessed.

The newspaper is organized as follows, first, the methodology of the systematic review is presented in "Review methodology" section followed past the specific research questions (RQs). The review finding alone with taxonomy is discussed in "Review findings" section. In this section, a detailed description of the articles is presented. Sections "Discussion", "Challenges and future directions", and "Decision" are discussion, challenges/future management, and decision, respectively.

Read full affiliate

URL:

https://www.sciencedirect.com/science/article/pii/B9780128225486000868

Management of Food Allergy

Donald Y.M. Leung Dr., PhD, FAAAAI , in Pediatric Allergy: Principles and Practice , 2021

Dietary Intake Cess

Dietary intakes tin can be obtained by 24-hour recall or multiple-day food diary or food frequency questionnaire. A 24-hour think is generally useful when assessing intake in an babe who is predominantly breastfed or bottle-fed merely may provide limited information for older children, because accuracy of a mixed diet may not be reflected with call up. For older infants and children, a nutrient diary volition provide a more accurate approximate of intake. A food diary of at to the lowest degree 3 days (including ane weekend and two weekdays) should include the amount and types of foods ingested and the timing of meals and snacks. Questions about typical dietary patterns or food frequency questionnaires also may be used and are especially useful in assessing specific nutrient intakes. For example, cess of calcium and vitamin D intake may be determined by request nigh frequency and amounts of dairy or enriched dairy substitutes consumed.

A registered dietitian volition be able to compare dietary patterns to recommendations from the Dietary Reference Intakes (DRIs;https://www.nal.usda.gov/fnic/dietary-reference-intakes) or food group guides specified by the U.Due south. Section of Agriculture (http://www.choosemyplate.gov) or provided by governmental agencies in other countries. The DRIs (https://fnic.nal.usda.gov/fnic/dri-computer/) and other guidelines may be used every bit tools to appraise nutrient intake, programme interventions, and/or monitor the patient's ongoing nutrient intakes. 37 Even clinicians who are not trained to assess nutrient intake may glean valuable information from a food diary or food frequency questionnaires. For instance, unusual meal or snack patterns such as feeding on need beyond infancy, or unusual food or potable intake, such as excessive fruit juice consumption, may become apparent and give clues to potential causes of poor nutritional condition in a child. A sample elimination diet menu is provided inBox 35.1.

Dietary Cobweb and Carbohydrates

MARÍA ELENA MARTÍNEZ , ELIZABETH T. JACOBS , in Nutritional Oncology (Second Edition), 2006

Analysis of Fiber in Foods

No method of dietary intake assessment is without flaws. Earlier fauna inquiry related to fiber focused on crude fiber and other methods that did not let complete analysis of fiber in foods for human being studies ( Goering and van Soest, 1970; van Soest and van Soest, 1973). However, crude fiber consists mainly of lignin and cellulose, whereas dietary fiber contains these plus other components (hemicellulose, gums, pectin, etc.). After, analysis methods described by Southgate (1969, 1976) and Englyst (1980) evolved and became more than applicable to studies in humans. The bug of isolating dietary polysaccharides (which are equivalent to dietary fiber) take been the objective of more recent methods for measuring fiber in food (Asp et al., 1992; Li, 1995). However, controversy regarding the most advisable method of chemic analysis continues.

Dietary fiber, chemically speaking, is a polysaccharide and, in terms of nutrient composition, is best seen every bit such. Currently, there is no one accepted definition or method of analysis for dietary fiber. Englyst and Cummings (1988) define it as a component of NSP. In this method of analysis, starch is removed enzymatically, after solubilization, and NSP is measured equally the sum of constituent sugars released by acid hydrolysis. The Southgate method provides data on total dietary fiber and individual fiber components (cellulose, lignin, etc.); thus, the Englyst method is considered a modified Southgate method. The Association of Belittling Chemists (AOAC) describes cobweb as endogenous plant nutrient textile from the nutrition that is resistant to human digestive secretions but that is substantially fermented in the colon (Prosky et al., 1985). This method relies on an enzymatic gravimetric system of assay and has been suggested to be the nigh practical and simplest approach to measuring the major components of dietary cobweb equally a single unit (Dreher, 1987). The AOAC method is the most accustomed in the United States, whereas that of Englyst is preferred in European countries. When data on dietary fiber and illness are reported using different analytical techniques, comparison across studies may be hard. Thus, standardized belittling chemical techniques of fiber content in foods are imperative.

Read total affiliate

URL:

https://www.sciencedirect.com/science/commodity/pii/B9780120883936500853

Nutritional Principles and Cess of the Gastroenterology Patient

Marker Feldman Medico , in Sleisenger and Fordtran'due south Gastrointestinal and Liver Affliction , 2021

Patients Undergoing Radiation Therapy

The usefulness of ambitious nutrition support in patients undergoing radiation therapy has been studied almost extensively in those who have caput and neck and esophageal cancers. There is now reasonable evidence in these patients that placement of a PEG tube and administration of supplemental tube feedings during and after the course of radiation therapy prevents further deterioration of nutritional status. 133-134 In patients with head and cervix cancers, supplemental PEG feedings have also been shown to ameliorate quality of life. Although improvements in survival or decreased morbidity have not yet been demonstrated, improved quality of life alone may warrant its use in this setting.

Total references for this chapter can exist found on www.expertconsult.com .

Diet and Metabolism in Kidney Disease

Alp Ikizler 1000.D. , Lara B. Pupim 1000.D., One thousand.Due south.C.I. , in Chronic Kidney Disease, Dialysis, and Transplantation (3rd Edition), 2010

Composite Indices of Nutritional Status

These diagnostic measures take into account not but some of the body composition and dietary intake cess tools but too must incorporate a degree of subjective assessment of overall nutritional status. The well-nigh commonly used composite indices include subjective global assessment (SGA) or modified or expanded indices that incorporate the SGA or its components, such equally the blended nutritional index and the malnutrition-inflammation score (MIS). They are clinically useful tools for evaluating nutritional status from a broader perspective, including medical history, symptoms, and concrete parameters.

SGA was originally used to predict outcomes in surgical patients with gastrointestinal disease, and it has been validated as a screening tool for this population. 29 On the footing of evaluation of history of weight changes, nutritional intake and gastrointestinal symptoms, diet-related functional impairment, and physical test to appraise subcutaneous fat, muscle stores, and the presence or absenteeism of edema, patients are divided into three categories: A, well-nourished; B, balmy to moderately malnourished; or C, severely malnourished. Although SGA was constitute to correlate with other measures of nutritional status in maintenance dialysis patients, it does not reliably detect sarcopenia (based on total body nitrogen). 30 Furthermore, SGA scores tend to discriminate well between all-time- versus worst-nourished patients only fail to split out the balmy or moderate PEW in MHD patient. It is recommended in the National Kidney Foundation Kidney Affliction Outcomes Quality Initiative (G/DOQI) guidelines that the modified SGA exist performed every 6 months in MHD patients. 31 The subjective nature, lack of inclusion of measures of visceral protein stores, and relative insensitivity to small changes in nutritional condition of SGA led others to create a more comprehensive nutritional index that includes the SGA and parameters based on body weight and weight-for-peak, skinfold measures, and serum albumin concentration.

The MIS incorporates components of the SGA and includes other components related to nutritional status (torso mass index), a combination of nutritional status and inflammation (serum albumin concentration and total iron binding capacity), and components non directly indicative of nutritional status, such every bit comorbidities and functional condition. 32 The MIS is not equally closely associated with measures of torso limerick every bit is the SGA. Equally with any method of nutritional status cess, SGA, composite nutritional index, or MIS should be used in conjunction with other methods.

Monitoring the nutritional status on a regular basis is the central to early detection of nutritional disturbances in CKD patients to evaluate the response of nutritional interventions and to motivate and amend the patient'south compliance to the dietary therapy. Although there is no definitive protocol for routine follow-upwards, torso weight and normalized PNA (nPNA) should be monitored monthly. Serum albumin, prealbumin, and cholesterol should exist determined every 3 months in clinically stable patients. Other anthropometric measurements, dietary interviews, and SGA should exist obtained every 6 months or more often in those patients at risk of developing PEW and those patients with established PEW. Effigy 12-2 depicts a proposed algorithm for the assessment and direction of nutritional status in maintenance dialysis patients.

Read full chapter

URL:

https://www.sciencedirect.com/science/article/pii/B9781437709872000121

Furan in Java Products

Dirk W. Lachenmeier , in Coffee in Health and Disease Prevention, 2015

98.3 Exposure Estimation

Similar to a previous study on caffeine intake, 17 the data source for dietary intake assessment was published dietary intake data from the second national diet survey (Nationale Verzehrsstudie II, NVS II) in Germany, which provides representative food intake data of 14- to 80-year-olds. A total of 20,000 participants in 500 randomly called towns in Deutschland were selected. The survey method was based on a combination of instruments, such every bit figurer-supported diet history questionnaires in the study center, telephone interviews using computerized 24-h call back methods, and 2  ×   iv-day weighing protocols in a random selection of participants. The data on beverage consumption are based on 15,371 records (54% females, 46% males) from the nutrition history cess conducted between Nov 2005 and November 2006. Several studies provide details about NVS II method and results. xx–23

The adding method was similar to a method adult for assessment of caffeine intake. 17 The calculation of java-related daily furan intake per kg torso weight (bw) requires the following data: amount of furan in coffee, daily intake of java, and bw of consumer. An initial expect at the information revealed that all three factors may not necessarily be usually distributed and may also vary widely. For this reason, it was decided to use a probabilistic approach in difference to the usual bespeak estimate calculation (which is based on averages and percentiles 3 ). A best fit distribution was selected to represent the furan content in java (run into above and Figure 98.one). For the NVS II survey, only boilerplate intake information segregated according to historic period group were reported for the beverage group "coffee and tea" (i.east., just the sum of both types was bachelor). According to per capita consumption information (2010) for Germany, the distribution betwixt coffee and tea is 86% (coffee) and 14% (tea), and then that a gene of 86% was used to estimate the coffee consumption. 24 The furan intake for the total sample and each historic period group was calculated using Monte Carlo simulation with 10,000 iterations.

The calculation formula was as follows:

Furan_intake ( μ g / kg bw / day ) = [ ( Java & Tea _ consumption 0.86 Furan _ content ) / Bodyweight]; with Furan _ content = RiskLognorm (43.468; 73.78).

The distribution office RiskLognorm(mean; standard deviation) specifies a log-normal distribution with the entered mean and standard deviation.

Monte Carlo simulations were performed using Latin Hypercube sampling and Mersenne Twister random number generator. The furan intake using the formula was gear up as risk output and ten,000 iterations were calculated. The calculation was repeated separately for each age grouping/sex and for the average over all age groups. Calculations were performed using the software package @Risk Version v.5.0 (Palisade Corporation, Ithaca, NY, USA) for Excel 2010 Version 14.0 (Microsoft Corporation, Redmond, WA, The states).

To additionally evaluate the individual adventure for coffee consumers, the furan intake was calculated for drinking of one to eight cups (0.two   50) of java per day with the following formula:

Furan _ intake ( μ yard / kg bw / day ) = [ ( Number of cups per day 0.two Furan _ content undefined undefined undefined Bodyweight ].

In this case, the body weight (kg) was described with the distribution function RiskNormal(73.9; 12), which assumes a normal distribution with average of 73.ix   kg and standard deviation of 12   kg for males and females based on data reported by EFSA. 25

The estimated daily furan intakes for men are shown in Figure 98.2. No significant differences were found between the sexes. In both sexes, the intake almost linearly increases until the age group of 35–50   years, which has the highest java consumption. At college ages, the consumption so stays more or less constant (Figure 98.iii).

FIGURE 98.two. Histograms showing the probability density of estimated furan intakes for probabilistic simulations with 10,000 iterations for all age groups in men.

The y-axis shows the relative frequency of a value in the range of a bin occurring. No significant differences were detected for women.

FIGURE 98.three. Estimated boilerplate furan intake as a function of age derived using probabilistic simulations with 10,000 iterations.

The error bar shows the standard deviation.

To facilitate judgment about the uncertainty in the calculation and to consider average and worst-case scenarios, Table 98.1 shows some descriptive statistical data on the results of the probabilistic calculation for both sexes combined. The mean is 0.26   μg/kg bw/day and the P90 is 0.59   μg/kg bw/24-hour interval.

For a daily consumption of betwixt i and eight cups of coffee per day, the resulting exposure was probabilistically estimated and the descriptive statistics of the resulting distributions are presented in Table 98.ane. The average furan intake is 0.12   μg/kg bw/24-hour interval per loving cup of coffee, whereas the P90 intake per cup of java is 0.26   μg/kg bw/twenty-four hour period.

The High german Federal Institute for Risk Assessment calculated exposures of furan between 0.ane   μg/kg bw/day (hateful java consumption) and 0.4   μg/kg bw/day (high coffee consumption). 26 A survey in the United States past Morehouse et al. 27 showed exposure estimates of 0.fifteen   μg/kg bw/twenty-four hours for brewed java, whereas a Canadian survey by Becalski et al. 28 reported an average exposure guess of 0.43   μg/kg bw/day. A written report from Belgium reported that the boilerplate population-based exposure from coffee consumption was approximately 0.21   μg/kg bw/day. 29 The EFSA showed that coffee contributed with 88% to overall adult furan exposure, but no separate estimation for coffee was provided. 12

Our average population-based exposure of 0.26   μg/kg bw/twenty-four hour period with a worst-case (P95) exposure of 0.90   μg/kg bw/day is comparable with the previous estimations in the literature and likewise with the EFSA data when extrapolated with 88% to coffee alone (approximately 0.2   μg/kg bw/twenty-four hour period for mean exposure and 1.1   μg/kg bw/day for worst-cases).

Read full chapter

URL:

https://www.sciencedirect.com/science/commodity/pii/B978012409517500098X

Nutritional Therapy in Maintenance Hemodialysis

Kamyar Kalantar-Zadeh Md, PhD, MPH , in Handbook of Dialysis Therapy (Quaternary Edition), 2008

Cess of Nutritional Status

Methods and tools to assess protein-energy malnutrition in dialysis patients are classically divided into four major categories: appraise-ment of appetite and dietary intake, biochemical and laboratory assessment, body composition measures, and nutritional scoring systems (Tabular array 51.3). A normal ambition is essential to maintain acceptable food intake and to avoid undernourishment. Even though poor appetite and anorexia are early signs of uremia and may be a cause of malnutrition in dialysis patients, there is currently no uniformly accepted quantitative cess for appetite considering appetite is inherently subjective. It has been argued that chronic inflammation is a crusade of poor ambition in hemodialysis patients. If this hypothesis is true, inflammation may exist causally linked to malnutrition by engendering anorexia in dialysis patients.

Dietary cess is a traditional nutritional evaluation, considering both the quality and quantity of the ingested nutrients can be assessed with a high degree of reproducibility. Yet, dietary assessment methods (including the 24-60 minutes recall, 3-day diary with interview, and food frequency questionnaires) are currently rarely employed in dialysis patients. A more routinely used and readily bachelor method is the adding of the weight-normalized protein equivalent of total nitrogen appearance (nPNA), also known equally the normalized protein catabolic rate (nPCR)—which is derived from the charge per unit of urea generation between the 2 subsequent dialysis treatment sessions. This urea-kinetic estimate of the protein intake is associated with survival in hemodialysis patients (Figure 51.ii). Among limitations of nPNA are its mathematical correlation with Kt/5 and the required assumption for the closed and stable organisation (i.east., no rest renal function and no negative or positive nitrogen rest).

Anthropometry and body composition measures are used as conventional indicators of nutritional status in dialysis patients. Weight-for-height and body mass index (BMI = weight/height2) tin can exist conveniently calculated and are also known to predict outcomes in dialysis patients. Withal, the reliability of these measures to represent the true trunk composition is questionable—specially because a high BMI tin can occur with both high total body fat and very loftier muscle mass. Furthermore, there is a paradoxical clan between higher BMI and better survival in hemodialysis patients (Figure 51.3). This consistently observed counterintuitive association, also known as obesity paradox, underscores the important role of diet in the survival of hemodialysis patients.

Caliper anthropometry, including mid-arm musculus mass and pare-fold thickness, has poor reproducibility. More reliable methods (such as underwater weighing and total nitrogen or potassium measurements) are costly and rarely used in dialysis patients, although they are considered golden standards. Energy-beam methods may provide more pragmatic alternatives. Portable devices such as those based on bioelectrical impedance analysis (BIA) or nearly-infrared interactance (NIR) technology are evaluator and patient friendly, whereas dual-free energy 10-ray absorbsiometry (DEXA) is a more than elaborate and costly method that requires both resources and expertise.

Serum concentrations of albumin, prealbumin (transthyretin), transferrin (total iron bounden capacity, TIBC), cholesterol, urea nitrogen, and creatinine can be evaluated as markers of nutritional condition and outcome predictors in dialysis patients. However, these laboratory values may significantly be confounded by such non-nutrition factors as inflammation, oxidative stress, iron stores, liver disease, and residuum renal function. Serum albumin is i of the almost sensitive bloodshed predictors in hemodialysis patients (Effigy 51.4). A fall in serum albumin concentration to equally low as 0.six one thousand/dL from baseline over a 6-month interval is associated with a doubling of the expiry risk in these patients.

Several scoring systems accept recently been developed to assess the overall nutritional aspects of dialysis patients. The Subjective Global Assessment (SGA) is probably the most well-known scoring tool, which has likewise been recommended past the NKF-K/DOQI Nutrition guidelines for the periodic assessment of dialysis patients. Among the limitations of the SGA are the inherently "subjective" characteristics of its assessment components and its semiquantities scoring. Fully quantitative versions of the SGA that take been developed for dialysis patients, including the Dialysis Malnutrition Score (DMS) and the Malnutrition-Inflammation Score (MIS). The reproducibility and objectivity of the DMS and especially of the MIS may be superior to the conventional SGA for hemodialysis patients.

Read full chapter

URL:

https://world wide web.sciencedirect.com/science/article/pii/B9781416041979500569

Diet IN RESPIRATORY DISEASE

I. Romieu , in Encyclopedia of Respiratory Medicine, 2006

Methodological Issues

Study design

Both observational and experimental studies have been performed to make up one's mind the impact of diet on obstructive lung diseases. The blazon of study almost often carried out consists of a cross-exclusive design in which dietary intake, airway responsiveness, change in lung functions, or respiratory symptoms are simultaneously assessed. Because of their cantankerous-sectional nature, these studies cannot provide information on the temporal relationship between dietary intake and lung diseases and are therefore more than likely to be biased. For example, subjects with prevalent symptoms may change their diet, thus altering the nature of the association. In improver, doubt exists concerning the biologically relevant exposure window, and it is plausible that dietary agents may deed at different stages of the disease process. Cohort studies enroll individuals free of disease and evaluate the incidence of lung disease, thus providing a correct temporal relationship and stronger show for a causal relation between nutrition and lung illness. However, long-term dietary intake assessment is difficult because of changes in nutrient condition during follow-up and it is too subject to misclassification, thus potentially underestimating associations.

In experimental studies, patients are randomly assigned to receive a dietary supplement or a placebo and therefore their diet is controlled. These studies are more than likely to provide data indicating a true causal relation; however, the results can simply be extrapolated to populations that are like to the one under study.

Assessment of dietary intake

In observational studies, different methodologies for dietary assessment have been used, and it is important to understand the limitations of such methods for interpreting the results of these studies. Biochemical indicators of dietary intake generally reverberate recent intake and are unable to provide adequate information concerning long-term intake, which represents the well-nigh relevant risk factor for chronic disease. In addition, sure factors, such equally exposure to tobacco smoke, will decrease the level of vitamin C in serum, and if these effects are non deemed for, indications of dietary intake will exist erroneous. Food frequency questionnaires provide a reasonable judge of long-term dietary intake just are subject to random error and may thus lead to an underestimate of the human relationship between dietary factors and lung disease. For some nutrients, such equally vitamin E and selenium, relying on dietary assessment methods to classify individuals is problematic since accurate data on food content are limited. The selenium content volition vary according to the origin of the food. Besides, the quantity of fats or oils, rich in vitamin E, that are ingested is difficult to discern. Finally, considering few studies consider more than ane nutrient in their analysis, a protective or adverse result may be attributed to one nutrient or micronutrient, such as vitamin C, when in fact information technology indicates the result of another correlated dietary constituent or the result of an interaction between dietary constituents. When analyzing the upshot of foods, it is particularly hard to exclude the possibility that a sure component of fruit or vegetable may have a truly protective issue.

Read full affiliate

URL:

https://world wide web.sciencedirect.com/science/article/pii/B0123708796002672

Diet and Nutrition

B.J. Rolls , A. Drewnowski , in Encyclopedia of Gerontology (Second Edition), 2007

The Population at Run a risk

The elderly are the fastest growing segment in the US population. Over 25 million Americans are over the historic period of 65. By the yr 2030, 57 million people are expected to be 65 years or older. Yet the elderly are not a homogeneous population with compatible nutritional needs. Although some subgroups may be at greater risk for anorexia and associated nutritional deficiencies, others, specially women, are more likely to exist overweight and obese. In general, trunk weight increases until tardily middle historic period, and so plateaus and decreases for older persons.

Nutritional surveys of the United states of america population have shown poor correlations between reported dietary intake of nutrients and clinical and biochemical measures of nutritional condition. For instance, information showing inadequate intake of energy and protein among the elderly are hard to reconcile with increased prevalence of obesity and increased pct of body fat. All the same, many measures of dietary and nutritional assessment may not exist applicable to older people. Near of the nutritional standards that have been used were generated from data on young individuals and were then extrapolated to the elderly. Awareness of the issues this can cause is leading to collection of more than data on intake patterns of the elderly. However, the assessment of dietary intake in large-scale epidemiological studies is often based on self-administered food frequency questionnaires or nutrition records. Such procedures place demands on retentiveness and cognition and their accuracy may be compromised by forgetfulness, fatigue, or dementia. Dietary intake assessments that are shorter and interview based may be more advisable for utilise with the elderly populations. The accuracy of body composition measures also changes every bit a office of age. Although trunk fat in young people is stored largely in subcutaneous depots and can be easily measured by skinfold thickness, much of the trunk fat in older people is stored in the trunk, which is harder to measure using readily accessible techniques such as skinfold thickness. These diverse assessment problems brand information technology difficult to depict house conclusions well-nigh the nutritional status of the elderly, particularly because they are a heterogeneous group. Ane matter is clear: in that location is a need for two sets of age-specific nutritional status standards for older people, one set for the gratis-living population and 1 for those who are chronically or acutely ill.

With these bug in mind, we volition briefly examine what existing nutritional surveys indicate about changes in body weight (BW) and nutritional status with age. The National Wellness and Nutrition Examination Survey Ii (NHANES II, 1976–80) shows that the prevalence of overweight increased with age. As shown in Figure 1, older people weigh more than and many are fatter than are immature people. Trunk mass index (BMI), the principal mensurate of overweight for adults, is correlated reasonably well with body fatness.

Figure one. Body mass indices (BMI) for United states men and women past age from the National Health and Diet Exam Survey II, 1976–1980 (NHANES Ii).

In contrast to BMI, energy intakes decline as a function of historic period. Based on a single 24-h diet retrieve, a more recent NHANES Iii study (NHANES Iii stage i; 1988–91) (encounter Figure 2) shows that median energy intakes for adults aged 70–79 years are 1797 kcal for men and 1382 kcal for women. The reported macronutrient composition of energy in their diet is approximately 50% saccharide, 16% protein, and 34% fatty. These data are like to the results reported for the population as a whole. However, the finding that intake of some vitamins and minerals is low has given rise to concerns that some elderly individuals may consume inadequate levels of food-dumbo foods. For example, low consumption of calcium and vitamin D has been attributed to depression intake of dairy products by the elderly.

Figure two. Energy intakes (kcal) for men and women by age from the National Wellness and Nutrition Test Survey Three, 1988–1991 (NHANES III).

Anorexia, or low food intake, is a problem in the elderly because it increases the risk of nutrition-related illness. Though nutritional surveys have shown a low-to-moderate prevalence of nutrient deficiencies in costless-living populations, elderly persons living in institutional settings may be at greater gamble for malnutrition. Although estimates vary, between 30 and 60% of long-term intendance older institutionalized individuals have some degree of malnutrition. Studies evidence that poor nutrition and depression BW are not simply a result of disease states, but often precede and predispose elderly individuals to affliction and death.

Loss of ambition tin be caused by historic period-related psychological and physical factors. Eating patterns tin can be influenced past mental factors, such every bit depression or dementia, and concrete factors, including immobility, inability to feed oneself, and poor dentition or sick-fitting dentures. Many elderly individuals have issues with their oral cavity. For example, 42% of the geriatric population in the United States have no natural teeth. Of those with their teeth, threescore% have tooth decay and xc% have gum disease, which impair their ability to chew. Tooth loss also affects chewing ability, and this is not completely restored by dentures. Dumb chewing tin cause changes in nutrient selection, such every bit decreasing the variety in the nutrition, which could contribute to nutritional problems.

Decreased physical activity and a slower metabolic charge per unit are part of the explanation for lowered food intake in the elderly. Still, decreased energy demands cannot exist the explanation for why nutritional issues related to depression food intake develop.

Read total chapter

URL:

https://www.sciencedirect.com/science/commodity/pii/B0123708702000548