404 resultados para intake level
em Queensland University of Technology - ePrints Archive
Resumo:
Oral intake of ascorbic acid is essential for optimum health in human beings. Continuous ambulatory peritoneal dialysis (CAPD) patients have an increased need for ascorbic acid, because of increased loss through dialysate, reduced intake owing to nausea and loss of appetite, and increased oxidative stress. However, optimum intake is still controversial. We studied 50 clinically stable patients to determine the relationship between oral ascorbic acid intake and serum ascorbic acid (SAA) level. Total oral intake ranged from 28 mg daily to 412 mg daily. Only one patient had an oral intake of ascorbic acid below 60 mg per day. The SAA levels ranged from 1 mg/L to 36.17 mg/L. Although a strong correlation existed between intake and SAA (p < 0.001, R2 = 0.47), the variation in SAA at any given intake level was wide. Of the studied patients, 62% had an SAA < 8.7 mg/L, 40% had an SAA < 5.1 mg/L (below the level in a healthy population), and 12% had a level below 2 mg/L (scorbutic). None of the patients demonstrated clinical manifestations of scurvy. Our results show that, in CAPD patients, ascorbic acid deficiency can be reliably detected only with SAA measurements, and oral intake may influence SAA level. To maintain ascorbic acid in the normal range for healthy adults, daily oral intake needs to be increased above the U.S. recommended dietary allowance to 80-140 mg.
Resumo:
Vitamin D deficiency and insufficiency are now seen as a contemporary health problem in Australia with possible widespread health effects not limited to bone health1. Despite this, the Vitamin D status (measured as serum 25-hydroxyvitamin D (25(OH)D)) of ambulatory adults has been overlooked in this country. Serum 25(OH)D status is especially important among this group as studies have shown a link between Vitamin D and fall risk in older adults2. Limited data also exists on the contributions of sun exposure via ultraviolet radiation and dietary intake to serum 25(OH)D status in this population. The aims of this project were to assess the serum 25(OH)D status of a group of older ambulatory adults in South East Queensland, to assess the association between their serum 25(OH)D status and functional measures as possible indicators of fall risk, obtain data on the sources of Vitamin D in this population and assess whether this intake was related to serum 25(OH)D status and describe sun protection and exposure behaviors in this group and investigate whether a relationship existed between these and serum 25(OH)D status. The collection of this data assists in addressing key gaps identified in the literature with regard to this population group and their Vitamin D status in Australia. A representative convenience sample of participants (N=47) over 55 years of age was recruited for this cross-sectional, exploratory study which was undertaken in December 2007 in south-east Queensland (Brisbane and Sunshine coast). Participants were required to complete a sun exposure questionnaire in addition to a Calcium and Vitamin D food frequency questionnaire. Timed up and go and handgrip dynamometry tests were used to examine functional capacity. Serum 25(OH)D status and blood measures of Calcium, Phosphorus and Albumin were determined through blood tests. The Mean and Median serum 25-Hydroxyvitamin D (25(OH)D) for all participants in this study was 85.8nmol/L (Standard Deviation 29.7nmol/L) and 81.0nmol/L (Range 22-158nmol/L), respectively. Analysis at the bivariate level revealed a statistically significant relationship between serum 25(OH)D status and location, with participants living on the Sunshine Coast having a mean serum 25(OH)D status 21.3nmol/L higher than participants living in Brisbane (p=0.014). While at the descriptive level there was an apparent trend towards higher outdoor exposure and increasing levels of serum 25(OH)D, no statistically significant associations between the sun measures of outdoor exposure, sun protection behaviors and phenotypic characteristics and serum 25(OH)D status were observed. Intake of both Calcium and Vitamin D was low in this sample with sixty-eight (68%) of participants not meeting the Estimated Average Requirements (EAR) for Calcium (Median=771.0mg; Range=218.0-2616.0mg), while eighty-seven (87%) did not meet the Adequate Intake for Vitamin D (Median=4.46ug; Range=0.13-30.0ug). This raises the question of how realistic meeting the new Adequate Intakes for Vitamin D is, when there is such a low level of Vitamin D fortification in this country. However, participants meeting the Adequate Intake (AI) for Vitamin D were observed to have a significantly higher serum 25(OH)D status compared to those not meeting the AI for Vitamin D (p=0.036), showing that meeting the AI for Vitamin D may play a significant role in determining Vitamin D status in this population. By stratifying our data by categories of outdoor exposure time, a trend was observed between increased importance of Vitamin D dietary intake as a possible determinant of serum 25(OH)D status in participants with lower outdoor exposures. While a trend towards higher Timed Up and Go scores in participants with higher 25(OH) D status was seen, this was only significant for females (p=0.014). Handgrip strength showed statistically significant association with serum 25(OH)D status. The high serum 25(OH)D status in our sample almost certainly explains the limited relationship between functional measures and serum 25(OH)D. However, the observation of an association between slower Time Up and Go speeds, and lower serum 25(OH)D levels, even with a small sample size, is significant as slower Timed Up and Go speeds have been associated with increased fall risk in older adults3. Multivariable regression analysis revealed Location as the only significant determinant of serum 25(OH)D status at p=0.014, with trends (p=>0.1) for higher serum 25(OH)D being shown for participants that met the AI for Vitamin D and rated themselves as having a higher health status. The results of this exploratory study show that 93.6% of participants had adequate 25(OH)D status-possibly due to measurement being taken in the summer season and the convenience nature of the sample. However, many participants do not meet their dietary Calcium and Vitamin D requirements, which may indicate inadequate intake of these nutrients in older Australians and a higher risk of osteoporosis. The relationship between serum 25(OH)D and functional measures in this population also requires further study, especially in older adults displaying Vitamin D insufficiency or deficiency.
Resumo:
OBJECTIVE Malnutrition is common among peritoneal dialysis (PD) patients. Reduced nutrient intake contributes to this. It has long been assumed that this reflects disturbed appetite. We set out to define the appetite profiles of a group of PD patients using a novel technique. DESIGN Prospective, cross-sectional comparison of PD patients versus controls. SETTING Teaching hospital dialysis unit. PATIENTS 39 PD patients and 42 healthy controls. INTERVENTION Visual analog ratings were recorded at hourly intervals to generate daily profiles for hunger and fullness. Summary statistics were generated to compare the groups. Food intake was measured using 3-day dietary records. MAIN OUTCOME MEASURES Hunger and fullness profiles. Derived hunger and fullness scores. RESULTS Controls demonstrated peaks of hunger before mealtimes, with fullness scores peaking after meals. The PD profiles had much reduced premeal hunger peaks. A postmeal reduction in hunger was evident, but the rest of the trace was flat. The PD fullness profile was also flatter than in the controls. Mean scores were similar despite the marked discrepancy in the profiles. The PD group had lower peak hunger and less diurnal variability in their hunger scores. They also demonstrated much less change in fullness rating around mealtimes, while the mean and peak fullness scores were little different. The reported nutrient intake was significantly lower for PD. CONCLUSION The data suggest that PD patients normalize their mean appetite perception at a lower level of nutrient intake than controls, suggesting that patient-reported appetite may be misleading in clinical practice. There is a loss of the usual daily variation for the PD group, which may contribute to their reduced food intake. The technique described here could be used to assess the impact of interventions upon the abnormal PD appetite profile.
Resumo:
Objective: To assess the effect of graded increases in exercised-induced energy expenditure (EE) on appetite, energy intake (EI), total daily EE and body weight in men living in their normal environment and consuming their usual diets. Design: Within-subject, repeated measures design. Six men (mean (s.d.) age 31.0 (5.0) y; weight 75.1 (15.96) kg; height 1.79 (0.10) m; body mass index (BMI) 23.3(2.4) kg/m2), were each studied three times during a 9 day protocol, corresponding to prescriptions of no exercise, (control) (Nex; 0 MJ/day), medium exercise level (Mex; ~1.6 MJ/day) and high exercise level (Hex; ~3.2 MJ/day). On days 1-2 subjects were given a medium fat (MF) maintenance diet (1.6 ´ resting metabolic rate (RMR)). Measurements: On days 3-9 subjects self-recorded dietary intake using a food diary and self-weighed intake. EE was assessed by continual heart rate monitoring, using the modified FLEX method. Subjects' HR (heart rate) was individually calibrated against submaximal VO2 during incremental exercise tests at the beginning and end of each 9 day study period. Respiratory exchange was measured by indirect calorimetry. Subjects completed hourly hunger ratings during waking hours to record subjective sensations of hunger and appetite. Body weight was measured daily. Results: EE amounted to 11.7, 12.9 and 16.8 MJ/day (F(2,10)=48.26; P<0.001 (s.e.d=0.55)) on the Nex, Mex and Hex treatments, respectively. The corresponding values for EI were 11.6, 11.8 and 11.8 MJ/day (F(2,10)=0.10; P=0.910 (s.e.d.=0.10)), respectively. There were no treatment effects on hunger, appetite or body weight, but there was evidence of weight loss on the Hex treatment. Conclusion: Increasing EE did not lead to compensation of EI over 7 days. However, total daily EE tended to decrease over time on the two exercise treatments. Lean men appear able to tolerate a considerable negative energy balance, induced by exercise, over 7 days without invoking compensatory increases in EI.
Resumo:
Objective: The evidence was reviewed on how physical activity could influence the regulation of food intake by either adjusting the sensitivity of appetite control mechanisms or by generating an energy deficit that could adjust the drive to eat. Design: Interventionist and correlational studies that had a significant influence on the relationship between physical activity and food intake were reviewed. Interventionist studies involve a deliberate imposition of physical activity with subsequent monitoring of the eating response. Correlational studies make use of naturally occurring differences in the levels of physical activity (between and within subjects) with simultaneous assessment of energy expenditure and intake. Subjects: Studies using lean, overweight, and obese men and women were included. Results: Only 19% of interventionist studies report an increase in energy intake after exercise; 65% show no change and 16% show a decrease in appetite. Of the correlational studies, approximately half show no relationship between energy expenditure and intake. These data indicate a rather loose coupling between energy expenditure and intake. A common sense view is that exercise is futile as a form of weight control because the energy deficit drives a compensatory increase in food intake. However, evidence shows that this is not generally true. One positive aspect of this is that raising energy expenditure through physical activity (or maintaining an active life style) can cause weight loss or prevent weight gain. A negative feature is that when people become sedentary after a period of high activity, food intake is not “down-regulated” to balance a reduced energy expenditure. Conclusion: Evidence suggests that a high level of physical activity can aid weight control either by improving the matching of food intake to energy expenditure (regulation) or by raising expenditure so that it is difficult for people to eat themselves into a positive energy balance.
Resumo:
Purpose The purpose of this paper is to explore the process, and analyse the implementation of constructability improvement and innovation result during the planning and design for sea water intake structure of fertilizer plant project. Design/methodology/approach The research methodology approach is case study method at project level. This constructability improvement process was investigated by using constructability implementation check lists, direct observation, documented lesson learned analysis and key personnel interviews. Findings The case study shows that the implementation of constructability during planning and design stage for this sea water intake structure has increased the project performance as well as improved the schedule by 5 months (14.21%) and reduced the project cost by 15.35%. Research limitations/implications This case study was limited to three (3) previous sea water intake projects as references and one (1) of new method sea water intake structure at fertilizer plant project. Practical implications A constructability improvement check list using theory and lesson learned for the specific construction project was documented. Originality/value The findings support the relevant study of constructability and provide specific lesson learned for three (3) previous project and one (1) of the new innovation method of the construction project and documented by the company.
Resumo:
Nutrition interventions in the form of both self-management education and individualised diet therapy are considered essential for the long-term management of type 2 diabetes mellitus (T2DM). The measurement of diet is essential to inform, support and evaluate nutrition interventions in the management of T2DM. Barriers inherent within health care settings and systems limit ongoing access to personnel and resources, while traditional prospective methods of assessing diet are burdensome for the individual and often result in changes in typical intake to facilitate recording. This thesis investigated the inclusion of information and communication technologies (ICT) to overcome limitations to current approaches in the nutritional management of T2DM, in particular the development, trial and evaluation of the Nutricam dietary assessment method (NuDAM) consisting of a mobile phone photo/voice application to assess nutrient intake in a free-living environment with older adults with T2DM. Study 1: Effectiveness of an automated telephone system in promoting change in dietary intake among adults with T2DM The effectiveness of an automated telephone system, Telephone-Linked Care (TLC) Diabetes, designed to deliver self-management education was evaluated in terms of promoting dietary change in adults with T2DM and sub-optimal glycaemic control. In this secondary data analysis independent of the larger randomised controlled trial, complete data was available for 95 adults (59 male; mean age(±SD)=56.8±8.1 years; mean(±SD)BMI=34.2±7.0kg/m2). The treatment effect showed a reduction in total fat of 1.4% and saturated fat of 0.9% energy intake, body weight of 0.7 kg and waist circumference of 2.0 cm. In addition, a significant increase in the nutrition self-efficacy score of 1.3 (p<0.05) was observed in the TLC group compared to the control group. The modest trends observed in this study indicate that the TLC Diabetes system does support the adoption of positive nutrition behaviours as a result of diabetes self-management education, however caution must be applied in the interpretation of results due to the inherent limitations of the dietary assessment method used. The decision to use a close-list FFQ with known bias may have influenced the accuracy of reporting dietary intake in this instance. This study provided an example of the methodological challenges experienced with measuring changes in absolute diet using a FFQ, and reaffirmed the need for novel prospective assessment methods capable of capturing natural variance in usual intakes. Study 2: The development and trial of NuDAM recording protocol The feasibility of the Nutricam mobile phone photo/voice dietary record was evaluated in 10 adults with T2DM (6 Male; age=64.7±3.8 years; BMI=33.9±7.0 kg/m2). Intake was recorded over a 3-day period using both Nutricam and a written estimated food record (EFR). Compared to the EFR, the Nutricam device was found to be acceptable among subjects, however, energy intake was under-recorded using Nutricam (-0.6±0.8 MJ/day; p<0.05). Beverages and snacks were the items most frequently not recorded using Nutricam; however forgotten meals contributed to the greatest difference in energy intake between records. In addition, the quality of dietary data recorded using Nutricam was unacceptable for just under one-third of entries. It was concluded that an additional mechanism was necessary to complement dietary information collected via Nutricam. Modifications to the method were made to allow for clarification of Nutricam entries and probing forgotten foods during a brief phone call to the subject the following morning. The revised recording protocol was evaluated in Study 4. Study 3: The development and trial of the NuDAM analysis protocol Part A explored the effect of the type of portion size estimation aid (PSEA) on the error associated with quantifying four portions of 15 single foods items contained in photographs. Seventeen dietetic students (1 male; age=24.7±9.1 years; BMI=21.1±1.9 kg/m2) estimated all food portions on two occasions: without aids and with aids (food models or reference food photographs). Overall, the use of a PSEA significantly reduced mean (±SD) group error between estimates compared to no aid (-2.5±11.5% vs. 19.0±28.8%; p<0.05). The type of PSEA (i.e. food models vs. reference food photograph) did not have a notable effect on the group estimation error (-6.7±14.9% vs. 1.4±5.9%, respectively; p=0.321). This exploratory study provided evidence that the use of aids in general, rather than the type, was more effective in reducing estimation error. Findings guided the development of the Dietary Estimation and Assessment Tool (DEAT) for use in the analysis of the Nutricam dietary record. Part B evaluated the effect of the DEAT on the error associated with the quantification of two 3-day Nutricam dietary records in a sample of 29 dietetic students (2 males; age=23.3±5.1 years; BMI=20.6±1.9 kg/m2). Subjects were randomised into two groups: Group A and Group B. For Record 1, the use of the DEAT (Group A) resulted in a smaller error compared to estimations made without the tool (Group B) (17.7±15.8%/day vs. 34.0±22.6%/day, p=0.331; respectively). In comparison, all subjects used the DEAT to estimate Record 2, with resultant error similar between Group A and B (21.2±19.2%/day vs. 25.8±13.6%/day; p=0.377 respectively). In general, the moderate estimation error associated with quantifying food items did not translate into clinically significant differences in the nutrient profile of the Nutricam dietary records, only amorphous foods were notably over-estimated in energy content without the use of the DEAT (57kJ/day vs. 274kJ/day; p<0.001). A large proportion (89.6%) of the group found the DEAT helpful when quantifying food items contained in the Nutricam dietary records. The use of the DEAT reduced quantification error, minimising any potential effect on the estimation of energy and macronutrient intake. Study 4: Evaluation of the NuDAM The accuracy and inter-rater reliability of the NuDAM to assess energy and macronutrient intake was evaluated in a sample of 10 adults (6 males; age=61.2±6.9 years; BMI=31.0±4.5 kg/m2). Intake recorded using both the NuDAM and a weighed food record (WFR) was coded by three dietitians and compared with an objective measure of total energy expenditure (TEE) obtained using the doubly labelled water technique. At the group level, energy intake (EI) was under-reported to a similar extent using both methods, with the ratio of EI:TEE was 0.76±0.20 for the NuDAM and 0.76±0.17 for the WFR. At the individual level, four subjects reported implausible levels of energy intake using the WFR method, compared to three using the NuDAM. Overall, moderate to high correlation coefficients (r=0.57-0.85) were found across energy and macronutrients except fat (r=0.24) between the two dietary measures. High agreement was observed between dietitians for estimates of energy and macronutrient derived for both the NuDAM (ICC=0.77-0.99; p<0.001) and WFR (ICC=0.82-0.99; p<0.001). All subjects preferred using the NuDAM over the WFR to record intake and were willing to use the novel method again over longer recording periods. This research program explored two novel approaches which utilised distinct technologies to aid in the nutritional management of adults with T2DM. In particular, this thesis makes a significant contribution to the evidence base surrounding the use of PhRs through the development, trial and evaluation of a novel mobile phone photo/voice dietary record. The NuDAM is an extremely promising advancement in the nutritional management of individuals with diabetes and other chronic conditions. Future applications lie in integrating the NuDAM with other technologies to facilitate practice across the remaining stages of the nutrition care process.
Resumo:
Changing sodium intake from 70-200 mmol/day elevates blood pressure in normotensive volunteers by 6/4 mmHg. Older people, people with reduced renal function on a low sodium diet and people with a family history of hypertension are more likely to show this effect. The rise in blood pressure was associated with a fall in plasma volume suggesting that plasma volume changes do not initiate hypertension. In normotensive individuals the most common abnormality in membrane sodium transport induced by an extra sodium load was an increased permeability of the red cell to sodium. Some normotensive individuals also had an increase in the level of a plasma inhibitor that inhibited Na-K ATPase. These individuals also appeared to have a rise in blood pressure. Sodium intake and blood pressure are related. The relationship differs in different people and is probably controlled by the genetically inherited capacity of systems involved in membrane sodium transport.
Resumo:
Aim: Up to 60% of older medical patients are malnourished with further decline during hospital stay. There is limited evidence for effective nutrition intervention. Staff focus groups were conducted to improve understanding of potential contextual and cultural barriers to feeding older adults in hospital. Methods: Three focus groups involved 22 staff working on the acute medical wards of a large tertiary teaching hospital. Staff disciplines were nursing, dietetics, speech pathology, occupational therapy, physiotherapy, pharmacy. A semistructured topic guide was used by the same facilitator to prompt discussions on hospital nutrition care including barriers. Focus groups were tape-recorded, transcribed and analysed thematically. Results: All staff recognised malnutrition to be an important problem in older patients during hospital stay and identified patient-level barriers to nutrition care such as non-compliance to feeding plans and hospital-level barriers including nursing staff shortages. Differences between disciplines revealed a lack of a coordinated approach, including poor knowledge of nutrition care processes, poor interdisciplinary communication, and a lack of a sense of shared responsibility/coordinated approach to nutrition care. All staff talked about competing activities at meal times and felt disempowered to prioritise nutrition in the acute medical setting. Staff agreed education and ‘extra hands’ would address most barriers but did not consider organisational change. Conclusions: Redesigning the model of care to reprioritise meal-time activities and redefine multidisciplinary roles and responsibilities would support coordinated nutrition care. However, effectiveness may also depend on hospitalwide leadership and support to empower staff and increase accountability within a team-led approach.
Resumo:
Abstract Objective: To explore whether area-level socioeconomic position or the form of retail stream (conventional versus farmers’ market) are associated with differences in the price, availability, variety and quality of a range of fresh fruit and vegetables. Design: A multi-site cross-sectional pilot study of farmers’ markets, supermarkets and independent fruit and vegetable retailers. Each was surveyed to assess the price, availability, variety and quality of 15 fruit and 18 vegetable items. Setting: Retail outlets were located in South-East Queensland. Subjects: Fifteen retail outlets were surveyed (five of each retail stream). Results: Average basket prices were not significantly different across the socioeconomic spectrum however prices in low socioeconomic areas were cheapest. Availability, variety, and quality did not differ across levels of socioeconomic position however the areas with the most socioeconomic disadvantage scored poorest for quality and variety. Supermarkets had significantly better fruit and vegetable availability than farmers’ markets however price, variety and quality scores were not different across retail streams. Results demonstrate a trend to fruit and vegetable prices being more expensive at farmers’ markets, with the price of the Fruit basket being significantly greater at the organic farmer’s market compared with the non-organic farmers’ markets. Conclusions: Neither area-level socioeconomic position nor the form of retail stream was significantly associated with differences in the availability, price, variety and quality of fruit and vegetables, except for availability which was higher in supermarkets than farmers’ markets. Further research is needed to determine what role farmers’ markets can play in affecting fruit and vegetable intake.
Resumo:
Rationale: The Australasian Nutrition Care Day Survey (ANCDS) evaluated if malnutrition and decreased food intake are independent risk factors for negative outcomes in hospitalised patients. Methods: A multicentre (56 hospitals) cross-sectional survey was conducted in two phases. Phase 1 evaluated nutritional status (defined by Subjective Global Assessment) and 24-hour food intake recorded as 0, 25, 50, 75, and 100% intake. Phase 2 data, which included length of stay (LOS), readmissions and mortality, were collected 90 days post-Phase 1. Logistic regression was used to control for confounders: age, gender, disease type and severity (using Patient Clinical Complexity Level scores). Results: Of 3122 participants (53% males, mean age: 65±18 years) 32% were malnourished and 23% consumed�25% of the offered food. Median LOS for malnourished (MN) patients was higher than well-nourished (WN) patients (15 vs. 10 days, p<0.0001). Median LOS for patients consuming �25% of the food was higher than those consuming �50% (13 vs. 11 days, p<0.0001). MN patients had higher readmission rates (36% vs. 30%, p = 0.001). The odds ratios of 90-day in-hospital mortality were 1.8 times greater for MN patients (CI: 1.03 3.22, p = 0.04) and 2.7 times greater for those consuming �25% of the offered food (CI: 1.54 4.68, p = 0.001). Conclusion: The ANCDS demonstrates that malnutrition and/or decreased food intake are associated with longer LOS and readmissions. The survey also establishes that malnutrition and decreased food intake are independent risk factors for in-hospital mortality in acute care patients; and highlights the need for appropriate nutritional screening and support during hospitalisation. Disclosure of Interest: None Declared.
Resumo:
Traditionally, infectious diseases and under-nutrition have been considered major health problems in Sri Lanka with little attention paid to obesity and associated non-communicable diseases (NCDs). However, the recent Sri Lanka Diabetes and Cardiovascular Study (SLDCS) reported the epidemic level of obesity, diabetes and metabolic syndrome. Moreover, obesity-associated NCDs is the leading cause of death in Sri Lanka and there is an exponential increase in hospitalization due to NCDs adversely affecting the development of the country. Despite Sri Lanka having a very high prevalence of NCDs and associated mortality, little is known about the causative factors for this burden. It is widely believed that the global NCD epidemic is associated with recent lifestyle changes, especially dietary factors. In the absence of sufficient data on dietary habits in Sri Lanka, successful interventions to manage these serious health issues would not be possible. In view of the current situation the dietary survey was undertaken to assess the intakes of energy, macro-nutrients and selected other nutrients with respect to socio demographic characteristics and the nutritional status of Sri Lankan adults especially focusing on obesity. Another aim of this study was to develop and validate a culturally specific food frequency questionnaire (FFQ) to assess dietary risk factors of NCDs in Sri Lankan adults. Data were collected from a subset of the national SLDCS using a multi-stage, stratified, random sampling procedure (n=500). However, data collection in the SLDCS was affected by the prevailing civil war which resulted in no data being collected from Northern and Eastern provinces. To obtain a nationally representative sample, additional subjects (n=100) were later recruited from the two provinces using similar selection criteria. Ethical Approval for this study was obtained from the Ethical Review Committee, Faculty of Medicine, University of Colombo, Sri Lanka and informed consent was obtained from the subjects before data were collected. Dietary data were obtained using the 24-h Dietary Recall (24HDR) method. Subjects were asked to recall all foods and beverages, consumed over the previous 24-hour period. Respondents were probed for the types of foods and food preparation methods. For the FFQ validation study, a 7-day weight diet record (7-d WDR) was used as the reference method. All foods recorded in the 24 HDR were converted into grams and then intake of energy and nutrients were analysed using NutriSurvey 2007 (EBISpro, Germany) which was modified for Sri Lankan food recipes. Socio-demographic details and body weight perception were collected from interviewer-administrated questionnaire. BMI was calculated and overweight (BMI ≥23 kg.m-2), obesity (BMI ≥25 kg.m-2) and abdominal obesity (Men: WC ≥ 90 cm; Women: WC ≥ 80 cm) were categorized according to Asia-pacific anthropometric cut-offs. The SPSS v. 16 for Windows and Minitab v10 were used for statistical analysis purposes. From a total of 600 eligible subjects, 491 (81.8%) participated of whom 34.5% (n=169) were males. Subjects were well distributed among different socio-economic parameters. A total of 312 different food items were recorded and nutritionists grouped similar food items which resulted in a total of 178 items. After performing step-wise multiple regression, 93 foods explained 90% of the variance for total energy intake, carbohydrates, protein, total fat and dietary fibre. Finally, 90 food items and 12 photographs were selected. Seventy-seven subjects completed (response rate = 65%) the FFQ and 7-day WDR. Estimated mean energy intake (SD) from FFQ (1794±398 kcal) and 7DWR (1698±333 kcal, P<0.001) was significantly different due to a significant overestimation of carbohydrate (~10 g/d, P<0.001) and to some extent fat (~5 g/d, NS). Significant positive correlations were found between the FFQ and 7DWR for energy (r = 0.39), carbohydrate (r = 0.47), protein (r = 0.26), fat (r =0.17) and dietary fiber (r = 0.32). Bland-Altman graphs indicated fairly good agreement between methods with no relationship between bias and average intake of each nutrient examined. The findings from the nutrition survey showed on average, Sri Lankan adults consumed over 14 portions of starch/d; moreover, males consumed 5 more portions of cereal than females. Sri Lankan adults consumed on average 3.56 portions of added sugars/d. Moreover, mean daily intake of fruit (0.43) and vegetable (1.73) portions was well below minimum dietary recommendations (fruits 2 portions/d; vegetables 3 portions/d). The total fruit and vegetable intake was 2.16 portions/d. Daily consumption of meat or alternatives was 1.75 portions and the sum of meat and pulses was 2.78 portions/d. Starchy foods were consumed by all participants and over 88% met the minimum daily recommendations. Importantly, nearly 70% of adults exceeded the maximum daily recommendation for starch (11portions/d) and a considerable proportion consumed larger numbers of starch servings daily, particularly men. More than 12% of men consumed over 25 starch servings/d. In contrast to their starch consumption, participants reported very low intakes of other food groups. Only 11.6%, 2.1% and 3.5% of adults consumed the minimum daily recommended servings of vegetables, fruits, and fruits and vegetables combined, respectively. Six out of ten adult Sri Lankans sampled did not consume any fruits. Milk and dairy consumption was extremely low; over a third of the population did not consume any dairy products and less than 1% of adults consumed 2 portions of dairy/d. A quarter of Sri Lankans did not report consumption of meat and pulses. Regarding protein consumption, 36.2% attained the minimum Sri Lankan recommendation for protein; and significantly more men than women achieved the recommendation of ≥3 servings of meat or alternatives daily (men 42.6%, women 32.8%; P<0.05). Over 70% of energy was derived from carbohydrates (Male:72.8±6.4%, Female:73.9±6.7%), followed by fat (Male:19.9±6.1%, Female:18.5±5.7%) and proteins (Male:10.6±2.1%, Female:10.9±5.6%). The average intake of dietary fiber was 21.3 g/day and 16.3 g/day for males and females, respectively. There was a significant difference in nutritional intake related to ethnicities, areas of residence, education levels and BMI categories. Similarly, dietary diversity was significantly associated with several socio-economic parameters among Sri Lankan adults. Adults with BMI ≥25 kg.m-2 and abdominally obese Sri Lankan adults had the highest diet diversity values. Age-adjusted prevalence (95% confidence interval) of overweight, obesity, and abdominal obesity among Sri Lankan adults were 17.1% (13.8-20.7), 28.8% (24.8-33.1), and 30.8% (26.8-35.2), respectively. Men, compared with women, were less overweight, 14.2% (9.4-20.5) versus 18.5% (14.4-23.3), P = 0.03, less obese, 21.0% (14.9-27.7) versus 32.7% (27.6-38.2), P < .05; and less abdominally obese, 11.9% (7.4-17.8) versus 40.6% (35.1-46.2), P < .05. Although, prevalence of obesity has reached to epidemic level body weight misperception was common among Sri Lankan adults. Two-thirds of overweight males and 44.7% of females considered themselves as in "about right weight". Over one third of both male and female obese subjects perceived themselves as "about right weight" or "underweight". Nearly 32% of centrally obese men and women perceived that their waist circumference is about right. People who perceived overweight or very overweight (n = 154) only 63.6% tried to lose their body weight (n = 98), and quarter of adults seek advices from professionals (n = 39). A number of important conclusions can be drawn from this research project. Firstly, the newly developed FFQ is an acceptable tool for assessing the nutrient intake of Sri Lankans and will assist proper categorization of individuals by dietary exposure. Secondly, a substantial proportion of the Sri Lankan population does not consume a varied and balanced diet, which is suggestive of a close association between the nutrition-related NCDs in the country and unhealthy eating habits. Moreover, dietary diversity is positively associated with several socio-demographic characteristics and obesity among Sri Lankan adults. Lastly, although obesity is a major health issue among Sri Lankan adults, body weight misperception was common among underweight, healthy weight, overweight, and obese adults in Sri Lanka. Over 2/3 of overweight and 1/3 of obese Sri Lankan adults believe that they are in "right weight" or "under-weight" categories.
Resumo:
Quantifying the competing rates of intake and elimination of persistent organic pollutants (POPs) in the human body is necessary to understand the levels and trends of POPs at a population level. In this paper we reconstruct the historical intake and elimination of ten polychlorinated biphenyls (PCBs) and five organochlorine pesticides (OCPs) from Australian biomonitoring data by fitting a population-level pharmacokinetic (PK) model. Our analysis exploits two sets of cross-sectional biomonitoring data for PCBs and OCPs in pooled blood serum samples from the Australian population that were collected in 2003 and 2009. The modeled adult reference intakes in 1975 for PCB congeners ranged from 0.89 to 24.5 ng/kg bw/day, lower than the daily intakes of OCPs ranging from 73 to 970 ng/kg bw/day. Modeled intake rates are declining with half-times from 1.1 to 1.3 years for PCB congeners and 0.83 to 0.97 years for OCPs. The shortest modeled intrinsic human elimination half-life among the compounds studied here is 6.4 years for hexachlorobenzene, and the longest is 30 years for PCB-74. Our results indicate that it is feasible to reconstruct intakes and to estimate intrinsic human elimination half-lives using the population-level PK model and biomonitoring data only. Our modeled intrinsic human elimination half-lives are in good agreement with values from a similar study carried out for the population of the United Kingdom, and are generally longer than reported values from other industrialized countries in the Northern Hemisphere.