144 resultados para Body weight, dry


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective: To investigate the validity of the Trendelenburg test (TT) using an ultrasound-guided nerve block (UNB) of the superior gluteal nerve and determine whether the reduction in hip abductor muscle (HABD) strength would result in the theorized mechanical compensatory strategies measured during the TT. Design: Quasi-experimental. Setting: Hospital. Participants: Convenience sample of 9 healthy men. Only participants with no current or previous injury to the lumbar spine, pelvis, or lower extremities, and no previous surgeries were included. Interventions: Ultrasound-guided nerve block. Main Outcome Measures: Hip abductor muscle strength (percent body weight [%BW]), contralateral pelvic drop (cPD), change in contralateral pelvic drop (Delta cPD), ipsilateral hip adduction, and ipsilateral trunk sway (TRUNK) measured in degrees. Results: The median age and weight of the participants were 31 years (interquartile range [IQR], 22-32 years) and 73 kg (IQR, 67-81 kg), respectively. An average 52% reduction of HABD strength (z = 2.36, P = 0.02) resulted after the UNB. No differences were found in cPD or Delta cPD (z = 0.01, P = 0.99, z = 20.67, P = 0.49, respectively). Individual changes in biomechanics showed no consistency between participants and nonsystematic changes across the group. One participant demonstrated the mechanical compensations described by Trendelenburg. Conclusions: The TT should not be used as a screening measure for HABD strength in populations demonstrating strength greater than 30% BW but should be reserved for use with populations with marked HABD weakness. Clinical Relevance: This study presents data regarding a critical level of HABD strength required to support the pelvis during the TT.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Diet Induced Thermogenesis (DIT) is the energy expended consequent to meal consumption, and reflects the energy required for the processing and digestion of food consumed throughout each day. Although DIT is the total energy expended across a day in digestive processes to a number of meals, most studies measure thermogenesis in response to a single meal (Meal Induced Thermogenesis: MIT) as a representation of an individual’s thermogenic response to acute food ingestion. As a component of energy expenditure, DIT may have a contributing role in weight gain and weight loss. While the evidence is inconsistent, research has tended to reveal a suppressed MIT response in obese compared to lean individuals, which identifies individuals with an efficient storage of food energy, hence a greater tendency for weight gain. Appetite is another factor regulating body weight through its influence on energy intake. Preliminary research has shown a potential link between MIT and postprandial appetite as both are responses to food ingestion and have a similar response dependent upon the macronutrient content of food. There is a growing interest in understanding how both MIT and appetite are modified with changes in diet, activity levels and body size. However, the findings from MIT research have been highly inconsistent, potentially due to the vastly divergent protocols used for its measurement. Therefore, the main theme of this thesis was firstly, to address some of the methodological issues associated with measuring MIT. Additionally this thesis aimed to measure postprandial appetite simultaneously to MIT to test for any relationships between these meal-induced variables and to assess changes that occur in MIT and postprandial appetite during periods of energy restriction (ER) and following weight loss. Two separate studies were conducted to achieve these aims. Based on the increasing prevalence of obesity, it is important to develop accurate methodologies for measuring the components potentially contributing to its development and to understand the variability within these variables. Therefore, the aim of Study One was to establish a protocol for measuring the thermogenic response to a single test meal (MIT), as a representation of DIT across a day. This was done by determining the reproducibility of MIT with a continuous measurement protocol and determining the effect of measurement duration. The benefit of a fixed resting metabolic rate (RMR), which is a single measure of RMR used to calculate each subsequent measure of MIT, compared to separate baseline RMRs, which are separate measures of RMR measured immediately prior to each MIT test meal to calculate each measure of MIT, was also assessed to determine the method with greater reproducibility. Subsidiary aims were to measure postprandial appetite simultaneously to MIT, to determine its reproducibility between days and to assess potential relationships between these two variables. Ten healthy individuals (5 males, 5 females, age = 30.2 ± 7.6 years, BMI = 22.3 ± 1.9 kg/m2, %Fat Mass = 27.6 ± 5.9%) undertook three testing sessions within a 1-4 week time period. During the first visit, participants had their body composition measured using DXA for descriptive purposes, then had an initial 30-minute measure of RMR to familiarise them with the testing and to be used as a fixed baseline for calculating MIT. During the second and third testing sessions, MIT was measured. Measures of RMR and MIT were undertaken using a metabolic cart with a ventilated hood to measure energy expenditure via indirect calorimetry with participants in a semi-reclined position. The procedure on each MIT test day was: 1) a baseline RMR measured for 30 minutes, 2) a 15-minute break in the measure to consume a standard 576 kcal breakfast (54.3% CHO, 14.3% PRO, 31.4% FAT), comprising muesli, milk toast, butter, jam and juice, and 3) six hours of measuring MIT with two, ten-minute breaks at 3 and 4.5 hours for participants to visit the bathroom. On the MIT test days, pre and post breakfast then at 45-minute intervals, participants rated their subjective appetite, alertness and comfort on visual analogue scales (VAS). Prior to each test, participants were required to be fasted for 12 hours, and have undertaken no high intensity physical activity for the previous 48 hours. Despite no significant group changes in the MIT response between days, individual variability was high with an average between-day CV of 33%, which was not significantly improved by the use of a fixed RMR to 31%. The 95% limits of agreements which ranged from 9.9% of energy intake (%EI) to -10.7%EI with the baseline RMRs and between 9.6%EI to -12.4%EI with the fixed RMR, indicated very large changes relative to the size of the average MIT response (MIT 1: 8.4%EI, 13.3%EI; MIT 2: 8.8%EI, 14.7%EI; baseline and fixed RMRs respectively). After just three hours, the between-day CV with the baseline RMR was 26%, which may indicate an enhanced MIT reproducibility with shorter measurement durations. On average, 76, 89, and 96% of the six-hour MIT response was completed within three, four and five hours, respectively. Strong correlations were found between MIT at each of these time points and the total six-hour MIT (range for correlations r = 0.990 to 0.998; P < 0.01). The reproducibility of the proportion of the six-hour MIT completed at 3, 4 and 5 hours was reproducible (between-day CVs ≤ 8.5%). This indicated the suitability to use shorter durations on repeated occasions and a similar percent of the total response to be completed. There was a lack of strong evidence of any relationship between the magnitude of the MIT response and subjective postprandial appetite. Given a six-hour protocol places a considerable burden on participants, these results suggests that a post-meal measurement period of only three hours is sufficient to produce valid information on the metabolic response to a meal. However while there was no mean change in MIT between test days, individual variability was large. Further research is required to better understand which factors best explain the between-day variability in this physiological measure. With such a high prevalence of obesity, dieting has become a necessity to reduce body weight. However, during periods of ER, metabolic and appetite adaptations can occur which may impede weight loss. Understanding how metabolic and appetite factors change during ER and weight loss is important for designing optimal weight loss protocols. The purpose of Study Two was to measure the changes in the MIT response and subjective postprandial appetite during either continuous (CONT) or intermittent (INT) ER and following post diet energy balance (post-diet EB). Thirty-six obese male participants were randomly assigned to either the CONT (Age = 38.6 ± 7.0 years, weight = 109.8 ± 9.2 kg, % fat mass = 38.2 ± 5.2%) or INT diet groups (Age = 39.1 ± 9.1 years, weight = 107.1 ± 12.5 kg, % fat mass = 39.6 ± 6.8%). The study was divided into three phases: a four-week baseline (BL) phase where participants were provided with a diet to maintain body weight, an ER phase lasting either 16 (CONT) or 30 (INT) weeks, where participants were provided with a diet which supplied 67% of their energy balance requirements to induce weight loss and an eight-week post-diet EB phase, providing a diet to maintain body weight post weight loss. The INT ER phase was delivered as eight, two-week blocks of ER interspersed with two-week blocks designed to achieve weight maintenance. Energy requirements for each phase were predicted based on measured RMR, and adjusted throughout the study to account for changes in RMR. All participants completed MIT and appetite tests during BL and the ER phase. Nine CONT and 15 INT participants completed the post-diet EB MIT and 14 INT and 15 CONT participants completed the post-diet EB appetite tests. The MIT test day protocol was as follows: 1) a baseline RMR measured for 30 minutes, 2) a 15-minute break in the measure to consume a standard breakfast meal (874 kcal, 53.3% CHO, 14.5% PRO, 32.2% FAT), and 3) three hours of measuring MIT. MIT was calculated as the energy expenditure above the pre-meal RMR. Appetite test days were undertaken on a separate day using the same 576 kcal breakfast used in Study One. VAS were used to assess appetite pre and post breakfast, at one hour post breakfast then a further three times at 45-minute intervals. Appetite ratings were calculated for hunger and fullness as both the intra-meal change in appetite and the AUC. The three-hour MIT response at BL, ER and post-diet EB respectively were 5.4 ± 1.4%EI, 5.1 ± 1.3%EI and 5.0 ± 0.8%EI for the CONT group and 4.4 ± 1.0%EI, 4.7 ± 1.0%EI and 4.8 ± 0.8%EI for the INT group. Compared to BL, neither group had significant changes in their MIT response during ER or post-diet EB. There were no significant time by group interactions (p = 0.17) indicating a similar response to ER and post-diet EB in both groups. Contrary to what was hypothesised, there was a significant increase in postprandial AUC fullness in response to ER in both groups (p < 0.05). However, there were no significant changes in any of the other postprandial hunger or fullness variables. Despite no changes in MIT in both the CONT or INT group in response to ER or post-diet EB and only a minor increase in postprandial AUC fullness, the individual changes in MIT and postprandial appetite in response to ER were large. However those with the greatest MIT changes did not have the greatest changes in postprandial appetite. This study shows that postprandial appetite and MIT are unlikely to be altered during ER and are unlikely to hinder weight loss. Additionally, there were no changes in MIT in response to weight loss, indicating that body weight did not influence the magnitude of the MIT response. There were large individual changes in both variables, however further research is required to determine whether these changes were real compensatory changes to ER or simply between-day variation. Overall, the results of this thesis add to the current literature by showing the large variability of continuous MIT measurements, which make it difficult to compare MIT between groups and in response to diet interventions. This thesis was able to provide evidence to suggest that shorter measures may provide equally valid information about the total MIT response and can therefore be utilised in future research in order to reduce the burden of long measurements durations. This thesis indicates that MIT and postprandial subjective appetite are most likely independent of each other. This thesis also shows that, on average, energy restriction was not associated with compensatory changes in MIT and postprandial appetite that would have impeded weight loss. However, the large inter-individual variability supports the need to examine individual responses in more detail.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective To evaluate the time course of the recovery of transverse strain in the Achilles and patellar tendon following a bout of resistance exercise. Methods Seventeen healthy adults underwent sonographic examination of the right patellar (n=9) and Achilles (n=8) tendons immediately prior to and following 90 repetitions of weight-bearing quadriceps and gastrocnemius-resistance exercise performed against an effective resistance of 175% and 250% body weight, respectively. Sagittal tendon thickness was determined 20 mm from the enthesis and transverse strain, as defined by the stretch ratio, was repeatedly monitored over a 24 h recovery period. Results Resistance exercise resulted in an immediate decrease in Achilles (t7=10.6, p<0.01) and patellar (t8=8.9, p<0.01) tendon thickness, resulting in an average transverse stretch ratio of 0.86±0.04 and 0.82±0.05, which was not significantly different between tendons. The magnitude of the immediate transverse strain response, however, was reduced with advancing age (r=0.63, p<0.01). Recovery in transverse strain was prolonged compared with the duration of loading and exponential in nature. The average primary recovery time was not significantly different between the Achilles (6.5±3.2 h) and patellar (7.1±3.2 h) tendons. Body weight accounted for 62% and 64% of the variation in recovery time, respectively. Conclusions Despite structural and biochemical differences between the Achilles and patellar tendon, the mechanisms underlying transverse creep recovery in vivo appear similar and are highly time dependent. These novel findings have important implications concerning the time required for the mechanical recovery of high-stress tendons following an acute bout of exercise.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introduction: Understanding the mechanical properties of tendon is an important step to guiding the process of improving athletic performance, predicting injury and treating tendinopathies. The speed of sound in a medium is governed by the bulk modulus and density for fluids and isotropic materials. However, for tendon,which is a structural composite of fluid and collagen, there is some anisotropy requiring an adjustment for Poisson’s ratio. In this paper, these relationships are explored and modelled using data collected, in vivo, on human Achilles tendon. Estimates for elastic modulus and hysteresis based on speed of sound data are then compared against published values from in vitro mechanical tests. Methods: Measurements using clinical ultrasound imaging, inverse dynamics and acoustic transmission techniques were used to determine dimensions, loading conditions and longitudinal speed of sound for the Achilles tendon during a series of isometric plantar flexion exercises against body weight. Upper and lower bounds for speed of sound versus tensile stress in the tendon were then modelled and estimates derived for elastic modulus and hysteresis. Results: Axial speed of sound varied between 1850 to 2090 m.s−1 with a non-linear, asymptotic dependency on the level of tensile stress in the tendon 5–35 MPa. Estimates derived for the elastic modulus ranged between 1–2 GPa. Hysteresis derived from models of the stress-strain relationship, ranged from 3–11%. These values agree closely with those previously reported from direct measurements obtained via in vitro mechanical tensile tests on major weight bearing tendons. Discussion: There is sufficiently good agreement between these indirect (speed of sound derived) and direct (mechanical tensile test derived) measures of tendon mechanical properties to validate the use of this non-invasive acoustic transmission technique. This non-invasive method is suitable for monitoring changes in tendon properties as predictors of athletic performance, injury or therapeutic progression.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The giant freshwater prawn (Macrobrachium rosenbergii) or GFP is one of the most important freshwater crustacean species in the inland aquaculture sector of many tropical and subtropical countries. Since the 1990’s, there has been rapid global expansion of freshwater prawn farming, especially in Asian countries, with an average annual rate of increase of 48% between 1999 and 2001 (New, 2005). In Vietnam, GFP is cultured in a variety of culture systems, typically in integrated or rotational rice-prawn culture (Phuong et al., 2006) and has become one of the most common farmed aquatic species in the country, due to its ability to grow rapidly and to attract high market price and high demand. Despite potential for expanded production, sustainability of freshwater prawn farming in the region is currently threatened by low production efficiency and vulnerability of farmed stocks to disease. Commercial large scale and small scale GFP farms in Vietnam have experienced relatively low stock productivity, large size and weight variation, a low proportion of edible meat (large head to body ratio), scarcity of good quality seed stock. The current situation highlights the need for a systematic stock improvement program for GFP in Vietnam aimed at improving economically important traits in this species. This study reports on the breeding program for fast growth employing combined (between and within) family selection in giant freshwater prawn in Vietnam. The base population was synthesized using a complete diallel cross including 9 crosses from two local stocks (DN and MK strains) and a third exotic stock (Malaysian strain - MY). In the next three selection generations, matings were conducted between genetically unrelated brood stock to produce full-sib and (paternal) half-sib families. All families were produced and reared separately until juveniles in each family were tagged as a batch using visible implant elastomer (VIE) at a body size of approximately 2 g. After tags were verified, 60 to 120 juveniles chosen randomly from each family were released into two common earthen ponds of 3,500 m2 pond for a grow-out period of 16 to 18 weeks. Selection applied at harvest on body weight was a combined (between and within) family selection approach. 81, 89, 96 and 114 families were produced for the Selection line in the F0, F1, F2 and F3 generations, respectively. In addition to the Selection line, 17 to 42 families were produced for the Control group in each generation. Results reported here are based on a data set consisting of 18,387 body and 1,730 carcass records, as well as full pedigree information collected over four generations. Variance and covariance components were estimated by restricted maximum likelihood fitting a multi-trait animal model. Experiments assessed performance of VIE tags in juvenile GFP of different size classes and individuals tagged with different numbers of tags showed that juvenile GFP at 2 g were of suitable size for VIE tags with no negative effects evident on growth or survival. Tag retention rates were above 97.8% and tag readability rates were 100% with a correct assignment rate of 95% through to mature animal size of up to 170 g. Across generations, estimates of heritability for body traits (body weight, body length, cephalothorax length, abdominal length, cephalothorax width and abdominal width) and carcass weight traits (abdominal weight, skeleton-off weight and telson-off weight) were moderate and ranged from 0.14 to 0.19 and 0.17 to 0.21, respectively. Body trait heritabilities estimated for females were significantly higher than for males whereas carcass weight trait heritabilities estimated for females and males were not significantly different (P > 0.05). Maternal and common environmental effects for body traits accounted for 4 to 5% of the total variance and were greater in females (7 to 10%) than in males (4 to 5%). Genetic correlations among body traits were generally high in both sexes. Genetic correlations between body and carcass weight traits were also high in the mixed sexes. Average selection response (% per generation) for body weight (transformed to square root) estimated as the difference between the Selection and the Control group was 7.4% calculated from least squares means (LSMs), 7.0% from estimated breeding values (EBVs) and 4.4% calculated from EBVs between two consecutive generations. Favourable correlated selection responses (estimated from LSMs) were detected for other body traits (12.1%, 14.5%, 10.4%, 15.5% and 13.3% for body length, cephalothorax length, abdominal length, cephalothorax width and abdominal width, respectively) over three selection generations. Data in the second selection generation showed positive correlated responses for carcass weight traits (8.8%, 8.6% and 8.8% for abdominal weight, skeleton-off weight and telson-off weight, respectively). Data in the third selection generation showed that heritability for body traits were moderate and ranged from 0.06 to 0.11 and 0.11 to 0.22 at weeks 10 and 18, respectively. Body trait heritabilities estimated at week 10 were not significantly lower than at week 18. Genetic correlations between body traits within age and genetic correlations for body traits between ages were generally high. Overall our results suggest that growth rate responds well to the application of family selection and carcass weight traits can also be improved in parallel, using this approach. Moreover, selection for high growth rate in GFP can be undertaken successfully before full market size has been reached. The outcome of this study was production of an improved culture strain of GFP for the Vietnamese culture industry that will be trialed in real farm production environments to confirm the genetic gains identified in the experimental stock improvement program.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Three native freshwater crayfish Cherax species are farmed in Australia namely; Redclaw (Cherax quadricarinatus), Marron (C. tenuimanus), and Yabby (C. destructor). Lack of appropriate data on specific nutrient requirements for each of these species, however, has constrained development of specific formulated diets and hence current use of over-formulated feeds or expensive marine shrimp feeds, limit their profitability. A number of studies have investigated nutritional requirements in redclaw that have focused on replacing expensive fish meal in formulated feeds with non-protein, less expensive substitutes including plant based ingredients. Confirmation that freshwater crayfish possess endogenous cellulase genes, suggests their potential ability to utilize complex carbohydrates like cellulose as nutrient sources in their diet. To date, studies have been limited to only C. quadricarinatus and C. destructor and no studies have compared the relative ability of each species to utilize soluble cellulose in their diets. Individual feeding trials of late-juveniles of each species were conducted separately in an automated recirculating culture system over 12 week cycles. Animals were fed either a test diet (TD) that contained 20% soluble cellulose or a reference diet (RD) substituted with the same amount of corn starch. Water temperature, conductivity and pH were maintained at constant and optimum levels for each species. Animals were fed at 3% of their body weight twice daily and wet body weight was recorded bi-weekly. At the end of experiment, all animals were harvested, measured and midgut gland extracts assayed for alpha-amylase, total protease and cellulase activity levels. After the trial period, redclaw fed with RD showed significantly higher (p<0.05) specific growth rate (SGR) compare with animals fed the TD while SGR of marron and yabby fed the two diets were not significantly different (p<0.05). Cellulase expression levels in redclaw were not significantly different between diets. Marron and yabby showed significantly higher cellulase activity when fed the RD. Amylase and protease activity in all three species were significantly higher in the animals fed with RD (Table 1). These results indicate that test animals of all species can utilize starch better than dietary soluble cellulose in their diet and inclusion of 20% soluble cellulose in diets does not appear to have any significant negative effect on their growth rate but survival was impacted in C. quadricarinatus while not in C. tenuimanus or C. destructor.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background & aims: - Excess adiposity (overweight) is one of numerous risk factors for cardiometabolic disease. Most risk reduction strategies for overweight rely on weight loss through dietary energy restriction. However, since the evidence base for long-term successful weight loss interventions is scant, it is important to identify strategies for risk reduction independent of weight loss. The aim of this study was to compare the effects of isoenergetic substitution of dietary saturated fat (SFA) with monounsaturated fat (MUFA) via macadamia nuts on coronary risk compared to usual diet in overweight adults. Methods: - A randomised controlled trial design, maintaining usual energy intake, but manipulating dietary lipid profile in a group of 64 (54 female, 10 male) overweight (BMI > 25), otherwise healthy, subjects. For the intervention group, energy intakes of usual (baseline) diets were calculated from multiple 3 day diet diaries, and SFA was replaced with MUFA (target: 50%E from fat as MUFA) by altering dietary SFA sources and adding macadamia nuts to the diet. Both control and intervention groups received advice on national guidelines for physical activity and adhered to the same protocol for diet diary record keeping and trial consultations. Anthropometric and clinical measures were taken at baseline and at 10 weeks. Results: A significant increase in brachial artery flow-mediated dilation (p < 0.05) was seen in the monounsaturated diet group at week 10 compared to baseline. This corresponded to significant decreases in waist circumference, total cholesterol (p < 0.05), plasma leptin and ICAM-1 (p < 0.01). Conclusions: - In patient subgroups where adherence to dietary energy-reduction is poor, isoenergetic interventions may improve endothelial function and other coronary risk factors without changes in body weight. This trial was registered with the Australia New Zealand Clinical Trial Registry (ACTRN12607000106437).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Obese (BMI ≥ 26 kg/m 2; n = 51) and lean (BMI <26 kg/m 2; n = 61) Caucasian patients with severe, familial essential hypertension, were compared with respect to genotype and allele frequencies of a HincII RFLP of the low density lipoprotein receptor gene (LDLR). A similar analysis was performed in obese (n = 28) and lean (n = 68) normotensives. A significant association of the C allele of the T→C variant responsible for this RFLP was seen with obesity (χ 2 = 4.6, P = 0.029) in the hypertensive, but not in the normotensive, group (odds ratio = 3.0 for the CC genotype and 2.7 for CT). Furthermore, BMI tracked with genotypes of this allele in the hypertensives (P = 0.046). No significant genotypic relationship was apparent for plasma lipids. Significant linkage disequilibrium was, moreover, noted between the HincII RFLP and an ApaLI RFLP (χ 2 = 33, P<0.0005) that has previously shown even stronger association with obesity (odds ratio 19.6 for cases homozygous for the susceptibility allele and 15.2 for het-erozygotes). The present study therefore adds to our previous evidence implicating LDLR as a locus for obesity in patients with essential hypertension.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

It has been reported that poor nutritional status, in the form of weight loss and resulting body mass index (BMI) changes, is an issue in people with Parkinson's disease (PWP). The symptoms resulting from Parkinson's disease (PD) and the side effects of PD medication have been implicated in the aetiology of nutritional decline. However, the evidence on which these claims are based is, on one hand, contradictory, and on the other, restricted primarily to otherwise healthy PWP. Despite the claims that PWP suffer from poor nutritional status, evidence is lacking to inform nutrition-related care for the management of malnutrition in PWP. The aims of this thesis were to better quantify the extent of poor nutritional status in PWP, determine the important factors differentiating the well-nourished from the malnourished and evaluate the effectiveness of an individualised nutrition intervention on nutritional status. Phase DBS: Nutritional status in people with Parkinson's disease scheduled for deep-brain stimulation surgery The pre-operative rate of malnutrition in a convenience sample of people with Parkinson's disease (PWP) scheduled for deep-brain stimulation (DBS) surgery was determined. Poorly controlled PD symptoms may result in a higher risk of malnutrition in this sub-group of PWP. Fifteen patients (11 male, median age 68.0 (42.0 – 78.0) years, median PD duration 6.75 (0.5 – 24.0) years) participated and data were collected during hospital admission for the DBS surgery. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference, waist circumference, body mass index (BMI)) were taken, and body composition was measured using bioelectrical impedance spectroscopy (BIS). Six (40%) of the participants were malnourished (SGA-B) while 53% reported significant weight loss following diagnosis. BMI was significantly different between SGA-A and SGA-B (25.6 vs 23.0kg/m 2, p<.05). There were no differences in any other variables, including PG-SGA score and the presence of non-motor symptoms. The conclusion was that malnutrition in this group is higher than that in other studies reporting malnutrition in PWP, and it is under-recognised. As poorer surgical outcomes are associated with poorer pre-operative nutritional status in other surgeries, it might be beneficial to identify patients at nutritional risk prior to surgery so that appropriate nutrition interventions can be implemented. Phase I: Nutritional status in community-dwelling adults with Parkinson's disease The rate of malnutrition in community-dwelling adults (>18 years) with Parkinson's disease was determined. One hundred twenty-five PWP (74 male, median age 70.0 (35.0 – 92.0) years, median PD duration 6.0 (0.0 – 31.0) years) participated. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference (MAC), calf circumference, waist circumference, body mass index (BMI)) were taken. Nineteen (15%) of the participants were malnourished (SGA-B). All anthropometric indices were significantly different between SGA-A and SGA-B (BMI 25.9 vs 20.0kg/m2; MAC 29.1 – 25.5cm; waist circumference 95.5 vs 82.5cm; calf circumference 36.5 vs 32.5cm; all p<.05). The PG-SGA score was also significantly lower in the malnourished (2 vs 8, p<.05). The nutrition impact symptoms which differentiated between well-nourished and malnourished were no appetite, constipation, diarrhoea, problems swallowing and feel full quickly. This study concluded that malnutrition in community-dwelling PWP is higher than that documented in community-dwelling elderly (2 – 11%), yet is likely to be under-recognised. Nutrition impact symptoms play a role in reduced intake. Appropriate screening and referral processes should be established for early detection of those at risk. Phase I: Nutrition assessment tools in people with Parkinson's disease There are a number of validated and reliable nutrition screening and assessment tools available for use. None of these tools have been evaluated in PWP. In the sample described above, the use of the World Health Organisation (WHO) cut-off (≤18.5kg/m2), age-specific BMI cut-offs (≤18.5kg/m2 for under 65 years, ≤23.5kg/m2 for 65 years and older) and the revised Mini-Nutritional Assessment short form (MNA-SF) were evaluated as nutrition screening tools. The PG-SGA (including the SGA classification) and the MNA full form were evaluated as nutrition assessment tools using the SGA classification as the gold standard. For screening, the MNA-SF performed the best with sensitivity (Sn) of 94.7% and specificity (Sp) of 78.3%. For assessment, the PG-SGA with a cut-off score of 4 (Sn 100%, Sp 69.8%) performed better than the MNA (Sn 84.2%, Sp 87.7%). As the MNA has been recommended more for use as a nutrition screening tool, the MNA-SF might be more appropriate and take less time to complete. The PG-SGA might be useful to inform and monitor nutrition interventions. Phase I: Predictors of poor nutritional status in people with Parkinson's disease A number of assessments were conducted as part of the Phase I research, including those for the severity of PD motor symptoms, cognitive function, depression, anxiety, non-motor symptoms, constipation, freezing of gait and the ability to carry out activities of daily living. A higher score in all of these assessments indicates greater impairment. In addition, information about medical conditions, medications, age, age at PD diagnosis and living situation was collected. These were compared between those classified as SGA-A and as SGA-B. Regression analysis was used to identify which factors were predictive of malnutrition (SGA-B). Differences between the groups included disease severity (4% more severe SGA-A vs 21% SGA-B, p<.05), activities of daily living score (13 SGA-A vs 18 SGA-B, p<.05), depressive symptom score (8 SGA-A vs 14 SGA-B, p<.05) and gastrointestinal symptoms (4 SGA-A vs 6 SGA-B, p<.05). Significant predictors of malnutrition according to SGA were age at diagnosis (OR 1.09, 95% CI 1.01 – 1.18), amount of dopaminergic medication per kg body weight (mg/kg) (OR 1.17, 95% CI 1.04 – 1.31), more severe motor symptoms (OR 1.10, 95% CI 1.02 – 1.19), less anxiety (OR 0.90, 95% CI 0.82 – 0.98) and more depressive symptoms (OR 1.23, 95% CI 1.07 – 1.41). Significant predictors of a higher PG-SGA score included living alone (β=0.14, 95% CI 0.01 – 0.26), more depressive symptoms (β=0.02, 95% CI 0.01 – 0.02) and more severe motor symptoms (OR 0.01, 95% CI 0.01 – 0.02). More severe disease is associated with malnutrition, and this may be compounded by lack of social support. Phase II: Nutrition intervention Nineteen of the people identified in Phase I as requiring nutrition support were included in Phase II, in which a nutrition intervention was conducted. Nine participants were in the standard care group (SC), which received an information sheet only, and the other 10 participants were in the intervention group (INT), which received individualised nutrition information and weekly follow-up. INT gained 2.2% of starting body weight over the 12 week intervention period resulting in significant increases in weight, BMI, mid-arm circumference and waist circumference. The SC group gained 1% of starting weight over the 12 weeks which did not result in any significant changes in anthropometric indices. Energy and protein intake (18.3kJ/kg vs 3.8kJ/kg and 0.3g/kg vs 0.15g/kg) increased in both groups. The increase in protein intake was only significant in the SC group. The changes in intake, when compared between the groups, were no different. There were no significant changes in any motor or non-motor symptoms or in "off" times or dyskinesias in either group. Aspects of quality of life improved over the 12 weeks as well, especially emotional well-being. This thesis makes a significant contribution to the evidence base for the presence of malnutrition in Parkinson's disease as well as for the identification of those who would potentially benefit from nutrition screening and assessment. The nutrition intervention demonstrated that a traditional high protein, high energy approach to the management of malnutrition resulted in improved nutritional status and anthropometric indices with no effect on the presence of Parkinson's disease symptoms and a positive effect on quality of life.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introduction Malnutrition is common among hospitalised patients, with poor follow-up of nutrition support post-discharge. Published studies on the efficacy of ambulatory nutrition support (ANS) for malnourished patients post-discharge are scarce. The aims of this study were to evaluate the rate of dietetics follow-up of malnourished patients post-discharge, before (2008) and after (2010) implementation of a new ANS service, and to evaluate nutritional outcomes post-implementation. Materials and Methods Consecutive samples of 261 (2008) and 163 (2010) adult inpatients referred to dietetics and assessed as malnourished using Subjective Global Assessment (SGA) were enrolled. All subjects received inpatient nutrition intervention and dietetic outpatient clinic follow-up appointments. For the 2010 cohort, ANS was initiated to provide telephone follow-up and home visits for patients who failed to attend the outpatient clinic. Subjective Global Assessment, body weight, quality of life (EQ-5D VAS) and handgrip strength were measured at baseline and five months post-discharge. Paired t-test was used to compare pre- and post-intervention results. Results In 2008, only 15% of patients returned for follow-up with a dietitian within four months post-discharge. After implementation of ANS in 2010, the follow-up rate was 100%. Mean weight improved from 44.0 ± 8.5kg to 46.3 ± 9.6kg, EQ-5D VAS from 61.2 ± 19.8 to 71.6 ± 17.4 and handgrip strength from 15.1 ± 7.1 kg force to 17.5 ± 8.5 kg force; p<0.001 for all. Seventy-four percent of patients improved in SGA score. Conclusion Ambulatory nutrition support resulted in significant improvements in follow-up rate, nutritional status and quality of life of malnourished patients post-discharge.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE: To optimize the animal model of liver injury that can properly represent the pathological characteristics of dampness-heat jaundice syndrome of traditional Chinese medicine. METHODS: The liver injury in the model rat was induced by alpha-naphthylisothiocyanate (ANIT) and carbon tetrachloride (CCl(4) ) respectively, and the effects of Yinchenhao Decoction (, YCHD), a proved effective Chinese medical formula for treating the dampness-heat jaundice syndrome in clinic, on the two liver injury models were evaluated by analyzing the serum level of alanine aminotransferase (ALT), asparate aminotransferase (AST), alkaline phosphatase (ALP), malondialchehyche (MDA), total bilirubin (T-BIL), superoxide dismutase (SOD), glutathione peroxidase (GSH-PX) as well as the ratio of liver weight to body weight. The experimental data were analyzed by principal component analytical method of pattern recognition. RESULTS: The ratio of liver weight to body weight was significantly elevated in the ANIT and CCl(4) groups when compared with that in the normal control (P<0.01). The contents of ALT and T-BIL were significantly higher in the ANIT group than in the normal control (P<0.05,P<0.01), and the levels of AST, ALT and ALP were significantly elevated in CCl(4) group relative to those in the normal control P<0.01). In the YCHD group, the increase in AST, ALT and ALP levels was significantly reduced (P<0.05, P<0.01), but with no significant increase in serum T-BIL. In the CCl(4) intoxicated group, the MDA content was significantly increased and SOD, GSH-PX activities decreased significantly compared with those in the normal control group, respectively (P<0.01). The increase in MDA induced by CCl(4) was significantly reduced by YCHD P<0.05). CONCLUSION: YCHD showed significant effects on preventing liver injury progression induced by CCl(4), and the closest or most suitable animal model for damp-heat jaundice syndrome may be the one induced by CCl(4).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introduction Malnutrition is common among hospitalised patients, with poor follow-up of nutrition support post-discharge. Published studies on the efficacy of ambulatory nutrition support (ANS) for malnourished patients post-discharge are scarce. The aims of this study were to evaluate the rate of dietetics follow-up of malnourished patients post-discharge, before (2008) and after (2010) implementation of a new ANS service, and to evaluate nutritional outcomes post-implementation. Materials and Methods Consecutive samples of 261 (2008) and 163 (2010) adult inpatients referred to dietetics and assessed as malnourished using Subjective Global Assessment (SGA) were enrolled. All subjects received inpatient nutrition intervention and dietetic outpatient clinic follow-up appointments. For the 2010 cohort, ANS was initiated to provide telephone follow-up and home visits for patients who failed to attend the outpatient clinic. Subjective Global Assessment, body weight, quality of life (EQ-5D VAS) and handgrip strength were measured at baseline and five months post-discharge. Paired t-test was used to compare pre- and post-intervention results. Results In 2008, only 15% of patients returned for follow-up with a dietitian within four months post-discharge. After implementation of ANS in 2010, the follow-up rate was 100%. Mean weight improved from 44.0 ± 8.5kg to 46.3 ± 9.6kg, EQ-5D VAS from 61.2 ± 19.8 to 71.6 ± 17.4 and handgrip strength from 15.1 ± 7.1 kg force to 17.5 ± 8.5 kg force; p<0.001 for all. Seventy-four percent of patients improved in SGA score. Conclusion Ambulatory nutrition support resulted in significant improvements in follow-up rate, nutritional status and quality of life of malnourished patients post-discharge.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Traditionally, infectious diseases and under-nutrition have been considered major health problems in Sri Lanka with little attention paid to obesity and associated non-communicable diseases (NCDs). However, the recent Sri Lanka Diabetes and Cardiovascular Study (SLDCS) reported the epidemic level of obesity, diabetes and metabolic syndrome. Moreover, obesity-associated NCDs is the leading cause of death in Sri Lanka and there is an exponential increase in hospitalization due to NCDs adversely affecting the development of the country. Despite Sri Lanka having a very high prevalence of NCDs and associated mortality, little is known about the causative factors for this burden. It is widely believed that the global NCD epidemic is associated with recent lifestyle changes, especially dietary factors. In the absence of sufficient data on dietary habits in Sri Lanka, successful interventions to manage these serious health issues would not be possible. In view of the current situation the dietary survey was undertaken to assess the intakes of energy, macro-nutrients and selected other nutrients with respect to socio demographic characteristics and the nutritional status of Sri Lankan adults especially focusing on obesity. Another aim of this study was to develop and validate a culturally specific food frequency questionnaire (FFQ) to assess dietary risk factors of NCDs in Sri Lankan adults. Data were collected from a subset of the national SLDCS using a multi-stage, stratified, random sampling procedure (n=500). However, data collection in the SLDCS was affected by the prevailing civil war which resulted in no data being collected from Northern and Eastern provinces. To obtain a nationally representative sample, additional subjects (n=100) were later recruited from the two provinces using similar selection criteria. Ethical Approval for this study was obtained from the Ethical Review Committee, Faculty of Medicine, University of Colombo, Sri Lanka and informed consent was obtained from the subjects before data were collected. Dietary data were obtained using the 24-h Dietary Recall (24HDR) method. Subjects were asked to recall all foods and beverages, consumed over the previous 24-hour period. Respondents were probed for the types of foods and food preparation methods. For the FFQ validation study, a 7-day weight diet record (7-d WDR) was used as the reference method. All foods recorded in the 24 HDR were converted into grams and then intake of energy and nutrients were analysed using NutriSurvey 2007 (EBISpro, Germany) which was modified for Sri Lankan food recipes. Socio-demographic details and body weight perception were collected from interviewer-administrated questionnaire. BMI was calculated and overweight (BMI ≥23 kg.m-2), obesity (BMI ≥25 kg.m-2) and abdominal obesity (Men: WC ≥ 90 cm; Women: WC ≥ 80 cm) were categorized according to Asia-pacific anthropometric cut-offs. The SPSS v. 16 for Windows and Minitab v10 were used for statistical analysis purposes. From a total of 600 eligible subjects, 491 (81.8%) participated of whom 34.5% (n=169) were males. Subjects were well distributed among different socio-economic parameters. A total of 312 different food items were recorded and nutritionists grouped similar food items which resulted in a total of 178 items. After performing step-wise multiple regression, 93 foods explained 90% of the variance for total energy intake, carbohydrates, protein, total fat and dietary fibre. Finally, 90 food items and 12 photographs were selected. Seventy-seven subjects completed (response rate = 65%) the FFQ and 7-day WDR. Estimated mean energy intake (SD) from FFQ (1794±398 kcal) and 7DWR (1698±333 kcal, P<0.001) was significantly different due to a significant overestimation of carbohydrate (~10 g/d, P<0.001) and to some extent fat (~5 g/d, NS). Significant positive correlations were found between the FFQ and 7DWR for energy (r = 0.39), carbohydrate (r = 0.47), protein (r = 0.26), fat (r =0.17) and dietary fiber (r = 0.32). Bland-Altman graphs indicated fairly good agreement between methods with no relationship between bias and average intake of each nutrient examined. The findings from the nutrition survey showed on average, Sri Lankan adults consumed over 14 portions of starch/d; moreover, males consumed 5 more portions of cereal than females. Sri Lankan adults consumed on average 3.56 portions of added sugars/d. Moreover, mean daily intake of fruit (0.43) and vegetable (1.73) portions was well below minimum dietary recommendations (fruits 2 portions/d; vegetables 3 portions/d). The total fruit and vegetable intake was 2.16 portions/d. Daily consumption of meat or alternatives was 1.75 portions and the sum of meat and pulses was 2.78 portions/d. Starchy foods were consumed by all participants and over 88% met the minimum daily recommendations. Importantly, nearly 70% of adults exceeded the maximum daily recommendation for starch (11portions/d) and a considerable proportion consumed larger numbers of starch servings daily, particularly men. More than 12% of men consumed over 25 starch servings/d. In contrast to their starch consumption, participants reported very low intakes of other food groups. Only 11.6%, 2.1% and 3.5% of adults consumed the minimum daily recommended servings of vegetables, fruits, and fruits and vegetables combined, respectively. Six out of ten adult Sri Lankans sampled did not consume any fruits. Milk and dairy consumption was extremely low; over a third of the population did not consume any dairy products and less than 1% of adults consumed 2 portions of dairy/d. A quarter of Sri Lankans did not report consumption of meat and pulses. Regarding protein consumption, 36.2% attained the minimum Sri Lankan recommendation for protein; and significantly more men than women achieved the recommendation of ≥3 servings of meat or alternatives daily (men 42.6%, women 32.8%; P<0.05). Over 70% of energy was derived from carbohydrates (Male:72.8±6.4%, Female:73.9±6.7%), followed by fat (Male:19.9±6.1%, Female:18.5±5.7%) and proteins (Male:10.6±2.1%, Female:10.9±5.6%). The average intake of dietary fiber was 21.3 g/day and 16.3 g/day for males and females, respectively. There was a significant difference in nutritional intake related to ethnicities, areas of residence, education levels and BMI categories. Similarly, dietary diversity was significantly associated with several socio-economic parameters among Sri Lankan adults. Adults with BMI ≥25 kg.m-2 and abdominally obese Sri Lankan adults had the highest diet diversity values. Age-adjusted prevalence (95% confidence interval) of overweight, obesity, and abdominal obesity among Sri Lankan adults were 17.1% (13.8-20.7), 28.8% (24.8-33.1), and 30.8% (26.8-35.2), respectively. Men, compared with women, were less overweight, 14.2% (9.4-20.5) versus 18.5% (14.4-23.3), P = 0.03, less obese, 21.0% (14.9-27.7) versus 32.7% (27.6-38.2), P < .05; and less abdominally obese, 11.9% (7.4-17.8) versus 40.6% (35.1-46.2), P < .05. Although, prevalence of obesity has reached to epidemic level body weight misperception was common among Sri Lankan adults. Two-thirds of overweight males and 44.7% of females considered themselves as in "about right weight". Over one third of both male and female obese subjects perceived themselves as "about right weight" or "underweight". Nearly 32% of centrally obese men and women perceived that their waist circumference is about right. People who perceived overweight or very overweight (n = 154) only 63.6% tried to lose their body weight (n = 98), and quarter of adults seek advices from professionals (n = 39). A number of important conclusions can be drawn from this research project. Firstly, the newly developed FFQ is an acceptable tool for assessing the nutrient intake of Sri Lankans and will assist proper categorization of individuals by dietary exposure. Secondly, a substantial proportion of the Sri Lankan population does not consume a varied and balanced diet, which is suggestive of a close association between the nutrition-related NCDs in the country and unhealthy eating habits. Moreover, dietary diversity is positively associated with several socio-demographic characteristics and obesity among Sri Lankan adults. Lastly, although obesity is a major health issue among Sri Lankan adults, body weight misperception was common among underweight, healthy weight, overweight, and obese adults in Sri Lanka. Over 2/3 of overweight and 1/3 of obese Sri Lankan adults believe that they are in "right weight" or "under-weight" categories.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background. In isotropic materials, the speed of acoustic wave propagation is governed by the bulk modulus and density. For tendon, which is a structural composite of fluid and collagen, however, there is some anisotropy requiring an adjustment for Poisson's ratio. This paper explores these relationships using data collected, in vivo, on human Achilles tendon and then compares estimates of elastic modulus and hysteresis against published values from in vitro mechanical tests. Methods. Measurements using conventional B-model ultrasound imaging, inverse dynamics and acoustic transmission techniques were used to determine dimensions, loading conditions and longitudinal speed of sound in the Achilles tendon during a series of isometric plantar flexion exercises against body weight. Upper and lower bounds for speed of sound versus tensile stress in the tendon were then modelled and estimates of the elastic modulus and hysteresis of the Achilles tendon derived. Results. Axial speed of sound varied between 1850 and 2090 ms-1 with a non-linear, asymptotic dependency on the level of tensile stress (5-35 MPa) in the tendon. Estimates derived for the elastic modulus of the Achilles tendon ranged between 1-2 GPa. Hysteresis derived from models of the stress-strain relationship, ranged from 3-11%. Discussion. Estimates of elastic modulus agree closely with those previously reported from direct measurements obtained via mechanical tensile tests on major weight bearing tendons in vitro [1,2]. Hysteresis derived from models of the stress-strain relationship is consistent with direct measures from various mamalian tendon (7-10%) but is lower than previous estimates in human tendon (17-26%) [3]. This non-invasive method would appear suitable for monitoring changes in tendon properties during dynamic sporting activities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background. This study evaluated the time course of recovery of transverse strain in the Achilles and patellar tendons following a bout of resistance exercise. Methods. Seventeen healthy adults underwent sonographic examination of the right patellar (n = 9) or Achilles (n = 8) tendons immediately prior to and following 90 repetitions of weight–bearing exercise. Quadriceps and gastrocnemius exercise were performed against an effective resistance of 175% and 250% body weight, respectively. Sagittal tendon thickness was determined 20 mm from the tendon enthesis and transverse strain was repeatedly monitored over a 24 hour recovery period. Results. Resistance exercise resulted in an immediate decrease in Achilles (t7 = 10.6, P<.01) and patellar (t8 = 8.9, P<.01) tendon thickness, resulting in an average transverse strain of 0.14 ± 0.04 and 0.18 ± 0.05. While the average strain was not significantly different between tendons, older age was associated with a reduced transverse strain response (r=0.63, P<.01). Recovery of transverse strain, in contrast, was prolonged compared with the duration of loading and exponential in nature. The mean primary recovery time was not significantly different between Achilles (6.5 ± 3.2 hours) and patellar (7.1 ± 3.2 hours) tendons and body weight accounted for 62% and 64% of the variation in recovery time, respectively. Discussion. Despite structural and biochemical differences between the Achilles and patellar tendons [1], the mechanisms underlying transverse creep–recovery in vivo appear similar and are highly time dependent. Primary recovery required about 7 hours in healthy tendons, with full recovery requiring up to 24 hours. These in vivo recovery times are similar to those reported for axial creep recovery of the vertebral disc in vitro [2], and may be used clinically to guide physical activity to rest ratios in healthy adults. Optimal ratios for high–stress tendons in clinical populations, however, remain unknown and require further attention in light of the knowledge gained in this study.