86 resultados para Mine and body


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The research field was intercultural theatre, specifically adapting indigenous performance forms for applied theatre purposes. The context was the rich performative traditions of Papua New Guinean cultures, which have remained largely untapped over several decades of "theatre for development" and "entertainment education". Papua New Guinean company Raun Raun Theatre developed Folk Opera from a similar concept in African theatre in the 1970s. The form incorporates elements of song, dance, ritual, chant, metaphor, music, and body adornment from traditional cultures. The form’s spectacular scope suited international touring in large theatrical venues, and the themes of emerging nationalism with which Raun Raun was concerned. The research team made three key innovations in the use of Folk Opera: adapting the form from theatres to community contexts, using the form to address issues of individual choice for health promotion, and emphasising experiential education over entertainment. Field-testing in Karkar Island showed community members gained clearer understandings of relevant health issues through participating in the folk opera form than through other educational approaches. The significance of the research was recognised by the members of the cross-cultural workshop team and the community of Karkar Island including the local Member of Parliament. The success of the Folk Opera form as an approach to sexual health promotion was recognised through the provision of AUD$74,000 funding by the National AIDS Council Secretariat of Papua New Guinea for a train-the-trainer program incorporating this innovative form of applied theatre. The research has been presented at a number of national and international conferences including the 6th International Research in Drama Education conference in 2009.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background and significance: Older adults with chronic diseases are at increasing risk of hospital admission and readmission. Approximately 75% of adults have at least one chronic condition, and the odds of developing a chronic condition increases with age. Chronic diseases consume about 70% of the total Australian health expenditure, and about 59% of hospital events for chronic conditions are potentially preventable. These figures have brought to light the importance of the management of chronic disease among the growing older population. Many studies have endeavoured to develop effective chronic disease management programs by applying social cognitive theory. However, limited studies have focused on chronic disease self-management in older adults at high risk of hospital readmission. Moreover, although the majority of studies have covered wide and valuable outcome measures, there is scant evidence on examining the fundamental health outcomes such as nutritional status, functional status and health-related quality of life. Aim: The aim of this research was to test social cognitive theory in relation to self-efficacy in managing chronic disease and three health outcomes, namely nutritional status, functional status, and health-related quality of life, in older adults at high risk of hospital readmission. Methods: A cross-sectional study design was employed for this research. Three studies were undertaken. Study One examined the nutritional status and validation of a nutritional screening tool; Study Two explored the relationships between participants. characteristics, self-efficacy beliefs, and health outcomes based on the study.s hypothesized model; Study Three tested a theoretical model based on social cognitive theory, which examines potential mechanisms of the mediation effects of social support and self-efficacy beliefs. One hundred and fifty-seven patients aged 65 years and older with a medical admission and at least one risk factor for readmission were recruited. Data were collected from medical records on demographics, medical history, and from self-report questionnaires. The nutrition data were collected by two registered nurses. For Study One, a contingency table and the kappa statistic was used to determine the validity of the Malnutrition Screening Tool. In Study Two, standard multiple regression, hierarchical multiple regression and logistic regression were undertaken to determine the significant influential predictors for the three health outcome measures. For Study Three, a structural equation modelling approach was taken to test the hypothesized self-efficacy model. Results: The findings of Study One suggested that a high prevalence of malnutrition continues to be a concern in older adults as the prevalence of malnutrition was 20.6% according to the Subjective Global Assessment. Additionally, the findings confirmed that the Malnutrition Screening Tool is a valid nutritional screening tool for hospitalized older adults at risk of readmission when compared to the Subjective Global Assessment with high sensitivity (94%), and specificity (89%) and substantial agreement between these two methods (k = .74, p < .001; 95% CI .62-.86). Analysis data for Study Two found that depressive symptoms and perceived social support were the two strongest influential factors for self-efficacy in managing chronic disease in a hierarchical multiple regression. Results of multivariable regression models suggested advancing age, depressive symptoms and less tangible support were three important predictors for malnutrition. In terms of functional status, a standard regression model found that social support was the strongest predictor for the Instrumental Activities of Daily Living, followed by self-efficacy in managing chronic disease. The results of standard multiple regression revealed that the number of hospital readmission risk factors adversely affected the physical component score, while depressive symptoms and self-efficacy beliefs were two significant predictors for the mental component score. In Study Three, the results of the structural equation modelling found that self-efficacy partially mediated the effect of health characteristics and depression on health-related quality of life. The health characteristics had strong direct effects on functional status and body mass index. The results also indicated that social support partially mediated the relationship between health characteristics and functional status. With regard to the joint effects of social support and self-efficacy, social support fully mediated the effect of health characteristics on self-efficacy, and self-efficacy partially mediated the effect of social support on functional status and health-related quality of life. The results also demonstrated that the models fitted the data well with relative high variance explained by the models, implying the hypothesized constructs under discussion were highly relevant, and hence the application for social cognitive theory in this context was supported. Conclusion: This thesis highlights the applicability of social cognitive theory on chronic disease self-management in older adults at risk of hospital readmission. Further studies are recommended to validate and continue to extend the development of social cognitive theory on chronic disease self-management in older adults to improve their nutritional and functional status, and health-related quality of life.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The increasing prevalence of childhood obesity is a global health issue. Past studies in Japan have reported an increase in both body mass index (BMI) and risk of obesity among children and adolescents. However, changes in body size and proportion in this population over time have also influenced BMI. To date, no study of secular changes in childhood obesity has considered the impact of changes in morphological factors. The current study explored the secular changes in BMI and childhood obesity risk among Japanese children from 1950 to 2000 with consideration of changes in body size and the proportions using The Statistical Report of the School Health Survey (SHS). The age of peak velocity (PV) occurred approximately two years earlier in both genders across this period. While the increments in height, sitting height and sub-ischial leg length relative to height levelled off by 1980, weight gain continued in boys. Between 1980 and 2000, the rate of the upper body weight gain in boys and girls were 0.7-1.3 kg/decade and 0.2-1.0 kg/decade, respectively. After considering body proportions, increments in body weight were small. It could be suggested that the increments in weight and BMI across the 50-year period may be due to a combination of changes including the tempo of growth and body size due to lifestyle factors.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective: We investigated to what extent changes in metabolic rate and composition of weight loss explained the less-than-expected weight loss in obese men and women during a diet-plus-exercise intervention. Design: 16 obese men and women (41 ± 9 years; BMI 39 ± 6 kg/m2) were investigated in energy balance before, after and twice during a 12-week VLED (565–650 kcal/day) plus exercise (aerobic plus resistance training) intervention. The relative energy deficit (EDef) from baseline requirements was severe (74-87%). Body composition was measured by deuterium dilution and DXA and resting metabolic rate (RMR) by indirect calorimetry. Fat mass (FM) and fat-free mass (FFM) were converted into energy equivalents using constants: 9.45 kcal/gFM and 1.13 kcal/gFFM. Predicted weight loss was calculated from the energy deficit using the '7700 kcal/kg rule'. Results: Changes in weight (-18.6 ± 5.0 kg), FM (-15.5 ± 4.3 kg), and FFM (-3.1 ± 1.9 kg) did not differ between genders. Measured weight loss was on average 67% of the predicted value, but ranged from 39 to 94%. Relative EDef was correlated with the decrease in RMR (R=0.70, P<0.01) and the decrease in RMR correlated with the difference between actual and expected weight loss (R=0.51, P<0.01). Changes in metabolic rate explained on average 67% of the less-than-expected weight loss, and variability in the proportion of weight lost as FM accounted for a further 5%. On average, after adjustment for changes in metabolic rate and body composition of weight lost, actual weight loss reached 90% of predicted values. Conclusion: Although weight loss was 33% lower than predicted at baseline from standard energy equivalents, the majority of this differential was explained by physiological variables. While lower-than-expected weight loss is often attributed to incomplete adherence to prescribed interventions, the influence of baseline calculation errors and metabolic down-regulation should not be discounted.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Total hip arthroplasty (THA) has a proven clinical record for providing pain relief and return of function to patients with disabling arthritis. There are many successful options for femoral implant design and fixation. Cemented, polished, tapered femoral implants have been shown to have excellent results in national joint registries and long-term clinical series. These implants are usually 150mm long at their lateral aspect. Due to their length, these implants cannot always be offered to patients due to variations in femoral anatomy. Polished, tapered implants as short as 95mm exist, however their small proximal geometry (neck offset and body size) limit their use to smaller stature patients. There is a group of patients in which a shorter implant with a maintained proximal body size would be advantageous. There are also potential benefits to a shorter implant in standard patient populations such as reduced bone removal due to reduced reaming, favourable loading of the proximal femur, and the ability to revise into good proximal bone stock if required. These factors potentially make a shorter implant an option for all patient populations. The role of implant length in determining the stability of a cemented, polished, tapered femoral implant is not well defined by the literature. Before changes in implant design can be made, a better understanding of the role of each region in determining performance is required. The aim of the thesis was to describe how implant length affects the stability of a cemented, polished, tapered femoral implant. This has been determined through an extensive body of laboratory testing. The major findings are that for a given proximal body size, a reduction in implant length has no effect on the torsional stability of a polished, tapered design, while a small reduction in axial stability should be expected. These findings are important because the literature suggests that torsional stability is the major determinant of long-term clinical performance of a THA system. Furthermore, a polished, tapered design is known to be forgiving of cement-implant interface micromotion due to the favourable wear characteristics. Together these findings suggest that a shorter polished, tapered implant may be well tolerated. The effect of a change in implant length on the geometric characteristics of polished, tapered design were also determined and applied to the mechanical testing. Importantly, interface area does play a role in stability of the system; however it is the distribution of the interface and not the magnitude of the area that defines stability. Taper angle (at least in the range of angles seen in this work) was shown not to be a determinant of axial or torsional stability. A range of implants were tested, comparing variations in length, neck offset and indication (primary versus cement-in-cement revision). At their manufactured length, the 125mm implants were similar to their longer 150mm counterparts suggesting that they may be similarly well tolerated in the clinical environment. However, the slimmer cement-in-cement revision implant was shown to have a poorer mechanical performance, suggesting their use in higher demand patients may be hazardous. An implant length of 125mm has been shown to be quite stable and the results suggest that a further reduction to 100mm may be tolerated. However, further work is required. A shorter implant with maintained proximal body size would be useful for the group of patients who are unable to access the current standard length implants due to variations in femoral anatomy. Extending the findings further, the similar function with potential benefits of a shorter implant make their application to all patients appealing.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Introduction Critical care patients frequently receive blood transfusions. Some reports show an association between aged or stored blood and increased morbidity and mortality, including the development of transfusion-related acute lung injury (TRALI). However, the existence of conflicting data endorses the need for research to either reject this association, or to confirm it and elucidate the underlying mechanisms. Methods Twenty-eight sheep were randomised into two groups, receiving saline or lipopolysaccharide (LPS). Sheep were further randomised to also receive transfusion of pooled and heat-inactivated supernatant from fresh (Day 1) or stored (Day 42) non-leucoreduced human packed red blood cells (PRBC) or an infusion of saline. TRALI was defined by hypoxaemia during or within two hours of transfusion and histological evidence of pulmonary oedema. Regression modelling compared physiology between groups, and to a previous study, using stored platelet concentrates (PLT). Samples of the transfused blood products also underwent cytokine array and biochemical analyses, and their neutrophil priming ability was measured in vitro. Results TRALI did not develop in sheep that first received saline-infusion. In contrast, 80% of sheep that first received LPS-infusion developed TRALI following transfusion with "stored PRBC." The decreased mean arterial pressure and cardiac output as well as increased central venous pressure and body temperature were more severe for TRALI induced by "stored PRBC" than by "stored PLT." Storage-related accumulation of several factors was demonstrated in both "stored PRBC" and "stored PLT", and was associated with increased in vitro neutrophil priming. Concentrations of several factors were higher in the "stored PRBC" than in the "stored PLT," however, there was no difference to neutrophil priming in vitro. Conclusions In this in vivo ovine model, both recipient and blood product factors contributed to the development of TRALI. Sick (LPS infused) sheep rather than healthy (saline infused) sheep predominantly developed TRALI when transfused with supernatant from stored but not fresh PRBC. "Stored PRBC" induced a more severe injury than "stored PLT" and had a different storage lesion profile, suggesting that these outcomes may be associated with storage lesion factors unique to each blood product type. Therefore, the transfusion of fresh rather than stored PRBC may minimise the risk of TRALI.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this paper was to investigate the association between appetite and Kidney-Disease Specific Quality of Life in maintenance hemodialysis patients. Quality of Life (QoL) was measured using the Kidney Disease Quality Of Life survey. Appetite was measured using self-reported categories and a visual analog scale. Other nutritional parameters included Patient-Generated Subjective Global Assessment (PGSGA), dietary intake, body mass index and biochemical markers C-Reactive Protein and albumin. Even in this well nourished sample (n=62) of hemodialysis patients, PGSGA score (r=-0.629), subjective hunger sensations (r=0.420) and body mass index (r=-0.409) were all significantly associated with the Physical Health Domain of QoL. As self-reported appetite declined, QoL was significantly lower in nine domains which were mostly in the SF36 component and covered social functioning and physical domains. Appetite and other nutritional parameters were not as strongly associated with the Mental Health domain and Kidney Disease Component Summary Domains. Nutritional parameters, especially PGSGA score and appetite, appear to be important components of the physical health domain of QoL. As even small reductions in nutritional status were associated with significantly lower QoL scores, monitoring appetite and nutritional status is an important component of care for hemodialysis patients.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Although there is a paucity of scientific support for the benefits of warm-up, athletes commonly warm up prior to activity with the intention of improving performance and reducing the incidence of injuries. The purpose of this study was to examine the role of warm-up intensity on both range of motion (ROM) and anaerobic performance. Nine males (age = 21.7 +/- 1.6 years, height = 1.77 +/- 0.04 m, weight = 80.2 +/- 6.8 kg, and VO2max = 60.4 +/- 5.4 ml/kg/min) completed four trials. Each trial consisted of hip, knee, and ankle ROM evaluation using an electronic inclinometer and an anaerobic capacity test on the treadmill (time to fatigue at 13 km/hr and 20% grade). Subjects underwent no warm-up or a warm-up of 15 minutes running at 60, 70 or 80% VO2max followed by a series of lower limb stretches. Intensity of warm-up had little effect on ROM, since ankle dorsiflexion and hip extension significantly increased in all warm-up conditions, hip flexion significantly increased only after the 80% VO2max warm-up, and knee flexion did not change after any warm-up. Heart rate and body temperature were significantly increased (p < 0.05) prior to anaerobic performance for each of the warm-up conditions, but anaerobic performance improved significantly only after warm-up at 60% VO2max (10%) and 70% VO2max (13%). A 15-minute warm-up at an intensity of 60-70% VO2max is therefore recommended to improve ROM and enhance subsequent anaerobic performance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose: Type 2 diabetes is a leading cause of morbidity and mortality in midlife and older Australian women with known modifiable risk factors for type 2 diabetes including smoking, nutrition, physical activity and obesity. In Australia little research has been done to investigate the perceived barriers to healthy lifestyle activities in midlife and older women with type 2 diabetes. The primary aim of this study was to explore the level and type of perceived barriers to health promotion activities. The secondary aim was to explore the relationship of perceived barriers to smoking behaviour, fruit and vegetable intake, physical activity, and body mass index. Methods: The study was a cross sectional survey of women, aged over 45 with type 2 diabetes, attending metropolitan community health clinics (N = 41). Data was collected from self-report questionnaire and analysed using descriptive and inferential statistics. Results: Women in the study had average total barriers scores similar to those reported in the literature for women with a range of physical disabilities and illnesses. The leading barriers for this group of women were: lack of interest, concern about safety, too tired, lack of money and feeling what they do does not help. There was no association between total barriers scores and body mass index, physical activity, fruit and vegetable intake or socio-demographic variables. Conclusion: This study contributes to understanding the perceptions of midlife and older women with type 2 diabetes about the level and type of barriers to healthy lifestyle activities that they experience. Evidence from this study can be applied to inform health promotion for lifestyle risk factor reduction in women with type 2 diabetes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In Bazley v Wesley Monash IVF Pty Ltd [2010] QSC 118 an order was made under r 250 of the Uniform Civil Procedure Rules 1999 (Qld) (“UCPR”) requiring the respondent to continue to hold and maintain straws of semen belonging to the applicant’s deceased husband. The decision includes a useful analysis of the development of the common law regarding property rights in human bodies and body parts.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Diet Induced Thermogenesis (DIT) is the energy expended consequent to meal consumption, and reflects the energy required for the processing and digestion of food consumed throughout each day. Although DIT is the total energy expended across a day in digestive processes to a number of meals, most studies measure thermogenesis in response to a single meal (Meal Induced Thermogenesis: MIT) as a representation of an individual’s thermogenic response to acute food ingestion. As a component of energy expenditure, DIT may have a contributing role in weight gain and weight loss. While the evidence is inconsistent, research has tended to reveal a suppressed MIT response in obese compared to lean individuals, which identifies individuals with an efficient storage of food energy, hence a greater tendency for weight gain. Appetite is another factor regulating body weight through its influence on energy intake. Preliminary research has shown a potential link between MIT and postprandial appetite as both are responses to food ingestion and have a similar response dependent upon the macronutrient content of food. There is a growing interest in understanding how both MIT and appetite are modified with changes in diet, activity levels and body size. However, the findings from MIT research have been highly inconsistent, potentially due to the vastly divergent protocols used for its measurement. Therefore, the main theme of this thesis was firstly, to address some of the methodological issues associated with measuring MIT. Additionally this thesis aimed to measure postprandial appetite simultaneously to MIT to test for any relationships between these meal-induced variables and to assess changes that occur in MIT and postprandial appetite during periods of energy restriction (ER) and following weight loss. Two separate studies were conducted to achieve these aims. Based on the increasing prevalence of obesity, it is important to develop accurate methodologies for measuring the components potentially contributing to its development and to understand the variability within these variables. Therefore, the aim of Study One was to establish a protocol for measuring the thermogenic response to a single test meal (MIT), as a representation of DIT across a day. This was done by determining the reproducibility of MIT with a continuous measurement protocol and determining the effect of measurement duration. The benefit of a fixed resting metabolic rate (RMR), which is a single measure of RMR used to calculate each subsequent measure of MIT, compared to separate baseline RMRs, which are separate measures of RMR measured immediately prior to each MIT test meal to calculate each measure of MIT, was also assessed to determine the method with greater reproducibility. Subsidiary aims were to measure postprandial appetite simultaneously to MIT, to determine its reproducibility between days and to assess potential relationships between these two variables. Ten healthy individuals (5 males, 5 females, age = 30.2 ± 7.6 years, BMI = 22.3 ± 1.9 kg/m2, %Fat Mass = 27.6 ± 5.9%) undertook three testing sessions within a 1-4 week time period. During the first visit, participants had their body composition measured using DXA for descriptive purposes, then had an initial 30-minute measure of RMR to familiarise them with the testing and to be used as a fixed baseline for calculating MIT. During the second and third testing sessions, MIT was measured. Measures of RMR and MIT were undertaken using a metabolic cart with a ventilated hood to measure energy expenditure via indirect calorimetry with participants in a semi-reclined position. The procedure on each MIT test day was: 1) a baseline RMR measured for 30 minutes, 2) a 15-minute break in the measure to consume a standard 576 kcal breakfast (54.3% CHO, 14.3% PRO, 31.4% FAT), comprising muesli, milk toast, butter, jam and juice, and 3) six hours of measuring MIT with two, ten-minute breaks at 3 and 4.5 hours for participants to visit the bathroom. On the MIT test days, pre and post breakfast then at 45-minute intervals, participants rated their subjective appetite, alertness and comfort on visual analogue scales (VAS). Prior to each test, participants were required to be fasted for 12 hours, and have undertaken no high intensity physical activity for the previous 48 hours. Despite no significant group changes in the MIT response between days, individual variability was high with an average between-day CV of 33%, which was not significantly improved by the use of a fixed RMR to 31%. The 95% limits of agreements which ranged from 9.9% of energy intake (%EI) to -10.7%EI with the baseline RMRs and between 9.6%EI to -12.4%EI with the fixed RMR, indicated very large changes relative to the size of the average MIT response (MIT 1: 8.4%EI, 13.3%EI; MIT 2: 8.8%EI, 14.7%EI; baseline and fixed RMRs respectively). After just three hours, the between-day CV with the baseline RMR was 26%, which may indicate an enhanced MIT reproducibility with shorter measurement durations. On average, 76, 89, and 96% of the six-hour MIT response was completed within three, four and five hours, respectively. Strong correlations were found between MIT at each of these time points and the total six-hour MIT (range for correlations r = 0.990 to 0.998; P < 0.01). The reproducibility of the proportion of the six-hour MIT completed at 3, 4 and 5 hours was reproducible (between-day CVs ≤ 8.5%). This indicated the suitability to use shorter durations on repeated occasions and a similar percent of the total response to be completed. There was a lack of strong evidence of any relationship between the magnitude of the MIT response and subjective postprandial appetite. Given a six-hour protocol places a considerable burden on participants, these results suggests that a post-meal measurement period of only three hours is sufficient to produce valid information on the metabolic response to a meal. However while there was no mean change in MIT between test days, individual variability was large. Further research is required to better understand which factors best explain the between-day variability in this physiological measure. With such a high prevalence of obesity, dieting has become a necessity to reduce body weight. However, during periods of ER, metabolic and appetite adaptations can occur which may impede weight loss. Understanding how metabolic and appetite factors change during ER and weight loss is important for designing optimal weight loss protocols. The purpose of Study Two was to measure the changes in the MIT response and subjective postprandial appetite during either continuous (CONT) or intermittent (INT) ER and following post diet energy balance (post-diet EB). Thirty-six obese male participants were randomly assigned to either the CONT (Age = 38.6 ± 7.0 years, weight = 109.8 ± 9.2 kg, % fat mass = 38.2 ± 5.2%) or INT diet groups (Age = 39.1 ± 9.1 years, weight = 107.1 ± 12.5 kg, % fat mass = 39.6 ± 6.8%). The study was divided into three phases: a four-week baseline (BL) phase where participants were provided with a diet to maintain body weight, an ER phase lasting either 16 (CONT) or 30 (INT) weeks, where participants were provided with a diet which supplied 67% of their energy balance requirements to induce weight loss and an eight-week post-diet EB phase, providing a diet to maintain body weight post weight loss. The INT ER phase was delivered as eight, two-week blocks of ER interspersed with two-week blocks designed to achieve weight maintenance. Energy requirements for each phase were predicted based on measured RMR, and adjusted throughout the study to account for changes in RMR. All participants completed MIT and appetite tests during BL and the ER phase. Nine CONT and 15 INT participants completed the post-diet EB MIT and 14 INT and 15 CONT participants completed the post-diet EB appetite tests. The MIT test day protocol was as follows: 1) a baseline RMR measured for 30 minutes, 2) a 15-minute break in the measure to consume a standard breakfast meal (874 kcal, 53.3% CHO, 14.5% PRO, 32.2% FAT), and 3) three hours of measuring MIT. MIT was calculated as the energy expenditure above the pre-meal RMR. Appetite test days were undertaken on a separate day using the same 576 kcal breakfast used in Study One. VAS were used to assess appetite pre and post breakfast, at one hour post breakfast then a further three times at 45-minute intervals. Appetite ratings were calculated for hunger and fullness as both the intra-meal change in appetite and the AUC. The three-hour MIT response at BL, ER and post-diet EB respectively were 5.4 ± 1.4%EI, 5.1 ± 1.3%EI and 5.0 ± 0.8%EI for the CONT group and 4.4 ± 1.0%EI, 4.7 ± 1.0%EI and 4.8 ± 0.8%EI for the INT group. Compared to BL, neither group had significant changes in their MIT response during ER or post-diet EB. There were no significant time by group interactions (p = 0.17) indicating a similar response to ER and post-diet EB in both groups. Contrary to what was hypothesised, there was a significant increase in postprandial AUC fullness in response to ER in both groups (p < 0.05). However, there were no significant changes in any of the other postprandial hunger or fullness variables. Despite no changes in MIT in both the CONT or INT group in response to ER or post-diet EB and only a minor increase in postprandial AUC fullness, the individual changes in MIT and postprandial appetite in response to ER were large. However those with the greatest MIT changes did not have the greatest changes in postprandial appetite. This study shows that postprandial appetite and MIT are unlikely to be altered during ER and are unlikely to hinder weight loss. Additionally, there were no changes in MIT in response to weight loss, indicating that body weight did not influence the magnitude of the MIT response. There were large individual changes in both variables, however further research is required to determine whether these changes were real compensatory changes to ER or simply between-day variation. Overall, the results of this thesis add to the current literature by showing the large variability of continuous MIT measurements, which make it difficult to compare MIT between groups and in response to diet interventions. This thesis was able to provide evidence to suggest that shorter measures may provide equally valid information about the total MIT response and can therefore be utilised in future research in order to reduce the burden of long measurements durations. This thesis indicates that MIT and postprandial subjective appetite are most likely independent of each other. This thesis also shows that, on average, energy restriction was not associated with compensatory changes in MIT and postprandial appetite that would have impeded weight loss. However, the large inter-individual variability supports the need to examine individual responses in more detail.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background The largest proportion of cancer patients are aged 65 years and over. Increasing age is also associated with nutritional risk and multi-morbidities—factors which complicate the cancer treatment decision-making process in older patients. Objectives To determine whether malnutrition risk and Body Mass Index (BMI) are associated with key oncogeriatric variables as potential predictors of chemotherapy outcomes in geriatric oncology patients with solid tumours. Methods In this longitudinal study, geriatric oncology patients (aged ≥65 years) received a Comprehensive Geriatric Assessment (CGA) for baseline data collection prior to the commencement of chemotherapy treatment. Malnutrition risk was assessed using the Malnutrition Screening Tool (MST) and BMI was calculated using anthropometric data. Nutritional risk was compared with other variables collected as part of standard CGA. Associations were determined by chi-square tests and correlations. Results Over half of the 175 geriatric oncology patients were at risk of malnutrition (53.1%) according to MST. BMI ranged from 15.5–50.9kg/m2, with 35.4% of the cohort overweight when compared to geriatric cutoffs. Malnutrition risk was more prevalent in those who were underweight (70%) although many overweight participants presented as at risk (34%). Malnutrition risk was associated with a diagnosis of colorectal or lung cancer (p=0.001), dependence in activities of daily living (p=0.015) and impaired cognition (p=0.049). Malnutrition risk was positively associated with vulnerability to intensive cancer therapy (rho=0.16, p=0.038). Larger BMI was associated with a greater number of multi-morbidities (rho =.27, p=0.001. Conclusions Malnutrition risk is prevalent among geriatric patients undergoing chemotherapy, is more common in colorectal and lung cancer diagnoses, is associated with impaired functionality and cognition and negatively influences ability to complete planned intensive chemotherapy.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: Malnutrition before and during chemotherapy is associated with poor treatment outcomes. The risk of cancer-related malnutrition is exacerbated by common nutrition impact symptoms during chemotherapy, such as nausea, diarrhoea and mucositis. Aim of presentation: To describe the prevalence of malnutrition/ malnutrition risk in two samples of patients treated in a quaternary-level chemotherapy unit. Research design: Cross sectional survey. Sample 1: Patients ≥ 65 years prior to chemotherapy treatment (n=175). Instrument: Nurse-administered Malnutrition Screening Tool to screen for malnutrition risk and body mass index (BMI). Sample 2: Patients ≥ 18 years receiving chemotherapy (n=121). Instrument: Dietitian-administered Patient Generated Subjective Global Assessment to assess malnutrition, malnutrition risk and BMI. Findings Sample 1: 93/175 (53%) of older patients were at risk of malnutrition prior to chemotherapy. 27 (15%) were underweight (BMI <21.9); 84 (48%) were overweight (BMI >27). Findings Sample 2: 31/121 patients (26%) were malnourished; 12 (10%) had intake-limiting nausea or vomiting; 22 (20%) reported significant weight loss; and 20 (18%) required improved nutritional symptom management during treatment. 13 participants with malnutrition/nutrition impact symptoms (35%) had no dietitian contact; the majority of these participants were overweight. Implications for nursing: Patients with, or at risk of, malnutrition before and during chemotherapy can be overlooked, particularly if they are overweight. Older patients seem particularly at risk. Nurses can easily and quickly identify risk with the regular use of the Malnutrition Screening Tool, and refer patients to expert dietetic support, to ensure optimal treatment outcomes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background Type 2 diabetes is a leading cause of morbidity and mortality in midlife and older Australian women with known modifiable risk factors for type 2 diabetes including smoking, nutrition, physical activity and obesity. In Australia little research has been done to investigate the perceived barriers to healthy lifestyle activities in midlife and older women with type 2 diabetes. Aims The primary aim of this study was to explore the level and type of perceived barriers to health promotion activities. The secondary aim was to explore the relationship of perceived barriers to smoking behaviour, fruit and vegetable intake, physical activity, and body mass index. Methods The study was a cross sectional survey of women, aged over 45 with type 2 diabetes, recruited from four metropolitan community health clinics (n = 41). Data were collected from self-report questionnaires and analysed using quantitative methods. Results Women in the study had average total barriers scores similar to those reported in the literature for women with a range of physical disabilities and illnesses. The leading barriers for this group of women were: lack of interest, concern about safety, too tired, lack of money and feeling what they do does not help. There was no association between total barriers scores and body mass index, physical activity, fruit and vegetable intake or socio-demographic variables. Conclusion This study contributes to understanding the perceptions of midlife and older women with type 2 diabetes about the level and type of barriers to healthy lifestyle activities that they experience. The participants reported a high level perceived barriers with a range of personal, social and environmental issues identified and described. This study suggests that health promotion education and interventions for risk factor reduction in women with type 2 diabetes may be enhanced by explicitly addressing perceived barriers to healthy lifestyle activities.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

It has been reported that poor nutritional status, in the form of weight loss and resulting body mass index (BMI) changes, is an issue in people with Parkinson's disease (PWP). The symptoms resulting from Parkinson's disease (PD) and the side effects of PD medication have been implicated in the aetiology of nutritional decline. However, the evidence on which these claims are based is, on one hand, contradictory, and on the other, restricted primarily to otherwise healthy PWP. Despite the claims that PWP suffer from poor nutritional status, evidence is lacking to inform nutrition-related care for the management of malnutrition in PWP. The aims of this thesis were to better quantify the extent of poor nutritional status in PWP, determine the important factors differentiating the well-nourished from the malnourished and evaluate the effectiveness of an individualised nutrition intervention on nutritional status. Phase DBS: Nutritional status in people with Parkinson's disease scheduled for deep-brain stimulation surgery The pre-operative rate of malnutrition in a convenience sample of people with Parkinson's disease (PWP) scheduled for deep-brain stimulation (DBS) surgery was determined. Poorly controlled PD symptoms may result in a higher risk of malnutrition in this sub-group of PWP. Fifteen patients (11 male, median age 68.0 (42.0 – 78.0) years, median PD duration 6.75 (0.5 – 24.0) years) participated and data were collected during hospital admission for the DBS surgery. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference, waist circumference, body mass index (BMI)) were taken, and body composition was measured using bioelectrical impedance spectroscopy (BIS). Six (40%) of the participants were malnourished (SGA-B) while 53% reported significant weight loss following diagnosis. BMI was significantly different between SGA-A and SGA-B (25.6 vs 23.0kg/m 2, p<.05). There were no differences in any other variables, including PG-SGA score and the presence of non-motor symptoms. The conclusion was that malnutrition in this group is higher than that in other studies reporting malnutrition in PWP, and it is under-recognised. As poorer surgical outcomes are associated with poorer pre-operative nutritional status in other surgeries, it might be beneficial to identify patients at nutritional risk prior to surgery so that appropriate nutrition interventions can be implemented. Phase I: Nutritional status in community-dwelling adults with Parkinson's disease The rate of malnutrition in community-dwelling adults (>18 years) with Parkinson's disease was determined. One hundred twenty-five PWP (74 male, median age 70.0 (35.0 – 92.0) years, median PD duration 6.0 (0.0 – 31.0) years) participated. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference (MAC), calf circumference, waist circumference, body mass index (BMI)) were taken. Nineteen (15%) of the participants were malnourished (SGA-B). All anthropometric indices were significantly different between SGA-A and SGA-B (BMI 25.9 vs 20.0kg/m2; MAC 29.1 – 25.5cm; waist circumference 95.5 vs 82.5cm; calf circumference 36.5 vs 32.5cm; all p<.05). The PG-SGA score was also significantly lower in the malnourished (2 vs 8, p<.05). The nutrition impact symptoms which differentiated between well-nourished and malnourished were no appetite, constipation, diarrhoea, problems swallowing and feel full quickly. This study concluded that malnutrition in community-dwelling PWP is higher than that documented in community-dwelling elderly (2 – 11%), yet is likely to be under-recognised. Nutrition impact symptoms play a role in reduced intake. Appropriate screening and referral processes should be established for early detection of those at risk. Phase I: Nutrition assessment tools in people with Parkinson's disease There are a number of validated and reliable nutrition screening and assessment tools available for use. None of these tools have been evaluated in PWP. In the sample described above, the use of the World Health Organisation (WHO) cut-off (≤18.5kg/m2), age-specific BMI cut-offs (≤18.5kg/m2 for under 65 years, ≤23.5kg/m2 for 65 years and older) and the revised Mini-Nutritional Assessment short form (MNA-SF) were evaluated as nutrition screening tools. The PG-SGA (including the SGA classification) and the MNA full form were evaluated as nutrition assessment tools using the SGA classification as the gold standard. For screening, the MNA-SF performed the best with sensitivity (Sn) of 94.7% and specificity (Sp) of 78.3%. For assessment, the PG-SGA with a cut-off score of 4 (Sn 100%, Sp 69.8%) performed better than the MNA (Sn 84.2%, Sp 87.7%). As the MNA has been recommended more for use as a nutrition screening tool, the MNA-SF might be more appropriate and take less time to complete. The PG-SGA might be useful to inform and monitor nutrition interventions. Phase I: Predictors of poor nutritional status in people with Parkinson's disease A number of assessments were conducted as part of the Phase I research, including those for the severity of PD motor symptoms, cognitive function, depression, anxiety, non-motor symptoms, constipation, freezing of gait and the ability to carry out activities of daily living. A higher score in all of these assessments indicates greater impairment. In addition, information about medical conditions, medications, age, age at PD diagnosis and living situation was collected. These were compared between those classified as SGA-A and as SGA-B. Regression analysis was used to identify which factors were predictive of malnutrition (SGA-B). Differences between the groups included disease severity (4% more severe SGA-A vs 21% SGA-B, p<.05), activities of daily living score (13 SGA-A vs 18 SGA-B, p<.05), depressive symptom score (8 SGA-A vs 14 SGA-B, p<.05) and gastrointestinal symptoms (4 SGA-A vs 6 SGA-B, p<.05). Significant predictors of malnutrition according to SGA were age at diagnosis (OR 1.09, 95% CI 1.01 – 1.18), amount of dopaminergic medication per kg body weight (mg/kg) (OR 1.17, 95% CI 1.04 – 1.31), more severe motor symptoms (OR 1.10, 95% CI 1.02 – 1.19), less anxiety (OR 0.90, 95% CI 0.82 – 0.98) and more depressive symptoms (OR 1.23, 95% CI 1.07 – 1.41). Significant predictors of a higher PG-SGA score included living alone (β=0.14, 95% CI 0.01 – 0.26), more depressive symptoms (β=0.02, 95% CI 0.01 – 0.02) and more severe motor symptoms (OR 0.01, 95% CI 0.01 – 0.02). More severe disease is associated with malnutrition, and this may be compounded by lack of social support. Phase II: Nutrition intervention Nineteen of the people identified in Phase I as requiring nutrition support were included in Phase II, in which a nutrition intervention was conducted. Nine participants were in the standard care group (SC), which received an information sheet only, and the other 10 participants were in the intervention group (INT), which received individualised nutrition information and weekly follow-up. INT gained 2.2% of starting body weight over the 12 week intervention period resulting in significant increases in weight, BMI, mid-arm circumference and waist circumference. The SC group gained 1% of starting weight over the 12 weeks which did not result in any significant changes in anthropometric indices. Energy and protein intake (18.3kJ/kg vs 3.8kJ/kg and 0.3g/kg vs 0.15g/kg) increased in both groups. The increase in protein intake was only significant in the SC group. The changes in intake, when compared between the groups, were no different. There were no significant changes in any motor or non-motor symptoms or in "off" times or dyskinesias in either group. Aspects of quality of life improved over the 12 weeks as well, especially emotional well-being. This thesis makes a significant contribution to the evidence base for the presence of malnutrition in Parkinson's disease as well as for the identification of those who would potentially benefit from nutrition screening and assessment. The nutrition intervention demonstrated that a traditional high protein, high energy approach to the management of malnutrition resulted in improved nutritional status and anthropometric indices with no effect on the presence of Parkinson's disease symptoms and a positive effect on quality of life.