171 resultados para Missions to leprosy patients.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Total hip arthroplasty (THA) is a commonly performed procedure and numbers are increasing with ageing populations. One of the most serious complications in THA are surgical site infections (SSIs), caused by pathogens entering the wound during the procedure. SSIs are associated with a substantial burden for health services, increased mortality and reduced functional outcomes in patients. Numerous approaches to preventing these infections exist but there is no gold standard in practice and the cost-effectiveness of alternate strategies is largely unknown. Objectives The aim of this project was to evaluate the cost-effectiveness of strategies claiming to reduce deep surgical site infections following total hip arthroplasty in Australia. The objectives were: 1. Identification of competing strategies or combinations of strategies that are clinically relevant to the control of SSI related to hip arthroplasty 2. Evidence synthesis and pooling of results to assess the volume and quality of evidence claiming to reduce the risk of SSI following total hip arthroplasty 3. Construction of an economic decision model incorporating cost and health outcomes for each of the identified strategies 4. Quantification of the effect of uncertainty in the model 5. Assessment of the value of perfect information among model parameters to inform future data collection Methods The literature relating to SSI in THA was reviewed, in particular to establish definitions of these concepts, understand mechanisms of aetiology and microbiology, risk factors, diagnosis and consequences as well as to give an overview of existing infection prevention measures. Published economic evaluations on this topic were also reviewed and limitations for Australian decision-makers identified. A Markov state-transition model was developed for the Australian context and subsequently validated by clinicians. The model was designed to capture key events related to deep SSI occurring within the first 12 months following primary THA. Relevant infection prevention measures were selected by reviewing clinical guideline recommendations combined with expert elicitation. Strategies selected for evaluation were the routine use of pre-operative antibiotic prophylaxis (AP) versus no use of antibiotic prophylaxis (No AP) or in combination with antibiotic-impregnated cement (AP & ABC) or laminar air operating rooms (AP & LOR). The best available evidence for clinical effect size and utility parameters was harvested from the medical literature using reproducible methods. Queensland hospital data were extracted to inform patients transitions between model health states and related costs captured in assigned treatment codes. Costs related to infection prevention were derived from reliable hospital records and expert opinion. Uncertainty of model input parameters was explored in probabilistic sensitivity analyses and scenario analyses and the value of perfect information was estimated. Results The cost-effectiveness analysis was performed from a health services perspective using a hypothetical cohort of 30,000 THA patients aged 65 years. The baseline rate of deep SSI was 0.96% within one year of a primary THA. The routine use of antibiotic prophylaxis (AP) was highly cost-effective and resulted in cost savings of over $1.6m whilst generating an extra 163 QALYs (without consideration of uncertainty). Deterministic and probabilistic analysis (considering uncertainty) identified antibiotic prophylaxis combined with antibiotic-impregnated cement (AP & ABC) to be the most cost-effective strategy. Using AP & ABC generated the highest net monetary benefit (NMB) and an incremental $3.1m NMB compared to only using antibiotic prophylaxis. There was a very low error probability that this strategy might not have the largest NMB (<5%). Not using antibiotic prophylaxis (No AP) or using both antibiotic prophylaxis combined with laminar air operating rooms (AP & LOR) resulted in worse health outcomes and higher costs. Sensitivity analyses showed that the model was sensitive to the initial cohort starting age and the additional costs of ABC but the best strategy did not change, even for extreme values. The cost-effectiveness improved for a higher proportion of cemented primary THAs and higher baseline rates of deep SSI. The value of perfect information indicated that no additional research is required to support the model conclusions. Conclusions Preventing deep SSI with antibiotic prophylaxis and antibiotic-impregnated cement has shown to improve health outcomes among hospitalised patients, save lives and enhance resource allocation. By implementing a more beneficial infection control strategy, scarce health care resources can be used more efficiently to the benefit of all members of society. The results of this project provide Australian policy makers with key information about how to efficiently manage risks of infection in THA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Inadequate vitamin D levels have been linked to bone disease but more recently have been associated with wider health implications. Limited studies suggest a high prevalence of Vitamin D deficiency in dialysis patients, although evidence is lacking on whether this is due to dietary restrictions, limited mobility and time outdoors or a combination of these. The aim of this study was to assess the contributions of diet, supplements and sunlight exposure to serum Vitamin D (25(OH)D) levels in dialysis patients. Cross-sectional data were obtained from 30 PD (Mean±SD age 56.9±16.2 y; n=13 male) and 22 HD (Mean±SD age 65.4±14.0 y; n=18 male) patients between 2009 and 2010. Serum 25(OH)D was measured and oral vitamin D intake estimated through a food-frequency-questionnaire and quantifying inactive supplementation. Sunlight exposure was assessed using a validated questionnaire. Prevalence of inadequate/insufficient vitamin D differed between dialysis modality (31% and 43% insufficient (<50nmol/L); 4% and 34% deficient (<25nmol/L) in HD and PD patients respectively (p=0.002)). In HD patients, there was a significant correlation between diet plus supplemental vitamin D intake and 25(OH)D (ρ=0.84, p<0.001). Results suggest a higher frequency of 25(OH)D inadequacy/deficiency in PD compared to HD patients. No other relationships between intake, sun exposure and 25(OH)D were seen. This could reflect limitations of the study design or the importance of other factors such as age, ethnicity and sun protection as interactions in the analysis. Understanding these factors is important given Vitamin D’s emerging status as a biomarker of systemic ill health.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study assessed the health-related quality of life (HRQoL), fatigue and physical activity levels of 28 persons with chronic kidney disease (CKD) on initial administration of an erythropoietin stimulating agent, and at 3 months, 6 months and 12 months. The sample comprised of 15 females and 13 males whose ages ranged from 31 to 84 years. Physical activity was measured using the Human Activity Profile (HAP): Self-care, Personal/Household work, Entertainment/Social, Independent exercise. Quality of life was measured using the SF-36 which gives scores on physical health (physical functioning, role-physical, bodily pain and general health) and mental health (vitality, social functioning, role-emotional and emotional well-being). Fatigue was measured by the Fatigue Severity Scale (FSS). Across all time points the renal sample engaged in considerably less HAP personal/household work activities and entertainment/social activities compared to healthy adults. The normative sample engaged in three times more independent/exercise activities compared to renal patients. One-way Repeated measures ANOVAs indicated a significant change over time for SF-36 scales of role physical, vitality, emotional well-being and overall mental health. There was a significant difference in fatigue levels over time [F(3,11) = 3.78, p<.05]. Fatigue was highest at baseline and lowest at 6 months. The more breathlessness the CKD patient reported, the fewer activities undertaken and the greater the reported level of fatigue. There were no significant age differences over time for fatigue or physical activity. Age differences were only found for SF-36 mental health at 3 months (t=-2.41, df=14, p<.05). Those younger than 65 years had lower emotional well-being compared to those aged over 65. Males had poorer physical health compared to females at 12 months. There were no significant gender differences on mental health at any time point. In the management of chronic kidney disease, early detection of a person’s inability to engage in routine activities due to fatigue is necessary. Early detection would enable timely interventions to optimise HRQoL and independent exercise.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Effective management of chronic diseases such as prostate cancer is important. Research suggests a tendency to use self-care treatment options such as over-the-counter (OTC) complementary medications among prostate cancer patients. The current trend in patient-driven recording of health data in an online Personal Health Record (PHR) presents an opportunity to develop new data-driven approaches for improving prostate cancer patient care. However, the ability of current online solutions to share patients' data for better decision support is limited. An informatics approach may improve online sharing of self-care interventions among these patients. It can also provide better evidence to support decisions made during their self-managed care. AIMS: To identify requirements for an online system and describe a new case-based reasoning (CBR) method for improving self-care of advanced prostate cancer patients in an online PHR environment. METHOD: A non-identifying online survey was conducted to understand self-care patterns among prostate cancer patients and to identify requirements for an online information system. The pilot study was carried out between August 2010 and December 2010. A case-base of 52 patients was developed. RESULTS: The data analysis showed self-care patterns among the prostate cancer patients. Selenium (55%) was the common complementary supplement used by the patients. Paracetamol (about 45%) was the commonly used OTC by the patients. CONCLUSION: The results of this study specified requirements for an online case-based reasoning information system. The outcomes of this study are being incorporated in design of the proposed Artificial Intelligence (Al) driven patient journey browser system. A basic version of the proposed system is currently being considered for implementation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Decreased ability to perform Activities of Daily Living (ADLs) during hospitalisation has negative consequences for patients and health service delivery. Objective: To develop an Index to stratify patients at lower and higher risk of a significant decline in ability to perform ADLs at discharge. Design: Prospective two cohort study comprising a derivation (n=389; mean age 82.3 years; SD� 7.1) and a validation cohort (n=153; mean age 81.5 years; SD� 6.1). Patients and setting: General medical patients aged = 70 years admitted to three university-affiliated acute care hospitals in Brisbane, Australia. Measurement and main results: The short ADL Scale was used to identify a significant decline in ability to perform ADLs from premorbid to discharge. In the derivation cohort, 77 patients (19.8%) experienced a significant decline. Four significant factors were identified for patients independent at baseline: 'requiring moderate assistance to being totally dependent on others with bathing'; 'difficulty understanding others (frequently or all the time)'; 'requiring moderate assistance to being totally dependent on others with performing housework'; a 'history of experiencing at least one fall in the previous 90 days prior to hospital admission' in addition to 'independent at baseline', which was protective against decline at discharge. 'Difficulty understanding others (frequently or all the time)' and 'requiring moderate assistance to being totally dependent on others with performing housework' were also predictors for patients dependent in ADLs at baseline. Sensitivity, specificity, Positive Predictive Value (PPV), and Negative Predictive Value (NPV) of the DADLD dichotomised risk scores were: 83.1% (95% CI 72.8; 90.7); 60.5% (95% CI 54.8; 65.9); 34.2% (95% CI 27.5; 41.5); 93.5% (95% CI 89.2; 96.5). In the validation cohort, 47 patients (30.7%) experienced a significant decline. Sensitivity, specificity, PPV and NPV of the DADLD were: 78.7% (95% CI 64.3; 89.3); 69.8% (95% CI 60.1, 78.3); 53.6% (95% CI 41.2; 65.7); 88.1% (95% CI 79.2; 94.1). Conclusions: The DADLD Index is a useful tool for identifying patients at higher risk of decline in ability to perform ADLs at discharge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Adolescent idiopathic scoliosis (AIS) is a deformity of the spine, which may 34 require surgical correction by attaching a rod to the patient’s spine using screws 35 implanted in the vertebral bodies. Surgeons achieve an intra-operative reduction in the 36 deformity by applying compressive forces across the intervertebral disc spaces while 37 they secure the rod to the vertebra. We were interested to understand how the 38 deformity correction is influenced by increasing magnitudes of surgical corrective forces 39 and what tissue level stresses are predicted at the vertebral endplates due to the 40 surgical correction. 41 Methods: Patient-specific finite element models of the osseoligamentous spine and 42 ribcage of eight AIS patients who underwent single rod anterior scoliosis surgery were 43 created using pre-operative computed tomography (CT) scans. The surgically altered 44 spine, including titanium rod and vertebral screws, was simulated. The models were 45 analysed using data for intra-operatively measured compressive forces – three load 46 profiles representing the mean and upper and lower standard deviation of this data 47 were analysed. Data for the clinically observed deformity correction (Cobb angle) were 48 compared with the model-predicted correction and the model results investigated to 49 better understand the influence of increased compressive forces on the biomechanics of 50 the instrumented joints. 51 Results: The predicted corrected Cobb angle for seven of the eight FE models were 52 within the 5° clinical Cobb measurement variability for at least one of the force profiles. 53 The largest portion of overall correction was predicted at or near the apical 54 intervertebral disc for all load profiles. Model predictions for four of the eight patients 55 showed endplate-to-endplate contact was occurring on adjacent endplates of one or 56 more intervertebral disc spaces in the instrumented curve following the surgical loading 57 steps. 58 Conclusion: This study demonstrated there is a direct relationship between intra-59 operative joint compressive forces and the degree of deformity correction achieved. The 60 majority of the deformity correction will occur at or in adjacent spinal levels to the apex 61 of the deformity. This study highlighted the importance of the intervertebral disc space 62 anatomy in governing the coronal plane deformity correction and the limit of this 63 correction will be when bone-to-bone contact of the opposing vertebral endplates 64 occurs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims and objectives. To examine Chinese cancer patients fatigue self-management, including the types of self-management behaviours used, their confidence in using these behaviours, the degree of relief obtained and the factors associated with patients use of fatigue self-management behaviours. Background. Fatigue places significant burden on patients with cancer undergoing chemotherapy. While some studies have explored fatigue self-management in Western settings, very few studies have explored self-management behaviours in China. Design. Cross-sectional self- and/or interviewer-administered survey. Methods. A total of 271 participants with self-reported fatigue in the past week were recruited from a specialist cancer hospital in south-east China. Participants completed measures assessing the use of fatigue self-management behaviours, corresponding self-efficacy, perceived relief levels plus items assessing demographic characteristics, fatigue experiences, distress and social support. Results. A mean of 4_94 (_2_07; range 1–10) fatigue self-management behaviours was reported. Most behaviours were rated as providing moderate relief and were implemented with moderate self-efficacy. Regression analyses identified that having more support from one’s neighbourhood and better functional status predicted the use of a greater number of self-management behaviours. Separate regression analyses identified that greater neighbourhood support predicted greater relief from ‘activity enhancement behaviours’ and that better functional status predicted greater relief from ‘rest and sleep behaviours’. Higher self-efficacy scores predicted greater relief from corresponding behaviours. Conclusions. A range of fatigue self-management behaviours were initiated by Chinese patients with cancer. Individual, condition and environmental factors were found to influence engagement in and relief from fatigue self-managementbehaviours. Relevance to clinical practice. Findings highlight the need for nurses to explore patients use of fatigue self-management behaviours and the effectiveness of these behaviours in reducing fatigue. Interventions that improve patients self-efficacy and neighbourhood supports have the potential to improve outcomes from fatigue self-management behaviours.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines the extent to which patients who have been diagnosed as having terminal cancer choose to use non-medical therapies. In particular it is concerned with the illness behaviour of patients who are receiving conventional cytotoxic drug and radiation treatments, who also decide to use a wide range of ‘alternative’ medications and therapies. The paper discusses the findings of a study of 152 patients with metastatic cancer that examined the extent to which they used alternative cancer therapies, as well as the beliefs and attitudes they have about their cancer, its treatment, and the practitioners providing that treatment. Four groups of users of alternative therapies, who differ according to their commitment to and the type of therapies they use, were identified. Results of logistic regression analyses indicate that those using alternative therapies were different in range of social attitudes. These differences were primarily their greater reported ‘will to live’ and desire for control over treatment decisions, and the differing beliefs they hold about their disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have previously reported a preliminary taxonomy of patient error. However, approaches to managing patients' contribution to error have received little attention in the literature. This paper aims to assess how patients and primary care professionals perceive the relative importance of different patient errors as a threat to patient safety. It also attempts to suggest what these groups believe may be done to reduce the errors, and how. It addresses these aims through original research that extends the nominal group analysis used to generate the error taxonomy. Interviews were conducted with 11 purposively selected groups of patients and primary care professionals in Auckland, New Zealand, during late 2007. The total number of participants was 83, including 64 patients. Each group ranked the importance of possible patient errors identified through the nominal group exercise. Approaches to managing the most important errors were then discussed. There was considerable variation among the groups in the importance rankings of the errors. Our general inductive analysis of participants' suggestions revealed the content of four inter-related actions to manage patient error: Grow relationships; Enable patients and professionals to recognise and manage patient error; be Responsive to their shared capacity for change; and Motivate them to act together for patient safety. Cultivation of this GERM of safe care was suggested to benefit from 'individualised community care'. In this approach, primary care professionals individualise, in community spaces, population health messages about patient safety events. This approach may help to reduce patient error and the tension between personal and population health-care.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Women undergoing Cesarean Section (CS) are vulnerable to the adverse effects associated with perioperative core temperature drop, in part due to the tendency for CS to be performed under neuraxial anesthesia, blood and fluid loss, and vasodilation. Inadvertent perioperative hypothermia (IPH) is a common condition that affects patients undergoing surgery of all specialties and is detrimental to all age groups, including neonates. Previous systematic reviews on IPH prevention largely focus on either adult or all ages populations, and have mainly overlooked pregnant or CS patients as a distinct group. Not all recommendations made by systematic reviews targeting all adult patients may be transferable to CS patients. Alternative, effective methods for preventing or managing hypothermia in this group would be valuable. Objectives To synthesize the best available evidence in relation to preventing and/or treating hypothermia in mothers after CS surgery. Types of participants Adult patients over the age of 18 years, of any ethnic background, with or without co-morbidities, undergoing any mode of anesthesia for any type of CS (emergency or planned) at healthcare facilities who have received interventions to limit or manage perioperative core heat loss were included. Types of intervention(s) Active or passive warming methods versus usual care or placebo, that aim to limit or manage core heat loss as applied to women undergoing CS were included. Types of studies Randomized controlled trials (RCTs) that met the inclusion criteria, with reduction of perioperative hypothermia a primary or secondary outcome were considered. Types of outcomes Primary outcome: maternal core temperature measured during the preoperative, intraoperative and postoperative phases of care Secondary outcomes: newborn core temperature at birth, umbilical pH obtained immediately after birth, Apgar scores, length of Post Anesthetic Care Unit (PACU) stay, maternal thermal comfort. Search strategy A comprehensive search was undertaken of the following databases from their inception until May 2012: ProQuest, Web of Science, Scopus, Dissertation and Theses PQDT (via ProQuest), Current Contents, CENTRAL, Mednar, OpenGrey, Clinical Trials. There were no language restrictions. Methodological quality Retrieved papers were assessed for methodological quality by two independent reviewers prior to inclusion using JBI software. Disagreements were resolved via consultation with the third reviewer. An assessment of quality of the included papers was also made in relation to five key quality factors. Data collection Two independent reviewers extracted data from the included papers using a previously piloted customized data extraction tool. Results 12 studies with a combined total of 719 participants were included. Three broad intervention groups were identified; intravenous (IV) fluid warming, warming devices, leg wrapping. IV fluid warming, whether administered intraoperatively or preoperatively, was found to be effective at maintaining maternal (but not neonatal) temperature and preventing shivering, but does not improve thermal comfort. The effectiveness of IV fluid warming on Apgar scores and umbilical pH remains unclear. Warming devices, including forced air warming and under body carbon polymer mattresses, were effective at preventing hypothermia and reduced shivering, however were most effective if applied preoperatively. The effectiveness of warming devices to improve thermal comfort remains unclear. Preoperative forced air warming appears to aid maintenance of neonatal temperature, while intraoperative forced air warming does not. Forced air warming was not effective at improving Apgar scores and the effects for umbilical pH remain unclear. Conclusions Intravenous fluid warming, by any method, improves maternal temperature and reduces shivering for women undergoing CS. Preoperative body warming devices also improve maternal temperature, in addition to reducing shivering.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Australasian Nutrition Care Day Survey (ANCDS) reported two-in-five patients in Australian and New Zealand hospitals consume ≤50% of the offered food. The ANCDS found a significant association between poor food intake and increased in-hospital mortality after controlling for confounders (nutritional status, age, disease type and severity)1. Evidence for the effectiveness of medical nutrition therapy (MNT) in hospital patients eating poorly is lacking. An exploratory study was conducted in respiratory, neurology and orthopaedic wards of an Australian hospital. At baseline, 24-hour food intake (0%, 25%, 50%, 75%, 100% of offered meals) was evaluated for patients hospitalised for ≥2 days and not under dietetic review. Patients consuming ≤50% of offered meals due to nutrition-impact symptoms were referred to ward dietitians for MNT with food intake re-evaluated on day-7. 184 patients were observed over four weeks. Sixty-two patients (34%) consumed ≤50% of the offered meals. Simple interventions (feeding/menu assistance, diet texture modifications) improved intake to ≥75% in 30 patients who did not require further MNT. Of the 32 patients referred for MNT, baseline and day-7 data were available for 20 patients (68±17years, 65% females, BMI: 22±5kg/m2, median energy, protein intake: 2250kJ, 25g respectively). On day-7, 17 participants (85%) demonstrated significantly higher consumption (4300kJ, 53g; p<0.01). Three participants demonstrated no improvement due to ongoing nutrition-impact symptoms. “Percentage food intake” was a quick tool to identify patients in whom simple interventions could enhance intake. MNT was associated with improved dietary intake in hospital patients. Further research is needed to establish a causal relationship.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It has been reported that poor nutritional status, in the form of weight loss and resulting body mass index (BMI) changes, is an issue in people with Parkinson's disease (PWP). The symptoms resulting from Parkinson's disease (PD) and the side effects of PD medication have been implicated in the aetiology of nutritional decline. However, the evidence on which these claims are based is, on one hand, contradictory, and on the other, restricted primarily to otherwise healthy PWP. Despite the claims that PWP suffer from poor nutritional status, evidence is lacking to inform nutrition-related care for the management of malnutrition in PWP. The aims of this thesis were to better quantify the extent of poor nutritional status in PWP, determine the important factors differentiating the well-nourished from the malnourished and evaluate the effectiveness of an individualised nutrition intervention on nutritional status. Phase DBS: Nutritional status in people with Parkinson's disease scheduled for deep-brain stimulation surgery The pre-operative rate of malnutrition in a convenience sample of people with Parkinson's disease (PWP) scheduled for deep-brain stimulation (DBS) surgery was determined. Poorly controlled PD symptoms may result in a higher risk of malnutrition in this sub-group of PWP. Fifteen patients (11 male, median age 68.0 (42.0 – 78.0) years, median PD duration 6.75 (0.5 – 24.0) years) participated and data were collected during hospital admission for the DBS surgery. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference, waist circumference, body mass index (BMI)) were taken, and body composition was measured using bioelectrical impedance spectroscopy (BIS). Six (40%) of the participants were malnourished (SGA-B) while 53% reported significant weight loss following diagnosis. BMI was significantly different between SGA-A and SGA-B (25.6 vs 23.0kg/m 2, p<.05). There were no differences in any other variables, including PG-SGA score and the presence of non-motor symptoms. The conclusion was that malnutrition in this group is higher than that in other studies reporting malnutrition in PWP, and it is under-recognised. As poorer surgical outcomes are associated with poorer pre-operative nutritional status in other surgeries, it might be beneficial to identify patients at nutritional risk prior to surgery so that appropriate nutrition interventions can be implemented. Phase I: Nutritional status in community-dwelling adults with Parkinson's disease The rate of malnutrition in community-dwelling adults (>18 years) with Parkinson's disease was determined. One hundred twenty-five PWP (74 male, median age 70.0 (35.0 – 92.0) years, median PD duration 6.0 (0.0 – 31.0) years) participated. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference (MAC), calf circumference, waist circumference, body mass index (BMI)) were taken. Nineteen (15%) of the participants were malnourished (SGA-B). All anthropometric indices were significantly different between SGA-A and SGA-B (BMI 25.9 vs 20.0kg/m2; MAC 29.1 – 25.5cm; waist circumference 95.5 vs 82.5cm; calf circumference 36.5 vs 32.5cm; all p<.05). The PG-SGA score was also significantly lower in the malnourished (2 vs 8, p<.05). The nutrition impact symptoms which differentiated between well-nourished and malnourished were no appetite, constipation, diarrhoea, problems swallowing and feel full quickly. This study concluded that malnutrition in community-dwelling PWP is higher than that documented in community-dwelling elderly (2 – 11%), yet is likely to be under-recognised. Nutrition impact symptoms play a role in reduced intake. Appropriate screening and referral processes should be established for early detection of those at risk. Phase I: Nutrition assessment tools in people with Parkinson's disease There are a number of validated and reliable nutrition screening and assessment tools available for use. None of these tools have been evaluated in PWP. In the sample described above, the use of the World Health Organisation (WHO) cut-off (≤18.5kg/m2), age-specific BMI cut-offs (≤18.5kg/m2 for under 65 years, ≤23.5kg/m2 for 65 years and older) and the revised Mini-Nutritional Assessment short form (MNA-SF) were evaluated as nutrition screening tools. The PG-SGA (including the SGA classification) and the MNA full form were evaluated as nutrition assessment tools using the SGA classification as the gold standard. For screening, the MNA-SF performed the best with sensitivity (Sn) of 94.7% and specificity (Sp) of 78.3%. For assessment, the PG-SGA with a cut-off score of 4 (Sn 100%, Sp 69.8%) performed better than the MNA (Sn 84.2%, Sp 87.7%). As the MNA has been recommended more for use as a nutrition screening tool, the MNA-SF might be more appropriate and take less time to complete. The PG-SGA might be useful to inform and monitor nutrition interventions. Phase I: Predictors of poor nutritional status in people with Parkinson's disease A number of assessments were conducted as part of the Phase I research, including those for the severity of PD motor symptoms, cognitive function, depression, anxiety, non-motor symptoms, constipation, freezing of gait and the ability to carry out activities of daily living. A higher score in all of these assessments indicates greater impairment. In addition, information about medical conditions, medications, age, age at PD diagnosis and living situation was collected. These were compared between those classified as SGA-A and as SGA-B. Regression analysis was used to identify which factors were predictive of malnutrition (SGA-B). Differences between the groups included disease severity (4% more severe SGA-A vs 21% SGA-B, p<.05), activities of daily living score (13 SGA-A vs 18 SGA-B, p<.05), depressive symptom score (8 SGA-A vs 14 SGA-B, p<.05) and gastrointestinal symptoms (4 SGA-A vs 6 SGA-B, p<.05). Significant predictors of malnutrition according to SGA were age at diagnosis (OR 1.09, 95% CI 1.01 – 1.18), amount of dopaminergic medication per kg body weight (mg/kg) (OR 1.17, 95% CI 1.04 – 1.31), more severe motor symptoms (OR 1.10, 95% CI 1.02 – 1.19), less anxiety (OR 0.90, 95% CI 0.82 – 0.98) and more depressive symptoms (OR 1.23, 95% CI 1.07 – 1.41). Significant predictors of a higher PG-SGA score included living alone (β=0.14, 95% CI 0.01 – 0.26), more depressive symptoms (β=0.02, 95% CI 0.01 – 0.02) and more severe motor symptoms (OR 0.01, 95% CI 0.01 – 0.02). More severe disease is associated with malnutrition, and this may be compounded by lack of social support. Phase II: Nutrition intervention Nineteen of the people identified in Phase I as requiring nutrition support were included in Phase II, in which a nutrition intervention was conducted. Nine participants were in the standard care group (SC), which received an information sheet only, and the other 10 participants were in the intervention group (INT), which received individualised nutrition information and weekly follow-up. INT gained 2.2% of starting body weight over the 12 week intervention period resulting in significant increases in weight, BMI, mid-arm circumference and waist circumference. The SC group gained 1% of starting weight over the 12 weeks which did not result in any significant changes in anthropometric indices. Energy and protein intake (18.3kJ/kg vs 3.8kJ/kg and 0.3g/kg vs 0.15g/kg) increased in both groups. The increase in protein intake was only significant in the SC group. The changes in intake, when compared between the groups, were no different. There were no significant changes in any motor or non-motor symptoms or in "off" times or dyskinesias in either group. Aspects of quality of life improved over the 12 weeks as well, especially emotional well-being. This thesis makes a significant contribution to the evidence base for the presence of malnutrition in Parkinson's disease as well as for the identification of those who would potentially benefit from nutrition screening and assessment. The nutrition intervention demonstrated that a traditional high protein, high energy approach to the management of malnutrition resulted in improved nutritional status and anthropometric indices with no effect on the presence of Parkinson's disease symptoms and a positive effect on quality of life.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Nutrition screening is usually administered by nurses. However, most studies on nutrition screening tools have not used nurses to validate the tools. The 3-Minute Nutrition Screening (3-MinNS) assesses weight loss, dietary intake and muscle wastage, with the composite score of each used to determine risk of malnutrition. The aim of the study was to determine the validity and reliability of 3-MinNS administered by nurses, who are the intended assessors. Methods In this cross sectional study, three ward-based nurses screened 121 patients aged 21 years and over using 3-MinNS in three wards within 24 hours of admission. A dietitian then assessed the patients nutritional status using Subjective Global Assessment within 48 hours of admission, whilst blinded to the results of the screening. To assess the reliability of 3-MinNS, 37 patients screened by the first nurse were re-screened by a second nurse within 24 hours, who was blinded to the results of the first nurse. The sensitivity, specificity and best cutoff score for 3-MinNS were determined using the Receiver Operator Characteristics Curve. Results The best cutoff score to identify all patients at risk of malnutrition using 3-MinNS was three, with sensitivity of 89% and specificity of 88%. This cutoff point also identified all (100%) severely malnourished patients. There was strong correlation between 3-MinNS and SGA (r=0.78, p<0.001). The agreement between two nurses conducting the 3-MinNS tool was 78.3%. Conclusion 3-Minute Nutrition Screening is a valid and reliable tool for nurses to identify patients at risk of malnutrition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: The objective of the study was to assess the bioequivalence of two tablet formulations of capecitabine and to explore the effect of age, gender, body surface area and creatinine clearance on the systemic exposure to capecitabine and its metabolites. Methods: The study was designed as an open, randomized two-way crossover trial. A single oral dose of 2000 mg capecitabine was administered on two separate days to 25 patients with solid tumors. On one day, the patients received four 500-mg tablets of formulation B (test formulation) and on the other day, four 500-mg tablets of formulation A (reference formulation). The washout period between the two administrations was between 2 and 8 days. After each administration, serial blood and urine samples were collected for up to 12 and 24 h, respectively. Unchanged capecitabine and its metabolites were determined in plasma using LC/MS-MS and in urine by NMRS. Results: Based on the primary pharmacokinetic parameter, AUC(0-∞) of 5'-DFUR, equivalence was concluded for the two formulations, since the 90% confidence interval of the estimate of formulation B relative to formulation A of 97% to 107% was within the acceptance region 80% to 125%. There was no clinically significant difference between the t(max) for the two formulations (median 2.1 versus 2.0 h). The estimate for C(max) was 111% for formulation B compared to formulation A and the 90% confidence interval of 95% to 136% was within the reference region 70% to 143%. Overall, these results suggest no relevant difference between the two formulations regarding the extent to which 5'-DFUR reached the systemic circulation and the rate at which 5'-DFUR appeared in the systemic circulation. The overall urinary excretions were 86.0% and 86.5% of the dose, respectively, and the proportion recovered as each metabolite was similar for the two formulations. The majority of the dose was excreted as FBAL (61.5% and 60.3%), all other chemical species making a minor contribution. Univariate and multivariate regression analysis to explore the influence of age, gender, body surface area and creatinine clearance on the log-transformed pharmacokinetic parameters AUC(0-∞) and C(max) of capecitabine and its metabolites revealed no clinically significant effects. The only statistically significant results were obtained for AUC(0-∞) and C(max) of intact drug and for C(max) of FBAL, which were higher in females than in males. Conclusion: The bioavailability of 5'-DFUR in the systemic circulation was practically identical after administration of the two tablet formulations. Therefore, the two formulations can be regarded as bioequivalent. The variables investigated (age, gender, body surface area, and creatinine clearance) had no clinically significant effect on the pharmacokinetics of capecitabine or its metabolites.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The randomised phase 3 First-Line Erbitux in Lung Cancer (FLEX) study showed that the addition of cetuximab to cisplatin and vinorelbine significantly improved overall survival compared with chemotherapy alone in the first-line treatment of advanced non-small-cell lung cancer (NSCLC). The main cetuximab-related side-effect was acne-like rash. Here, we assessed the association of this acne-like rash with clinical benefit. Methods: We did a subgroup analysis of patients in the FLEX study, which enrolled patients with advanced NSCLC whose tumours expressed epidermal growth factor receptor. Our landmark analysis assessed if the development of acne-like rash in the first 21 days of treatment (first-cycle rash) was associated with clinical outcome, on the basis of patients in the intention-to-treat population alive on day 21. The FLEX study is registered with ClinicalTrials.gov, number NCT00148798. Findings: 518 patients in the chemotherapy plus cetuximab group-290 of whom had first-cycle rash-and 540 patients in the chemotherapy alone group were alive on day 21. Patients in the chemotherapy plus cetuximab group with first-cycle rash had significantly prolonged overall survival compared with patients in the same treatment group without first-cycle rash (median 15·0 months [95% CI 12·8-16·4] vs 8·8 months [7·6-11·1]; hazard ratio [HR] 0·631 [0·515-0·774]; p<0·0001). Corresponding significant associations were also noted for progression-free survival (median 5·4 months [5·2-5·7] vs 4·3 months [4·1-5·3]; HR 0·741 [0·607-0·905]; p=0·0031) and response (rate 44·8% [39·0-50·8] vs 32·0% [26·0-38·5]; odds ratio 1·703 [1·186-2·448]; p=0·0039). Overall survival for patients without first-cycle rash was similar to that of patients that received chemotherapy alone (median 8·8 months [7·6-11·1] vs 10·3 months [9·6-11·3]; HR 1·085 [0·910-1·293]; p=0·36). The significant overall survival benefit for patients with first-cycle rash versus without was seen in all histology subgroups: adenocarcinoma (median 16·9 months, [14·1-20·6] vs 9·3 months [7·7-13·2]; HR 0·614 [0·453-0·832]; p=0·0015), squamous-cell carcinoma (median 13·2 months [10·6-16·0] vs 8·1 months [6·7-12·6]; HR 0·659 [0·472-0·921]; p=0·014), and carcinomas of other histology (median 12·6 months [9·2-16·4] vs 6·9 months [5·2-11·0]; HR 0·616 [0·392-0·966]; p=0·033). Interpretation: First-cycle rash was associated with a better outcome in patients with advanced NSCLC who received cisplatin and vinorelbine plus cetuximab as a first-line treatment. First-cycle rash might be a surrogate clinical marker that could be used to tailor cetuximab treatment for advanced NSCLC to those patients who would be most likely to derive a significant benefit. Funding: Merck KGaA. © 2011 Elsevier Ltd.