271 resultados para Molière, Armande Claire Elisabeth Grésinde (Béjart) Poquelin, afterwards Guérin, 1643-1700.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE(S): An individual's risk of developing cardiovascular disease (CVD) is influenced by genetic factors. This study focussed on mapping genetic loci for CVD-risk traits in a unique population isolate derived from Norfolk Island. METHODS: This investigation focussed on 377 individuals descended from the population founders. Principal component analysis was used to extract orthogonal components from 11 cardiovascular risk traits. Multipoint variance component methods were used to assess genome-wide linkage using SOLAR to the derived factors. A total of 285 of the 377 related individuals were informative for linkage analysis. RESULTS: A total of 4 principal components accounting for 83% of the total variance were derived. Principal component 1 was loaded with body size indicators; principal component 2 with body size, cholesterol and triglyceride levels; principal component 3 with the blood pressures; and principal component 4 with LDL-cholesterol and total cholesterol levels. Suggestive evidence of linkage for principal component 2 (h(2) = 0.35) was observed on chromosome 5q35 (LOD = 1.85; p = 0.0008). While peak regions on chromosome 10p11.2 (LOD = 1.27; p = 0.005) and 12q13 (LOD = 1.63; p = 0.003) were observed to segregate with principal components 1 (h(2) = 0.33) and 4 (h(2) = 0.42), respectively. CONCLUSION(S): This study investigated a number of CVD risk traits in a unique isolated population. Findings support the clustering of CVD risk traits and provide interesting evidence of a region on chromosome 5q35 segregating with weight, waist circumference, HDL-c and total triglyceride levels.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objectives Only 193 people from Pitcairn Island, all descended from 9 ‘Bounty’ mutineers and 12 Tahitian women, moved to the uninhabited Norfolk Island in 1856. Our objective was to assess the population of Norfolk Island, several thousand km off the eastern coast of Australia, as a genetic isolate of potential use for cardiovascular disease (CVD) gene mapping. Methods A total of 602 participants, approximately two thirds of the island’s present adult population, were characterized for a panel of CVD risk factors. Statistical power and heritability were calculated. Results Norfolk Islander’s possess an increased prevalence of hypertension, obesity and multiple CVD risk factors when compared to outbred Caucasian populations. 64% of the study participants were descendents of the island’s original founder population. Triglycerides, cholesterol, and blood pressures all had heritabilities above 0.2. Conclusions The Norfolk land population is a potentially useful genetic isolate for gene mapping studies aimed at identifying CVD risk factor quantitative trait loci (QTL).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: US Centers for Disease Control guidelines recommend replacement of peripheral intravenous (IV) catheters no more frequently than every 72 to 96 hours. Routine replacement is thought to reduce the risk of phlebitis and bloodstream infection. Catheter insertion is an unpleasant experience for patients and replacement may be unnecessary if the catheter remains functional and there are no signs of inflammation. Costs associated with routine replacement may be considerable. This is an update of a review first published in 2010. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The International Classification of Diseases, Version 10, Australian modification (ICD-10- AM) is commonly used to classify diseases in hospital patients. ICD-10-AM defines malnutrition as “BMI < 18.5 kg/m2 or unintentional weight loss of ≥ 5% with evidence of suboptimal intake resulting in subcutaneous fat loss and/or muscle wasting”. The Australasian Nutrition Care Day Survey (ANCDS) is the most comprehensive survey to evaluate malnutrition prevalence in acute care patients from Australian and New Zealand hospitals1. This study determined if malnourished participants were assigned malnutritionrelated codes as per ICD-10-AM. The ANCDS recruited acute care patients from 56 hospitals. Hospital-based dietitians evaluated participants’ nutritional status using BMI and Subjective Global Assessment (SGA). In keeping with the ICD-10-AM definition, malnutrition was defined as BMI <18.5kg/m2, SGA-B (moderately malnourished) or SGA-C (severely malnourished). After three months, in this prospective cohort study, hospitals’ health information/medical records department provided coding results for malnourished participants. Although malnutrition was prevalent in 32% (n= 993) of the cohort (N= 3122), a significantly small number were coded for malnutrition (n= 162, 16%, p<0.001). In 21 hospitals, none of the malnourished participants were coded. This is the largest study to provide a snapshot of malnutrition-coding in Australian and New Zealand hospitals. Findings highlight gaps in malnutrition documentation and/or subsequent coding, which could potentially result in significant loss of casemix-related revenue for hospitals. Dietitians must lead the way in developing structured processes for malnutrition identification, documentation and coding.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Australasian Nutrition Care Day Survey (ANCDS) reported two-in-five patients in Australian and New Zealand hospitals consume ≤50% of the offered food. The ANCDS found a significant association between poor food intake and increased in-hospital mortality after controlling for confounders (nutritional status, age, disease type and severity)1. Evidence for the effectiveness of medical nutrition therapy (MNT) in hospital patients eating poorly is lacking. An exploratory study was conducted in respiratory, neurology and orthopaedic wards of an Australian hospital. At baseline, 24-hour food intake (0%, 25%, 50%, 75%, 100% of offered meals) was evaluated for patients hospitalised for ≥2 days and not under dietetic review. Patients consuming ≤50% of offered meals due to nutrition-impact symptoms were referred to ward dietitians for MNT with food intake re-evaluated on day-7. 184 patients were observed over four weeks. Sixty-two patients (34%) consumed ≤50% of the offered meals. Simple interventions (feeding/menu assistance, diet texture modifications) improved intake to ≥75% in 30 patients who did not require further MNT. Of the 32 patients referred for MNT, baseline and day-7 data were available for 20 patients (68±17years, 65% females, BMI: 22±5kg/m2, median energy, protein intake: 2250kJ, 25g respectively). On day-7, 17 participants (85%) demonstrated significantly higher consumption (4300kJ, 53g; p<0.01). Three participants demonstrated no improvement due to ongoing nutrition-impact symptoms. “Percentage food intake” was a quick tool to identify patients in whom simple interventions could enhance intake. MNT was associated with improved dietary intake in hospital patients. Further research is needed to establish a causal relationship.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Secretion of proinflammatory cytokines by LPS activated endothelial cells contributes substantially to the pathogenesis of sepsis. However, the mechanism involved in this process is not well understood. In the present study, we determined the roles of GEF-H1 (Guanine-nucleotide exchange factor-H1)-RhoA signalling in LPS-induced interleukin-8 (IL-8, CXCL8) production in endothelial cells. First, we observed that GEF-H1 expression was upregulated in a dose- and time-dependent manner as consistent with TLR4 (Toll-like receptor 4) expression after LPS stimulation. Afterwards, Clostridium difficile toxin B-10463 (TcdB-10463), an inhibitor of Rho activities, reduced LPS-induced NF-κB phosphorylation. Inhibition of GEF-H1 and RhoA expression reduced LPS-induced NF-κB and p38 phosphorylation. TLR4 knockout blocked LPS-induced activity of RhoA, however, MyD88 knockout did not impair the LPS-induced activity of RhoA. Nevertheless, TLR4 and MyD88 knockout both significantly inhibited transactivation of NF-κB. GEF-H1-RhoA and MyD88 both induced significant changes in NF-κB transactivation and IL-8 synthesis. Co-inhibition of GEF-H1-RhoA and p38 expression produced similar inhibitory effects on LPS-induced NF-κB transactivation and IL-8 synthesis as inhibition of p38 expression alone, thus confirming that activation of p38 was essential for the GEF-H1-RhoA signalling pathway to induce NF-κB transactivation and IL-8 synthesis. Taken together, these results demonstrate that LPS-induced NF-κB activation and IL-8 synthesis in endothelial cells are regulated by the MyD88 pathway and GEF-H1-RhoA pathway.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Results of recent studies suggest that circulating levels of vitamin D may play an important role in cancer-specific outcomes. The present systematic review was undertaken to determine the prevalence of vitamin D deficiency (<25 nmol/L) and insufficiency (25-50 nmol/L) in cancer patients and to evaluate the association between circulating calcidiol (the indicator of vitamin D status) and clinical outcomes. A systematic search of original, peer-reviewed studies on calcidiol at cancer diagnosis, and throughout treatment and survival, was conducted yielding 4,706 studies. A total of 37 studies met the inclusion criteria for this review. Reported mean blood calcidiol levels ranged from 24.7 to 87.4 nmol/L, with up to 31% of patients identified as deficient and 67% as insufficient. The efficacy of cholecalciferol supplementation for raising the concentration of circulating calcidiol is unclear; standard supplement regimens of <1,000 IU D3 /day may not be sufficient to maintain adequate concentrations or prevent decreasing calcidiol. Dose-response studies linking vitamin D status to musculoskeletal and survival outcomes in cancer patients are lacking.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A progressive global increase in the burden of allergic diseases has affected the industrialized world over the last half century and has been reported in the literature. The clinical evidence reveals a general increase in both incidence and prevalence of respiratory diseases, such as allergic rhinitis (common hay fever) and asthma. Such phenomena may be related not only to air pollution and changes in lifestyle, but also to an actual increase in airborne quantities of allergenic pollen. Experimental enhancements of carbon dioxide (CO) have demonstrated changes in pollen amount and allergenicity, but this has rarely been shown in the wider environment. The present analysis of a continental-scale pollen data set reveals an increasing trend in the yearly amount of airborne pollen for many taxa in Europe, which is more pronounced in urban than semi-rural/rural areas. Climate change may contribute to these changes, however increased temperatures do not appear to be a major influencing factor. Instead, we suggest the anthropogenic rise of atmospheric CO levels may be influential.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background and aims The Australasian Nutrition Care Day Survey (ANCDS) reported two-in-five patients consume ≤50% of the offered food in Australian and New Zealand hospitals. After controlling for confounders (nutritional status, age, disease type and severity), the ANCDS also established an independent association between poor food intake and increased in-hospital mortality. This study aimed to evaluate if medical nutrition therapy (MNT) could improve dietary intake in hospital patients eating poorly. Methods An exploratory pilot study was conducted in the respiratory, neurology and orthopaedic wards of an Australian hospital. At baseline, percentage food intake (0%, 25%, 50%, 75%, and 100%) was evaluated for each main meal and snack for a 24-hour period in patients hospitalised for ≥2 days and not under dietetic review. Patients consuming ≤50% of offered meals due to nutrition-impact symptoms were referred to ward dietitians for MNT. Food intake was re-evaluated on the seventh day following recruitment (post-MNT). Results 184 patients were observed over four weeks; 32 patients were referred for MNT. Although baseline and post-MNT data for 20 participants (68±17years, 65% females) indicated a significant increase in median energy and protein intake post-MNT (3600kJ/day, 40g/day) versus baseline (2250kJ/day, 25g/day) (p<0.05), the increased intake met only 50% of dietary requirements. Persistent nutrition impact symptoms affected intake. Conclusion In this pilot study whilst dietary intake improved, it remained inadequate to meet participants’ estimated requirements due to ongoing nutrition-impact symptoms. Appropriate medical management and early enteral feeding could be a possible solution for such patients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background The assessment of competence for health professionals including nutrition and dietetics professionals in work-based settings is challenging. The present study aimed to explore the experiences of educators involved in the assessment of nutrition and dietetics students in the practice setting and to identify barriers and enablers to effective assessment. Methods A qualitative research approach using in-depth interviews was employed with a convenience sample of inexperienced dietitian assessors. Interviews explored assessment practices and challenges. Data were analysed using a thematic approach within a phenomenological framework. Twelve relatively inexperienced practice educators were purposefully sampled to take part in the present study. Results Three themes emerged from these data. (i) Student learning and thus assessment is hindered by a number of barriers, including workload demands and case-mix. Some workplaces are challenged to provide appropriate learning opportunities and environment. Adequate support for placement educators from the university, managers and their peers and planning are enablers to effective assessment. (ii) The role of the assessor and their relationship with students impacts on competence assessment. (iii) There is a lack of clarity in the tasks and responsibilities of competency-based assessment. Conclusions The present study provides perspectives on barriers and enablers to effective assessment. It highlights the importance of reflective practice and feedback in assessment practices that are synonymous with evidence from other disciplines, which can be used to better support a work-based competency assessment of student performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Population ageing is one of the major challenges of the 21st century and societies need to optimize opportunities for active ageing. This thesis explored how the built environment impacts the mobility and participation within the community. A combination of person-based GPS tracking and in-depth interviews was used to collect data on transportation use and engagement in activities of older people living within Brisbane. It showed that the built environment has a strong impact on mobility. To enable healthy and active ageing modern communities need to overcome car dependency and provide mobility options that are tailored towards older people’s needs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim There is a growing population of people with cancer who experience physiological and psychological effects that persist long after treatment is complete. Interventions that enhance survivors’ self-management abilities might help offset these effects. The aim of this pilot study was to develop, implement and evaluate interventions tailored to assist patients to manage post-treatment health issues effectively. Method In this pre-post intervention cohort study, participants were recruited on completion of cancer treatment. Participants recruited preimplementation, who received usual care, comprised the control group. Participants recruited later formed the intervention group. In the intervention group, the Cancer Care Coordinator developed an individualised, structured Cancer Survivor Self-management Care Plan. Participants were interviewed on completion of treatment (baseline) and at three months. Assessments concerned health needs (CaSUN), self-efficacy in adjusting and coping with cancer and health-related quality of life (FACIT-B or FACT-C). The impact of the intervention was determined by independent t-tests of change scores. Results The intervention (n = 32) and control groups (n = 35) were comparable on demographic and clinical characteristics. Sample mean age was 54 + 10 years. Cancer diagnoses were breast (82%) and colorectal (18%). Statistically significant differences (p < 0.05) indicated improvement in the intervention group for: (a) functional well-being, from the FACIT, (Control: M = −0.69, SE = 0.91; Intervention: M = 3.04, SE = 1.13); and (b) self-efficacy in maintaining social relationships, (Control: M = −0.333, SE = 0.33; Intervention: M = 0.621, SE = 0.27). No significant differences were found in health needs, other subscales of quality of life, the extent and number of strategies used in coping and adjusting to cancer and in other domains of self-efficacy. Conclusions While the results should be interpreted with caution, due to the non-randomised nature of the study and the small sample size, they indicate the potential benefits of tailored self-management interventions warrant further investigation in this context.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: Modern series from high-volume esophageal centers report an approximate 40% 5-year survival in patients treated with curative intent and postoperative mortality rates of less than 4%. An objective analysis of factors that underpin current benchmarks within high-volume centers has not been performed. Methods: Three time periods were studied, 1990 to 1998 (period 1), 1999 to 2003 (period 2), and 2004 to 2008 (period 3), in which 471, 254, and 342 patients, respectively, with esophageal cancer were treated with curative intent. All data were prospectively recorded, and staging, pathology, treatment, operative, and oncologic outcomes were compared. Results: Five-year disease-specific survival was 28%, 35%, and 44%, and in-hospital postoperative mortality was 6.7%, 4.4%, and 1.7% for periods 1 to 3, respectively (P < .001). Period 3, compared with periods 1 and 2, respectively, was associated with significantly (P < .001) more early tumors (17% vs 4% and 6%), higher nodal yields (median 22 vs 11 and 18), and a higher R0 rate in surgically treated patients (81% vs 73% and 75%). The use of multimodal therapy increased (P < .05) across time periods. By multivariate analysis, age, T stage, N stage, vascular invasion, R status, and time period were significantly (P < .0001) associated with outcome. Conclusions: Improved survival with localized esophageal cancer in the modern era may reflect an increase of early tumors and optimized staging. Important surgical and pathologic standards, including a higher R0 resection rate and nodal yields, and lower postoperative mortality, were also observed. Copyright © 2012 by The American Association for Thoracic Surgery.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nonsmall cell lung cancer samples from the European Early Lung Cancer biobank were analysed to assess the prognostic significance of mutations in the TP53, KRAS and EGFR genes. The series included 11 never-smokers, 86 former smokers, 152 current smokers and one patient without informed smoking status. There were 110 squamous cell carcinomas (SCCs), 133 adenocarcinomas (ADCs) and seven large cell carcinomas or mixed histologies. Expression of p53 was analysed by immunohistochemistry. DNA was extracted from frozen tumour tissues. TP53 mutations were detected in 48.8% of cases and were more frequent among SCCs than ADCs (p<0.0001). TP53 mutation status was not associated with prognosis. G to T transversions, known to be associated with smoking, were marginally more common among patients who developed a second primary lung cancer or recurrence/metastasis (progressive disease). EGFR mutations were almost exclusively found in never-smoking females (p=0.0067). KRAS mutations were detected in 18.5% of cases, mainly ADC (p<0.0001), and showed a tendency toward association with progressive disease status. These results suggest that mutations are good markers of different aetiologies and histopathological forms of lung cancers but have little prognostic value, with the exception of KRAS mutation, which may have a prognostic value in ADC. Copyright©ERS 2012.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Do you need a practical guide to assessment, curriculum and policy? Are you also looking for a book that is firmly grounded in the theory of this subject? Assessment for Education combines both theory and practice, making it the perfect guide for students, researchers, academics and teachers. This book makes assessment processes transparent for practitioners, and shows how assessment should relate to education. It looks at evidence-informed decision-making and the interrelationships between standards, judgment and moderation practice for improved assessment, teacher quality, schools and systems. The book will provide you with: ' Knowledge about quality assessment and judgement practice ' Understanding of relationships across curriculum, assessment, teaching and learning ' Knowledge of the concept of front-ending assessment based on the learner's needs ' An analysis of practitioner judgement approaches ' Understanding of the conditions under which teacher assessment can be valid ' Principles derived from research of social moderation practices Whether you are studying and researching assessment or working in curriculum and assessment policy, this book will show you how practitioner use of achievement standards can improve learning, equity, social justice and accountability.