799 resultados para Portuguese Registry on Acute Coronary Syndromes
Resumo:
Penetration of fractional flow reserve (FFR) in clinical practice varies extensively, and the applicability of results from randomized trials is understudied. We describe the extent to which the information gained from routine FFR affects patient management strategy and clinical outcome. METHODS AND RESULTS: Nonselected patients undergoing coronary angiography, in which at least 1 lesion was interrogated by FFR, were prospectively enrolled in a multicenter registry. FFR-driven change in management strategy (medical therapy, revascularization, or additional stress imaging) was assessed per-lesion and per-patient, and the agreement between final and initial strategies was recorded. Cardiovascular death, myocardial infarction, or unplanned revascularization (MACE) at 1 year was recorded. A total of 1293 lesions were evaluated in 918 patients (mean FFR, 0.81±0.1). Management plan changed in 406 patients (44.2%) and 584 lesions (45.2%). One-year MACE was 6.9%; patients in whom all lesions were deferred had a lower MACE rate (5.3%) than those with at least 1 lesion revascularized (7.3%) or left untreated despite FFR≤0.80 (13.6%; log-rank P=0.014). At the lesion level, deferral of those with an FFR≤0.80 was associated with a 3.1-fold increase in the hazard of cardiovascular death/myocardial infarction/target lesion revascularization (P=0.012). Independent predictors of target lesion revascularization in the deferred lesions were proximal location of the lesion, B2/C type and FFR. CONCLUSIONS: Routine FFR assessment of coronary lesions safely changes management strategy in almost half of the cases. Also, it accurately identifies patients and lesions with a low likelihood of events, in which revascularization can be safely deferred, as opposed to those at high risk when ischemic lesions are left untreated, thus confirming results from randomized trials.
Resumo:
Background: Portugal has a temperate climate and low industrialization levels existing in the period after World War II, when asbestos materials were used worldwide, has contributed to the generalized belief of low usage of those materials. - Such supposition lacks confirmation; - There is no specific registry of asbestos-related diseases, workers asbestos exposure or asbestos industrial use; - Mesotheliomas are rare neoplasms strongly related to asbestos exposure so they can be used to understand the possible dimension of past exposure to asbestos; - It was estimated that professional diseases under notification was up to 90% for asbestos-related diseases, mainly mesotheliomas.
Resumo:
In this age of evidence-based practice, nurses are increasingly expected to use research evidence in a systematic and judicious way when making decisions about patient care practices. Clinicians recognise the role of research when it provides valid, realistic answers in practical situations. Nonetheless, research is still perceived by some nurses as external to practice and implementing research findings into practice is often difficult. Since its conceptual platform in the 1960s, the emergence and growth of Nursing Development Units, and later, Practice Development Units has been described in the literature as strategic, organisational vehicles for changing the way nurses think about nursing by promoting and supporting a culture of inquiry and research-based practice. Thus, some scholars argue that practice development is situated in the gap between research and practice. Since the 1990s, the discourse has shifted from the structure and outcomes of developing practice to the process of developing practice, using a Practice Development methodology; underpinned by critical social science theory, as a vehicle for changing the culture and context of care. The nursing and practice development literature is dominated by descriptive reports of local practice development activity, typically focusing on reflection on processes or outcomes of processes, and describing perceived benefits. However, despite the volume of published literature, there is little published empirical research in the Australian or international context on the effectiveness of Practice Development as a methodology for changing the culture and context of care - leaving a gap in the literature. The aim of this study was to develop, implement and evaluate the effectiveness of a Practice Development model for clinical practice review and change on changing the culture and context of care for nurses working in an acute care setting. A longitudinal, pre-test/post-test, non-equivalent control group design was used to answer the following research questions: 1. Is there a relationship between nurses' perceptions of the culture and context of care and nurses' perceptions of research and evidence-based practice? 2. Is there a relationship between engagement in a facilitated process of Practice Development and change in nurses' perceptions of the culture and context of care? 3. Is there a relationship between engagement in a facilitated process of Practice Development and change in nurses' perceptions of research and evidence-based practice? Through a critical analysis of the literature and synthesis of the findings of past evaluations of Nursing and Practice Development structures and processes, this research has identified key attributes consistent throughout the chronological and theoretical development of Nursing and Practice Development that exemplify a culture and context of care that is conducive to creating a culture of inquiry and evidence-based practice. The study findings were then used in the development, validation and testing of an instrument to measure change in the culture and context of care. Furthermore, this research has also provided empirical evidence of the relationship of the key attributes to each other and to barriers to research and evidence-based practice. The research also provides empirical evidence regarding the effectiveness of a Practice Development methodology in changing the culture and context of care. This research is noteworthy in its contribution to advancing the discipline of nursing by providing evidence of the degree to which attributes of the culture and context of care, namely autonomy and control, workplace empowerment and constructive team dynamics, can be connected to engagement with research and evidence-based practice.
Resumo:
The effects of exercise and breakfast manipulations on mood and motivation to eat were assessed in 11 healthy females who were regular exercisers and habitual breakfast eaters. The study involved a two by two repeated-measures design, with exercise (or no exercise) and a high-energy breakfast (or low-energy breakfast) as the repeated measures. The exercise or no-exercise session (0800 h) was followed by consumption of the low- or high-energy breakfast (0900 h). An ad libitum lunch test meal was provided 4 hours after the beginning of the exercise session (1200 h). Mood and motivation to eat were continuously tracked from 0800 until 1700 h by an electronic appetite ratings system (EARS). In general, morning subjective mood states (e.g., contentment) were significantly lower in the low-energy breakfast condition, but exercise reversed this effect. Exercise also significantly decreased feelings of lethargy, independent of the breakfast condition. Desire-to-eat and fullness ratings were significantly increased in the low-energy breakfast and high-energy breakfast conditions, respectively. Impairments of mood disappeared in the afternoon after consumption of an ad libitum lunch. In these healthy young adults, the condition inducing the largest energy deficit (exercise and low-energy breakfast) was not associated with the lowest mental states.
Resumo:
Serotonergic hypofunction is associated with a depressive mood state, an increased drive to eat and preference for sweet (SW) foods. High-trait anxiety individuals are characterised by a functional shortage of serotonin during stress, which in turn increases their susceptibility to experience a negative mood and an increased drive for SW foods. The present study examined whether an acute dietary manipulation, intended to increase circulating serotonin levels, alleviated the detrimental effects of a stress-inducing task on subjective appetite and mood sensations, and preference for SW foods in high-trait anxiety individuals. Thirteen high- (eleven females and two males; anxiety scores 45·5 (sd 5·9); BMI 22·9 (sd 3·0)kg/m2) and twelve low- (ten females and two males; anxiety scores 30·4 (sd 4·8); BMI 23·4 (sd 2·5) kg/m2) trait anxiety individuals participated in a placebo-controlled, two-way crossover design. Participants were provided with 40 g α-lactalbumin (LAC; l-tryptophan (Trp):large neutral amino acids (LNAA) ratio of 7·6) and 40 g casein (placebo) (Trp:LNAA ratio of 4·0) in the form of a snack and lunch on two test days. On both the test days, participants completed a stress-inducing task 2 h after the lunch. Mood and appetite were assessed using visual analogue scales. Changes in food hedonics for different taste and nutrient combinations were assessed using a computer task. The results demonstrated that the LAC manipulation did not exert any immediate effects on mood or appetite. However, LAC did have an effect on food hedonics in individuals with high-trait anxiety after acute stress. These individuals expressed a lower liking (P = 0·012) and SW food preference (P = 0·014) after the stressful task when supplemented with LAC.
Resumo:
Purpose of review: To examine the relationship between energy intake, appetite control and exercise, with particular reference to longer term exercise studies. This approach is necessary when exploring the benefits of exercise for weight control, as changes in body weight and energy intake are variable and reflect diversity in weight loss. Recent findings: Recent evidence indicates that longer term exercise is characterized by a highly variable response in eating behaviour. Individuals display susceptibility or resistance to exercise-induced weight loss, with changes in energy intake playing a key role in determining the degree of weight loss achieved. Marked differences in hunger and energy intake exist between those who are capable of tolerating periods of exercise-induced energy deficit, and those who are not. Exercise-induced weight loss can increase the orexigenic drive in the fasted state, but for some this is offset by improved postprandial satiety signalling. Summary: The biological and behavioural responses to acute and long-term exercise are highly variable, and these responses interact to determine the propensity for weight change. For some people, long-term exercise stimulates compensatory increases in energy intake that attenuate weight loss. However, favourable changes in body composition and health markers still exist in the absence of weight loss. The physiological mechanisms that confer susceptibility to compensatory overconsumption still need to be determined.
Resumo:
Background and purpose: The appropriate fixation method for hemiarthroplasty of the hip as it relates to implant survivorship and patient mortality is a matter of ongoing debate. We examined the influence of fixation method on revision rate and mortality.----- ----- Methods: We analyzed approximately 25,000 hemiarthroplasty cases from the AOA National Joint Replacement Registry. Deaths at 1 day, 1 week, 1 month, and 1 year were compared for all patients and among subgroups based on implant type.----- ----- Results: Patients treated with cemented monoblock hemiarthroplasty had a 1.7-times higher day-1 mortality compared to uncemented monoblock components (p < 0.001). This finding was reversed by 1 week, 1 month, and 1 year after surgery (p < 0.001). Modular hemiarthroplasties did not reveal a difference in mortality between fixation methods at any time point.----- ----- Interpretation: This study shows lower (or similar) overall mortality with cemented hemiarthroplasty of the hip.
Resumo:
OBJECTIVES: To identify the prevalence of geriatric syndromes in the premorbid for all syndromes except falls (preadmission), admission, and discharge assessment periods and the incidence of new and significant worsening of existing syndromes at admission and discharge. DESIGN: Prospective cohort study. SETTING: Three acute care hospitals in Brisbane, Australia. PARTICIPANTS: Five hundred seventy-seven general medical patients aged 70 and older admitted to the hospital. MEASUREMENTS: Prevalence of syndromes in the premorbid (or preadmission for falls), admission, and discharge periods; incidence of new syndromes at admission and discharge; and significant worsening of existing syndromes at admission and discharge. RESULTS: The most frequently reported premorbid syndromes were bladder incontinence (44%), impairment in any activity of daily living (ADL) (42%). A high proportion (42%) experienced at least one fall in the 90 days before admission. Two-thirds of the participants experienced between one and five syndromes (cognitive impairment, dependence in any ADL item, bladder and bowel incontinence, pressure ulcer) before, at admission, and at discharge. A majority experienced one or two syndromes during the premorbid (49.4%), admission (57.0%), or discharge (49.0%) assessment period.The syndromes with a higher incidence of significant worsening at discharge (out of the proportion with the syndrome present premorbidly) were ADL limitation (33%), cognitive impairment (9%), and bladder incontinence (8%). Of the syndromes examined at discharge, a higher proportion of patients experienced the following new syndromes at discharge (absent premorbidly): ADL limitation (22%); and bladder incontinence (13%). CONCLUSION: Geriatric syndromes were highly prevalent. Many patients did not return to their premorbid function and acquired new syndromes.
Resumo:
Objectives:Despite many years of research, there is currently no treatment available that results in major neurological or functional recovery after traumatic spinal cord injury (tSCI). In particular, no conclusive data related to the role of the timing of decompressive surgery, and the impact of injury severity on its benefit, have been published to date. This paper presents a protocol that was designed to examine the hypothesized association between the timing of surgical decompression and the extent of neurological recovery in tSCI patients.Study design: The SCI-POEM study is a Prospective, Observational European Multicenter comparative cohort study. This study compares acute (<12 h) versus non-acute (>12 h, <2 weeks) decompressive surgery in patients with a traumatic spinal column injury and concomitant spinal cord injury. The sample size calculation was based on a representative European patient cohort of 492 tSCI patients. During a 4-year period, 300 patients will need to be enrolled from 10 trauma centers across Europe. The primary endpoint is lower-extremity motor score as assessed according to the 'International standards for neurological classification of SCI' at 12 months after injury. Secondary endpoints include motor, sensory, imaging and functional outcomes at 3, 6 and 12 months after injury.Conclusion:In order to minimize bias and reduce the impact of confounders, special attention is paid to key methodological principles in this study protocol. A significant difference in safety and/or efficacy endpoints will provide meaningful information to clinicians, as this would confirm the hypothesis that rapid referral to and treatment in specialized centers result in important improvements in tSCI patients.Spinal Cord advance online publication, 17 April 2012; doi:10.1038/sc.2012.34.
Resumo:
Purpose: To investigate the effects of an acute multinutrient supplement on game-based running performance, peak power output, anaerobic by-products, hormonal profiles, markers of muscle damage, and perceived muscular soreness before, immediately after, and 24 h following competitive rugby union games. Methods: Twelve male rugby union players ingested either a comprehensive multinutrient supplement (SUPP), [RE-ACTIVATE:01], or a placebo (PL) for 5 d. Participants then performed a competitive rugby union game (with global positioning system tracking), with associated blood draws and vertical jump assessments pre, immediately post and 24 h following competition. Results: SUPP ingestion resulted in moderate to large effects for augmented 1st half very high intensity running (VHIR) mean speed (5.9 ± 0.4 vs 4.8 ± 2.3 m·min–1; d= 0.93). Further, moderate increases in 2nd half VHIR distance (137 ± 119 vs 83 ± 89 m; d= 0.73) and VHIR mean speed (5.9 ± 0.6 v 5.3 ± 1.7 m·min–1; d= 0.56) in SUPP condition were also apparent. Postgame aspartate aminotransferase (AST; 44.1 ± 11.8 vs 37.0 ± 3.2 UL; d= 1.16) and creatine kinase (CK; 882 ± 472 vs. 645 ± 123 UL; d= 0.97) measures demonstrated increased values in the SUPP condition, while AST and CK values correlated with 2nd half VHIR distance (r= –0.71 and r= –0.76 respectively). Elevated C-reactive protein (CRP) was observed postgame in both conditions; however, it was significantly blunted with SUPP (P= .05). Conclusions: These findings suggest SUPP may assist in the maintenance of VHIR during rugby union games, possibly via the buffering qualities of SUPP ingredients. However, correlations between increased work completed at very high intensities and muscular degradation in SUPP conditions, may mask any anticatabolic properties of the supplement.
Resumo:
Aim: To determine the effects of an acute multi-nutrient supplement on physiological, performance and recovery responses to intermittent-sprint running and muscular damage during rugby union matches. Methods: Using a randomised, double-blind, cross-over design, twelve male rugby union players ingested either 75 g of a comprehensive multi-nutrient supplement (SUPP), [Musashi] or 1 g of a taste and carbohydrate matched placebo (PL) for 5 days pre-competition. Competitive rugby union game running performance was then measured using 1 Hz GPS data (SPI10, SPI elite, GPSports), in addition to associated blood draws, vertical jump assessments and ratings of perceived muscular soreness (MS) pre, immediately post and 24 h post-competition. Baseline (BL) GPS data was collected during six competition rounds preceding data collection. Results: No significant differences were observed between supplement conditions for all game running, vertical jump, and ratings of perceived muscular soreness. However, effect size analysis indicated SUPP ingestion increased 1st half very high intensity running (VHIR) mean speed (d = 0.93) and 2nd half relative distance (m/min) (d = 0.97). Further, moderate increases in 2nd half VHIR distance (d = 0.73), VHIR m/min (d = 0.70) and VHIR mean speed (d = 0.56) in SUPP condition were also apparent. Moreover, SUPP demonstrated significant increases in 2nd half dist m/min, total game dist m/min and total game HIR m/min compared with BL data (P < 0.05). Further, large ES increases in VHIR time (d = 0.88) and moderate increases in 2nd half HIR m/min (d = 0.65) and 2nd half VHIR m/min (d = 0.74) were observed between SUPP and BL. Post-game aspartate aminotransferase (AST) (d = 1.16) and creatine kinase (CK) (d = 0.97) measures demonstrated increased ES values with SUPP, while AST and CK values correlated with 2nd half VHIR distance (r = −0.71 and r = −0.76 respectively). Elevated c-reactive protein (CRP) was observed post-game in both conditions, however was significantly blunted with SUPP (P = 0.05). Additionally, pre-game (d = 0.98) and post-game (d = 0.96) increases in cortisol (CORT) were apparent with SUPP. No differences were apparent between conditions for pH, lactate, glucose, HCO3, vertical jump assessments and MS (P > 0.05). Conclusion: These findings suggest SUPP may assist in the maintenance of VHIR speeds and distances covered during rugby union games, possibly via the buffering qualities of SUPP ingredients (i.e. caffeine, creatine, bicarbonate). While the mechanisms for these findings are unclear, the similar pH between conditions despite additional VHIR during SUPP may support this conclusion. Finally, correlations between increased work completed at very high intensities and muscular degradation in SUPP conditions, may mask any anti-catabolic properties of supplementation.
Resumo:
Rationale: The Australasian Nutrition Care Day Survey (ANCDS) evaluated if malnutrition and decreased food intake are independent risk factors for negative outcomes in hospitalised patients. Methods: A multicentre (56 hospitals) cross-sectional survey was conducted in two phases. Phase 1 evaluated nutritional status (defined by Subjective Global Assessment) and 24-hour food intake recorded as 0, 25, 50, 75, and 100% intake. Phase 2 data, which included length of stay (LOS), readmissions and mortality, were collected 90 days post-Phase 1. Logistic regression was used to control for confounders: age, gender, disease type and severity (using Patient Clinical Complexity Level scores). Results: Of 3122 participants (53% males, mean age: 65±18 years) 32% were malnourished and 23% consumed�25% of the offered food. Median LOS for malnourished (MN) patients was higher than well-nourished (WN) patients (15 vs. 10 days, p<0.0001). Median LOS for patients consuming �25% of the food was higher than those consuming �50% (13 vs. 11 days, p<0.0001). MN patients had higher readmission rates (36% vs. 30%, p = 0.001). The odds ratios of 90-day in-hospital mortality were 1.8 times greater for MN patients (CI: 1.03 3.22, p = 0.04) and 2.7 times greater for those consuming �25% of the offered food (CI: 1.54 4.68, p = 0.001). Conclusion: The ANCDS demonstrates that malnutrition and/or decreased food intake are associated with longer LOS and readmissions. The survey also establishes that malnutrition and decreased food intake are independent risk factors for in-hospital mortality in acute care patients; and highlights the need for appropriate nutritional screening and support during hospitalisation. Disclosure of Interest: None Declared.