894 resultados para tree mortality and recruitment
Resumo:
Background: Hypertrophic cardiomyopathy (HCM) is a common cardiac disease caused by a range of genetic and acquired disorders. The most common cause is genetic variation in sarcomeric proteins genes. Current ESC guidelines suggest that particular clinical features (‘red flags’) assist in differential diagnosis. Aims: To test the hypothesis that left ventricular (LV) systolic dysfunction in the presence of increased wall thickness is an age-specific ‘red flag’ for aetiological diagnosis and to determine long-term outcomes in adult patients with various types of HCM. Methods: A cohort of 1697 adult patients with HCM followed at two European referral centres were studied. Aetiological diagnosis was based on clinical examination, cardiac imaging and targeted genetic and biochemical testing. Main outcomes were: all-cause mortality or heart transplantation (HTx) and heart failure (HF) related-death. All-cause mortality included sudden cardiac death or equivalents, HF and stroke-related death and non-cardiovascular death. Results: Prevalence of different aetiologies was as follows: sarcomeric HCM 1288 (76%); AL amyloidosis 115 (7%), hereditary TTR amyloidosis 86 (5%), Anderson-Fabry disease 85 (5%), wild-type TTR amyloidosis 48 (3%), Noonan syndrome 15 (0.9%), mitochondrial disease 23 (1%), Friedreich’s ataxia 11 (0.6%), glycogen storage disease 16 (0.9%), LEOPARD syndrome 7 (0.4%), FHL1 2 (0.1%) and CPT II deficiency 1 (0.1%). Systolic dysfunction at first evaluation was significantly more frequent in phenocopies than sarcomeric HCM [105/409 (26%) versus 40/1288 (3%), (p<0.0001)]. All-cause mortality/HTx and HF-related death were higher in phenocopies compared to sarcomeric HCM (p<0.001, respectively). When considering specific aetiologies, all-cause mortality and HF-related death were higher in cardiac amyloidosis (p<0.001, respectively). Conclusion: Systolic dysfunction at first evaluation is more common in phenocopies compared to sarcomeric HCM representing an age-specific ‘red flag’ for differential diagnosis. Long-term prognosis was more severe in phenocopies compared to sarcomeric HCM and when comparing specific aetiologies, cardiac amyloidosis showed the worse outcomes.
Resumo:
Climate change is occurring at a faster rate than in the past, with an expected increase of mean sea surface temperatures up to 4.8°C by the end of this century. The actual capabilities of marine invertebrates to adapt to these rapid changes has still to be understood. Adult echinoids play a crucial role in the tropical ecosystems where they live. Despite their role, few studies about the effect of temperature increase on their viability have been reported in literature. This thesis work reports a first systematic study on several Caribbean echinoids about their tolerance to temperature rise in the context of global warming. The research - carried out at the Bocas del Toro Station of the Smithsonian Tropical Research Institute, in Panama - focalized on the 6 sea urchins Lytechinus variegatus, L. williamsi, Echinometra lucunter, E. viridis, Tripneustes ventricosus and Eucidaris tribuloides, and the 2 sand dollars Clypeaster rosaceus and C. subdepressus. Mortality and neuromuscular well-being indicators - such as righting response, covering behaviour, adhesion to the substrate, spine and tube feet movements - have been analysed in the temperature range 28-38°C. The righting time RT (i.e., the time necessary for the animal to right itself completely after inversion) measured in the 6 sea urchin species, demonstrated a clearly dependence on the water temperature. The experiments allowed to determine the “thermal safety margin” (TSM) of each species. Echinometra lucunter and E. viridis resulted the most tolerant species to high temperatures with a TSM of 5.5°C, while T. ventricosus was the most vulnerable with a TSM of only 3°C. The study assessed that all the species already live at temperatures close to their upper thermal limit. Their TSMs are comparable to the predicted temperature increase by 2100. In absence of acclimatization to such temperature change, these species could experience severe die-offs, with important consequences for tropical marine ecosystems.
Resumo:
To prevent iatrogenic damage, transfusions of red blood cells should be avoided. For this, specific and reliable transfusion triggers must be defined. To date, the optimal hematocrit during the initial operating room (OR) phase is still unclear in patients with severe traumatic brain injury (TBI). We hypothesized that hematocrit values exceeding 28%, the local hematocrit target reached by the end of the initial OR phase, resulted in more complications, increased mortality, and impaired recovery compared to patients in whom hematocrit levels did not exceed 28%.
Resumo:
Chemotherapy-induced neutropenia is a major risk factor for infection-related morbidity and mortality and also a significant dose-limiting toxicity in cancer treatment. Patients developing severe (grade 3/4) or febrile neutropenia (FN) during chemotherapy frequently receive dose reductions and/or delays to their chemotherapy. This may impact the success of treatment, particularly when treatment intent is either curative or to prolong survival. In Europe, prophylactic treatment with granulocyte-colony stimulating factors (G-CSFs), such as filgrastim (including approved biosimilars), lenograstim or pegfilgrastim is available to reduce the risk of chemotherapy-induced neutropenia. However, the use of G-CSF prophylactic treatment varies widely in clinical practice, both in the timing of therapy and in the patients to whom it is offered. The need for generally applicable, European-focused guidelines led to the formation of a European Guidelines Working Party by the European Organisation for Research and Treatment of Cancer (EORTC) and the publication in 2006 of guidelines for the use of G-CSF in adult cancer patients at risk of chemotherapy-induced FN. A new systematic literature review has been undertaken to ensure that recommendations are current and provide guidance on clinical practice in Europe. We recommend that patient-related adverse risk factors, such as elderly age (≥65 years) and neutrophil count be evaluated in the overall assessment of FN risk before administering each cycle of chemotherapy. It is important that after a previous episode of FN, patients receive prophylactic administration of G-CSF in subsequent cycles. We provide an expanded list of common chemotherapy regimens considered to have a high (≥20%) or intermediate (10-20%) risk of FN. Prophylactic G-CSF continues to be recommended in patients receiving a chemotherapy regimen with high risk of FN. When using a chemotherapy regimen associated with FN in 10-20% of patients, particular attention should be given to patient-related risk factors that may increase the overall risk of FN. In situations where dose-dense or dose-intense chemotherapy strategies have survival benefits, prophylactic G-CSF support is recommended. Similarly, if reductions in chemotherapy dose intensity or density are known to be associated with a poor prognosis, primary G-CSF prophylaxis may be used to maintain chemotherapy. Clinical evidence shows that filgrastim, lenograstim and pegfilgrastim have clinical efficacy and we recommend the use of any of these agents to prevent FN and FN-related complications where indicated. Filgrastim biosimilars are also approved for use in Europe. While other forms of G-CSF, including biosimilars, are administered by a course of daily injections, pegfilgrastim allows once-per-cycle administration. Choice of formulation remains a matter for individual clinical judgement. Evidence from multiple low level studies derived from audit data and clinical practice suggests that some patients receive suboptimal daily G-CSFs; the use of pegfilgrastim may avoid this problem.
Resumo:
Objective. To assess differences in access to antiretroviral treatment (ART) and patient outcomes across public sector treatment facilities in the Free State province, South Africa. Design. Prospective cohort study with retrospective database linkage. We analysed data on patients enrolled in the treatment programme across 36 facilities between May 2004 and December 2007, and assessed percentage initiating ART and percentage dead at 1 year after enrolment. Multivariable logistic regression was used to estimate associations of facility-level and patient-level characteristics with both mortality and treatment status. Results. Of 44 866 patients enrolled, 15 219 initiated treatment within 1 year; 8 778 died within 1 year, 7 286 before accessing ART. Outcomes at 1 year varied greatly across facilities and more variability was explained by facility-level factors than by patient-level factors. The odds of starting treatment within 1 year improved over calendar time. Patients enrolled in facilities with treatment initiation available on site had higher odds of starting treatment and lower odds of death at 1 year compared with those enrolled in facilities that did not offer treatment initiation. Patients were less likely to start treatment if they were male, severely immunosuppressed (CD4 count ≤50 cells/μl), or underweight (<50 kg). Men were also more likely to die in the first year after enrolment. Conclusions. Although increasing numbers of patients started ART between 2004 and 2007, many patients died before accessing ART. Patient outcomes could be improved by decentralisation of treatment services, fast-tracking the most immunodeficient patients and improving access, especially for men.
Resumo:
Cerebral vasospasm after aneurysmal subarachnoid hemorrhage (aSAH) is a frequent but unpredictable complication associated with poor outcome. Current vasospasm therapies are suboptimal; new therapies are needed. Clazosentan, an endothelin receptor antagonist, has shown promise in phase 2 studies, and two randomized, double-blind, placebo-controlled phase 3 trials (CONSCIOUS-2 and CONSCIOUS-3) are underway to further investigate its impact on vasospasm-related outcome after aSAH. Here, we describe the design of these studies, which was challenging with respect to defining endpoints and standardizing endpoint interpretation and patient care. Main inclusion criteria are: age 18-75 years; SAH due to ruptured saccular aneurysm secured by surgical clipping (CONSCIOUS-2) or endovascular coiling (CONSCIOUS-3); substantial subarachnoid clot; and World Federation of Neurosurgical Societies grades I-IV prior to aneurysm-securing procedure. In CONSCIOUS-2, patients are randomized 2:1 to clazosentan (5 mg/h) or placebo. In CONSCIOUS-3, patients are randomized 1:1:1 to clazosentan 5, 15 mg/h, or placebo. Treatment is initiated within 56 h of aSAH and continued until 14 days after aSAH. Primary endpoint is a composite of mortality and vasospasm-related morbidity within 6 weeks of aSAH (all-cause mortality, vasospasm-related new cerebral infarction, vasospasm-related delayed ischemic neurological deficit, neurological signs or symptoms in the presence of angiographic vasospasm leading to rescue therapy initiation). Main secondary endpoint is extended Glasgow Outcome Scale at week 12. A critical events committee assesses all data centrally to ensure consistency in interpretation, and patient management guidelines are used to standardize care. Results are expected at the end of 2010 and 2011 for CONSCIOUS-2 and CONSCIOUS-3, respectively.
Resumo:
Purpose The hypothesis of this clinical study was to determine whether glucocorticoid use and immobility were associated with in-hospital nutritional risk. Methods One hundred and one patients consecutively admitted to the medical wards were enrolled. Current medical conditions, symptoms, medical history, eating and drinking habits, diagnosis, laboratory findings, medications, and anthropometrics were recorded. The Nutrition Risk Score 2002 (NRS-2002) was used as a screening instrument to identify nutritional risk. Results The results confirmed that glucocorticoid use and immobility are independently associated with nutritional risk determined by the NRS-2002. Constipation could be determined as an additional cofactor independently associated with nutritional risk. Conclusions Glucocorticoid treatment, immobility, and constipation are associated with nutritional risk in a mixed hospitalized population. The presence of long-time glucocorticoid use, immobility, or constipation should alert the clinician to check for nutritional status, which is an important factor in mortality and morbidity.
Resumo:
P>1. There are a number of models describing population structure, many of which have the capacity to incorporate spatial habitat effects. One such model is the source-sink model, that describes a system where some habitats have a natality that is higher than mortality (source) and others have a mortality that exceeds natality (sink). A source can be maintained in the absence of migration, whereas a sink will go extinct. 2. However, the interaction between population dynamics and habitat quality is complex, and concerns have been raised about the validity of published empirical studies addressing source-sink dynamics. In particular, some of these studies fail to provide data on survival, a significant component in disentangling a sink from a low quality source. Moreover, failing to account for a density-dependent increase in mortality, or decrease in fecundity, can result in a territory being falsely assigned as a sink, when in fact, this density-dependent suppression only decreases the population size to a lower level, hence indicating a 'pseudo-sink'. 3. In this study, we investigate a long-term data set for key components of territory-specific demography (mortality and reproduction) and their relationship to habitat characteristics in the territorial, group-living Siberian jay (Perisoreus infaustus). We also assess territory-specific population growth rates (r), to test whether spatial population dynamics are consistent with the ideas of source-sink dynamics. 4. Although average mortality did not differ between sexes, habitat-specific mortality did. Female mortality was higher in older forests, a pattern not observed in males. Male mortality only increased with an increasing amount of open areas. Moreover, reproductive success was higher further away from human settlement, indicating a strong effect of human-associated nest predators. 5. Averaged over all years, 76% of the territories were sources. These territories generally consisted of less open areas, and were located further away from human settlement. 6. The source-sink model provides a tool for modelling demography in distinct habitat patches of different quality, which can aid in identifying key habitats within the landscape, and thus, reduce the risk of implementing unsound management decisions.
Resumo:
Introduction Reduced left ventricular function in patients with severe symptomatic valvular aortic stenosis is associated with impaired clinical outcome in patients undergoing surgical aortic valve replacement (SAVR). Transcatheter Aortic Valve Implantation (TAVI) has been shown non-inferior to SAVR in high-risk patients with respect to mortality and may result in faster left ventricular recovery. Methods We investigated clinical outcomes of high-risk patients with severe aortic stenosis undergoing medical treatment (n = 71) or TAVI (n = 256) stratified by left ventricular ejection fraction (LVEF) in a prospective single center registry. Results Twenty-five patients (35%) among the medical cohort were found to have an LVEF≤30% (mean 26.7±4.1%) and 37 patients (14%) among the TAVI patients (mean 25.2±4.4%). Estimated peri-interventional risk as assessed by logistic EuroSCORE was significantly higher in patients with severely impaired LVEF as compared to patients with LVEF>30% (medical/TAVI 38.5±13.8%/40.6±16.4% versus medical/TAVI 22.5±10.8%/22.1±12.8%, p <0.001). In patients undergoing TAVI, there was no significant difference in the combined endpoint of death, myocardial infarction, major stroke, life-threatening bleeding, major access-site complications, valvular re-intervention, or renal failure at 30 days between the two groups (21.0% versus 27.0%, p = 0.40). After TAVI, patients with LVEF≤30% experienced a rapid improvement in LVEF (from 25±4% to 34±10% at discharge, p = 0.002) associated with improved NYHA functional class at 30 days (decrease ≥1 NYHA class in 95%). During long-term follow-up no difference in survival was observed in patients undergoing TAVI irrespective of baseline LVEF (p = 0.29), whereas there was a significantly higher mortality in medically treated patients with severely reduced LVEF (log rank p = 0.001). Conclusion TAVI in patients with severely reduced left ventricular function may be performed safely and is associated with rapid recovery of systolic left ventricular function and heart failure symptoms.
Resumo:
Perinatal care of pregnant women at high risk for preterm delivery and of preterm infants born at the limit of viability (22-26 completed weeks of gestation) requires a multidisciplinary approach by an experienced perinatal team. Limited precision in the determination of both gestational age and foetal weight, as well as biological variability may significantly affect the course of action chosen in individual cases. The decisions that must be taken with the pregnant women and on behalf of the preterm infant in this context are complex and have far-reaching consequences. When counselling pregnant women and their partners, neonatologists and obstetricians should provide them with comprehensive information in a sensitive and supportive way to build a basis of trust. The decisions are developed in a continuing dialogue between all parties involved (physicians, midwives, nursing staff and parents) with the principal aim to find solutions that are in the infant's and pregnant woman's best interest. Knowledge of current gestational age-specific mortality and morbidity rates and how they are modified by prenatally known prognostic factors (estimated foetal weight, sex, exposure or nonexposure to antenatal corticosteroids, single or multiple births) as well as the application of accepted ethical principles form the basis for responsible decision-making. Communication between all parties involved plays a central role. The members of the interdisciplinary working group suggest that the care of preterm infants with a gestational age between 22 0/7 and 23 6/7 weeks should generally be limited to palliative care. Obstetric interventions for foetal indications such as Caesarean section delivery are usually not indicated. In selected cases, for example, after 23 weeks of pregnancy have been completed and several of the above mentioned prenatally known prognostic factors are favourable or well informed parents insist on the initiation of life-sustaining therapies, active obstetric interventions for foetal indications and provisional intensive care of the neonate may be reasonable. In preterm infants with a gestational age between 24 0/7 and 24 6/7 weeks, it can be difficult to determine whether the burden of obstetric interventions and neonatal intensive care is justified given the limited chances of success of such a therapy. In such cases, the individual constellation of prenatally known factors which impact on prognosis can be helpful in the decision making process with the parents. In preterm infants with a gestational age between 25 0/7 and 25 6/7 weeks, foetal surveillance, obstetric interventions for foetal indications and neonatal intensive care measures are generally indicated. However, if several prenatally known prognostic factors are unfavourable and the parents agree, primary non-intervention and neonatal palliative care can be considered. All pregnant women with threatening preterm delivery or premature rupture of membranes at the limit of viability must be transferred to a perinatal centre with a level III neonatal intensive care unit no later than 23 0/7 weeks of gestation, unless emergency delivery is indicated. An experienced neonatology team should be involved in all deliveries that take place after 23 0/7 weeks of gestation to help to decide together with the parents if the initiation of intensive care measures appears to be appropriate or if preference should be given to palliative care (i.e., primary non-intervention). In doubtful situations, it can be reasonable to initiate intensive care and to admit the preterm infant to a neonatal intensive care unit (i.e., provisional intensive care). The infant's clinical evolution and additional discussions with the parents will help to clarify whether the life-sustaining therapies should be continued or withdrawn. Life support is continued as long as there is reasonable hope for survival and the infant's burden of intensive care is acceptable. If, on the other hand, the health car...
Resumo:
The examination of telomere dynamics is a recent technique in ecology for assessing physiological state and age-related traits from individuals of unknown age. Telomeres shorten with age in most species and are expected to reflect physiological state, reproductive investment, and chronological age. Loss of telomere length is used as an indicator of biological aging, as this detrimental deterioration is associated with lowered survival. Lifespan dimorphism and more rapid senescence in the larger, shorter-lived sex are predicted in species with sexual size dimorphism, however, little is known about the effects of behavioral dimorphism on senescence and life history traits in species with sexual monomorphism. Here we compare telomere dynamics of thick-billed murres (Uria lomvia), a species with male-biased parental care, in two ways: 1) cross-sectionally in birds of known-age (0-28 years) from one colony and 2) longitudinally in birds from four colonies. Telomere dynamics are compared using three measures: the telomere restriction fragment (TRF), a lower window of TRF (TOE), and qPCR. All showed age-related shortening of telomeres, but the TRF measure also indicated that adult female murres have shorter telomere length than adult males, consistent with sex-specific patterns of ageing. Adult males had longer telomeres than adult females on all colonies examined, but chick telomere length did not differ by sex. Additionally, inter-annual telomere changes may be related to environmental conditions; birds from a potentially low quality colony lost telomeres, while those at more hospitable colonies maintained telomere length. We conclude that sex-specific patterns of telomere loss exist in the sexually monomorphic thick-billed murre but are likely to occur between fledging and recruitment. Longer telomeres in males may be related to their homogamous sex chromosomes (ZZ) or to selection for longer life in the care-giving sex. Environmental conditions appeared to be the primary drivers of annual changes in adult birds.
Resumo:
Cardiovascular disease (CVD) due to atherosclerosis of the arterial vessel wall and to thrombosis is the foremost cause of premature mortality and of disability-adjusted life years (DALYs) in Europe, and is also increasingly common in developing countries.1 In the European Union, the economic cost of CVD represents annually E192 billion1 in direct and indirect healthcare costs. The main clinical entities are coronary artery disease (CAD), ischaemic stroke, and peripheral arterial disease (PAD). The causes of these CVDs are multifactorial. Some of these factors relate to lifestyles, such as tobacco smoking, lack of physical activity, and dietary habits, and are thus modifiable. Other risk factors are also modifiable, such as elevated blood pressure, type 2 diabetes, and dyslipidaemias, or non-modifiable, such as age and male gender. These guidelines deal with the management of dyslipidaemias as an essential and integral part of CVD prevention. Prevention and treatment of dyslipidaemias should always be considered within the broader framework of CVD prevention, which is addressed in guidelines of the Joint European Societies’ Task forces on CVD prevention in clinical practice.2 – 5 The latest version of these guidelines was published in 20075; an update will become available in 2012. These Joint ESC/European Atherosclerosis Society (EAS) guidelines on the management of dyslipidaemias are complementary to the guidelines on CVD prevention in clinical practice and address not only physicians [e.g. general practitioners (GPs) and cardiologists] interested in CVD prevention, but also specialists from lipid clinics or metabolic units who are dealing with dyslipidaemias that are more difficult to classify and treat.
Resumo:
Background— The age, creatinine, and ejection fraction (ACEF) score (age/left ventricular ejection fraction+1 if creatinine >2.0 mg/dL) has been established as an effective predictor of clinical outcomes in patients undergoing elective coronary artery bypass surgery; however, its utility in “all-comer” patients undergoing percutaneous coronary intervention is yet unexplored. Methods and Results— The ACEF score was calculated for 1208 of the 1707 patients enrolled in the LEADERS trial. Post hoc analysis was performed by stratifying clinical outcomes at the 1-year follow-up according to ACEF score tertiles: ACEFlow ≤1.0225, 1.0225< ACEFmid ≤1.277, and ACEFhigh >1.277. At 1-year follow-up, there was a significantly lower number of patients with major adverse cardiac event–free survival in the highest tertile of the ACEF score (ACEFlow=92.1%, ACEFmid=89.5%, and ACEFhigh=86.1%; P=0.0218). Cardiac death was less frequent in ACEFlow than in ACEFmid and ACEFhigh (0.7% vs 2.2% vs 4.5%; hazard ratio=2.22, P=0.002) patients. Rates of myocardial infarction were significantly higher in patients with a high ACEF score (6.7% for ACEFhigh vs 5.2% for ACEFmid and 2.5% for ACEFlow; hazard ratio=1.6, P=0.006). Clinically driven target-vessel revascularization also tended to be higher in the ACEFhigh group, but the difference among the 3 groups did not reach statistical significance. The rate of composite definite, possible, and probable stent thrombosis was also higher in the ACEFhigh group (ACEFlow=1.2%, ACEFmid=3.5%, and ACEFhigh=6.2%; hazard ratio=2.04, P<0.001). Conclusions— ACEF score may be a simple way to stratify risk of events in patients treated with percutaneous coronary intervention with respect to mortality and risk of myocardial infarction.
Resumo:
Objectives: To compare outcomes of antiretroviral therapy (ART) in South Africa, where viral load monitoring is routine, with those in Malawi and Zambia, where monitoring is based on CD4 cell counts. Methods: We included 18 706 adult patients starting ART in South Africa and 80 937 patients in Zambia or Malawi. We examined CD4 responses in models for repeated measures and the probability of switching to second-line regimens, mortality and loss to follow-up in multistate models, measuring time from 6 months. Results: In South Africa, 9.8% [95% confidence interval (CI) 9.1–10.5] had switched at 3 years, 1.3% (95% CI 0.9–1.6) remained on failing first-line regimens, 9.2% (95% CI 8.5–9.8) were lost to follow-up and 4.3% (95% CI 3.9–4.8) had died. In Malawi and Zambia, more patients were on a failing first-line regimen [3.7% (95% CI 3.6–3.9], fewer patients had switched [2.1% (95% CI 2.0–2.3)] and more patients were lost to follow-up [15.3% (95% CI 15.0–15.6)] or had died [6.3% (95% CI 6.0–6.5)]. Median CD4 cell counts were lower in South Africa at the start of ART (93 vs. 132 cells/μl; P < 0.001) but higher after 3 years (425 vs. 383 cells/μl; P < 0.001). The hazard ratio comparing South Africa with Malawi and Zambia after adjusting for age, sex, first-line regimen and CD4 cell count was 0.58 (0.50–0.66) for death and 0.53 (0.48–0.58) for loss to follow-up. Conclusion: Over 3 years of ART mortality was lower in South Africa than in Malawi or Zambia. The more favourable outcome in South Africa might be explained by viral load monitoring leading to earlier detection of treatment failure, adherence counselling and timelier switching to second-line ART.