980 resultados para 612 Kielitieteet, kirjallisuus


Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND Newer generation everolimus-eluting stents (EES) improve clinical outcome compared to early generation sirolimus-eluting stents (SES) and paclitaxel-eluting stents (PES). We investigated whether the advantage in safety and efficacy also holds among the high-risk population of diabetic patients during long-term follow-up. METHODS Between 2002 and 2009, a total of 1963 consecutive diabetic patients treated with the unrestricted use of EES (n=804), SES (n=612) and PES (n=547) were followed throughout three years for the occurrence of cardiac events at two academic institutions. The primary end point was the occurrence of definite stent thrombosis. RESULTS The primary outcome occurred in 1.0% of EES, 3.7% of SES and 3.8% of PES treated patients ([EES vs. SES] adjusted HR=0.58, 95% CI 0.39-0.88; [EES vs. PES] adjusted HR=0.29, 95% CI 0.13-0.67). Similarly, patients treated with EES had a lower risk of target-lesion revascularization (TLR) compared to patients treated with SES and PES ([EES vs. SES], 5.6% vs. 11.5%, adjusted HR=0.68, 95% CI: 0.55-0.83; [EES vs. PES], 5.6% vs. 11.3%, adjusted HR=0.51, 95% CI: 0.33-0.77). There were no differences in other safety end points, such as all-cause mortality, cardiac mortality, myocardial infarction (MI) and MACE. CONCLUSION In diabetic patients, the unrestricted use of EES appears to be associated with improved outcomes, specifically a significant decrease in the need for TLR and ST compared to early generation SES and PES throughout 3-year follow-up.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUCTION The aims of this study were to compare lateral cephalograms with other radiologic methods for diagnosing suspected fusions of the cervical spine and to validate the assessment of congenital fusions and osteoarthritic changes against the anatomic truth. METHODS Four cadaver heads were selected with fusion of vertebrae C2 and C3 seen on a lateral cephalogram. Multidetector computed tomography (MDCT) and cone-beam computed tomography (CBCT) were performed and assessed by 5 general radiologists and 5 oral radiologists, respectively. Vertebrae C2 and C3 were examined for osseous fusions, and the left and right facet joints were diagnosed for osteoarthritis. Subsequently, the C2 and C3 were macerated and appraised by a pathologist. Descriptive analysis was performed, and interrater agreements between and within the groups were computed. RESULTS All macerated specimens showed osteoarthritic findings of varying degrees, but no congenital bony fusion. All observers agreed that no fusion was found on MDCT or CBCT. They disagreed on the prevalence of osteoarthritic deformities (general radiologists/MDCT, 100%; oral radiologists/CBCT, 93.3%) and joint space assessment in the facet joints (kappa = 0.452). The agreement within the rater groups differed considerably (general radiologists/MDCT, kappa = 0.612; oral radiologists/CBCT, kappa = 0.240). CONCLUSIONS Lateral cephalograms do not provide dependable data to assess the cervical spine for fusions and cause false-positive detections. Both MDCT interpreted by general radiologists and CBCT interpreted by oral radiologists are reliable methods to exclude potential fusions. Degenerative osteoarthritic changes are diagnosed more accurately and consistently by general radiologists evaluating MDCT.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUCTION Low systolic blood pressure (SBP) is an important secondary insult following traumatic brain injury (TBI), but its exact relationship with outcome is not well characterised. Although a SBP of <90mmHg represents the threshold for hypotension in consensus TBI treatment guidelines, recent studies suggest redefining hypotension at higher levels. This study therefore aimed to fully characterise the association between admission SBP and mortality to further inform resuscitation endpoints. METHODS We conducted a multicentre cohort study using data from the largest European trauma registry. Consecutive adult patients with AIS head scores >2 admitted directly to specialist neuroscience centres between 2005 and July 2012 were studied. Multilevel logistic regression models were developed to examine the association between admission SBP and 30 day inpatient mortality. Models were adjusted for confounders including age, severity of injury, and to account for differential quality of hospital care. RESULTS 5057 patients were included in complete case analyses. Admission SBP demonstrated a smooth u-shaped association with outcome in a bivariate analysis, with increasing mortality at both lower and higher values, and no evidence of any threshold effect. Adjusting for confounding slightly attenuated the association between mortality and SBP at levels <120mmHg, and abolished the relationship for higher SBP values. Case-mix adjusted odds of death were 1.5 times greater at <120mmHg, doubled at <100mmHg, tripled at <90mmHg, and six times greater at SBP<70mmHg, p<0.01. CONCLUSIONS These findings indicate that TBI studies should model SBP as a continuous variable and may suggest that current TBI treatment guidelines, using a cut-off for hypotension at SBP<90mmHg, should be reconsidered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND Acute cardiogenic shock after myocardial infarction is associated with high in-hospital mortality attributable to persisting low-cardiac output. The Impella-EUROSHOCK-registry evaluates the safety and efficacy of the Impella-2.5-percutaneous left-ventricular assist device in patients with cardiogenic shock after acute myocardial infarction. METHODS AND RESULTS This multicenter registry retrospectively included 120 patients (63.6±12.2 years; 81.7% male) with cardiogenic shock from acute myocardial infarction receiving temporary circulatory support with the Impella-2.5-percutaneous left-ventricular assist device. The primary end point evaluated mortality at 30 days. The secondary end point analyzed the change of plasma lactate after the institution of hemodynamic support, and the rate of early major adverse cardiac and cerebrovascular events as well as long-term survival. Thirty-day mortality was 64.2% in the study population. After Impella-2.5-percutaneous left-ventricular assist device implantation, lactate levels decreased from 5.8±5.0 mmol/L to 4.7±5.4 mmol/L (P=0.28) and 2.5±2.6 mmol/L (P=0.023) at 24 and 48 hours, respectively. Early major adverse cardiac and cerebrovascular events were reported in 18 (15%) patients. Major bleeding at the vascular access site, hemolysis, and pericardial tamponade occurred in 34 (28.6%), 9 (7.5%), and 2 (1.7%) patients, respectively. The parameters of age >65 and lactate level >3.8 mmol/L at admission were identified as predictors of 30-day mortality. After 317±526 days of follow-up, survival was 28.3%. CONCLUSIONS In patients with acute cardiogenic shock from acute myocardial infarction, Impella 2.5-treatment is feasible and results in a reduction of lactate levels, suggesting improved organ perfusion. However, 30-day mortality remains high in these patients. This likely reflects the last-resort character of Impella-2.5-application in selected patients with a poor hemodynamic profile and a greater imminent risk of death. Carefully conducted randomized controlled trials are necessary to evaluate the efficacy of Impella-2.5-support in this high-risk patient group.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: WHO's 2013 revisions to its Consolidated Guidelines on antiretroviral drugs recommend routine viral load monitoring, rather than clinical or immunological monitoring, as the preferred monitoring approach on the basis of clinical evidence. However, HIV programmes in resource-limited settings require guidance on the most cost-effective use of resources in view of other competing priorities such as expansion of antiretroviral therapy coverage. We assessed the cost-effectiveness of alternative patient monitoring strategies. Methods: We evaluated a range of monitoring strategies, including clinical, CD4 cell count, and viral load monitoring, alone and together, at different frequencies and with different criteria for switching to second-line therapies. We used three independently constructed and validated models simultaneously. We estimated costs on the basis of resource use projected in the models and associated unit costs; we quantified impact as disability-adjusted life years (DALYs) averted. We compared alternatives using incremental cost-effectiveness analysis. Findings: All models show that clinical monitoring delivers significant benefit compared with a hypothetical baseline scenario with no monitoring or switching. Regular CD4 cell count monitoring confers a benefit over clinical monitoring alone, at an incremental cost that makes it affordable in more settings than viral load monitoring, which is currently more expensive. Viral load monitoring without CD4 cell count every 6—12 months provides the greatest reductions in morbidity and mortality, but incurs a high cost per DALY averted, resulting in lost opportunities to generate health gains if implemented instead of increasing antiretroviral therapy coverage or expanding antiretroviral therapy eligibility. Interpretation: The priority for HIV programmes should be to expand antiretroviral therapy coverage, firstly at CD4 cell count lower than 350 cells per μL, and then at a CD4 cell count lower than 500 cells per μL, using lower-cost clinical or CD4 monitoring. At current costs, viral load monitoring should be considered only after high antiretroviral therapy coverage has been achieved. Point-of-care technologies and other factors reducing costs might make viral load monitoring more affordable in future. Funding: Bill & Melinda Gates Foundation, WHO.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A critical step for speciation in the face of gene flow is the origination of reproductive isolation. The evolution of assortative mating greatly facilitates this process. Assortative mating can be mediated by one or multiple cues across an array of sensory modalities. We here explore possible cues that may underlie female mate choice in a sympatric species pair of cichlid fish from Lake Victoria, Pundamilia pundamilia and Pundamilia nyererei. Previous studies identified species-specific female preferences for male coloration, but effects of other cues could not be ruled out. Therefore, we assessed female choice in a series of experiments in which we manipulated visual (color) and chemical cues. We show that the visibility of differences in nuptial hue (i.e., either blue or red) between males of the 2 species is necessary and sufficient for assortative mating by female mate choice. Such assortment mediated by a single cue may evolve relatively quickly, but could make reproductive isolation vulnerable to environmental changes. These findings confirm the important role of female mate choice for male nuptial hue in promoting the explosive speciation of African haplochromine cichlids.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Voltage-dependent calcium channels (VDCCs) serve a wide range of physiological functions and their activity is modulated by different neurotransmitter systems. GABAergic inhibition of VDCCs in neurons has an important impact in controlling transmitter release, neuronal plasticity, gene expression and neuronal excitability. We investigated the molecular signalling mechanisms by which GABAB receptors inhibit calcium-mediated electrogenesis (Ca2+ spikes) in the distal apical dendrite of cortical layer 5 pyramidal neurons. Ca2+ spikes are the basis of coincidence detection and signal amplification of distal tuft synaptic inputs characteristic for the computational function of cortical pyramidal neurons. By combining dendritic whole-cell recordings with two-photon fluorescence Ca2+ imaging we found that all subtypes of VDCCs were present in the Ca2+ spike initiation zone, but that they contribute differently to the initiation and sustaining of dendritic Ca2+ spikes. Particularly, Cav1 VDCCs are the most abundant VDCC present in this dendritic compartment and they generated the sustained plateau potential characteristic for the Ca2+ spike. Activation of GABAB receptors specifically inhibited Cav1 channels. This inhibition of L-type Ca2+ currents was transiently relieved by strong depolarization but did not depend on protein kinase activity. Therefore, our findings suggest a novel membrane-delimited interaction of the Gi/o-βγ-subunit with Cav1 channels identifying this mechanism as the general pathway of GABAB receptor-mediated inhibition of VDCCs. Furthermore, the characterization of the contribution of the different VDCCs to the generation of the Ca2+ spike provides new insights into the molecular mechanism of dendritic computation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Identifying drivers of species diversity is a major challenge in understanding and predicting the dynamics of species-rich semi-natural grasslands. In particular in temperate grasslands changes in land use and its consequences, i.e. increasing fragmentation, the on-going loss of habitat and the declining importance of regional processes such as seed dispersal by livestock, are considered key drivers of the diversity loss witnessed within the last decades. It is a largely unresolved question to what degree current temperate grassland communities already reflect a decline of regional processes such as longer distance seed dispersal. Answering this question is challenging since it requires both a mechanistic approach to community dynamics and a sufficient data basis that allows identifying general patterns. Here, we present results of a local individual- and trait-based community model that was initialized with plant functional types (PFTs) derived from an extensive empirical data set of species-rich grasslands within the `Biodiversity Exploratories' in Germany. Driving model processes included above- and belowground competition, dynamic resource allocation to shoots and roots, clonal growth, grazing, and local seed dispersal. To test for the impact of regional processes we also simulated seed input from a regional species pool. Model output, with and without regional seed input, was compared with empirical community response patterns along a grazing gradient. Simulated response patterns of changes in PFT richness, Shannon diversity, and biomass production matched observed grazing response patterns surprisingly well if only local processes were considered. Already low levels of additional regional seed input led to stronger deviations from empirical community pattern. While these findings cannot rule out that regional processes other than those considered in the modeling study potentially play a role in shaping the local grassland communities, our comparison indicates that European grasslands are largely isolated, i.e. local mechanisms explain observed community patterns to a large extent.

Relevância:

10.00% 10.00%

Publicador: