599 resultados para Rebecca Salsburry
Resumo:
OBJECTIVES This study sought to evaluate: 1) the effect of impaired renal function on long-term clinical outcomes in women undergoing percutaneous coronary intervention (PCI) with drug-eluting stent (DES); and 2) the safety and efficacy of new-generation compared with early-generation DES in women with chronic kidney disease (CKD). BACKGROUND The prevalence and effect of CKD in women undergoing PCI with DES is unclear. METHODS We pooled patient-level data for women enrolled in 26 randomized trials. The study population was categorized by creatinine clearance (CrCl) <45 ml/min, 45 to 59 ml/min, and ≥60 ml/min. The primary endpoint was the 3-year rate of major adverse cardiovascular events (MACE). Participants for whom baseline creatinine was missing were excluded from the analysis. RESULTS Of 4,217 women included in the pooled cohort treated with DES and for whom serum creatinine was available, 603 (14%) had a CrCl <45 ml/min, 811 (19%) had a CrCl 45 to 59 ml/min, and 2,803 (66%) had a CrCl ≥60 ml/min. A significant stepwise gradient in risk for MACE was observed with worsening renal function (26.6% vs. 15.8% vs. 12.9%; p < 0.01). Following multivariable adjustment, CrCl <45 ml/min was independently associated with a higher risk of MACE (adjusted hazard ratio: 1.56; 95% confidence interval: 1.23 to 1.98) and all-cause mortality (adjusted hazard ratio: 2.67; 95% confidence interval: 1.85 to 3.85). Compared with older-generation DES, the use of newer-generation DES was associated with a reduction in the risk of cardiac death, myocardial infarction, or stent thrombosis in women with CKD. The effect of new-generation DES on outcomes was uniform, between women with or without CKD, without evidence of interaction. CONCLUSIONS Among women undergoing PCI with DES, CKD is a common comorbidity associated with a strong and independent risk for MACE that is durable over 3 years. The benefits of newer-generation DES are uniform in women with or without CKD.
Resumo:
BACKGROUND The safety and efficacy of new-generation drug-eluting stents (DES) in women with multiple atherothrombotic risk (ATR) factors is unclear. METHODS AND RESULTS We pooled patient-level data for women enrolled in 26 randomized trials. Study population was categorized based on the presence or absence of high ATR, which was defined as having history of diabetes mellitus, prior percutaneous or surgical coronary revascularization, or prior myocardial infarction. The primary end point was major adverse cardiovascular events defined as a composite of all-cause mortality, myocardial infarction, or target lesion revascularization at 3 years of follow-up. Out of 10 449 women included in the pooled database, 5333 (51%) were at high ATR. Compared with women not at high ATR, those at high ATR had significantly higher risk of major adverse cardiovascular events (15.8% versus 10.6%; adjusted hazard ratio: 1.53; 95% confidence interval: 1.34-1.75; P=0.006) and all-cause mortality. In high-ATR risk women, the use of new-generation DES was associated with significantly lower risk of 3-year major adverse cardiovascular events (adjusted hazard ratio: 0.69; 95% confidence interval: 0.52-0.92) compared with early-generation DES. The benefit of new-generation DES on major adverse cardiovascular events was uniform between high-ATR and non-high-ATR women, without evidence of interaction (Pinteraction=0.14). At landmark analysis, in high-ATR women, stent thrombosis rates were comparable between DES generations in the first year, whereas between 1 and 3 years, stent thrombosis risk was lower with new-generation devices. CONCLUSIONS Use of new-generation DES even in women at high ATR is associated with a benefit consistent over 3 years of follow-up and a substantial improvement in very-late thrombotic safety.
Resumo:
QUESTIONS UNDER STUDY Many persons are travelling all over the world; the elderly with pre-existing diseases also travel to places with less developed health systems. Reportedly, fewer than 0.5% of all travellers need repatriation. We aimed to analyse and examine people who are injured or ill while abroad, where they travelled to and by what means they were repatriated. METHODS Retrospective cross-sectional study with adult patients repatriated to a single level 1 trauma centre in Switzerland (2000-2011). RESULTS A total of 372 patients were repatriated, with an increasing trend per year. Of these, 67% were male; the median age was 56 years. Forty-nine percent sustained an injury, and 13% had surgical and 38% medical pathologies. Patients with medical conditions were older than those with injuries or surgical emergencies (p <0.001). Seventy-three percent were repatriated from Europe. For repatriation from Africa trauma was slightly more frequent (53%, n = 17) than illnesses, whereas for most other countries illnesses and trauma were equally distributed. Injured patients had a median Injury Severity Score of 8. The majority of illnesses involved the nervous system (38%), mainly stroke. Forty-five percent were repatriated by Swiss Air Ambulance, 26% by ground ambulance, 18% by scheduled flights with or without medical assistance and two patients injured near the Swiss boarder by helicopter. The 28-day mortality was 4%. CONCLUSIONS The numbers of travellers repatriated increased from 2000 to 2011. About half were due to illnesses and half due to injuries. The largest group were elderly Swiss nationals repatriated from European countries. As mortality is relatively high, special consideration to this group of patients is warranted.
Resumo:
BACKGROUND Since the introduction of helmets in winter sports there is on-going debate on whether they decrease traumatic brain injuries (TBI). METHODS This cohort study included 117 adult (≥ 16 years) snowboarders with TBI admitted to a level I alpine trauma center in Switzerland between 2000/2001 and 2010/2011. The primary objective was to examine the association between helmet use and moderate-to-severe TBI. Secondary objectives were to describe the epidemiology of TBI during the past decade in relation to increased helmet use. RESULTS Of 691 injured snowboarders evaluated, 117 (17%) suffered TBI. Sixty-six percent were men (median age, 23 years). Two percent of accidents were fatal. Ninety-two percent of patients sustained minor, 1% moderate, and 7% severe TBI according to the Glasgow coma scale. Pathologic computed tomography findings were present in 16% of patients, 26% of which required surgery. Eighty-three percent of TBIs occurred while riding on-slope. There was no trend in the TBI rate during the studied period, although helmet use increased from 10% to 69%. Comparing patients with and without a helmet showed no significant difference in odds ratios for the severity of TBI. However, of the 5 patients requiring surgery only 1 was wearing a helmet. Off-piste compared with on-slope snowboarders showed an odds ratio of 26.5 (P = 0.003) for sustaining a moderate-to-severe TBI. CONCLUSIONS Despite increased helmet use we found no decrease in TBI among snowboarders. The possibility of TBI despite helmet use and the dangers of riding off-piste should be a focus of future prevention programs.
Resumo:
Increasing time-on-task leads to fatigue and, as shown by previous research, differentially affects the deployment of visual attention towards the left and the right visual space. In healthy participants, an increasing rightward bias is commonly observed with increasing time-on-task. Yet, it is unclear whether specific mechanisms involved in the spatial deployment of visual attention are differentially affected by increasing time-on-task. The aim of the present study was to investigate whether prolonged time-on-task would affect a specific mechanism of visuo-spatial attentional deployment, namely attentional disengagement, in an asymmetrical fashion. For this purpose, we administered to healthy participants a prolonged gap/overlap saccadic paradigm, with left- and right-sided target stimuli. This oculomotor paradigm allowed to quantify disengagement costs according to the direction of the subsequent attentional shifts, and to evaluate the temporal development of disengagement costs with increasing time-on-task. Our results show that, with increasing time-on-task, participants demonstrated significantly lower disengagement costs for rightward compared to leftward saccades. These effects were specific, since concurring side differences of saccadic latencies were found for overlap trials (requiring attentional disengagement), but not for gap trials (requiring no or less attentional disengagement). Moreover, the results were paralleled by a non-lateralised decrease in saccadic peak velocity with increasing time-on-task, a common finding indicating an increasing level of fatigue. Our findings support the idea that non-spatial attentional aspects, such as fatigue due to increasing time-on-task, can have a substantial influence on the spatial deployment of visual attention, in particular on its disengagement, depending on the direction of the subsequent attentional shift.
Resumo:
The European Eye Epidemiology (E3) consortium is a recently formed consortium of 29 groups from 12 European countries. It already comprises 21 population-based studies and 20 other studies (case-control, cases only, randomized trials), providing ophthalmological data on approximately 170,000 European participants. The aim of the consortium is to promote and sustain collaboration and sharing of data and knowledge in the field of ophthalmic epidemiology in Europe, with particular focus on the harmonization of methods for future research, estimation and projection of frequency and impact of visual outcomes in European populations (including temporal trends and European subregions), identification of risk factors and pathways for eye diseases (lifestyle, vascular and metabolic factors, genetics, epigenetics and biomarkers) and development and validation of prediction models for eye diseases. Coordinating these existing data will allow a detailed study of the risk factors and consequences of eye diseases and visual impairment, including study of international geographical variation which is not possible in individual studies. It is expected that collaborative work on these existing data will provide additional knowledge, despite the fact that the risk factors and the methods for collecting them differ somewhat among the participating studies. Most studies also include biobanks of various biological samples, which will enable identification of biomarkers to detect and predict occurrence and progression of eye diseases. This article outlines the rationale of the consortium, its design and presents a summary of the methodology.
Resumo:
Dysregulation of sleep or feeding has enormous health consequences. In humans, acute sleep loss is associated with increased appetite and insulin insensitivity, while chronically sleep-deprived individuals are more likely to develop obesity, metabolic syndrome, type II diabetes, and cardiovascular disease. Conversely, metabolic state potently modulates sleep and circadian behavior; yet, the molecular basis for sleep-metabolism interactions remains poorly understood. Here, we describe the identification of translin (trsn), a highly conserved RNA/DNA binding protein, as essential for starvation-induced sleep suppression. Strikingly, trsn does not appear to regulate energy stores, free glucose levels, or feeding behavior suggesting the sleep phenotype of trsn mutant flies is not a consequence of general metabolic dysfunction or blunted response to starvation. While broadly expressed in all neurons, trsn is transcriptionally upregulated in the heads of flies in response to starvation. Spatially restricted rescue or targeted knockdown localizes trsn function to neurons that produce the tachykinin family neuropeptide Leucokinin. Manipulation of neural activity in Leucokinin neurons revealed these neurons to be required for starvation-induced sleep suppression. Taken together, these findings establish trsn as an essential integrator of sleep and metabolic state, with implications for understanding the neural mechanism underlying sleep disruption in response to environmental perturbation.
Resumo:
BACKGROUND: Bioluminescence imaging is widely used for cell-based assays and animal imaging studies, both in biomedical research and drug development. Its main advantages include its high-throughput applicability, affordability, high sensitivity, operational simplicity, and quantitative outputs. In malaria research, bioluminescence has been used for drug discovery in vivo and in vitro, exploring host-pathogen interactions, and studying multiple aspects of Plasmodium biology. While the number of fluorescent proteins available for imaging has undergone a great expansion over the last two decades, enabling simultaneous visualization of multiple molecular and cellular events, expansion of available luciferases has lagged behind. The most widely used bioluminescent probe in malaria research is the Photinus pyralis firefly luciferase, followed by the more recently introduced Click-beetle and Renilla luciferases. Ultra-sensitive imaging of Plasmodium at low parasite densities has not been previously achieved. With the purpose of overcoming these challenges, a Plasmodium berghei line expressing the novel ultra-bright luciferase enzyme NanoLuc, called PbNLuc has been generated, and is presented in this work. RESULTS: NanoLuc shows at least 150 times brighter signal than firefly luciferase in vitro, allowing single parasite detection in mosquito, liver, and sexual and asexual blood stages. As a proof-of-concept, the PbNLuc parasites were used to image parasite development in the mosquito, liver and blood stages of infection, and to specifically explore parasite liver stage egress, and pre-patency period in vivo. CONCLUSIONS: PbNLuc is a suitable parasite line for sensitive imaging of the entire Plasmodium life cycle. Its sensitivity makes it a promising line to be used as a reference for drug candidate testing, as well as the characterization of mutant parasites to explore the function of parasite proteins, host-parasite interactions, and the better understanding of Plasmodium biology. Since the substrate requirements of NanoLuc are different from those of firefly luciferase, dual bioluminescence imaging for the simultaneous characterization of two lines, or two separate biological processes, is possible, as demonstrated in this work.
Resumo:
This protocol describes a method for obtaining rodent Plasmodium parasite clones with high efficiency, which takes advantage of the normal course of Plasmodium in vitro exoerythrocytic development. At the completion of development, detached cells/merosomes form, which contain hundreds to thousands of merozoites. As all parasites within a single detached cell/merosome derive from the same sporozoite, we predicted them to be genetically identical. To prove this, hepatoma cells were infected simultaneously with a mixture of Plasmodium berghei sporozoites expressing either GFP or mCherry. Subsequently, individual detached cells/merosomes from this mixed population were selected and injected into mice, resulting in clonal blood stage parasite infections. Importantly, as a large majority of mice become successfully infected using this protocol, significantly less mice are necessary than for the widely used technique of limiting dilution cloning. To produce a clonal P. berghei blood stage infection from a non-clonal infection using this procedure requires between 4 and 5 weeks.