824 resultados para Cohort Extinction
Resumo:
Infectious diseases (ID) are a major cause of morbidity and mortality after SOT. Since May 2008, the STCS has registered 95% of all SOT recipients in Switzerland. The extensive data set includes pre- and post-transplant variables that are prospectively collected at transplantation, 6 months post-transplant, and yearly thereafter. All ID events are recorded using internationally validated defi nitions. We obtained data from 1101 patients (79 heart, 685 kidney, 29 kidney-pancreas, 212 liver, and 96 lung transplants). So far the median observation times were 0.8 (IQR 0.3-1.4; heart); 1.1 (0.6-1.8, kidney); 1.1 (0.6-1.9, kidney-pancreas); 1.0 (0.5-1.7, liver); and 0.9 years (0.5-1.5, lung). The highest rates of proven or probable ID events were seen in lung (76%), followed by liver (64%), heart (62%), kidney-pancreas (62%), kidney (58%). During the observation period, ID was the cause of death in 19 patients (1.7%). Rates of infections per person-years according to pathogen and type of transplantation are shown in Figure 1. The data indicate that virus infections are only second after bacteria whereas fungi occur at relatively low rates. This prospective and standardized long-term collection of all ID events will allow a comprehensive assessment of the burden of ID across all SOT types in Switzerland. Regular analysis will identify new trends, serve as a quality control and help design anti-infectious interventions aiming at increasing safety and improving overall transplantation outcome.
Resumo:
BACKGROUND: Genotypes obtained with commercial SNP arrays have been extensively used in many large case-control or population-based cohorts for SNP-based genome-wide association studies for a multitude of traits. Yet, these genotypes capture only a small fraction of the variance of the studied traits. Genomic structural variants (GSV) such as Copy Number Variation (CNV) may account for part of the missing heritability, but their comprehensive detection requires either next-generation arrays or sequencing. Sophisticated algorithms that infer CNVs by combining the intensities from SNP-probes for the two alleles can already be used to extract a partial view of such GSV from existing data sets. RESULTS: Here we present several advances to facilitate the latter approach. First, we introduce a novel CNV detection method based on a Gaussian Mixture Model. Second, we propose a new algorithm, PCA merge, for combining copy-number profiles from many individuals into consensus regions. We applied both our new methods as well as existing ones to data from 5612 individuals from the CoLaus study who were genotyped on Affymetrix 500K arrays. We developed a number of procedures in order to evaluate the performance of the different methods. This includes comparison with previously published CNVs as well as using a replication sample of 239 individuals, genotyped with Illumina 550K arrays. We also established a new evaluation procedure that employs the fact that related individuals are expected to share their CNVs more frequently than randomly selected individuals. The ability to detect both rare and common CNVs provides a valuable resource that will facilitate association studies exploring potential phenotypic associations with CNVs. CONCLUSION: Our new methodologies for CNV detection and their evaluation will help in extracting additional information from the large amount of SNP-genotyping data on various cohorts and use this to explore structural variants and their impact on complex traits.
Resumo:
Introduction: Venlafaxine (Efexor®) is a serotonin and noradrenaline reuptake inhibitor (SNRI) used for the treatment of depression and anxiety disorders. The limited data on the use of venlafaxine in human pregnancy do not indicate an increased risk of congenital malformations. The main purpose of the study is to assess the rate of major malformations after first trimester exposure to venlafaxine. Methods: This multicenter, prospective cohort study was performed using data from nine centers who are member of the European Network of Teratology Information Services (ENTIS). Data on pregnancy and pregnancy outcome of women who used venlafaxine in pregnancy were collected during individual risk counseling. Standardized procedures for data collection and followup were used by each center. Results: Follow up data were collected on 744 pregnancies of womenwhoused venlafaxine during gestation. In 583 (78.4%) cases the exposure had occurred at least in the first trimester. In total, there were 600 live births (5 twins), 85 spontaneous abortions, 57 elective terminations of pregnancy, 5 fetal deaths, and 2 ectopic pregnancies. The overall rate of major malformations after first trimester exposure and excluding chromosomal and genetic disorders was 3.2% (16/500) in all pregnancies ending in delivery, pregnancy terminations or fetal deaths with fetal-pathological examination. Among live births the malformation rate was 2.7% (13/490). We observed no increased risk for organ specific malformations. Conclusions: The present study indicates that venlafaxine is not a major human teratogen.
Resumo:
BACKGROUND: Guidelines for the management of anaemia in patients with chronic kidney disease (CKD) recommend a minimal haemoglobin (Hb) target of 11 g/dL. Recent surveys indicate that this requirement is not met in many patients in Europe. In most studies, Hb is only assessed over a short-term period. The aim of this study was to examine the control of anaemia over a continuous long-term period in Switzerland. METHODS: A prospective multi-centre observational study was conducted in dialysed patients treated with recombinant human epoetin (EPO) beta, over a one-year follow-up period, with monthly assessments of anaemia parameters. RESULTS: Three hundred and fifty patients from 27 centres, representing 14% of the dialysis population in Switzerland, were included. Mean Hb was 11.9 +/- 1.0 g/dL, and remained stable over time. Eighty-five % of the patients achieved mean Hb >or= 11 g/dL. Mean EPO dose was 155 +/- 118 IU/kg/week, being delivered mostly by subcutaneous route (64-71%). Mean serum ferritin and transferrin saturation were 435 +/- 253 microg/L and 30 +/- 11%, respectively. At month 12, adequate iron stores were found in 72.5% of patients, whereas absolute and functional iron deficiencies were observed in only 5.1% and 17.8%, respectively. Multivariate analysis showed that diabetes unexpectedly influenced Hb towards higher levels (12.1 +/- 0.9 g/dL; p = 0.02). One year survival was significantly higher in patients with Hb >or= 11 g/dL than in those with Hb <11 g/dL (19.7% vs 7.3%, p = 0.006). CONCLUSION: In comparison to European studies of reference, this survey shows a remarkable and continuous control of anaemia in Swiss dialysis centres. These results were reached through moderately high EPO doses, mostly given subcutaneously, and careful iron therapy management.
Resumo:
The temporal dynamics of species diversity are shaped by variations in the rates of speciation and extinction, and there is a long history of inferring these rates using first and last appearances of taxa in the fossil record. Understanding diversity dynamics critically depends on unbiased estimates of the unobserved times of speciation and extinction for all lineages, but the inference of these parameters is challenging due to the complex nature of the available data. Here, we present a new probabilistic framework to jointly estimate species-specific times of speciation and extinction and the rates of the underlying birth-death process based on the fossil record. The rates are allowed to vary through time independently of each other, and the probability of preservation and sampling is explicitly incorporated in the model to estimate the true lifespan of each lineage. We implement a Bayesian algorithm to assess the presence of rate shifts by exploring alternative diversification models. Tests on a range of simulated data sets reveal the accuracy and robustness of our approach against violations of the underlying assumptions and various degrees of data incompleteness. Finally, we demonstrate the application of our method with the diversification of the mammal family Rhinocerotidae and reveal a complex history of repeated and independent temporal shifts of both speciation and extinction rates, leading to the expansion and subsequent decline of the group. The estimated parameters of the birth-death process implemented here are directly comparable with those obtained from dated molecular phylogenies. Thus, our model represents a step towards integrating phylogenetic and fossil information to infer macroevolutionary processes.
Resumo:
OBJECTIVES: Etravirine (ETV) is a novel nonnucleoside reverse transcriptase inhibitor (NNRTI) with reduced cross-resistance to first-generation NNRTIs, which has been primarily studied in randomized clinical trials and not in routine clinical settings. METHODS: ETV resistance-associated mutations (RAMs) were investigated by analysing 6072 genotypic tests. The antiviral activity of ETV was predicted using different interpretation systems: International AIDS Society-USA (IAS-USA), Stanford, Rega and Agence Nationale de Recherches sur le Sida et les hépatites virales (ANRS). RESULTS: The prevalence of ETV RAMs was higher in NNRTI-exposed patients [44.9%, 95% confidence interval (CI) 41.0-48.9%] than in treatment-naïve patients (9.6%, 95% CI 8.5-10.7%). ETV RAMs in treatment-naïve patients mainly represent polymorphism, as prevalence estimates in genotypic tests for treatment-naïve patients with documented recent (<1 year) infection, who had acquired HIV before the introduction of NNRTIs, were almost identical (9.8%, 95% CI 3.3-21.4). Discontinuation of NNRTI treatment led to a marked drop in the detection of ETV RAMs, from 51.7% (95% CI 40.8-62.6%) to 34.5% (95% CI 24.6-45.4%, P=0.032). Differences in prevalence among subtypes were found for V90I and V179T (P<0.001). Estimates of restricted virological response to ETV varied among algorithms in patients with exposure to efavirenz (EFV)/nevirapine (NVP), ranging from 3.8% (95% CI 2.5-5.6%) for ANRS to 56.2% (95% CI 52.2-60.1%) for Stanford. The predicted activity of ETV decreased as the sensitivity of potential optimized background regimens decreased. The presence of major IAS-USA mutations (L100I, K101E/H/P and Y181C/I/V) reduced the treatment response at week 24. CONCLUSIONS: Most ETV RAMs in drug-naïve patients are polymorphisms rather than transmitted RAMs. Uncertainty regarding predictions of antiviral activity for ETV in NNRTI-treated patients remains high. The lowest activity was predicted for patients harbouring extensive multidrug-resistant viruses, thus limiting ETV use in those who are most in need.
Resumo:
BACKGROUND: The optimal strategy for percutaneous coronary intervention (PCI) of ST-segment elevation myocardial infarction (STEMI) in multi-vessel disease (MVD), i.e., multi-vessel PCI (MV-PCI) vs. PCI of the infarct-related artery only (IRA-PCI), still remains unknown. METHODS: Patients of the AMIS Plus registry admitted with an acute coronary syndrome were contacted after a median of 378 days (interquartile range 371-409). The primary end-point was all-cause death. The secondary end-point included all major adverse cardiovascular and cerebrovascular events (MACCE) including death, re-infarction, re-hospitalization for cardiac causes, any cardiac re-intervention, and stroke. RESULTS: Between 2005 and 2012, 8330 STEMI patients were identified, of whom 1909 (24%) had MVD. Of these, 442 (23%) received MV-PCI and 1467 (77%) IRA-PCI. While all-cause mortality was similar in both groups (2.7% both, p>0.99), MACCE was significantly lower after MV-PCI vs. IRA-PCI (15.6% vs. 20.0%, p=0.038), mainly driven by lower rates of cardiac re-hospitalization and cardiac re-intervention. Patients undergoing MV-PCI with drug-eluting stents had lower rates of all-cause mortality (2.1% vs. 7.4%, p=0.026) and MACCE (14.1% vs. 25.9%, p=0.042) compared with those receiving bare metal stents (BMS). In multivariate analysis, MV-PCI (odds ratio, OR 0.69, 95% CI 0.51-0.93, p=0.017) and comorbidities (Charlson index ≥ 2; OR 1.42, 95% CI 1.05-1.92, p=0.025) were independent predictors for 1-year MACCE. CONCLUSION: In an unselected nationwide real-world cohort, an approach using immediate complete revascularization may be beneficial in STEMI patients with MVD regarding MACCE, specifically when drug-eluting stents are used, but not regarding mortality. This has to be tested in a randomized controlled trial.
Resumo:
BACKGROUND: Mental disorders, common in primary care, are often associated with physical complaints. While exposure to psychosocial stressors and development or presence of principal mental disorders (i.e. depression, anxiety and somatoform disorders defined as multisomatoforme disorders) is commonly correlated, temporal association remains unproven. The study explores the onset of such disorders after exposure to psychosocial stressors in a cohort of primary care patients with at least one physical symptom. METHOD: The cohort study SODA (SOmatization, Depression and Anxiety) was conducted by 21 private-practice GPs and three fellow physicians in a Swiss academic primary care centre. GPs included patients via randomized daily identifiers. Depression, anxiety or somatoform disorders were identified by the full Patient Health Questionnaire (PHQ), a validated procedure to identify mental disorders based on DSM-IV criteria. The PHQ was also used to investigate exposure to psychosocial stressors (before the index consultation and during follow up) and the onset of principal mental disorders after one year of follow up. RESULTS: From November 2004 to July 2005, 1020 patients were screened for inclusion. 627 were eligible and 482 completed the PHQ one year later and were included in the analysis (77%). At one year, prevalence of principal mental disorders was 30/153 (19.6% CI95% 13.6; 26.8) for those initially exposed to a major psychosocial stressor and 26/329 (7.9% CI95% 5.2; 11.4) for those not. Stronger association exists between psychosocial stressors and depression (RR = 2.4) or anxiety (RR = 3.5) than multisomatoforme disorders (RR = 1.8). Patients who are "bothered a lot" (subjective distress) by a stressor are therefore 2.5 times (CI95% 1.5; 4.0) more likely to experience a mental disorder at one year. A history of psychiatric comorbidities or psychological treatment was not a confounding factor for developing a principal mental disorder after exposure to psychosocial stressors. CONCLUSION: This primary care study shows that patients with physical complaints exposed to psychosocial stressors had a higher risk for developing mental disorders one year later. This temporal association opens the field for further research in preventive care for mental diseases in primary care patients.
Safety of artemether-lumefantrine exposure in first trimester of pregnancy: an observational cohort.
Resumo:
BACKGROUND: There is limited data available regarding safety profile of artemisinins in early pregnancy. They are, therefore, not recommended by WHO as a first-line treatment for malaria in first trimester due to associated embryo-foetal toxicity in animal studies. The study assessed birth outcome among pregnant women inadvertently exposed to artemether-lumefantrine (AL) during first trimester in comparison to those of women exposed to other anti-malarial drugs or no drug at all during the same period of pregnancy. METHODS: Pregnant women with gestational age <20 weeks were recruited from Maternal Health clinics or from monthly house visits (demographic surveillance), and followed prospectively until delivery. RESULTS: 2167 pregnant women were recruited and 1783 (82.3%) completed the study until delivery. 319 (17.9%) used anti-malarials in first trimester, of whom 172 (53.9%) used (AL), 78 (24.4%) quinine, 66 (20.7%) sulphadoxine-pyrimethamine (SP) and 11 (3.4%) amodiaquine. Quinine exposure in first trimester was associated with an increased risk of miscarriage/stillbirth (OR 2.5; 1.3-5.1) and premature birth (OR 2.6; 1.3-5.3) as opposed to AL with (OR 1.4; 0.8-2.5) for miscarriage/stillbirth and (OR 0.9; 0.5-1.8) for preterm birth. Congenital anomalies were identified in 4 exposure groups namely AL only (1/164[0.6%]), quinine only (1/70[1.4%]), SP (2/66[3.0%]), and non-anti-malarial exposure group (19/1464[1.3%]). CONCLUSION: Exposure to AL in first trimester was more common than to any other anti-malarial drugs. Quinine exposure was associated with adverse pregnancy outcomes which was not the case following other anti-malarial intake. Since AL and quinine were used according to their availability rather than to disease severity, it is likely that the effect observed was related to the drug and not to the disease itself. Even with this caveat, a change of policy from quinine to AL for the treatment of uncomplicated malaria during the whole pregnancy period could be already envisaged.
Resumo:
Purpose: To assess the global cardiovascular (CV) risk of an individual, several scores have been developed. However, their accuracy and comparability need to be evaluated in populations others from which they were derived. The aim of this study was to compare the predictive accuracy of 4 CV risk scores using data of a large population-based cohort. Methods: Prospective cohort study including 4980 participants (2698 women, mean age± SD: 52.7±10.8 years) in Lausanne, Switzerland followed for an average of 5.5 years (range 0.2 - 8.5). Two end points were assessed: 1) coronary heart disease (CHD), and 2) CV diseases (CVD). Four risk scores were compared: original and recalibrated Framingham coronary heart disease scores (1998 and 2001); original PROCAM score (2002) and its recalibrated version for Switzerland (IAS-AGLA); Reynolds risk score. Discrimination was assessed using Harrell's C statistics, model fitness using Akaike's information criterion (AIC) and calibration using pseudo Hosmer-Lemeshow test. The sensitivity, specificity and corresponding 95% confidence intervals were assessed for each risk score using the highest risk category ([20+ % at 10 years) as the "positive" test. Results: Recalibrated and original 1998 and original 2001 Framingham scores show better discrimination (>0.720) and model fitness (low AIC) for CHD and CVD. All 4 scores are correctly calibrated (Chi2<20). The recalibrated Framingham 1998 score has the best sensitivities, 37.8% and 40.4%, for CHD and CVD, respectively. All scores present specificities >90%. Framingham 1998, PROCAM and IAS-AGLA scores include the greatest proportion of subjects (>200) in the high risk category whereas recalibrated Framingham 2001 and Reynolds include <=44 subjects. Conclusion: In this cohort, we see variations of accuracy between risk scores, the original Framingham 2001 score demonstrating the best compromise between its accuracy and its limited selection of subjects in the highest risk category. We advocate that national guidelines, based on independently validated data, take into account calibrated CV risk scores for their respective countries.
Resumo:
OBJECTIVES: Toll-like receptors (TLRs) are innate immune sensors that are integral to resisting chronic and opportunistic infections. Mounting evidence implicates TLR polymorphisms in susceptibilities to various infectious diseases, including HIV-1. We investigated the impact of TLR single nucleotide polymorphisms (SNPs) on clinical outcome in a seroincident cohort of HIV-1-infected volunteers. DESIGN: We analyzed TLR SNPs in 201 antiretroviral treatment-naive HIV-1-infected volunteers from a longitudinal seroincident cohort with regular follow-up intervals (median follow-up 4.2 years, interquartile range 4.4). Participants were stratified into two groups according to either disease progression, defined as peripheral blood CD4(+) T-cell decline over time, or peak and setpoint viral load. METHODS: Haplotype tagging SNPs from TLR2, TLR3, TLR4, and TLR9 were detected by mass array genotyping, and CD4(+) T-cell counts and viral load measurements were determined prior to antiretroviral therapy initiation. The association of TLR haplotypes with viral load and rapid progression was assessed by multivariate regression models using age and sex as covariates. RESULTS: Two TLR4 SNPs in strong linkage disequilibrium [1063 A/G (D299G) and 1363 C/T (T399I)] were more frequent among individuals with high peak viral load compared with low/moderate peak viral load (odds ratio 6.65, 95% confidence interval 2.19-20.46, P < 0.001; adjusted P = 0.002 for 1063 A/G). In addition, a TLR9 SNP previously associated with slow progression was found less frequently among individuals with high viral setpoint compared with low/moderate setpoint (odds ratio 0.29, 95% confidence interval 0.13-0.65, P = 0.003, adjusted P = 0.04). CONCLUSION: This study suggests a potentially new role for TLR4 polymorphisms in HIV-1 peak viral load and confirms a role for TLR9 polymorphisms in disease progression.
Resumo:
BACKGROUND AND OBJECTIVES: Combination antiretroviral therapy (cART) is changing, and this may affect the type and occurrence of side effects. We examined the frequency of lipodystrophy (LD) and weight changes in relation to the use of specific drugs in the Swiss HIV Cohort Study (SHCS). METHODS: In the SHCS, patients are followed twice a year and scored by the treating physician as having 'fat accumulation', 'fat loss', or neither. Treatments, and reasons for change thereof, are recorded. Our study sample included all patients treated with cART between 2003 and 2006 and, in addition, all patients who started cART between 2000 and 2003. RESULTS: From 2003 to 2006, the percentage of patients taking stavudine, didanosine and nelfinavir decreased, the percentage taking lopinavir, nevirapine and efavirenz remained stable, and the percentage taking atazanavir and tenofovir increased by 18.7 and 22.2%, respectively. In life-table Kaplan-Meier analysis, patients starting cART in 2003-2006 were less likely to develop LD than those starting cART from 2000 to 2002 (P<0.02). LD was quoted as the reason for treatment change or discontinuation for 4% of patients on cART in 2003, and for 1% of patients treated in 2006 (P for trend <0.001). In univariate and multivariate regression analysis, patients with a weight gain of >or=5 kg were more likely to take lopinavir or atazanavir than patients without such a weight gain [odds ratio (OR) 2, 95% confidence interval (CI) 1.3-2.9, and OR 1.7, 95% CI 1.3-2.1, respectively]. CONCLUSIONS: LD has become less frequent in the SHCS from 2000 to 2006. A weight gain of more than 5 kg was associated with the use of atazanavir and lopinavir.
Resumo:
OBJECTIVES: Darunavir was designed for activity against HIV resistant to other protease inhibitors (PIs). We assessed the efficacy, tolerability and risk factors for virological failure of darunavir for treatment-experienced patients seen in clinical practice. METHODS: We included all patients in the Swiss HIV Cohort Study starting darunavir after recording a viral load above 1000 HIV-1 RNA copies/mL given prior exposure to both PIs and nonnucleoside reverse transcriptase inhibitors. We followed these patients for up to 72 weeks, assessed virological failure using different loss of virological response algorithms and evaluated risk factors for virological failure using a Bayesian method to fit discrete Cox proportional hazard models. RESULTS: Among 130 treatment-experienced patients starting darunavir, the median age was 47 years, the median duration of HIV infection was 16 years, and 82% received mono or dual antiretroviral therapy before starting highly active antiretroviral therapy. During a median patient follow-up period of 45 weeks, 17% of patients stopped taking darunavir after a median exposure of 20 weeks. In patients followed beyond 48 weeks, the rate of virological failure at 48 weeks was at most 20%. Virological failure was more likely where patients had previously failed on both amprenavir and saquinavir and as the number of previously failed PI regimens increased. CONCLUSIONS: As a component of therapy for treatment-experienced patients, darunavir can achieve a similar efficacy and tolerability in clinical practice to that seen in clinical trials. Clinicians should consider whether a patient has failed on both amprenavir and saquinavir and the number of failed PI regimens before prescribing darunavir.
Resumo:
Molecular and genetic investigations in endometrial carcinogenesis may have prognostic and therapeutic implications. We studied the expression of EGFR, c-Met, PTEN and the mTOR signalling pathway (phospho-AKT/phospho-mTOR/phospho-RPS6) in 69 consecutive tumours and 16 tissue microarrays. We also analysed PIK3CA, K-Ras mutations and microsatellite instability (MSI). We distinguished two groups: group 1 (grade 1 and 2 endometrioid cancers) and group 2 (grade 3 endometrioid and type II clear and serous cell cancers). We hypothesised that these histological groups might have different features. We found that a) survival was higher in group 1 with less aggressive tumours (P⟨0.03); b) EGFR (P=0.01), PTEN and the AKT/mTOR/RPS6 signalling pathway were increased in group 1 versus group 2 (P=0.05 for phospho-mTOR); c) conversely, c-Met was higher (P⟨0.03) in group 2 than in group 1; d) In group 1, EGFR was correlated with c-Met, phospho-mTOR, phospho-RPS6 and the global activity of the phospho-AKT/phospho-mTOR/phospho-RPS6 pathway. In group 2, EGFR was correlated only with the phospho-AKT/phospho-mTOR/phospho-RPS6 pathway, whereas c-Met was correlated with PTEN; e) survival was higher for tumours with more than 50% PTEN-positive cells; f) K-RAS and PIK3CA mutations occurred in 10-12% of the available tumours and MSI in 40.4%, with a loss of MLH1 and PMS2 expression. Our results for endometrial cancers provide the first evidence for a difference in status between groups 1 and 2. The patients may benefit from different targeted treatments, anti-EGFR agents and rapamycin derivatives (anti-mTOR) for group 1 and an anti c-MET/ligand complex for group 2.