824 resultados para Cohort Extinction
Resumo:
Background a nd A ims: T he 2 007 ECCO g uidelines o nanemia in inflammatory bowel disease (IBD) favour intravenous(iv) over oral (po) i ron supplementation due to bettereffectiveness and tolerance. Application of guidelines in clinicalpractice m ay r equire time. We a imed to determine thepercentage of IBD patients under iron supplementation therapyand its application mode over time in a large IBD cohort.Methods: Helsana, a leading Swiss health insurance companyprovides c overage f or approximately 18% of t he Swisspopulation, corresponding to about 1.2 million enrollees.Patients with Crohn's disease (CD) and ulcerative colitis (UC)were identified b y keyword search from t he a nonymisedHelsana database.Results: I n total, 6 29 CD ( 61% female) a nd 4 03 UC ( 56%female) patients w ere identified, mean retrospectiveobservation time w as 2 0.4 m onths f or CD and 13 m onths f orUC patients. Of t he entire study population, 29.3% wereprescribed iron. O ccurrence of iron prescription was 21.3% inmales a nd 31.2% in f emales ( odds r atio [OR] 1 .69, 95%-confidence interval [CI] 1.26-2.28). The prescription of iv i ronincreased from 2006/2007 ( 48.8% w ith iv i ron) to 2 008/2009(65.2% with iv iron) by a factor of 1.89.Conclusions: One third of the IBD population was treated withiron supplementation. A gradual s hift from oral t o iv iron wasobserved over time in a large Swiss IBD cohort. This switch inprescription habits g oes a long with the implementation of theECCO consensus guidelines on anemia in IBD.
Resumo:
BACKGROUND AND PURPOSE: Intravenous thrombolysis for acute ischemic stroke is beneficial within 4.5 hours of symptom onset, but the effect rapidly decreases over time, necessitating quick diagnostic in-hospital work-up. Initial time strain occasionally results in treatment of patients with an alternate diagnosis (stroke mimics). We investigated whether intravenous thrombolysis is safe in these patients. METHODS: In this multicenter observational cohort study containing 5581 consecutive patients treated with intravenous thrombolysis, we determined the frequency and the clinical characteristics of stroke mimics. For safety, we compared the symptomatic intracranial hemorrhage (European Cooperative Acute Stroke Study II [ECASS-II] definition) rate of stroke mimics with ischemic strokes. RESULTS: One hundred stroke mimics were identified, resulting in a frequency of 1.8% (95% confidence interval, 1.5-2.2). Patients with a stroke mimic were younger, more often female, and had fewer risk factors except smoking and previous stroke or transient ischemic attack. The symptomatic intracranial hemorrhage rate in stroke mimics was 1.0% (95% confidence interval, 0.0-5.0) compared with 7.9% (95% confidence interval, 7.2-8.7) in ischemic strokes. CONCLUSIONS: In experienced stroke centers, among patients treated with intravenous thrombolysis, only a few had a final diagnosis other than stroke. The complication rate in these stroke mimics was low.
Resumo:
Background Long-term treatment of primary HIV-1 infection (PHI) may allow the immune reconstitution of responses lost during the acute viremic phase and decrease of peripheral reservoirs. This in turn may represent the best setting for the use of therapeutic vaccines in order to lower the viral set-point or control of viral rebound upon ART discontinuation. Methods We investigated a cohort of 16 patients who started ART at PHI, with treatment duration of ≥4 years and persistent aviremia (<50 HIV-1 copies/ml). The cohort was characterized in terms of viral subtype, cell-associated RNA, proviral DNA and HLA genotype. Secretion of IFN-γ, IL-2 and TNF-α by CD8 T-cells was analysed by polychromatic flowcytometry using a panel of 192 HIV-1-derived epitopes. Results This cohort is highly homogenous in terms of viral subtype: 81% clade B. We identified 44 epitope-specific responses: all patients had detectable responses to >1 epitope and the mean number of responding epitopes per patient was 3. The mean frequency of cytokines-secreting CD8 T-cells was 0.32%. CD8 T-cells secreting simultaneously IFN-γ, IL-2 and TNF-α made up for about 40% of the response and cells secreting at least 2 cytokines for about 80%, consistent with a highly polyfunctional CD8 T-cell profile. There was no difference in term of polyfunctionality when HLA restriction, or recognized viral regions and epitopes were considered. Proviral DNA was detectable in all patients but at low levels (mean = 108 copies/1 million PBMCs) while cell-associated mRNA was not detectable in 19% of patients (mean = 11 copies/1 million PBMCs when detectable). Conclusion Patients with sustained virological suppression after initiation of ART at PHI show polyfunctional CD8 T-cell and low levels of proviral DNA with an absence of residual replication in a substantial percentage of patients. The use of therapeutic vaccines in this population may promote low level of rebound viremia or control of viral replication upon ART cessation.
Resumo:
The end-Permian mass extinction greatly diminished marine diversity and brought about a whole-scale restructuring of marine ecosystems; these ecosystem changes also profoundly affected the sedimentary record. Data presented here, attained through facies analyses of strata deposited during the immediate aftermath of the end-Permian mass extinction (southern Turkey) and at the close of the Early Triassic (southwestern United States), in combination with a literature review, show that sedimentary systems were profoundly affected by: (1) a reduction in biotic diversity and abundance and (2) long-term environmental fluctuations that resulted from the end-Permian crisis. Lower Triassic strata display widespread microbialite and carbonate seafloor fan development and contain indicators of suppressed infaunal bioturbation such as flat-pebble conglomerates and wrinkle structures (facies considered unusual in post-Cambrian subtidal deposits). Our observations suggest that depositional systems, too, respond to biotic crises, and that certain facies may act as barometers of ecologic and environmental change independent of fossil assemblage analyses. Close investigation of facies changes during other critical times in Earth history may serve as an important tool in interpreting the ecology of metazoans and their environment.
Resumo:
Despite recent medical progresses in patient support, the mortality of sepsis remains high. Recently, new supporting strategies were proposed to improve outcome. Whereas such strategies are currently considered as standard of care, their real impact on mortality, morbidity, length of stay, and hence, health care resources utilization has been only weakly evaluated so far. Obviously, there is a critical need for epidemiologic surveys of sepsis to better address these major issues. The Lausanne Cohort of septic patients aims at building a large clinical, biological and microbiological database that will be used as a multidisciplinary research platform to study the various pathogenic mechanisms of sepsis in collaboration with the various specialists. This could be an opportunity to strengthen the collaboration within the Swiss Latin network of Intensive Care Medicine.
Resumo:
The aim of this study was to assess the prevalence of malignant lymphomas in patients with long-standing primary Sjögren's syndrome (pSS). We retrospectively studied a cohort of 55 patients with pSS over a mean follow-up period of 12 years. Five patients (9%) developed malignant lymphoma. The interval between the diagnoses of SS and lymphoma ranged from four to 12 years (mean = 6.5 years). The lymphoma arose in the lymph nodes in two cases, the parotid gland in one case, the lacrimal gland in one case, and the lung in one case. All five cases were B-cell low-grade lymphomas. Among our SS patients, those with extraglandular manifestations and/or a mixed cryoglobulin were at increased risk for lymphoma development. Secondary lymphoma carried a poor prognosis in our study. Three of the six SS patients who died during the follow-up period had lymphoma.
Resumo:
The Permo-Triassic crisis was a major turning point in geological history. Following the end-Guadalupian extinction phase, the Palaeozoic biota underwent a steady decline through the Lopingian (Late Permian), resulting in their decimation at the level that is adopted as the Permian-Triassic boundary (PTB). This trend coincided with the greatest Phanerozoic regression. The extinction at the end of the Guadalupian and that marking the end of the Permian are therefore related. The subsequent recovery of the biota occupied the whole of the Early Triassic. Several phases of perturbations in [delta]13Ccarb occurred through a similar period, from the late Wuchiapingian to the end of the Early Triassic. Therefore, the Permian-Triassic crisis was protracted, and spanned Late Permian and Early Triassic time. The extinction associated with the PTB occurred in two episodes, the main act with a prelude and the epilogue. The prelude commenced prior to beds 25 and 26 at Meishan and coincided with the end-Permian regression. The main act itself happened in beds 25 and 26 at Meishan. The epilogue occurred in the late Griesbachian and coincided with the second volcanogenic layer (bed 28) at Meishan. The temporal distribution of these episodes constrains the interpretation of mechanisms responsible for the greatest Phanerozoic mass extinction, particularly the significance of a postulated bolide impact that to our view may have occurred about 50,000[no-break space]Myr after the prelude. The prolonged and multi-phase nature of the Permo-Triassic crisis favours the mechanisms of the Earth's intrinsic evolution rather than extraterrestrial catastrophe. The most significant regression in the Phanerozoic, the palaeomagnetic disturbance of the Permo-Triassic Mixed Superchron, widespread extensive volcanism, and other events, may all be related, through deep-seated processes that occurred during the integration of Pangea. These combined processes could be responsible for the profound changes in marine, terrestrial and atmospheric environments that resulted in the end-Permian mass extinction. Bolide impact is possible but is neither an adequate nor a necessary explanation for these changes.
Resumo:
OBJECTIVE(S): To investigate the relationship between detection of HIV drug resistance by 2 years from starting antiretroviral therapy and the subsequent risk of progression to AIDS and death. DESIGN: Virological failure was defined as experiencing two consecutive viral loads of more than 400 copies/ml in the time window between 0.5 and 2 years from starting antiretroviral therapy (baseline). Patients were grouped according to evidence of virological failure and whether there was detection of the International AIDS Society resistance mutations to one, two or three drug classes in the time window. METHODS: Standard survival analysis using Kaplan-Meier curves and Cox proportional hazards regression model with time-fixed covariates defined at baseline was employed. RESULTS: We studied 8229 patients in EuroSIDA who started antiretroviral therapy and who had at least 2 years of clinical follow-up. We observed 829 AIDS events and 571 deaths during 38,814 person-years of follow-up resulting in an overall incidence of new AIDS and death of 3.6 per 100 person-years of follow-up [95% confidence interval (CI):3.4-3.8]. By 96 months from baseline, the proportion of patients with a new AIDS diagnosis or death was 20.3% (95% CI:17.7-22.9) in patients with no evidence of virological failure and 53% (39.3-66.7) in those with virological failure and mutations to three drug classes (P = 0.0001). An almost two-fold difference in risk was confirmed in the multivariable analysis (adjusted relative hazard = 1.8, 95% CI:1.2-2.7, P = 0.005). CONCLUSION: Although this study shows an association between the detection of resistance at failure and risk of clinical progression, further research is needed to clarify whether resistance reflects poor adherence or directly increases the risk of clinical events via exhaustion of drug options.
Resumo:
The objective of this study was to describe the all-cause mortality of participants in the Swiss Hepatitis C Cohort compared to the Swiss general population. Patients with hepatitis C virus (HCV) infection attending secondary and tertiary care centres in Switzerland. One thousand six hundred and forty-five patients with HCV infection were followed up for a mean of over 2 years. We calculated all-cause standardized mortality ratios (SMR) and 95% confidence intervals (CI) using age, sex and calendar year-specific Swiss all-cause mortality rates. Multivariable Poisson regression was used to model the variability of SMR by cirrhotic status, HCV genotype, infection with hepatitis B virus or HIV, injection drug use and alcohol intake. Sixty-one deaths were recorded out of 1645 participants. The crude all-cause SMR was 4.5 (95% CI: 3.5-5.8). Patients co-infected with HIV had a crude SMR of 20 (95% CI: 11.1-36.1). The SMR of 1.1 (95% CI: 0.63-2.03) for patients who were not cirrhotic, not infected with HBV or HIV, did not inject drugs, were not heavy alcohol consumers (<or=40 g/day) and were not genotype 3, indicated no strong evidence of excess mortality. We found little evidence of excess mortality in hepatitis C infected patients who were not cirrhotic, in the absence of selected risk factors. Our findings emphasize the importance of providing appropriate preventive advice, such as counselling to avoid alcohol intake, in those infected with HCV.
Resumo:
Background and Aims: The 2007 European Crohn's and Colitis Organization guidelines on anemia in inflammatory bowel disease (IBD) favour intravenous (iv) over oral (po) iron supplementation due to better effectiveness and tolerance. We aimed to determine the percentage of IBD patients under iron supplementation therapy and the dynamics of prescription habits (iv versus po) over time. Methods: Helsana, a leading Swiss health insurance company provides coverage for approximately 18% of the Swiss population, corresponding to about 1.2 million enrollees. Patients with Crohn's disease (CD) and ulcerative colitis (UC) were analyzed from the anonymised Helsana database. Results: In total, 629 CD (61% female) and 398 UC (57% female) patients were identified, mean observation time was 31.8 months for CD and 31.0 months for UC patients. Of the entire study population, 27.1% were prescribed iron (21.1% in males and 31.1% in females). Patients treated with IBD-specific drugs (steroids, immunomodulators, anti-TNF agents) were more frequently treated with iron compared to patients without any medication (35.0% vs. 20.9%, OR 1.91, 95%-CI 1.41-2.61). The prescription of iv iron increased from 2006/2007 (48.8% of all patients receiving any iron priscription) to 65.2% in 2008/2009 by a factor of 1.89. Conclusions: One third of the IBD population was treated with iron supplementation. A gradual shift from oral to iv iron was observed over time. This switch in prescription habits goes along with the implementation of the ECCO consensus guidelines on anemia in IBD.
Resumo:
Background: The appropriateness of use of therapy for severe active luminal Crohn's disease (CD) cases has never been formally assessed. The European panel on the appropriateness of Crohn's disease therapy [EPACT (http://www.epact.ch)] developed appropriateness criteria. We have applied these criteria to the EC-IBD prospectively assembled, uniformly diagnosed European population-based inception cohort of Inflammatory Bowel Disease (IBD) patients diagnosed between 1991 and 1993. Methods: 426 CD patients from 13 European participating centers (10 countries) were included at the time of diagnosis (first flare, naive patients, no maintenance treatment, no steroids). We used the EPACT definition of the severe active luminal CD, agreed upon by the panel experts (acute flare, hospitalized patient, without documented fistula or stenosis and who did not undergo surgery for abscess drainage or a fistulectomy). The various treatments were analyzed to determine the appropriateness of the medical decision, according to the EPACT criteria. Results: 84 (20%) patients met the inclusion criteria. Considering at least one appropriate (A) treatment as appropriate: 60 patients (71%) received an appropriate treatment, 24 patients (29%) an inappropriate treatment (I). Furthermore, in 87% of the cases with one appropriate treatment an additional mostly inappropriate treatment was added or continued. Detailed results are indicated in the table below. Conclusion: In the EC-IBD cohort, the treatment for severe active luminal CD was appropriate for more than 70% of the patients, but frequently an inappropriate treatment was continued or added, thus increasing the risk of adverse reactions, drugs interactions and costs.
Resumo:
BACKGROUND: Little is known about time trends, predictors, and consequences of changes made to antiretroviral therapy (ART) regimens early after patients initially start treatment. METHODS: We compared the incidence of, reasons for, and predictors of treatment change within 1 year after starting combination ART (cART), as well as virological and immunological outcomes at 1 year, among 1866 patients from the Swiss HIV Cohort Study who initiated cART during 2000--2001, 2002--2003, or 2004--2005. RESULTS: The durability of initial regimens did not improve over time (P = .15): 48.8% of 625 patients during 2000--2001, 43.8% of 607 during 2002--2003, and 44.3% of 634 during 2004--2005 changed cART within 1 year; reasons for change included intolerance (51.1% of all patients), patient wish (15.4%), physician decision (14.8%), and virological failure (7.1%). An increased probability of treatment change was associated with larger CD4+ cell counts, larger human immunodeficiency virus type 1 (HIV-1) RNA loads, and receipt of regimens that contained stavudine or indinavir/ritonavir, but a decreased probability was associated with receipt of regimens that contained tenofovir. Treatment discontinuation was associated with larger CD4+ cell counts, current use of injection drugs, and receipt of regimens that contained nevirapine. One-year outcomes improved between 2000--2001 and 2004--2005: 84.5% and 92.7% of patients, respectively, reached HIV-1 RNA loads of <50 copies/mL and achieved median increases in CD4+ cell counts of 157.5 and 197.5 cells/microL, respectively (P < .001 for all comparisons). CONCLUSIONS: Virological and immunological outcomes of initial treatments improved between 2000--2001 and 2004--2005, irrespective of uniformly high rates of early changes in treatment across the 3 study intervals.
Resumo:
A scientific challenge is to assess the role of Deccan volcanism in the Cretaceous-Tertiary boundary (KTB) mass extinction. Here we report on the stratigraphy and biologic effects of Deccan volcanism in eleven deep wells from the Krishna-Godavari (K-G) Basin, Andhra Pradesh, India. In these wells, two phases of Deccan volcanism record the world's largest and longest lava mega-flows interbedded in marine sediments in the K-G Basin about 1500 km from the main Deccan volcanic province. The main phase-2 eruptions (similar to 80% of total Deccan Traps) began in C29r and ended at or near the KTB, an interval that spans planktic foraminiferal zones CF1-CF2 and most of the nannofossil Micula prinsii zone, and is correlative with the rapid global warming and subsequent cooling near the end of the Maastrichtian. The mass extinction began in phase-2 preceding the first of four mega-flows. Planktic foraminifera suffered a 50% drop in species richness. Survivors suffered another 50% drop after the first mega-flow, leaving just 7 to 8 survivor species. No recovery occurred between the next three mega-flows and the mass extinction was complete with the last phase-2 mega-flow at the KTB. The mass extinction was likely the consequence of rapid and massive volcanic CO(2) and SO(2) gas emissions, leading to high continental weathering rates, global warming, cooling, acid rains, ocean acidification and a carbon crisis in the marine environment. Deccan volcanism phase-3 began in the early Danian near the C29R/C29n boundary correlative with the planktic foraminiferal zone P1a/P1b boundary and accounts for similar to 14% of the total volume of Deccan eruptions, including four of Earth's longest and largest mega-flows. No major faunal changes are observed in the intertrappeans of zone P1b, which suggests that environmental conditions remained tolerable, volcanic eruptions were less intense and/or separated by longer time intervals thus preventing runaway effects. Alternatively, early Danian assemblages evolved in adaptation to high-stress conditions in the aftermath of the mass extinction and therefore survived phase-3 volcanism. Full marine biotic recovery did not occur until after Deccan phase-3. These data suggest that the catastrophic effects of phase-2 Deccan volcanism upon the Cretaceous planktic foraminifera were a function of both the rapid and massive volcanic eruptions and the highly specialized faunal assemblages prone to extinction in a changing environment. Data from the K-G Basin indicates that Deccan phase-2 alone could have caused the KTB mass extinction and that impacts may have had secondary effects.
Resumo:
In the field of thrombosis and haemostasis, many preanalytical variables influence the results of coagulation assays and measures to limit potential results variations should be taken. To our knowledge, no paper describing the development and maintenance of a haemostasis biobank has been previously published. Our description of the biobank of the Swiss cohort of elderly patients with venous thromboembolism (SWITCO65+) is intended to facilitate the set-up of other biobanks in the field of thrombosis and haemostasis. SWITCO65+ is a multicentre cohort that prospectively enrolled consecutive patients aged ≥65 years with venous thromboembolism at nine Swiss hospitals from 09/2009 to 03/2012. Patients will be followed up until December 2013. The cohort includes a biobank with biological material from each participant taken at baseline and after 12 months of follow-up. Whole blood from all participants is assayed with a standard haematology panel, for which fresh samples are required. Two buffy coat vials, one PAXgene Blood RNA System tube and one EDTA-whole blood sample are also collected at baseline for RNA/DNA extraction. Blood samples are processed and vialed within 1 h of collection and transported in batches to a central laboratory where they are stored in ultra-low temperature archives. All analyses of the same type are performed in the same laboratory in batches. Using multiple core laboratories increased the speed of sample analyses and reduced storage time. After recruiting, processing and analyzing the blood of more than 1,000 patients, we determined that the adopted methods and technologies were fit-for-purpose and robust.
Resumo:
ABSTRACT:: Adherence patterns and their influence on virologic outcome are well characterized for protease inhibitor (PI)- and non-nucleoside reverse transcriptase inhibitor (NNRTI)-based regimens. We aimed to determine how patterns of adherence to raltegravir influence the risk of virological failure. We conducted a prospective multicenter cohort following 81 HIV-infected antiretroviral-naive or experienced subjects receiving or starting twice-a-day raltegravir-based antiretroviral therapy. Their adherence patterns were monitored using the Medication Events Monitoring System. During follow-up (188 days, ±77), 12 (15%) of 81 subjects experienced virological failure. Longer treatment interruption [adjusted odds ratio per 24-hour increase: 2.4; 95% confidence interval: 1.2 to 6.9; P < 0.02] and average adherence (odds ratio per 5% increase: 0.68; 95% confidence interval: 0.46 to 1.00, P < 0.05) were both independently associated with virological failure controlling for prior duration of viral suppression. Timely interdose intervals and high levels of adherence to raltegravir are both necessary to control HIV replication.