946 resultados para Birth Cohort
Resumo:
Background and Aims: The 2007 European Crohn's and Colitis Organization guidelines on anemia in inflammatory bowel disease (IBD) favour intravenous (iv) over oral (po) iron supplementation due to better effectiveness and tolerance. We aimed to determine the percentage of IBD patients under iron supplementation therapy and the dynamics of prescription habits (iv versus po) over time. Methods: Helsana, a leading Swiss health insurance company provides coverage for approximately 18% of the Swiss population, corresponding to about 1.2 million enrollees. Patients with Crohn's disease (CD) and ulcerative colitis (UC) were analyzed from the anonymised Helsana database. Results: In total, 629 CD (61% female) and 398 UC (57% female) patients were identified, mean observation time was 31.8 months for CD and 31.0 months for UC patients. Of the entire study population, 27.1% were prescribed iron (21.1% in males and 31.1% in females). Patients treated with IBD-specific drugs (steroids, immunomodulators, anti-TNF agents) were more frequently treated with iron compared to patients without any medication (35.0% vs. 20.9%, OR 1.91, 95%-CI 1.41-2.61). The prescription of iv iron increased from 2006/2007 (48.8% of all patients receiving any iron priscription) to 65.2% in 2008/2009 by a factor of 1.89. Conclusions: One third of the IBD population was treated with iron supplementation. A gradual shift from oral to iv iron was observed over time. This switch in prescription habits goes along with the implementation of the ECCO consensus guidelines on anemia in IBD.
Resumo:
Background: The appropriateness of use of therapy for severe active luminal Crohn's disease (CD) cases has never been formally assessed. The European panel on the appropriateness of Crohn's disease therapy [EPACT (http://www.epact.ch)] developed appropriateness criteria. We have applied these criteria to the EC-IBD prospectively assembled, uniformly diagnosed European population-based inception cohort of Inflammatory Bowel Disease (IBD) patients diagnosed between 1991 and 1993. Methods: 426 CD patients from 13 European participating centers (10 countries) were included at the time of diagnosis (first flare, naive patients, no maintenance treatment, no steroids). We used the EPACT definition of the severe active luminal CD, agreed upon by the panel experts (acute flare, hospitalized patient, without documented fistula or stenosis and who did not undergo surgery for abscess drainage or a fistulectomy). The various treatments were analyzed to determine the appropriateness of the medical decision, according to the EPACT criteria. Results: 84 (20%) patients met the inclusion criteria. Considering at least one appropriate (A) treatment as appropriate: 60 patients (71%) received an appropriate treatment, 24 patients (29%) an inappropriate treatment (I). Furthermore, in 87% of the cases with one appropriate treatment an additional mostly inappropriate treatment was added or continued. Detailed results are indicated in the table below. Conclusion: In the EC-IBD cohort, the treatment for severe active luminal CD was appropriate for more than 70% of the patients, but frequently an inappropriate treatment was continued or added, thus increasing the risk of adverse reactions, drugs interactions and costs.
Resumo:
BACKGROUND: Little is known about time trends, predictors, and consequences of changes made to antiretroviral therapy (ART) regimens early after patients initially start treatment. METHODS: We compared the incidence of, reasons for, and predictors of treatment change within 1 year after starting combination ART (cART), as well as virological and immunological outcomes at 1 year, among 1866 patients from the Swiss HIV Cohort Study who initiated cART during 2000--2001, 2002--2003, or 2004--2005. RESULTS: The durability of initial regimens did not improve over time (P = .15): 48.8% of 625 patients during 2000--2001, 43.8% of 607 during 2002--2003, and 44.3% of 634 during 2004--2005 changed cART within 1 year; reasons for change included intolerance (51.1% of all patients), patient wish (15.4%), physician decision (14.8%), and virological failure (7.1%). An increased probability of treatment change was associated with larger CD4+ cell counts, larger human immunodeficiency virus type 1 (HIV-1) RNA loads, and receipt of regimens that contained stavudine or indinavir/ritonavir, but a decreased probability was associated with receipt of regimens that contained tenofovir. Treatment discontinuation was associated with larger CD4+ cell counts, current use of injection drugs, and receipt of regimens that contained nevirapine. One-year outcomes improved between 2000--2001 and 2004--2005: 84.5% and 92.7% of patients, respectively, reached HIV-1 RNA loads of <50 copies/mL and achieved median increases in CD4+ cell counts of 157.5 and 197.5 cells/microL, respectively (P < .001 for all comparisons). CONCLUSIONS: Virological and immunological outcomes of initial treatments improved between 2000--2001 and 2004--2005, irrespective of uniformly high rates of early changes in treatment across the 3 study intervals.
Resumo:
In the field of thrombosis and haemostasis, many preanalytical variables influence the results of coagulation assays and measures to limit potential results variations should be taken. To our knowledge, no paper describing the development and maintenance of a haemostasis biobank has been previously published. Our description of the biobank of the Swiss cohort of elderly patients with venous thromboembolism (SWITCO65+) is intended to facilitate the set-up of other biobanks in the field of thrombosis and haemostasis. SWITCO65+ is a multicentre cohort that prospectively enrolled consecutive patients aged ≥65 years with venous thromboembolism at nine Swiss hospitals from 09/2009 to 03/2012. Patients will be followed up until December 2013. The cohort includes a biobank with biological material from each participant taken at baseline and after 12 months of follow-up. Whole blood from all participants is assayed with a standard haematology panel, for which fresh samples are required. Two buffy coat vials, one PAXgene Blood RNA System tube and one EDTA-whole blood sample are also collected at baseline for RNA/DNA extraction. Blood samples are processed and vialed within 1 h of collection and transported in batches to a central laboratory where they are stored in ultra-low temperature archives. All analyses of the same type are performed in the same laboratory in batches. Using multiple core laboratories increased the speed of sample analyses and reduced storage time. After recruiting, processing and analyzing the blood of more than 1,000 patients, we determined that the adopted methods and technologies were fit-for-purpose and robust.
Resumo:
ABSTRACT:: Adherence patterns and their influence on virologic outcome are well characterized for protease inhibitor (PI)- and non-nucleoside reverse transcriptase inhibitor (NNRTI)-based regimens. We aimed to determine how patterns of adherence to raltegravir influence the risk of virological failure. We conducted a prospective multicenter cohort following 81 HIV-infected antiretroviral-naive or experienced subjects receiving or starting twice-a-day raltegravir-based antiretroviral therapy. Their adherence patterns were monitored using the Medication Events Monitoring System. During follow-up (188 days, ±77), 12 (15%) of 81 subjects experienced virological failure. Longer treatment interruption [adjusted odds ratio per 24-hour increase: 2.4; 95% confidence interval: 1.2 to 6.9; P < 0.02] and average adherence (odds ratio per 5% increase: 0.68; 95% confidence interval: 0.46 to 1.00, P < 0.05) were both independently associated with virological failure controlling for prior duration of viral suppression. Timely interdose intervals and high levels of adherence to raltegravir are both necessary to control HIV replication.
Resumo:
BACKGROUND: Although methicillin-susceptible Staphylococcus aureus (MSSA) native bone and joint infection (BJI) constitutes the more frequent clinical entity of BJI, prognostic studies mostly focused on methicillin-resistant S. aureus prosthetic joint infection. We aimed to assess the determinants of native MSSA BJI outcomes. METHODS: Retrospective cohort study (2001-2011) of patients admitted in a reference hospital centre for native MSSA BJI. Treatment failure determinants were assessed using Kaplan-Meier curves and binary logistic regression. RESULTS: Sixty-six patients (42 males [63.6%]; median age 61.2 years; interquartile range [IQR] 45.9-71.9) presented an acute (n = 38; 57.6%) or chronic (n = 28; 42.4%) native MSSA arthritis (n = 15; 22.7%), osteomyelitis (n = 19; 28.8%) or spondylodiscitis (n = 32; 48.5%), considered as "difficult-to-treat" in 61 cases (92.4%). All received a prolonged (27.1 weeks; IQR, 16.9-36.1) combined antimicrobial therapy, after surgical management in 37 cases (56.1%). Sixteen treatment failures (24.2%) were observed during a median follow-up period of 63.3 weeks (IQR, 44.7-103.1), including 13 persisting infections, 1 relapse after treatment disruption, and 2 super-infections. Independent determinants of treatment failure were the existence of a sinus tract (odds ratio [OR], 5.300; 95% confidence interval [CI], 1.166-24.103) and a prolonged delay to infectious disease specialist referral (OR, 1.134; 95% CI 1.013-1.271). CONCLUSIONS: The important treatment failure rate pinpointed the difficulty of cure encountered in complicated native MSSA BJI. An early infectious disease specialist referral is essential, especially in debilitated patients or in presence of sinus tract.
Resumo:
Whole-grain foods are touted for multiple health benefits, including enhancing insulin sensitivity and reducing type 2 diabetes risk. Recent genome-wide association studies (GWAS) have identified several single nucleotide polymorphisms (SNPs) associated with fasting glucose and insulin concentrations in individuals free of diabetes. We tested the hypothesis that whole-grain food intake and genetic variation interact to influence concentrations of fasting glucose and insulin. Via meta-analysis of data from 14 cohorts comprising ∼ 48,000 participants of European descent, we studied interactions of whole-grain intake with loci previously associated in GWAS with fasting glucose (16 loci) and/or insulin (2 loci) concentrations. For tests of interaction, we considered a P value <0.0028 (0.05 of 18 tests) as statistically significant. Greater whole-grain food intake was associated with lower fasting glucose and insulin concentrations independent of demographics, other dietary and lifestyle factors, and BMI (β [95% CI] per 1-serving-greater whole-grain intake: -0.009 mmol/l glucose [-0.013 to -0.005], P < 0.0001 and -0.011 pmol/l [ln] insulin [-0.015 to -0.007], P = 0.0003). No interactions met our multiple testing-adjusted statistical significance threshold. The strongest SNP interaction with whole-grain intake was rs780094 (GCKR) for fasting insulin (P = 0.006), where greater whole-grain intake was associated with a smaller reduction in fasting insulin concentrations in those with the insulin-raising allele. Our results support the favorable association of whole-grain intake with fasting glucose and insulin and suggest a potential interaction between variation in GCKR and whole-grain intake in influencing fasting insulin concentrations.
Resumo:
OBJECTIVE: To assess the contribution of modifiable risk factors to social inequalities in the incidence of type 2 diabetes when these factors are measured at study baseline or repeatedly over follow-up and when long term exposure is accounted for. DESIGN: Prospective cohort study with risk factors (health behaviours (smoking, alcohol consumption, diet, and physical activity), body mass index, and biological risk markers (systolic blood pressure, triglycerides and high density lipoprotein cholesterol)) measured four times and diabetes status assessed seven times between 1991-93 and 2007-09. SETTING: Civil service departments in London (Whitehall II study). PARTICIPANTS: 7237 adults without diabetes (mean age 49.4 years; 2196 women). MAIN OUTCOME MEASURES: Incidence of type 2 diabetes and contribution of risk factors to its association with socioeconomic status. RESULTS: Over a mean follow-up of 14.2 years, 818 incident cases of diabetes were identified. Participants in the lowest occupational category had a 1.86-fold (hazard ratio 1.86, 95% confidence interval 1.48 to 2.32) greater risk of developing diabetes relative to those in the highest occupational category. Health behaviours and body mass index explained 33% (-1% to 78%) of this socioeconomic differential when risk factors were assessed at study baseline (attenuation of hazard ratio from 1.86 to 1.51), 36% (22% to 66%) when they were assessed repeatedly over the follow-up (attenuated hazard ratio 1.48), and 45% (28% to 75%) when long term exposure over the follow-up was accounted for (attenuated hazard ratio 1.41). With additional adjustment for biological risk markers, a total of 53% (29% to 88%) of the socioeconomic differential was explained (attenuated hazard ratio 1.35, 1.05 to 1.72). CONCLUSIONS: Modifiable risk factors such as health behaviours and obesity, when measured repeatedly over time, explain almost half of the social inequalities in incidence of type 2 diabetes. This is more than was seen in previous studies based on single measurement of risk factors.
Resumo:
The substantial recurrence rate of colorectal cancer following potentially curative resection has fuelled the search for effective adjuvant therapy. Previous trials using 5-fluorouracil (5-FU) as a single agent or in combination chemotherapy regimens have not demonstrated meaningful benefits, an impression reflected in the results of a meta-analysis encompassing large patient numbers. Newer developments utilizing intraportal chemotherapy and the combination of 5-FU and levamisole have resulted in lower recurrence rates and improved survival in patients with colon cancer. In advanced disease, the biochemical modulation of 5-FU by Leucovorin has been shown to prolong survival in some studies. Combined chemotherapy and radiotherapy or chemotherapy alone have showed promising results in rectal cancer. These developments have now been incorporated into ongoing trials.
Resumo:
The success of combination antiretroviral therapy is limited by the evolutionary escape dynamics of HIV-1. We used Isotonic Conjunctive Bayesian Networks (I-CBNs), a class of probabilistic graphical models, to describe this process. We employed partial order constraints among viral resistance mutations, which give rise to a limited set of mutational pathways, and we modeled phenotypic drug resistance as monotonically increasing along any escape pathway. Using this model, the individualized genetic barrier (IGB) to each drug is derived as the probability of the virus not acquiring additional mutations that confer resistance. Drug-specific IGBs were combined to obtain the IGB to an entire regimen, which quantifies the virus' genetic potential for developing drug resistance under combination therapy. The IGB was tested as a predictor of therapeutic outcome using between 2,185 and 2,631 treatment change episodes of subtype B infected patients from the Swiss HIV Cohort Study Database, a large observational cohort. Using logistic regression, significant univariate predictors included most of the 18 drugs and single-drug IGBs, the IGB to the entire regimen, the expert rules-based genotypic susceptibility score (GSS), several individual mutations, and the peak viral load before treatment change. In the multivariate analysis, the only genotype-derived variables that remained significantly associated with virological success were GSS and, with 10-fold stronger association, IGB to regimen. When predicting suppression of viral load below 400 cps/ml, IGB outperformed GSS and also improved GSS-containing predictors significantly, but the difference was not significant for suppression below 50 cps/ml. Thus, the IGB to regimen is a novel data-derived predictor of treatment outcome that has potential to improve the interpretation of genotypic drug resistance tests.
Resumo:
BACKGROUND: Elderly patients are emerging as a population at high risk for infective endocarditis (IE). However, adequately sized prospective studies on the features of IE in elderly patients are lacking. METHODS: In this multinational, prospective, observational cohort study within the International Collaboration on Endocarditis, 2759 consecutive patients were enrolled from June 15, 2000, to December 1, 2005; 1056 patients with IE 65 years or older were compared with 1703 patients younger than 65 years. Risk factors, predisposing conditions, origin, clinical features, course, and outcome of IE were comprehensively analyzed. RESULTS: Elderly patients reported more frequently a hospitalization or an invasive procedure before IE onset. Diabetes mellitus and genitourinary and gastrointestinal cancer were the major predisposing conditions. Blood culture yield was higher among elderly patients with IE. The leading causative organism was Staphylococcus aureus, with a higher rate of methicillin resistance. Streptococcus bovis and enterococci were also significantly more prevalent. The clinical presentation of elderly patients with IE was remarkable for lower rates of embolism, immune-mediated phenomena, or septic complications. At both echocardiography and surgery, fewer vegetations and more abscesses were found, and the gain in the diagnostic yield of transesophageal echocardiography was significantly larger. Significantly fewer elderly patients underwent cardiac surgery (38.9% vs 53.5%; P < .001). Elderly patients with IE showed a higher rate of in-hospital death (24.9% vs 12.8%; P < .001), and age older than 65 years was an independent predictor of mortality. CONCLUSIONS: In this large prospective study, increasing age emerges as a major determinant of the clinical characteristics of IE. Lower rates of surgical treatment and high mortality are the most prominent features of elderly patients with IE. Efforts should be made to prevent health care-associated acquisition and improve outcomes in this major subgroup of patients with IE.
Resumo:
BACKGROUND: Whether nucleoside reverse transcriptase inhibitors increase the risk of myocardial infarction in HIV-infected individuals is unclear. Our aim was to explore whether exposure to such drugs was associated with an excess risk of myocardial infarction in a large, prospective observational cohort of HIV-infected patients. METHODS: We used Poisson regression models to quantify the relation between cumulative, recent (currently or within the preceding 6 months), and past use of zidovudine, didanosine, stavudine, lamivudine, and abacavir and development of myocardial infarction in 33 347 patients enrolled in the D:A:D study. We adjusted for cardiovascular risk factors that are unlikely to be affected by antiretroviral therapy, cohort, calendar year, and use of other antiretrovirals. FINDINGS: Over 157,912 person-years, 517 patients had a myocardial infarction. We found no associations between the rate of myocardial infarction and cumulative or recent use of zidovudine, stavudine, or lamivudine. By contrast, recent-but not cumulative-use of abacavir or didanosine was associated with an increased rate of myocardial infarction (compared with those with no recent use of the drugs, relative rate 1.90, 95% CI 1.47-2.45 [p=0.0001] with abacavir and 1.49, 1.14-1.95 [p=0.003] with didanosine); rates were not significantly increased in those who stopped these drugs more than 6 months previously compared with those who had never received these drugs. After adjustment for predicted 10-year risk of coronary heart disease, recent use of both didanosine and abacavir remained associated with increased rates of myocardial infarction (1.49, 1.14-1.95 [p=0.004] with didanosine; 1.89, 1.47-2.45 [p=0.0001] with abacavir). INTERPRETATION: There exists an increased risk of myocardial infarction in patients exposed to abacavir and didanosine within the preceding 6 months. The excess risk does not seem to be explained by underlying established cardiovascular risk factors and was not present beyond 6 months after drug cessation.
Resumo:
BACKGROUND: The advent of highly active antiretroviral therapy (HAART) in 1996 led to a decrease in the incidence of Kaposi's sarcoma (KS) and non-Hodgkin's lymphoma (NHL), but not of other cancers, among people with HIV or AIDS (PWHA). It also led to marked increases in their life expectancy. METHODS: We conducted a record-linkage study between the Swiss HIV Cohort Study and nine Swiss cantonal cancer registries. In total, 9429 PWHA provided 20,615, 17,690, and 15,410 person-years in the pre-, early-, and late-HAART periods, respectively. Standardised incidence ratios in PWHA vs the general population, as well as age-standardised, and age-specific incidence rates were computed for different periods. RESULTS: Incidence of KS and NHL decreased by several fold between the pre- and early-HAART periods, and additionally declined from the early- to the late-HAART period. Incidence of cancers of the anus, liver, non-melanomatous skin, and Hodgkin's lymphoma increased in the early- compared with the pre-HAART period, but not during the late-HAART period. The incidence of all non-AIDS-defining cancers (NADCs) combined was similar in all periods, and approximately double that in the general population. CONCLUSIONS: Increases in the incidence of selected NADCs after the introduction of HAART were largely accounted for by the ageing of PWHA.
Resumo:
Background: Maturation of amplitude-integrated electroencephalogram (aEEG) activity is influenced by both gestational age (GA) and postmenstrual age. It is not fully known how this process is influenced by cerebral lesions. Objective: To compare early aEEG developmental changes between preterm newborns with different degrees of cerebral lesions on cranial ultrasound (cUS). Methods: Prospective cohort study on preterm newborns with GA <32.0 weeks, undergoing continuous aEEG recording during the first 84 h after birth. aEEG characteristics were qualitatively and quantitatively evaluated using pre-established criteria. Based on cUS findings three groups were formed: normal (n = 78), mild (n = 20), and severe cerebral lesions (n = 6). Linear mixed models for repeated measures were used to analyze aEEG maturational trajectories. Results: 104 newborns with a mean GA (range) 29.5 (24.4-31.7) weeks, and birth weight 1,220 (580-2,020) g were recruited. Newborns with severe brain lesions started with similar aEEG scores and tendentially lower aEEG amplitudes than newborns without brain lesions, and showed a slower development of the cyclic activity (p < 0.001), but a more rapid increase of the maximum and minimum aEEG amplitudes (p = 0.002 and p = 0.04). Conclusions: Preterm infants with severe cerebral lesions manifest a maturational delay in the aEEG cyclic activity already early after birth, but show a catch-up of aEEG amplitudes to that of newborns without cerebral lesions. Changes in the maturational aEEG pattern may be a marker of severe neurological lesions in the preterm infant.