896 resultados para Cohort-based supervision
Resumo:
Context-Daytime sleepiness in kidney transplant recipients has emerged as a potential predictor of impaired adherence to the immunosuppressive medication regimen. Thus there is a need to assess daytime sleepiness in clinical practice and transplant registries.Objective-To evaluate the validity of a single-item measure of daytime sleepiness integrated in the Swiss Transplant Cohort Study (STCS), using the American Educational Research Association framework.Methods-Using a cross-sectional design, we enrolled a convenience sample of 926 home-dwelling kidney transplant recipients (median age, 59.69 years; 25%-75% quartile [Q25-Q75], 50.27-59.69), 63% men; median time since transplant 9.42 years (Q25-Q75, 4.93-15.85). Daytime sleepiness was assessed by using a single item from the STCS and the 8 items of the validated Epworth Sleepiness Scale. Receiver operating characteristic curve analysis was used to determine the cutoff for the STCS daytime sleepiness item against the Epworth Sleepiness Scale score.Results-Based on the receiver operating characteristic curve analysis, a score greater than 4 on the STCS daytime sleepiness item is recommended to detect daytime sleepiness. Content validity was high as all expert reviews were unanimous. Concurrent validity was moderate (Spearman ϱ, 0.531; P< .001) and convergent validity with depression and poor sleep quality although low, was significant (ϱ, 0.235; P<.001 and ϱ, 0.318, P=.002, respectively). For the group difference validity: kidney transplant recipients with moderate, severe, and extremely severe depressive symptom scores had 3.4, 4.3, and 5.9 times higher odds of having daytime sleepiness, respectively, as compared with recipients without depressive symptoms.Conclusion-The accumulated evidence provided evidence for the validity of the STCS daytime sleepiness item as a simple screening scale for daytime sleepiness.
Resumo:
OBJECTIVE: Metabolic changes caused by antiretroviral therapy (ART) may increase the risk of coronary heart disease (CHD). We evaluated changes in the prevalence of cardiovascular risk factors (CVRFs) and 10-year risk of CHD in a large cohort of HIV-infected individuals. METHODS: All individuals from the Swiss HIV Cohort Study (SHCS) who completed at least one CVRF questionnaire and for whom laboratory data were available for the period February 2000 to February 2006 were included in the analysis. The presence of a risk factor was determined using cut-offs based on the guidelines of the National Cholesterol Education Program (NCEP ATP III), the Joint National Committee on Prevention, Detection, Evaluation, and Treatment of High Blood Pressure (JNC7), the American Diabetes Association, and the Swiss Society for Cardiology. RESULTS: Overall, 8,033 individuals completed at least one CVRF questionnaire. The most common CVRFs in the first completed questionnaire were smoking (57.0%), low high-density lipoprotein (HDL) cholesterol (37.2%), high triglycerides (35.7%), and high blood pressure (26.1%). In total, 2.7 and 13.8% of patients were categorized as being at high (>20%) and moderate (10-20%) 10-year risk for CHD, respectively. Over 6 years the percentage of smokers decreased from 61.4 to 47.6% and the percentage of individuals with total cholesterol >6.2 mmol/L decreased from 21.1 to 12.3%. The prevalence of CVRFs and CHD risk was higher in patients currently on ART than in either pretreated or ART-naive patients. CONCLUSION: During the 6-year observation period, the prevalence of CVRFs remains high in the SHCS. Time trends indicate a decrease in the percentage of smokers and individuals with high cholesterol.
Resumo:
OBJECTIVE: To estimate the cumulative incidence of severe complications associated with genital chlamydia infection in the general female population. METHODS: The Uppsala Women's Cohort Study was a retrospective population based cohort study in Sweden, linking laboratory, hospital, and population registers. We estimated the cumulative incidence of hospital diagnosed pelvic inflammatory disease, ectopic pregnancy, and infertility, and used multivariable regression models to estimate hazard ratios according to screening status. RESULTS: We analysed complete data from 43 715 women in Uppsala aged 15-24 years between January 1985 and December 1989. Follow up until the end of 1999 included 709 000 woman years and 3025 events. The cumulative incidence of pelvic inflammatory disease by age 35 years was 3.9% (95% CI 3.7% to 4.0%) overall: 5.6% (4.7% to 6.7%) in women who ever tested positive for chlamydia, 4.0% (3.7% to 4.4%) in those with negative tests, and 2.9% (2.7% to 3.2%) in those who were never screened. The corresponding figures were: for ectopic pregnancy, 2.3% (2.2% to 2.5%) overall, 2.7% (2.1% to 3.5%), 2.0% (1.8% to 2.3%), and 1.9% (1.7% to 2.1%); and for infertility, 4.1% (3.9% to 4.3%) overall, 6.7% (5.7% to 7.9%), 4.7% (4.4% to 5.1%), and 3.1% (2.8% to 3.3%). Low educational attainment was strongly associated with the development of all outcomes. CONCLUSIONS: The incidence of severe chlamydia associated complications estimated from ours, and other population based studies, was lower than expected. Studies that incorporate data about pelvic inflammatory disease diagnosed in primary care and behavioural risk factors would further improve our understanding of the natural history of chlamydia. Our results provide reassurance for patients, but mean that the benefits of chlamydia screening programmes might have been overestimated.
Resumo:
BACKGROUND: No large clinical end-point trials have been conducted comparing regimens among human immunodeficiency virus type 1-positive persons starting antiretroviral therapy. We examined clinical progression according to initial regimen in the Antiretroviral Therapy Cohort Collaboration, which is based on 12 European and North American cohort studies. METHODS: We analyzed progression to death from any cause and to AIDS or death (AIDS/death), comparing efavirenz (EFV), nevirapine (NVP), nelfinavir, idinavir, ritonavir (RTV), RTV-boosted protease inhibitors (PIs), saquinavir, and abacavir. We also compared nucleoside reverse-transcriptase inhibitor pairs: zidovudine/lamivudine (AZT/3TC), stavudine (D4T)/3TC, D4T/didanosine (DDI), and others. RESULTS: A total of 17,666 treatment-naive patients, 55,622 person-years at risk, 1,617 new AIDS events, and 895 deaths were analyzed. Compared with EFV, the adjusted hazard ratio (HR) for AIDS/death was 1.28 (95% confidence interval [CI], 1.03-1.60) for NVP, 1.31 (95% CI, 1.01-1.71) for RTV, and 1.45 (95% CI, 1.15-1.81) for RTV-boosted PIs. For death, the adjusted HR for NVP was 1.65 (95% CI, 1.16-2.36). The adjusted HR for death for D4T/3TC was 1.35 (95% CI, 1.14-1.59), compared with AZT/3TC. CONCLUSIONS: Outcomes may vary across initial regimens. Results are observational and may have been affected by bias due to unmeasured or residual confounding. There is a need for large, randomized, clinical end-point trials.
Resumo:
BACKGROUND: IL-18 is a pleiotrophic cytokine involved in both, T-helper type 1 (Th1) and Th2 differentiation. Recently genetic variants in the IL-18 gene have been associated with increased risk of atopy and asthma. OBJECTIVE: To examine the relationship of a genetic, haplotype-tagging promotor variant -137G/C in the IL-18 gene with atopic asthma in a large, well-characterized and population-based study of adults. METHODS: Prospective cohort study design was used to collect interview and biological measurement data at two examination time-points 11 years apart. Multivariate logistic regression analysis was used to assess the association of genotype with asthma and atopy. RESULTS: The G-allele of the IL-18 promotor variant (-137G/C) was associated with a markedly increased risk for the prevalence of physician-diagnosed asthma with concomitant skin reactivity to common allergens. Stratification of the asthma cases by skin reactivity to common allergens revealed an exclusive association of IL-18 -137 G-allele with an increased prevalence of atopic asthma (adjusted odds ratio (OR): 3.63; 95% confidence interval: (1.64-8.02) for GC or GG carriers vs. CC carriers), and no according association with asthma and concomitant negative skin reactivity (adjusted OR: 1.13; 0.66-1.94). The interaction between IL-18 -137G/C genotype and positive skin prick test was statistically significant (P=0.029). None of 74 incident asthma cases with atopy at baseline exhibited the CC genotype. CONCLUSION: Our results strongly suggest that this variant of the IL-18 gene is an important genetic determinant involved in the development of atopic asthma.
Resumo:
In natural history studies of chronic disease, it is of interest to understand the evolution of key variables that measure aspects of disease progression. This is particularly true for immunological variables in persons infected with the Human Immunodeficiency Virus (HIV). The natural timescale for such studies is time since infection. However, most data available for analysis arise from prevalent cohorts, where the date of infection is unknown for most or all individuals. As a result, standard curve fitting algorithms are not immediately applicable. Here we propose two methods to circumvent this difficulty. The first uses repeated measurement data to provide information not only on the level of the variable of interest, but also on its rate of change, while the second uses an estimate of the expected time since infection. Both methods are based on the principal curves algorithm of Hastie and Stuetzle, and are applied to data from a prevalent cohort of HIV-infected homosexual men, giving estimates of the average pattern of CD4+ lymphocyte decline. These methods are applicable to natural history studies using data from prevalent cohorts where the time of disease origin is uncertain, provided certain ancillary information is available from external sources.
Resumo:
INTRODUCTION: The patterns and reasons for antiretroviral therapy (ART) drug substitutions are poorly described in resource-limited settings. METHODS: Time to and reason for drug substitution were recorded in treatment-naive adults receiving ART in two primary care treatment programmes in Cape Town. The cumulative proportion of patients having therapy changed because of toxicity was described for each drug, and associations with these changes were explored in multivariate models. RESULTS: Analysis included 2,679 individuals followed for a median of 11 months. Median CD4+ T-cell count at baseline was 85 cells/microl. Mean weight was 59 kg, mean age was 32 years and 71% were women. All started non-nucleoside reverse transcriptase inhibitor-based ART (60% on efavrienz) and 75% started on stavudine (d4T). After 3 years, 75% remained in care on-site, of whom 72% remained on their initial regimen. Substitutions due to toxicity of nevirapine (8% by 3 years), efavirenz (2%) and zidovudine (8%) occurred early. Substitutions on d4T occurred in 21% of patients by 3 years, due to symptomatic hyperlactataemia (5%), lipodystrophy (9%) or peripheral neuropathy (6%), and continued to accumulate over time. Those at greatest risk of hyperlactataemia or lipodystrophy were women on ART > or =6 months, weighing > or =75 kg at baseline. DISCUSSION: A high proportion of adult patients are able to tolerate their initial ART regimen for up to 3 years. In most instances treatment-limiting toxicities occur early, but continue to accumulate over time in patients on d4T. Whilst awaiting other treatment options, the risks of known toxicities could be minimized through early identification of patients at the highest risk.
Resumo:
AIMS: To investigate the relationship between extremely low frequency magnetic field (ELF-MF) exposure and mortality from leukaemia and brain tumour in a cohort of Swiss railway workers. METHODS: 20,141 Swiss railway employees with 464,129 person-years of follow-up between 1972 and 2002 were studied. Mortality rates for leukaemia and brain tumour of highly exposed train drivers (21 muT average annual exposure) were compared with medium and low exposed occupational groups (i.e. station masters with an average exposure of 1 muT). In addition, individual cumulative exposure was calculated from on-site measurements and modelling of past exposures. RESULTS: The hazard ratio (HR) for leukaemia mortality of train drivers was 1.43 (95% CI 0.74 to 2.77) compared with station masters. For myeloid leukaemia the HR of train drivers was 4.74 (95% CI 1.04 to 21.60) and for Hodgkin's disease 3.29 (95% CI 0.69 to 15.63). Lymphoid leukaemia, non-Hodgkin's disease and brain tumour mortality were not associated with magnetic field exposure. Concordant results were obtained from analyses based on individual cumulative exposure. CONCLUSIONS: Some evidence of an exposure-response association was found for myeloid leukaemia and Hodgkin's disease, but not for other haematopoietic and lymphatic malignancies and brain tumours.
Resumo:
BACKGROUND: A growing number of case reports have described tenofovir (TDF)-related proximal renal tubulopathy and impaired calculated glomerular filtration rates (cGFR). We assessed TDF-associated changes in cGFR in a large observational HIV cohort. METHODS: We compared treatment-naive patients or patients with treatment interruptions > or = 12 months starting either a TDF-based combination antiretroviral therapy (cART) (n = 363) or a TDF-sparing regime (n = 715). The predefined primary endpoint was the time to a 10 ml/min reduction in cGFR, based on the Cockcroft-Gault equation, confirmed by a follow-up measurement at least 1 month later. In sensitivity analyses, secondary endpoints including calculations based on the modified diet in renal disease (MDRD) formula were considered. Endpoints were modelled using pre-specified covariates in a multiple Cox proportional hazards model. RESULTS: Two-year event-free probabilities were 0.65 (95% confidence interval [CI] 0.58-0.72) and 0.80 (95% CI 0.76-0.83) for patients starting TDF-containing or TDF-sparing cART, respectively. In the multiple Cox model, diabetes mellitus (hazard ratio [HR] = 2.34 [95% CI 1.24-4.42]), higher baseline cGFR (HR = 1.03 [95% CI 1.02-1.04] by 10 ml/min), TDF use (HR = 1.84 [95% CI 1.35-2.51]) and boosted protease inhibitor use (HR = 1.71 [95% CI 1.30-2.24]) significantly increased the risk for reaching the primary endpoint. Sensitivity analyses showed high consistency. CONCLUSION: There is consistent evidence for a significant reduction in cGFR associated with TDF use in HIV-infected patients. Our findings call for a strict monitoring of renal function in long-term TDF users with tests that distinguish between glomerular dysfunction and proximal renal tubulopathy, a known adverse effect of TDF.
Resumo:
OBJECTIVE: To investigate predictors of continued HIV RNA viral load suppression in individuals switched to abacavir (ABC), lamivudine (3TC) and zidovudine (ZDV) after successful previous treatment with a protease inhibitor or non-nucleoside reverse transcriptase inhibitor-based combination antiretroviral therapy. DESIGN AND METHODS: An observational cohort study, which included individuals in the Swiss HIV Cohort Study switching to ABC/3TC/ZDV following successful suppression of viral load. The primary endpoint was time to treatment failure defined as the first of the following events: two consecutiveviral load measurements > 400 copies/ml under ABC/3TC/ZDV, one viral load measurement > 400 copies/ml and subsequent discontinuation of ABC/3TC/ZDV within 3 months, AIDS or death. RESULTS: We included 495 individuals; 47 experienced treatment failure in 1459 person-years of follow-up [rate = 3.22 events/100 person-years; 95% confidence interval (95% CI), 2.30-4.14]. Of all failures, 62% occurred in the first year after switching to ABC/3TC/ZDV. In a Cox regression analysis, treatment failure was independently associated with earlier exposure to nucleoside reverse transcriptase inhibitor (NRTI) mono or dual therapy [hazard ratio (HR), 8.02; 95% CI, 4.19-15.35) and low CD4 cell count at the time of the switch (HR, 0.66; 95% CI, 0.51-0.87 by +100 cells/microl up to 500 cells/microl). In patients without earlier exposure to mono or dual therapy, AIDS prior to switch to simplified maintenance therapy was an additional risk factor. CONCLUSIONS: The failure rate was low in patients with suppressed viral load and switch to ABC/3TC/ZDV treatment. Patients with earlier exposure to mono or dual NRTI therapy, low CD4 cell count at time of switch, or AIDS are at increased risk of treatment failure, limiting the use of ABC/3TC/ZDV in these patient groups.
Resumo:
PURPOSE: Antiretroviral therapy (ART) may induce metabolic changes and increase the risk of coronary heart disease (CHD). Based on a health care system approach, we investigated predictors for normalization of dyslipidemia in HIV-infected individuals receiving ART. METHOD: Individuals included in the study were registered in the Swiss HIV Cohort Study (SHCS), had dyslipidemia but were not on lipid-lowering medication, were on potent ART for >or= 3 months, and had >or= 2 follow-up visits. Dyslipidemia was defined as two consecutive total cholesterol (TC) values above recommended levels. Predictors of achieving treatment goals for TC were assessed using Cox models. RESULTS: Analysis included 958 individuals with median followup of 2.3 years (IQR 1.2-4.0). 454 patients (47.4%) achieved TC treatment goals. In adjusted analyses, variables significantly associated with a lower hazard of reaching TC treatment goals were as follows: older age (compared to 18-37 year olds: hazard ratio [HR] 0.62 for 45-52 year olds, 95% CI 0.47-0.82; HR 0.40 for 53-85, 95% CI 0.29-0.54), diabetes (HR 0.39, 95% CI 0.26-0.59), history of coronary heart disease (HR 0.27, 95% CI 0.10-0.71), higher baseline TC (HR 0.78, 95% CI 0.71-0.85), baseline triple nucleoside regimen (HR 0.12 compared to PI-only regimen, 95% CI 0.07-0.21), longer time on PI-only regimen (HR 0.39, 95% CI 0.33-0.46), longer time on NNRTI only regimen (HR 0.35, 95% CI 0.29-0.43), and longer time on PI/NNRTI regimen (HR 0.34, 95% CI 0.26-0.43). Switching ART regimen when viral load was undetectable was associated with a higher hazard of reaching TC treatment goals (HR 1.48, 95% CI 1.14-1.91). CONCLUSION: In SHCS participants on ART, several ART-related and not ART-related epidemiological factors were associated with insufficient control of dyslipidemia. Control of dyslipidemia in ART recipients must be further improved.
Resumo:
OBJECT: The effect of normobaric hyperoxia (fraction of inspired O2 [FIO2] concentration 100%) in the treatment of patients with traumatic brain injury (TBI) remains controversial. The aim of this study was to investigate the effects of normobaric hyperoxia on five cerebral metabolic indices, which have putative prognostic significance following TBI in humans. METHODS: At two independent neurointensive care units, the authors performed a prospective study of 52 patients with severe TBI who were treated for 24 hours with 100% FIO2, starting within 6 hours of admission. Data for these patients were compared with data for a cohort of 112 patients who were treated in the past; patients in the historical control group matched the patients in our study according to their Glasgow Coma Scale scores after resuscitation and their intracranial pressure within the first 8 hours after admission. Patients were monitored with the aid of intracerebral microdialysis and tissue O2 probes. Normobaric hyperoxia treatment resulted in a significant improvement in biochemical markers in the brain compared with the baseline measures for patients treated in our study (patients acting as their own controls) and also compared with findings from the historical control group. In the dialysate the glucose levels increased (369.02 +/- 20.1 micromol/L in the control group and 466.9 +/- 20.39 micromol/L in the 100% O2 group, p = 0.001), whereas the glutamate and lactate levels significantly decreased (p < 0.005). There were also reductions in the lactate/glucose and lactate/pyruvate ratios. Intracranial pressure in the treatment group was reduced significantly both during and after hyperoxia treatment compared with the control groups (15.03 +/- 0.8 mm Hg in the control group and 12.13 +/- 0.75 mm Hg in the 100% O2 group, p < 0.005) with no changes in cerebral perfusion pressure. Outcomes of the patients in the treatment group improved. CONCLUSIONS: The results of the study support the hypothesis that normobaric hyperoxia in patients with severe TBI improves the indices of brain oxidative metabolism. Based on these data further mechanistic studies and a prospective randomized controlled trial are warranted.
Resumo:
INTRODUCTION: The paucity of data on resource use in critically ill patients with hematological malignancy and on these patients' perceived poor outcome can lead to uncertainty over the extent to which intensive care treatment is appropriate. The aim of the present study was to assess the amount of intensive care resources needed for, and the effect of treatment of, hemato-oncological patients in the intensive care unit (ICU) in comparison with a nononcological patient population with a similar degree of organ dysfunction. METHODS: A retrospective cohort study of 101 ICU admissions of 84 consecutive hemato-oncological patients and 3,808 ICU admissions of 3,478 nononcological patients over a period of 4 years was performed. RESULTS: As assessed by Therapeutic Intervention Scoring System points, resource use was higher in hemato-oncological patients than in nononcological patients (median (interquartile range), 214 (102 to 642) versus 95 (54 to 224), P < 0.0001). Severity of disease at ICU admission was a less important predictor of ICU resource use than necessity for specific treatment modalities. Hemato-oncological patients and nononcological patients with similar admission Simplified Acute Physiology Score scores had the same ICU mortality. In hemato-oncological patients, improvement of organ function within the first 48 hours of the ICU stay was the best predictor of 28-day survival. CONCLUSION: The presence of a hemato-oncological disease per se is associated with higher ICU resource use, but not with increased mortality. If withdrawal of treatment is considered, this decision should not be based on admission parameters but rather on the evolutional changes in organ dysfunctions.
Resumo:
BACKGROUND: We aimed to assess the value of a structured clinical assessment and genetic testing for refining the diagnosis of abacavir hypersensitivity reactions (ABC-HSRs) in a routine clinical setting. METHODS: We performed a diagnostic reassessment using a structured patient chart review in individuals who had stopped ABC because of suspected HSR. Two HIV physicians blinded to the human leukocyte antigen (HLA) typing results independently classified these individuals on a scale between 3 (ABC-HSR highly likely) and -3 (ABC-HSR highly unlikely). Scoring was based on symptoms, onset of symptoms and comedication use. Patients were classified as clinically likely (mean score > or =2), uncertain (mean score > or = -1 and < or = 1) and unlikely (mean score < or = -2). HLA typing was performed using sequence-based methods. RESULTS: From 131 reassessed individuals, 27 (21%) were classified as likely, 43 (33%) as unlikely and 61 (47%) as uncertain ABC-HSR. Of the 131 individuals with suspected ABC-HSR, 31% were HLA-B*5701-positive compared with 1% of 140 ABC-tolerant controls (P < 0.001). HLA-B*5701 carriage rate was higher in individuals with likely ABC-HSR compared with those with uncertain or unlikely ABC-HSR (78%, 30% and 5%, respectively, P < 0.001). Only six (7%) HLA-B*5701-negative individuals were classified as likely HSR after reassessment. CONCLUSIONS: HLA-B*5701 carriage is highly predictive of clinically diagnosed ABC-HSR. The high proportion of HLA-B*5701-negative individuals with minor symptoms among individuals with suspected HSR indicates overdiagnosis of ABC-HSR in the era preceding genetic screening. A structured clinical assessment and genetic testing could reduce the rate of inappropriate ABC discontinuation and identify individuals at high risk for ABC-HSR.
Resumo:
BACKGROUND: The aim of this study was to evaluate the effect of CD4+ T-cell counts and other characteristics of HIV-infected individuals on hepatitis C virus (HCV) RNA levels. METHODS: All HIV-HCV-coinfected Swiss HIV Cohort Study participants with available HCV RNA levels and concurrent CD4+ T-cell counts before starting HCV therapy were included. Potential predictors of HCV RNA levels were assessed by multivariate censored linear regression models that adjust for censored values. RESULTS: The study included 1,031 individuals. Low current and nadir CD4+ T-cell counts were significantly associated with higher HCV RNA levels (P = 0.004 and 0.001, respectively). In individuals with current CD4+ T-cell counts < 200/microl, median HCV RNA levels (6.22 log10 IU/ml) were +0.14 and +0.24 log10 IU/ml higher than those with CD4+ T-cell counts of 200-500/microl and > 500/microl. Based on nadir CD4+ T-cell counts, median HCV RNA levels (6.12 log10 IU/ml) in individuals with < 200/microl CD4+ T-cells were +0.06 and +0.44 log10 IU/ml higher than those with nadir T-cell counts of 200-500/microl and > 500/microl. Median HCV RNA levels were also significantly associated with HCV genotype: lower values were associated with genotype 4 and higher values with genotype 2, as compared with genotype 1. Additional significant predictors of lower HCV RNA levels were female gender and HIV transmission through male homosexual contacts. In multivariate analyses, only CD4+ T-cell counts and HCV genotype remained significant predictors of HCV RNA levels. Conclusions: Higher HCV RNA levels were associated with CD4+ T-cell depletion. This finding is in line with the crucial role of CD4+ T-cells in the control of HCV infection.