899 resultados para blood cell count
Resumo:
BACKGROUND: CD4+ T-cell recovery in patients with continuous suppression of plasma HIV-1 viral load (VL) is highly variable. This study aimed to identify predictive factors for long-term CD4+ T-cell increase in treatment-naive patients starting combination antiretroviral therapy (cART). METHODS: Treatment-naive patients in the Swiss HIV Cohort Study reaching two VL measurements <50 copies/ml >3 months apart during the 1st year of cART were included (n=1816 patients). We studied CD4+ T-cell dynamics until the end of suppression or up to 5 years, subdivided into three periods: 1st year, years 2-3 and years 4-5 of suppression. Multiple median regression adjusted for repeated CD4+ T-cell measurements was used to study the dependence of CD4+ T-cell slopes on clinical covariates and drug classes. RESULTS: Median CD4+ T-cell increases following VL suppression were 87, 52 and 19 cells/microl per year in the three periods. In the multiple regression model, median CD4+ T-cell increases over all three periods were significantly higher for female gender, lower age, higher VL at cART start, CD4+ T-cell <650 cells/microl at start of the period and low CD4+ T-cell increase in the previous period. Patients on tenofovir showed significantly lower CD4+ T-cell increases compared with stavudine. CONCLUSIONS: In our observational study, long-term CD4+ T-cell increase in drug-naive patients with suppressed VL was higher in regimens without tenofovir. The clinical relevance of these findings must be confirmed in, ideally, clinical trials or large, collaborative cohort projects but could influence treatment of older patients and those starting cART at low CD4+ T-cell levels.
Resumo:
INTRODUCTION: The incidence of bloodstream infection (BSI) in extracorporeal life support (ECLS) is reported between 0.9 and 19.5%. In January 2006, the Extracorporeal Life Support Organization (ELSO) reported an overall incidence of 8.78% distributed as follows: respiratory: 6.5% (neonatal), 20.8% (pediatric); cardiac: 8.2% (neonatal) and 12.6% (pediatric). METHOD: At BC Children's Hospital (BCCH) daily surveillance blood cultures (BC) are performed and antibiotic prophylaxis is not routinely recommended. Positive BC (BC+) were reviewed, including resistance profiles, collection time of BC+, time to positivity and mortality. White blood cell count, absolute neutrophile count, immature/total ratio, platelet count, fibrinogen and lactate were analyzed 48, 24 and 0 h prior to BSI. A univariate linear regression analysis was performed. RESULTS: From 1999 to 2005, 89 patients underwent ECLS. After exclusion, 84 patients were reviewed. The attack rate was 22.6% (19 BSI) and 13.1% after exclusion of coagulase-negative staphylococci (n = 8). BSI patients were significantly longer on ECLS (157 h) compared to the no-BSI group (127 h, 95% CI: 106-148). Six BSI patients died on ECLS (35%; 4 congenital diaphragmatic hernias, 1 hypoplastic left heart syndrome and 1 after a tetralogy repair). BCCH survival on ECLS was 71 and 58% at discharge, which is comparable to previous reports. No patient died primarily because of BSI. No BSI predictor was identified, although lactate may show a decreasing trend before BSI (P = 0.102). CONCLUSION: Compared with ELSO, the studied BSI incidence was higher with a comparable mortality. We speculate that our BSI rate is explained by underreporting of "contaminants" in the literature, the use of broad-spectrum antibiotic prophylaxis and a higher yield with daily monitoring BC. We support daily surveillance blood cultures as an alternative to antibiotic prophylaxis in the management of patients on ECLS.
Resumo:
Erythropoietin (EPO) and iron deficiency as causes of anemia in patients with limited renal function or end-stage renal disease are well addressed. The concomitant impairment of red blood cell (RBC) survival has been largely neglected. Properties of the uremic environment like inflammation, increased oxidative stress and uremic toxins seem to be responsible for the premature changes in RBC membrane and cytoskeleton. The exposure of antigenic sites and breakdown of the phosphatidylserine asymmetry promote RBC phagocytosis. While the individual response to treatment with EPO-stimulating agents (ESA) depends on both the RBC's lifespan and the production rate, uniform dosing algorithms do not meet that demand. The clinical use of mathematical models predicting ESA-induced changes in hematocrit might be greatly improved once independent estimates of RBC production rate and/or lifespan become available, thus making the concomitant estimation of both parameters unnecessary. Since heme breakdown by the hemoxygenase pathway results in carbon monoxide (CO) which is exhaled, a simple CO breath test has been used to calculate hemoglobin turnover and therefore RBC survival and lifespan. Future research will have to be done to validate and implement this method in patients with kidney failure. This will result in new insights into RBC kinetics in renal patients. Eventually, these findings are expected to improve our understanding of the hemoglobin variability in response to ESA.
Resumo:
BACKGROUND: In recent years, treatment options for human immunodeficiency virus type 1 (HIV-1) infection have changed from nonboosted protease inhibitors (PIs) to nonnucleoside reverse-transcriptase inhibitors (NNRTIs) and boosted PI-based antiretroviral drug regimens, but the impact on immunological recovery remains uncertain. METHODS: During January 1996 through December 2004 [corrected] all patients in the Swiss HIV Cohort were included if they received the first combination antiretroviral therapy (cART) and had known baseline CD4(+) T cell counts and HIV-1 RNA values (n = 3293). For follow-up, we used the Swiss HIV Cohort Study database update of May 2007 [corrected] The mean (+/-SD) duration of follow-up was 26.8 +/- 20.5 months. The follow-up time was limited to the duration of the first cART. CD4(+) T cell recovery was analyzed in 3 different treatment groups: nonboosted PI, NNRTI, or boosted PI. The end point was the absolute increase of CD4(+) T cell count in the 3 treatment groups after the initiation of cART. RESULTS: Two thousand five hundred ninety individuals (78.7%) initiated a nonboosted-PI regimen, 452 (13.7%) initiated an NNRTI regimen, and 251 (7.6%) initiated a boosted-PI regimen. Absolute CD4(+) T cell count increases at 48 months were as follows: in the nonboosted-PI group, from 210 to 520 cells/muL; in the NNRTI group, from 220 to 475 cells/muL; and in the boosted-PI group, from 168 to 511 cells/muL. In a multivariate analysis, the treatment group did not affect the response of CD4(+) T cells; however, increased age, pretreatment with nucleoside reverse-transcriptase inhibitors, serological tests positive for hepatitis C virus, Centers for Disease Control and Prevention stage C infection, lower baseline CD4(+) T cell count, and lower baseline HIV-1 RNA level were risk factors for smaller increases in CD4(+) T cell count. CONCLUSION: CD4(+) T cell recovery was similar in patients receiving nonboosted PI-, NNRTI-, and boosted PI-based cART.
Resumo:
BACKGROUND: Estimates of the decrease in CD4(+) cell counts in untreated patients with human immunodeficiency virus (HIV) infection are important for patient care and public health. We analyzed CD4(+) cell count decreases in the Cape Town AIDS Cohort and the Swiss HIV Cohort Study. METHODS: We used mixed-effects models and joint models that allowed for the correlation between CD4(+) cell count decreases and survival and stratified analyses by the initial cell count (50-199, 200-349, 350-499, and 500-750 cells/microL). Results are presented as the mean decrease in CD4(+) cell count with 95% confidence intervals (CIs) during the first year after the initial CD4(+) cell count. RESULTS: A total of 784 South African (629 nonwhite) and 2030 Swiss (218 nonwhite) patients with HIV infection contributed 13,388 CD4(+) cell counts. Decreases in CD4(+) cell count were steeper in white patients, patients with higher initial CD4(+) cell counts, and older patients. Decreases ranged from a mean of 38 cells/microL (95% CI, 24-54 cells/microL) in nonwhite patients from the Swiss HIV Cohort Study 15-39 years of age with an initial CD4(+) cell count of 200-349 cells/microL to a mean of 210 cells/microL (95% CI, 143-268 cells/microL) in white patients in the Cape Town AIDS Cohort > or =40 years of age with an initial CD4(+) cell count of 500-750 cells/microL. CONCLUSIONS: Among both patients from Switzerland and patients from South Africa, CD4(+) cell count decreases were greater in white patients with HIV infection than they were in nonwhite patients with HIV infection.
Resumo:
OBJECTIVES: To examine the accuracy of the World Health Organization immunological criteria for virological failure of antiretroviral treatment. METHODS: Analysis of 10 treatment programmes in Africa and South America that monitor both CD4 cell counts and HIV-1 viral load. Adult patients with at least two CD4 counts and viral load measurements between month 6 and 18 after starting a non-nucleoside reverse transcriptase inhibitor-based regimen were included. WHO immunological criteria include CD4 counts persistently <100 cells/microl, a fall below the baseline CD4 count, or a fall of >50% from the peak value. Virological failure was defined as two measurements > or =10 0000 copies/ml (higher threshold) or > or =500 copies/ml (lower threshold). Measures of accuracy with exact binomial 95% confidence intervals (CI) were calculated. RESULTS: A total of 2009 patients were included. During 1856 person-years of follow up 63 patients met the immunological criteria and 35 patients (higher threshold) and 95 patients (lower threshold) met the virological criteria. Sensitivity [95% confidence interval (CI)] was 17.1% (6.6-33.6%) for the higher and 12.6% (6.7-21.0%) for the lower threshold. Corresponding results for specificity were 97.1% (96.3-97.8%) and 97.3% (96.5-98.0%), for positive predictive value 9.5% (3.6-19.6%) and 19.0% (10.2-30.9%) and for negative predictive value 98.5% (97.9-99.0%) and 95.7% (94.7-96.6%). CONCLUSIONS: The positive predictive value of the WHO immunological criteria for virological failure of antiretroviral treatment in resource-limited settings is poor, but the negative predictive value is high. Immunological criteria are more appropriate for ruling out than for ruling in virological failure in resource-limited settings.
Resumo:
OBJECTIVES: CD4 cell count and plasma viral load are well known predictors of AIDS and mortality in HIV-1-infected patients treated with combination antiretroviral therapy (cART). This study investigated, in patients treated for at least 3 years, the respective prognostic importance of values measured at cART initiation, and 6 and 36 months later, for AIDS and death. METHODS: Patients from 15 HIV cohorts included in the ART Cohort Collaboration, aged at least 16 years, antiretroviral-naive when they started cART and followed for at least 36 months after start of cART were eligible. RESULTS: Among 14 208 patients, the median CD4 cell counts at 0, 6 and 36 months were 210, 320 and 450 cells/microl, respectively, and 78% of patients achieved viral load less than 500 copies/ml at 6 months. In models adjusted for characteristics at cART initiation and for values at all time points, values at 36 months were the strongest predictors of subsequent rates of AIDS and death. Although CD4 cell count and viral load at cART initiation were no longer prognostic of AIDS or of death after 36 months, viral load at 6 months and change in CD4 cell count from 6 to 36 months were prognostic for rates of AIDS from 36 months. CONCLUSIONS: Although current values of CD4 cell count and HIV-1 RNA are the most important prognostic factors for subsequent AIDS and death rates in HIV-1-infected patients treated with cART, changes in CD4 cell count from 6 to 36 months and the value of 6-month HIV-1 RNA are also prognostic for AIDS.
Resumo:
Persistently low white blood cell count (WBC) and neutrophil count is a well-described phenomenon in persons of African ancestry, whose etiology remains unknown. We recently used admixture mapping to identify an approximately 1-megabase region on chromosome 1, where ancestry status (African or European) almost entirely accounted for the difference in WBC between African Americans and European Americans. To identify the specific genetic change responsible for this association, we analyzed genotype and phenotype data from 6,005 African Americans from the Jackson Heart Study (JHS), the Health, Aging and Body Composition (Health ABC) Study, and the Atherosclerosis Risk in Communities (ARIC) Study. We demonstrate that the causal variant must be at least 91% different in frequency between West Africans and European Americans. An excellent candidate is the Duffy Null polymorphism (SNP rs2814778 at chromosome 1q23.2), which is the only polymorphism in the region known to be so differentiated in frequency and is already known to protect against Plasmodium vivax malaria. We confirm that rs2814778 is predictive of WBC and neutrophil count in African Americans above beyond the previously described admixture association (P = 3.8 x 10(-5)), establishing a novel phenotype for this genetic variant.
Resumo:
Bovine mastitis is a frequent problem in Swiss dairy herds. One of the main pathogens causing significant economic loss is Staphylococcus aureus. Various Staph. aureus genotypes with different biological properties have been described. Genotype B (GTB) of Staph. aureus was identified as the most contagious and one of the most prevalent strains in Switzerland. The aim of this study was to identify risk factors associated with the herd-level presence of Staph. aureus GTB and Staph. aureus non-GTB in Swiss dairy herds with an elevated yield-corrected herd somatic cell count (YCHSCC). One hundred dairy herds with a mean YCHSCC between 200,000 and 300,000cells/mL in 2010 were recruited and each farm was visited once during milking. A standardized protocol investigating demography, mastitis management, cow husbandry, milking system, and milking routine was completed during the visit. A bulk tank milk (BTM) sample was analyzed by real-time PCR for the presence of Staph. aureus GTB to classify the herds into 2 groups: Staph. aureus GTB-positive and Staph. aureus GTB-negative. Moreover, quarter milk samples were aseptically collected for bacteriological culture from cows with a somatic cell count ≥150,000cells/mL on the last test-day before the visit. The culture results allowed us to allocate the Staph. aureus GTB-negative farms to Staph. aureus non-GTB and Staph. aureus-free groups. Multivariable multinomial logistic regression models were built to identify risk factors associated with the herd-level presence of Staph. aureus GTB and Staph. aureus non-GTB. The prevalence of Staph. aureus GTB herds was 16% (n=16), whereas that of Staph. aureus non-GTB herds was 38% (n=38). Herds that sent lactating cows to seasonal communal pastures had significantly higher odds of being infected with Staph. aureus GTB (odds ratio: 10.2, 95% CI: 1.9-56.6), compared with herds without communal pasturing. Herds that purchased heifers had significantly higher odds of being infected with Staph. aureus GTB (rather than Staph. aureus non-GTB) compared with herds without purchase of heifers. Furthermore, herds that did not use udder ointment as supportive therapy for acute mastitis had significantly higher odds of being infected with Staph. aureus GTB (odds ratio: 8.5, 95% CI: 1.6-58.4) or Staph. aureus non-GTB (odds ratio: 6.1, 95% CI: 1.3-27.8) than herds that used udder ointment occasionally or regularly. Herds in which the milker performed unrelated activities during milking had significantly higher odds of being infected with Staph. aureus GTB (rather than Staph. aureus non-GTB) compared with herds in which the milker did not perform unrelated activities at milking. Awareness of 4 potential risk factors identified in this study guides implementation of intervention strategies to improve udder health in both Staph. aureus GTB and Staph. aureus non-GTB herds.
Resumo:
INTRODUCTION Optimising the use of blood has become a core task of transfusion medicine. Because no general guidelines are available in Switzerland, we analysed the effects of the introduction of a guideline on red blood cell (RBC) transfusion for elective orthopaedic surgery. METHODS Prospective, multicentre, before-and-after study comparing the use of RBCs in adult elective hip or knee replacement before and after the implementation of a guideline in 10 Swiss hospitals, developed together with all participants. RESULTS We included 2,134 patients, 1,238 in 7 months before, 896 in 6 months after intervention. 57 (34 or 2.7% before, 23 or 2.6% after) were lost before follow-up visit. The mean number of transfused RBC units decreased from 0.5 to 0.4 per patient (0.1, 95% CI 0.08-0.2; p = 0.014), the proportion of transfused patients from 20.9% to 16.9% (4%, 95% C.I. 0.7-7.4%; p = 0.02), and the pre-transfusion haemoglobin from 82.6 to 78.2 g/l (4.4 g/l, 95% C. I. 2.15-6.62 g/l, p < 0.001). We did not observe any statistically significant changes in in-hospital mortality (0.4% vs. 0%) and morbidity (4.1% vs. 4.0%), median hospital length of stay (9 vs. 9 days), follow-up mortality (0.4% vs. 0.2%) and follow-up morbidity (6.9% vs. 6.0%). CONCLUSIONS The introduction of a simple transfusion guideline reduces and standardises the use of RBCs by decreasing the haemoglobin transfusion trigger, without negative effects on the patient outcome. Local support, training, and monitoring of the effects are requirements for programmes optimising the use of blood.
Resumo:
OBJECTIVE To illustrate an approach to compare CD4 cell count and HIV-RNA monitoring strategies in HIV-positive individuals on antiretroviral therapy (ART). DESIGN Prospective studies of HIV-positive individuals in Europe and the USA in the HIV-CAUSAL Collaboration and The Center for AIDS Research Network of Integrated Clinical Systems. METHODS Antiretroviral-naive individuals who initiated ART and became virologically suppressed within 12 months were followed from the date of suppression. We compared 3 CD4 cell count and HIV-RNA monitoring strategies: once every (1) 3 ± 1 months, (2) 6 ± 1 months, and (3) 9-12 ± 1 months. We used inverse-probability weighted models to compare these strategies with respect to clinical, immunologic, and virologic outcomes. RESULTS In 39,029 eligible individuals, there were 265 deaths and 690 AIDS-defining illnesses or deaths. Compared with the 3-month strategy, the mortality hazard ratios (95% CIs) were 0.86 (0.42 to 1.78) for the 6 months and 0.82 (0.46 to 1.47) for the 9-12 month strategy. The respective 18-month risk ratios (95% CIs) of virologic failure (RNA >200) were 0.74 (0.46 to 1.19) and 2.35 (1.56 to 3.54) and 18-month mean CD4 differences (95% CIs) were -5.3 (-18.6 to 7.9) and -31.7 (-52.0 to -11.3). The estimates for the 2-year risk of AIDS-defining illness or death were similar across strategies. CONCLUSIONS Our findings suggest that monitoring frequency of virologically suppressed individuals can be decreased from every 3 months to every 6, 9, or 12 months with respect to clinical outcomes. Because effects of different monitoring strategies could take years to materialize, longer follow-up is needed to fully evaluate this question.
Resumo:
Patients who had started HAART (Highly Active Anti-Retroviral Treatment) under previous aggressive DHHS guidelines (1997) underwent a life-long continuous HAART that was associated with many short term as well as long term complications. Many interventions attempted to reduce those complications including intermittent treatment also called pulse therapy. Many studies were done to study the determinants of rate of fall in CD4 count after interruption as this data would help guide treatment interruptions. The data set used here was a part of a cohort study taking place at the Johns Hopkins AIDS service since January 1984, in which the data were collected both prospectively and retrospectively. The patients in this data set consisted of 47 patients receiving via pulse therapy with the aim of reducing the long-term complications. ^ The aim of this project was to study the impact of virologic and immunologic factors on the rate of CD4 loss after treatment interruption. The exposure variables under investigation included CD4 cell count and viral load at treatment initiation. The rates of change of CD4 cell count after treatment interruption was estimated from observed data using advanced longitudinal data analysis methods (i.e., linear mixed model). Using random effects accounted for repeated measures of CD4 per person after treatment interruption. The regression coefficient estimates from the model was then used to produce subject specific rates of CD4 change accounting for group trends in change. The exposure variables of interest were age, race, and gender, CD4 cell counts and HIV RNA levels at HAART initiation. ^ The rate of fall of CD4 count did not depend on CD4 cell count or viral load at initiation of treatment. Thus these factors may not be used to determine who can have a chance of successful treatment interruption. CD4 and viral load were again studied by t-tests and ANOVA test after grouping based on medians and quartiles to see any difference in means of rate of CD4 fall after interruption. There was no significant difference between the groups suggesting that there was no association between rate of fall of CD4 after treatment interruption and above mentioned exposure variables. ^
Resumo:
Erythropoietin (EPO) is required for red blood cell development, but whether EPO-specific signals directly instruct erythroid differentiation is unknown. We used a dominant system in which constitutively active variants of the EPO receptor were introduced into erythroid progenitors in mice. Chimeric receptors were constructed by replacing the cytoplasmic tail of constitutively active variants of the EPO receptor with tails of diverse cytokine receptors. Receptors linked to granulocyte or platelet production supported complete erythroid development in vitro and in vivo, as did the growth hormone receptor, a nonhematopoietic receptor. Therefore, EPOR-specific signals are not required for terminal differentiation of erythrocytes. Furthermore, we found that cellular context can influence cytokine receptor signaling.