852 resultados para HIV-1 epidemic
Resumo:
BACKGROUND: The CD4 cell count at which combination antiretroviral therapy should be started is a central, unresolved issue in the care of HIV-1-infected patients. In the absence of randomised trials, we examined this question in prospective cohort studies. METHODS: We analysed data from 18 cohort studies of patients with HIV. Antiretroviral-naive patients from 15 of these studies were eligible for inclusion if they had started combination antiretroviral therapy (while AIDS-free, with a CD4 cell count less than 550 cells per microL, and with no history of injecting drug use) on or after Jan 1, 1998. We used data from patients followed up in seven of the cohorts in the era before the introduction of combination therapy (1989-95) to estimate distributions of lead times (from the first CD4 cell count measurement in an upper range to the upper threshold of a lower range) and unseen AIDS and death events (occurring before the upper threshold of a lower CD4 cell count range is reached) in the absence of treatment. These estimations were used to impute completed datasets in which lead times and unseen AIDS and death events were added to data for treated patients in deferred therapy groups. We compared the effect of deferred initiation of combination therapy with immediate initiation on rates of AIDS and death, and on death alone, in adjacent CD4 cell count ranges of width 100 cells per microL. FINDINGS: Data were obtained for 21 247 patients who were followed up during the era before the introduction of combination therapy and 24 444 patients who were followed up from the start of treatment. Deferring combination therapy until a CD4 cell count of 251-350 cells per microL was associated with higher rates of AIDS and death than starting therapy in the range 351-450 cells per microL (hazard ratio [HR] 1.28, 95% CI 1.04-1.57). The adverse effect of deferring treatment increased with decreasing CD4 cell count threshold. Deferred initiation of combination therapy was also associated with higher mortality rates, although effects on mortality were less marked than effects on AIDS and death (HR 1.13, 0.80-1.60, for deferred initiation of treatment at CD4 cell count 251-350 cells per microL compared with initiation at 351-450 cells per microL). INTERPRETATION: Our results suggest that 350 cells per microL should be the minimum threshold for initiation of antiretroviral therapy, and should help to guide physicians and patients in deciding when to start treatment.
Resumo:
OBJECTIVES: CD4 cell count and plasma viral load are well known predictors of AIDS and mortality in HIV-1-infected patients treated with combination antiretroviral therapy (cART). This study investigated, in patients treated for at least 3 years, the respective prognostic importance of values measured at cART initiation, and 6 and 36 months later, for AIDS and death. METHODS: Patients from 15 HIV cohorts included in the ART Cohort Collaboration, aged at least 16 years, antiretroviral-naive when they started cART and followed for at least 36 months after start of cART were eligible. RESULTS: Among 14 208 patients, the median CD4 cell counts at 0, 6 and 36 months were 210, 320 and 450 cells/microl, respectively, and 78% of patients achieved viral load less than 500 copies/ml at 6 months. In models adjusted for characteristics at cART initiation and for values at all time points, values at 36 months were the strongest predictors of subsequent rates of AIDS and death. Although CD4 cell count and viral load at cART initiation were no longer prognostic of AIDS or of death after 36 months, viral load at 6 months and change in CD4 cell count from 6 to 36 months were prognostic for rates of AIDS from 36 months. CONCLUSIONS: Although current values of CD4 cell count and HIV-1 RNA are the most important prognostic factors for subsequent AIDS and death rates in HIV-1-infected patients treated with cART, changes in CD4 cell count from 6 to 36 months and the value of 6-month HIV-1 RNA are also prognostic for AIDS.
Resumo:
The human immunodeficiency virus-1 reverse transcriptase inhibitory activity of 2-(2,6-disubstituted phenyl)-3-(substituted pyrimidin-2-yl)-thiazolidin-4-ones have been analyzed using combinatorial protocol in multiple linear regression (CP-MLR) with several electronic and molecular surface area features of the compounds obtained from Molecular Operating Environment (MOE) software. The study has indicated the role of different charged molecular surface areas in modeling the inhibitory activity of the compounds. The derived models collectively suggested that the compounds should be compact without bulky substitutions on its peripheries for better HIV-1 RT inhibitory activity. It also emphasized the necessity of hydrophobicity and compact structural features for their activity. The scope of the descriptors identified for these analogues have been verified by extending the dataset with different 2-(disubstituted phenyl)-3-(substituted pyridin-2-yl)-thiazolidin-4-ones. The joint analysis of extended dataset highlighted the information content of identified descriptors in modeling the HIV-1 RT inhibitory activity of the compounds.
Resumo:
BACKGROUND Human immunodeficiency virus type 1 (HIV-1) transmitted drug resistance (TDR) can compromise antiretroviral therapy (ART) and thus represents an important public health concern. Typically, sources of TDR remain unknown, but they can be characterized with molecular epidemiologic approaches. We used the highly representative Swiss HIV Cohort Study (SHCS) and linked drug resistance database (SHCS-DRDB) to analyze sources of TDR. METHODS ART-naive men who have sex with men with infection date estimates between 1996 and 2009 were chosen for surveillance of TDR in HIV-1 subtype B (N = 1674), as the SHCS-DRDB contains pre-ART genotypic resistance tests for >69% of this surveillance population. A phylogeny was inferred using pol sequences from surveillance patients and all subtype B sequences from the SHCS-DRDB (6934 additional patients). Potential sources of TDR were identified based on phylogenetic clustering, shared resistance mutations, genetic distance, and estimated infection dates. RESULTS One hundred forty of 1674 (8.4%) surveillance patients carried virus with TDR; 86 of 140 (61.4%) were assigned to clusters. Potential sources of TDR were found for 50 of 86 (58.1%) of these patients. ART-naive patients constitute 56 of 66 (84.8%) potential sources and were significantly overrepresented among sources (odds ratio, 6.43 [95% confidence interval, 3.22-12.82]; P < .001). Particularly large transmission clusters were observed for the L90M mutation, and the spread of L90M continued even after the near cessation of antiretroviral use selecting for that mutation. Three clusters showed evidence of reversion of K103N or T215Y/F. CONCLUSIONS Many individuals harboring viral TDR belonged to transmission clusters with other Swiss patients, indicating substantial domestic transmission of TDR in Switzerland. Most TDR in clusters could be linked to sources, indicating good surveillance of TDR in the SHCS-DRDB. Most TDR sources were ART naive. This, and the presence of long TDR transmission chains, suggests that resistance mutations are frequently transmitted among untreated individuals, highlighting the importance of early diagnosis and treatment.
Resumo:
BACKGROUND: Prognostic models for children starting antiretroviral therapy (ART) in Africa are lacking. We developed models to estimate the probability of death during the first year receiving ART in Southern Africa. METHODS: We analyzed data from children ≤10 years old who started ART in Malawi, South Africa, Zambia or Zimbabwe from 2004-2010. Children lost to follow-up or transferred were excluded. The primary outcome was all-cause mortality in the first year of ART. We used Weibull survival models to construct two prognostic models: one with CD4%, age, WHO clinical stage, weight-for-age z-score (WAZ) and anemia and one without CD4%, because it is not routinely measured in many programs. We used multiple imputation to account for missing data. RESULTS: Among 12655 children, 877 (6.9%) died in the first year of ART. 1780 children were lost to follow-up/transferred and excluded from main analyses; 10875 children were included. With the CD4% model probability of death at 1 year ranged from 1.8% (95% CI: 1.5-2.3) in children 5-10 years with CD4% ≥10%, WHO stage I/II, WAZ ≥-2 and without severe anemia to 46.3% (95% CI: 38.2-55.2) in children <1 year with CD4% <5%, stage III/IV, WAZ< -3 and severe anemia. The corresponding range for the model without CD4% was 2.2% (95% CI: 1.8-2.7) to 33.4% (95% CI: 28.2-39.3). Agreement between predicted and observed mortality was good (C-statistics=0.753 and 0.745 for models with and without CD4% respectively). CONCLUSION: These models may be useful to counsel children/caregivers, for program planning and to assess program outcomes after allowing for differences in patient disease severity characteristics.
Resumo:
HIV-1 sequence diversity is affected by selection pressures arising from host genomic factors. Using paired human and viral data from 1071 individuals, we ran >3000 genome-wide scans, testing for associations between host DNA polymorphisms, HIV-1 sequence variation and plasma viral load (VL), while considering human and viral population structure. We observed significant human SNP associations to a total of 48 HIV-1 amino acid variants (p<2.4 × 10−12). All associated SNPs mapped to the HLA class I region. Clinical relevance of host and pathogen variation was assessed using VL results. We identified two critical advantages to the use of viral variation for identifying host factors: (1) association signals are much stronger for HIV-1 sequence variants than VL, reflecting the ‘intermediate phenotype’ nature of viral variation; (2) association testing can be run without any clinical data. The proposed genome-to-genome approach highlights sites of genomic conflict and is a strategy generally applicable to studies of host–pathogen interaction.
Resumo:
The success of combination antiretroviral therapy is limited by the evolutionary escape dynamics of HIV-1. We used Isotonic Conjunctive Bayesian Networks (I-CBNs), a class of probabilistic graphical models, to describe this process. We employed partial order constraints among viral resistance mutations, which give rise to a limited set of mutational pathways, and we modeled phenotypic drug resistance as monotonically increasing along any escape pathway. Using this model, the individualized genetic barrier (IGB) to each drug is derived as the probability of the virus not acquiring additional mutations that confer resistance. Drug-specific IGBs were combined to obtain the IGB to an entire regimen, which quantifies the virus' genetic potential for developing drug resistance under combination therapy. The IGB was tested as a predictor of therapeutic outcome using between 2,185 and 2,631 treatment change episodes of subtype B infected patients from the Swiss HIV Cohort Study Database, a large observational cohort. Using logistic regression, significant univariate predictors included most of the 18 drugs and single-drug IGBs, the IGB to the entire regimen, the expert rules-based genotypic susceptibility score (GSS), several individual mutations, and the peak viral load before treatment change. In the multivariate analysis, the only genotype-derived variables that remained significantly associated with virological success were GSS and, with 10-fold stronger association, IGB to regimen. When predicting suppression of viral load below 400 cps/ml, IGB outperformed GSS and also improved GSS-containing predictors significantly, but the difference was not significant for suppression below 50 cps/ml. Thus, the IGB to regimen is a novel data-derived predictor of treatment outcome that has potential to improve the interpretation of genotypic drug resistance tests.
Resumo:
Background. Drug-resistant human immunodeficiency virus type 1 (HIV-1) minority variants (MVs) are present in some antiretroviral therapy (ART)–naive patients. They may result from de novo mutagenesis or transmission. To date, the latter has not been proven. Methods. MVs were quantified by allele-specific polymerase chain reaction in 204 acute or recent seroconverters from the Zurich Primary HIV Infection study and 382 ART-naive, chronically infected patients. Phylogenetic analyses identified transmission clusters. Results. Three lines of evidence were observed in support of transmission of MVs. First, potential transmitters were identified for 12 of 16 acute or recent seroconverters harboring M184V MVs. These variants were also detected in plasma and/or peripheral blood mononuclear cells at the estimated time of transmission in 3 of 4 potential transmitters who experienced virological failure accompanied by the selection of the M184V mutation before transmission. Second, prevalence between MVs harboring the frequent mutation M184V and the particularly uncommon integrase mutation N155H differed highly significantly in acute or recent seroconverters (8.2% vs 0.5%; P < .001). Third, the prevalence of less-fit M184V MVs is significantly higher in acutely or recently than in chronically HIV-1–infected patients (8.2% vs 2.5%; P = .004). Conclusions. Drug-resistant HIV-1 MVs can be transmitted. To what extent the origin—transmission vs sporadic appearance—of these variants determines their impact on ART needs to be further explored.
Resumo:
Context: In virologically suppressed, antiretroviral-treated patients, the effect of switching to tenofovir (TDF) on bone biomarkers compared to patients remaining on stable antiretroviral therapy is unknown. Methods: We examined bone biomarkers (osteocalcin [OC], procollagen type 1 amino-terminal propeptide, and C-terminal cross-linking telopeptide of type 1 collagen) and bone mineral density (BMD) over 48 weeks in virologically suppressed patients (HIV RNA < 50 copies/ml) randomized to switch to TDF/emtricitabine (FTC) or remain on first-line zidovudine (AZT)/lamivudine (3TC). PTH was also measured. Between-group differences in bone biomarkers and associations between change in bone biomarkers and BMD measures were assessed by Student's t tests, Pearson correlation, and multivariable linear regression, respectively. All data are expressed as mean (SD), unless otherwise specified. Results: Of 53 subjects (aged 46.0 y; 84.9% male; 75.5% Caucasian), 29 switched to TDF/FTC. There were reductions in total hip and lumbar spine BMD in those switching to TDF/FTC (total hip, TDF/FTC, −1.73 (2.76)% vs AZT/3TC, −0.39 (2.41)%; between-group P = .07; lumbar spine, TDF/FTC, −1.50 (3.49)% vs AZT/3TC, +0.25 (2.82)%; between-group P = .06), but they did not reach statistical significance. Greater declines in lumbar spine BMD correlated with greater increases in OC (r = −0.28; P = .05). The effect of TDF/FTC on bone biomarkers remained significant when adjusted for baseline biomarker levels, gender, and ethnicity. There was no difference in change in PTH levels over 48 weeks between treatment groups (between-group P = .23). All biomarkers increased significantly from weeks 0 to 48 in the switch group, with no significant change in those remaining on AZT/3TC (between-group, all biomarkers, P < .0001). Conclusion: A switch to TDF/FTC compared to remaining on a stable regimen is associated with increases in bone turnover that correlate with reductions in BMD, suggesting that TDF exposure directly affects bone metabolism in vivo.
Resumo:
Objectives: To determine HIV-1 RNA in cerebrospinal fluid (CSF) of successfully treated patients and to evaluate if combination antiretroviral treatments with higher central nervous system penetration-effectiveness (CPE) achieve better CSF viral suppression. Methods: Viral loads (VLs) and drug concentrations of lopinavir, atazanavir, and efavirenz were measured in plasma and CSF. The CPE was calculated using 2 different methods. Results: The authors analyzed 87 CSF samples of 60 patients. In 4 CSF samples, HIV-1 RNA was detectable with 43–82 copies per milliliter. Median CPE in patients with detectable CSF VL was significantly lower compared with individuals with undetectable VL: CPE of 1.0 (range, 1.0–1.5) versus 2.3 (range, 1.0–3.5) using the method of 2008 (P = 0.011) and CPE of 6 (range, 6–8) versus 8 (range, 5–12) using the method of 2010 (P = 0.022). The extrapolated CSF trough levels for atazanavir (n = 12) were clearly above the 50% inhibitory concentration (IC50) in only 25% of samples; both patients on atazanavir/ritonavir with detectable CSF HIV-1 RNA had trough levels in the range of the presumed IC50. The extrapolated CSF trough level for lopinavir (n = 42) and efavirenz (n = 18) were above the IC50 in 98% and 78%, respectively, of samples, including the patients with detectable CSF HIV-1 RNA. Conclusions: This study suggests that treatment regimens with high intracerebral efficacy reflected by a high CPE score are essential to achieve CSF HIV-1 RNA suppression. The CPE score including all drug components was a better predictor for treatment failure in the CSF than the sole concentrations of protease inhibitor or nonnucleoside reverse transcriptase inhibitor in plasma or CSF.
Resumo:
Objectives: Etravirine (ETV) is metabolized by cytochrome P450 (CYP) 3A, 2C9, and 2C19. Metabolites are glucuronidated by uridine diphosphate glucuronosyltransferases (UGT). To identify the potential impact of genetic and non-genetic factors involved in ETV metabolism, we carried out a two-step pharmacogenetics-based population pharmacokinetic study in HIV-1 infected individuals. Materials and methods: The study population included 144 individuals contributing 289 ETV plasma concentrations and four individuals contributing 23 ETV plasma concentrations collected in a rich sampling design. Genetic variants [n=125 single-nucleotide polymorphisms (SNPs)] in 34 genes with a predicted role in ETV metabolism were selected. A first step population pharmacokinetic model included non-genetic and known genetic factors (seven SNPs in CYP2C, one SNP in CYP3A5) as covariates. Post-hoc individual ETV clearance (CL) was used in a second (discovery) step, in which the effect of the remaining 98 SNPs in CYP3A, P450 cytochrome oxidoreductase (POR), nuclear receptor genes, and UGTs was investigated. Results: A one-compartment model with zero-order absorption best characterized ETV pharmacokinetics. The average ETV CL was 41 (l/h) (CV 51.1%), the volume of distribution was 1325 l, and the mean absorption time was 1.2 h. The administration of darunavir/ritonavir or tenofovir was the only non-genetic covariate influencing ETV CL significantly, resulting in a 40% [95% confidence interval (CI): 13–69%] and a 42% (95% CI: 17–68%) increase in ETV CL, respectively. Carriers of rs4244285 (CYP2C19*2) had 23% (8–38%) lower ETV CL. Co-administered antiretroviral agents and genetic factors explained 16% of the variance in ETV concentrations. None of the SNPs in the discovery step influenced ETV CL. Conclusion: ETV concentrations are highly variable, and co-administered antiretroviral agents and genetic factors explained only a modest part of the interindividual variability in ETV elimination. Opposing effects of interacting drugs effectively abrogate genetic influences on ETV CL, and vice-versa.
Resumo:
Alpha interferon (IFN-α) suppresses human immunodeficiency virus type 1 (HIV-1) replication in vitro by inducing cell-intrinsic retroviral restriction mechanisms. We investigated the effects of IFN-α/ribavirin (IFN-α/riba) treatment on 34 anti-HIV-1 restriction factors in vivo. Expression of several anti-HIV-1 restriction factors was significantly induced by IFN-α/riba in HIV/hepatitis C virus (HCV)-coinfected individuals. Fold induction of cumulative restriction factor expression in CD4+ T cells was significantly correlated with viral load reduction during IFN-α/riba treatment (r2 = 0.649; P < 0.016). Exogenous IFN-α induces supraphysiologic restriction factor expression associated with a pronounced decrease in HIV-1 viremia.
Resumo:
The predominant route of human immunodeficiency virus type 1 (HIV-1) transmission is infection across the vaginal mucosa. Epithelial cells, which form the primary barrier of protection against pathogens, are the first cell type at these mucosal tissues to encounter the virus but their role in HIV infection has not been clearly elucidated. Although mucosal epithelial cells express only low levels of the receptors required for successful HIV infection, productive infection does occur at these sites. The present work provides evidence to show that HIV exposure, without the need for productive infection, induces human cervical epithelial cells to produce Thymic Stromal Lymphopoietin (TSLP), an IL7-like cytokine, which potently activated human myeloid dendritic cells (mDC) to cause the homeostatic proliferation of autologous CD4+ T cells that serve as targets for HIV infection. Rhesus macaques inoculated with simian immunodeficiency virus (SIV) or with the simian-human immunodeficiency virus (SHIV) by the vaginal, oral or rectal route exhibited dramatic increases in: TSLP expression, DC and CD4+ T cell numbers, and viral replication, in the vaginal, oral, and rectal tissues, respectively within the first 2 weeks after virus exposure. Evidence obtained showed that HIV-mediated TSLP production by cervical cells is dependent upon the expression of the cell surface salivary agglutinin (SAG) protein gp340. Epithelial cells expressing gp340 exhibited HIV endocytosis and TSLP expression and genetic knockdown of gp340 or use of a gp340-blocking antibody inhibited TSLP expression by HIV. On the other hand, gp340-null epithelial cells failed to endocytose HIV and produce TSLP, but transfection of gp340 resulted in HIV-induced TSLP expression. Finally, HIV-induced TSLP expression was found to be mediated by TLR7/8 signaling and NF-kB activity because silencing these pathways or use of specific inhibitors abrogated TSLP expression in gp340-postive but not in gp340-null epithelial cells. Overall these studies identify TSLP as a key player in the acute phase of HIV-1 infection in permitting HIV to successfully maneuver the hostile vaginal mucosal microenvironment by creating a conducive environment for sustaining the small amount of virus that initially crosses the mucosal barrier allowing it to successfully cause infection and spread to distal compartments of the body
Resumo:
Next-generation sequencing (NGS) is a valuable tool for the detection and quantification of HIV-1 variants in vivo. However, these technologies require detailed characterization and control of artificially induced errors to be applicable for accurate haplotype reconstruction. To investigate the occurrence of substitutions, insertions, and deletions at the individual steps of RT-PCR and NGS, 454 pyrosequencing was performed on amplified and non-amplified HIV-1 genomes. Artificial recombination was explored by mixing five different HIV-1 clonal strains (5-virus-mix) and applying different RT-PCR conditions followed by 454 pyrosequencing. Error rates ranged from 0.04-0.66% and were similar in amplified and non-amplified samples. Discrepancies were observed between forward and reverse reads, indicating that most errors were introduced during the pyrosequencing step. Using the 5-virus-mix, non-optimized, standard RT-PCR conditions introduced artificial recombinants in a fraction of at least 30% of the reads that subsequently led to an underestimation of true haplotype frequencies. We minimized the fraction of recombinants down to 0.9-2.6% by optimized, artifact-reducing RT-PCR conditions. This approach enabled correct haplotype reconstruction and frequency estimations consistent with reference data obtained by single genome amplification. RT-PCR conditions are crucial for correct frequency estimation and analysis of haplotypes in heterogeneous virus populations. We developed an RT-PCR procedure to generate NGS data useful for reliable haplotype reconstruction and quantification.