927 resultados para HIV-POSITIVE WOMEN
Resumo:
BACKGROUND: According to current recommendations, HIV-infected women should have at least 1 gynecologic examination per year. OBJECTIVES: To analyze factors associated with frequency of gynecologic follow-up and cervical cancer screening among HIV-infected women followed in the Swiss HIV Cohort Study (SHCS). METHODS: Half-yearly questionnaires between April 2001 and December 2004. At every follow-up visit, the women were asked if they had had a gynecologic examination and a cervical smear since their last visit. Longitudinal models were fitted with these variables as outcomes. RESULTS: A total of 2186 women were included in the analysis. Of the 1146 women with complete follow-up in the SHCS, 35.3% had a gynecologic examination in each time period, whereas 7.4% had never gone to a gynecologist. Factors associated with a poor gynecologic follow-up were older age, nonwhite ethnicity, less education, underweight, obesity, being sexually inactive, intravenous drug use, smoking, having a private infectious disease specialist as a care provider, HIV viral load <400 copies/mL, and no previous cervical dysplasia. No association was seen for living alone, CD4 cell count, and positive serology for syphilis. CONCLUSIONS: Gynecologic care among well-followed HIV-positive women is poor and needs to be improved.
Resumo:
OBJECTIVE: To determine characteristics and clinical course of high-grade anogenital intraepithelial neoplasia (AIN) in human immunodeficiency virus (HIV)-infected women. STUDY DESIGN: HIV-positive women with biopsy-proven high-grade (II and III) vulvar (VIN), vaginal (VAIN) or perianal intraepithelial neoplasia (PAIN) were identified in the electronic databases of 2 colposcopy clinics. RESULTS: A total of 31 patients were identified from 1992 to 2007, of which 30 had a mean follow-up of 47.7 months (SD = 46.0; range, 2.6-166.2). Of the patients, 77.4% had VIN, 12.9% VAIN and 9.7% PAIN at first diagnosis. Age at diagnosis of IN was 36.2 years (SD +/- 5.2; range, 23.5-47.0). Ninety percent of patients received antiretroviral therapy at first diagnosis of IN; 65% (13 of 20) of patients with a follow-up of > 2 years required a second treatment, and 2 developed invasive vulvar cancer (10%). CONCLUSION: AIN among HIV-positive women shows a high relapse rate despite treatment modality used and a substantial invasive potential.
Resumo:
Background Few studies have monitored late presentation (LP) of HIV infection over the European continent, including Eastern Europe. Study objectives were to explore the impact of LP on AIDS and mortality. Methods and Findings LP was defined in Collaboration of Observational HIV Epidemiological Research Europe (COHERE) as HIV diagnosis with a CD4 count <350/mm3 or an AIDS diagnosis within 6 months of HIV diagnosis among persons presenting for care between 1 January 2000 and 30 June 2011. Logistic regression was used to identify factors associated with LP and Poisson regression to explore the impact on AIDS/death. 84,524 individuals from 23 cohorts in 35 countries contributed data; 45,488 were LP (53.8%). LP was highest in heterosexual males (66.1%), Southern European countries (57.0%), and persons originating from Africa (65.1%). LP decreased from 57.3% in 2000 to 51.7% in 2010/2011 (adjusted odds ratio [aOR] 0.96; 95% CI 0.95–0.97). LP decreased over time in both Central and Northern Europe among homosexual men, and male and female heterosexuals, but increased over time for female heterosexuals and male intravenous drug users (IDUs) from Southern Europe and in male and female IDUs from Eastern Europe. 8,187 AIDS/deaths occurred during 327,003 person-years of follow-up. In the first year after HIV diagnosis, LP was associated with over a 13-fold increased incidence of AIDS/death in Southern Europe (adjusted incidence rate ratio [aIRR] 13.02; 95% CI 8.19–20.70) and over a 6-fold increased rate in Eastern Europe (aIRR 6.64; 95% CI 3.55–12.43). Conclusions LP has decreased over time across Europe, but remains a significant issue in the region in all HIV exposure groups. LP increased in male IDUs and female heterosexuals from Southern Europe and IDUs in Eastern Europe. LP was associated with an increased rate of AIDS/deaths, particularly in the first year after HIV diagnosis, with significant variation across Europe. Earlier and more widespread testing, timely referrals after testing positive, and improved retention in care strategies are required to further reduce the incidence of LP.
Resumo:
Because of the large variability in the pharmacokinetics of anti-HIV drugs, therapeutic drug monitoring in patients may contribute to optimize the overall efficacy and safety of antiretroviral therapy. An LC–MS/MS method for the simultaneous assay in plasma of the novel antiretroviral agents rilpivirine (RPV) and elvitegravir (EVG) has been developed to that endeavor. Plasma samples (100 μL) extraction is performed by protein precipitation with acetonitrile, and the supernatant is subsequently diluted 1:1 with 20-mM ammonium acetate/MeOH 50:50. After reverse-phase chromatography, quantification of RPV and EVG, using matrix-matched calibration samples, is performed by electrospray ionization–triple quadrupole mass spectrometry by selected reaction monitoring detection using the positive mode. The stable isotopic-labeled compounds RPV-13C6 and EVG-D6 were used as internal standards. The method was validated according to FDA recommendations, including assessment of extraction yield, matrix effects variability (<6.4%), as well as EVG and RPV short and long-term stability in plasma. Calibration curves were validated over the clinically relevant concentrations ranging from 5 to 2500 ng/ml for RPV and from 50 to 5000 ng/ml for EVG. The method is precise (inter-day CV%: 3–6.3%) and accurate (3.8–7.2%). Plasma samples were found to be stable (<15%) in all considered conditions (RT/48 h, +4°C/48 h, −20°C/3 months and 60°C/1 h). Selected metabolite profiles analysis in patients' samples revealed the presence of EVG glucuronide, that was well separated from parent EVG, allowing to exclude potential interferences through the in-source dissociation of glucuronide to parent drug. This new, rapid and robust LCMS/MS assay for the simultaneous quantification of plasma concentrations of these two major new anti-HIV drugs EVG and RPV offers an efficient analytical tool for clinical pharmacokinetics studies and routine therapeutic drug monitoring service.
Resumo:
OBJECTIVES: To assess health care utilisation for patients co-infected with TB and HIV (TB-HIV), and to develop a weighted health care index (HCI) score based on commonly used interventions and compare it with patient outcome. METHODS: A total of 1061 HIV patients diagnosed with TB in four regions, Central/Northern, Southern and Eastern Europe and Argentina, between January 2004 and December 2006 were enrolled in the TB-HIV study. A weighted HCI score (range 0–5), based on independent prognostic factors identified in multivariable Cox models and the final score, included performance of TB drug susceptibility testing (DST), an initial TB regimen containing a rifamycin, isoniazid and pyrazinamide, and start of combination antiretroviral treatment (cART). RESULTS: The mean HCI score was highest in Central/Northern Europe (3.2, 95%CI 3.1–3.3) and lowest in Eastern Europe (1.6, 95%CI 1.5–1.7). The cumulative probability of death 1 year after TB diagnosis decreased from 39% (95%CI 31–48) among patients with an HCI score of 0, to 9% (95%CI 6–13) among those with a score of ≥4. In an adjusted Cox model, a 1-unit increase in the HCI score was associated with 27% reduced mortality (relative hazard 0.73, 95%CI 0.64–0.84). CONCLUSIONS: Our results suggest that DST, standard anti-tuberculosis treatment and early cART may improve outcome for TB-HIV patients. The proposed HCI score provides a tool for future research and monitoring of the management of TB-HIV patients. The highest HCI score may serve as a benchmark to assess TB-HIV management, encouraging continuous health care improvement.
Resumo:
Background Persons infected with human immunodeficiency virus (HIV) have increased rates of coronary artery disease (CAD). The relative contribution of genetic background, HIV-related factors, antiretroviral medications, and traditional risk factors to CAD has not been fully evaluated in the setting of HIV infection. Methods In the general population, 23 common single-nucleotide polymorphisms (SNPs) were shown to be associated with CAD through genome-wide association analysis. Using the Metabochip, we genotyped 1875 HIV-positive, white individuals enrolled in 24 HIV observational studies, including 571 participants with a first CAD event during the 9-year study period and 1304 controls matched on sex and cohort. Results A genetic risk score built from 23 CAD-associated SNPs contributed significantly to CAD (P = 2.9×10−4). In the final multivariable model, participants with an unfavorable genetic background (top genetic score quartile) had a CAD odds ratio (OR) of 1.47 (95% confidence interval [CI], 1.05–2.04). This effect was similar to hypertension (OR = 1.36; 95% CI, 1.06–1.73), hypercholesterolemia (OR = 1.51; 95% CI, 1.16–1.96), diabetes (OR = 1.66; 95% CI, 1.10–2.49), ≥1 year lopinavir exposure (OR = 1.36; 95% CI, 1.06–1.73), and current abacavir treatment (OR = 1.56; 95% CI, 1.17–2.07). The effect of the genetic risk score was additive to the effect of nongenetic CAD risk factors, and did not change after adjustment for family history of CAD. Conclusions In the setting of HIV infection, the effect of an unfavorable genetic background was similar to traditional CAD risk factors and certain adverse antiretroviral exposures. Genetic testing may provide prognostic information complementary to family history of CAD.
Resumo:
Hereditary breast and ovarian cancer (HBOC) is caused by a mutation in the BRCA1 or BRCA2 genes. Women with a BRCA1/2 mutation are at increased risks for breast and ovarian cancer and often develop cancer at an earlier age than the general population. However, some women with a BRCA1/2 mutation do not develop breast or ovarian cancer under the age of 50 years. There have been no specific studies on BRCA positive women with no cancer prior to age 50, therefore this study sought to investigate factors within these women with no cancer under age 50 with respect to reproductive risk factors, BMI, tumor pathology, screening history, risk-reducing surgeries, and family history. 241 women were diagnosed with cancer prior to age 50, 92 with cancer at age 50 or older, and 20 women were over age 50 with no cancer. Data were stratified based on BRCA1 and BRCA2 mutation status. Within the cohorts we investigated differences between women who developed cancer prior to age 50 and those who developed cancer at age 50 or older. We also investigated the differences between women who developed cancer at age 50 or older and those who were age 50 or older with no cancer. Of the 92 women with a BRCA1/2 mutation who developed cancer at age 50 or older, 46 developed ovarian cancer first, 45 developed breast cancer, and one had breast and ovarian cancer diagnosed synchronously. BRCA2 carriers diagnosed age 50 or older were more likely to have ER/PR negative breast tumors when compared to BRCA2 carriers who were diagnosed before age 50. This is consistent with one other study that has been performed. Ashkenazi Jewish women with a BRCA1 mutation were more likely to be diagnosed age 50 or older than other ethnicities. Hispanic women with a BRCA2 mutation were more likely to be diagnosed prior to age 50 when compared to other ethnicities. No differences in reproductive factors or BMI were observed. Further characterization of BRCA positive women with no cancer prior to age 50 may aid in finding factors important in the development of breast or ovarian cancer.
Resumo:
: Noncommunicable diseases (NCDs) account for a growing burden of morbidity and mortality among people living with HIV in low- and middle-income countries (LMICs). HIV infection and antiretroviral therapy interact with NCD risk factors in complex ways, and research into this "web of causation" has so far been largely based on data from high-income countries. However, improving the understanding, treatment, and prevention of NCDs in LMICs requires region-specific evidence. Priority research areas include: (1) defining the burden of NCDs among people living with HIV, (2) understanding the impact of modifiable risk factors, (3) evaluating effective and efficient care strategies at individual and health systems levels, and (4) evaluating cost-effective prevention strategies. Meeting these needs will require observational data, both to inform the design of randomized trials and to replace trials that would be unethical or infeasible. Focusing on Sub-Saharan Africa, we discuss data resources currently available to inform this effort and consider key limitations and methodological challenges. Existing data resources often lack population-based samples; HIV-negative, HIV-positive, and antiretroviral therapy-naive comparison groups; and measurements of key NCD risk factors and outcomes. Other challenges include loss to follow-up, competing risk of death, incomplete outcome ascertainment and measurement of factors affecting clinical decision making, and the need to control for (time-dependent) confounding. We review these challenges and discuss strategies for overcoming them through augmented data collection and appropriate analysis. We conclude with recommendations to improve the quality of data and analyses available to inform the response to HIV and NCD comorbidity in LMICs.
Resumo:
INTRODUCTION Proteinuria (PTU) is an important marker for the development and progression of renal disease, cardiovascular disease and death, but there is limited information about the prevalence and factors associated with confirmed PTU in predominantly white European HIV+ persons, especially in those with an estimated glomerular filtration rate (eGFR) of 60 mL/min/1.73 m(2). PATIENTS AND METHODS Baseline was defined as the first of two consecutive dipstick urine protein (DPU) measurements during prospective follow-up >1/6/2011 (when systematic data collection began). PTU was defined as two consecutive DUP >1+ (>30 mg/dL) >3 months apart; persons with eGFR <60 at either DPU measurement were excluded. Logistic regression investigated factors associated with PTU. RESULTS A total of 1,640 persons were included, participants were mainly white (n=1,517, 92.5%), male (n=1296, 79.0%) and men having sex with men (n=809; 49.3%). Median age at baseline was 45 (IQR 37-52 years), and CD4 was 570 (IQR 406-760/mm(3)). The median baseline date was 2/12 (IQR 11/11-6/12), and median eGFR was 99 (IQR 88-109 mL/min/1.73 m(2)). Sixty-nine persons had PTU (4.2%, 95% CI 3.2-4.7%). Persons with diabetes had increased odds of PTU, as were those with a prior non-AIDS (1) or AIDS event and those with prior exposure to indinavir. Among females, those with a normal eGFR (>90) and those with prior abacavir use had lower odds of PTU (Figure 1). CONCLUSIONS One in 25 persons with eGFR>60 had confirmed proteinuria at baseline. Factors associated with PTU were similar to those associated with CKD. The lack of association with antiretrovirals, particularly tenofovir, may be due to the cross-sectional design of this study, and additional follow-up is required to address progression to PTU in those without PTU at baseline. It may also suggest other markers are needed to capture the deteriorating renal function associated with antiretrovirals may be needed at higher eGFRs. Our findings suggest PTU is an early marker for impaired renal function.
Resumo:
INTRODUCTION HIV-infected pregnant women are very likely to engage in HIV medical care to prevent transmission of HIV to their newborn. After delivery, however, childcare and competing commitments might lead to disengagement from HIV care. The aim of this study was to quantify loss to follow-up (LTFU) from HIV care after delivery and to identify risk factors for LTFU. METHODS We used data on 719 pregnancies within the Swiss HIV Cohort Study from 1996 to 2012 and with information on follow-up visits available. Two LTFU events were defined: no clinical visit for >180 days and no visit for >360 days in the year after delivery. Logistic regression analysis was used to identify risk factors for a LTFU event after delivery. RESULTS Median maternal age at delivery was 32 years (IQR 28-36), 357 (49%) women were black, 280 (39%) white, 56 (8%) Asian and 4% other ethnicities. One hundred and seven (15%) women reported any history of IDU. The majority (524, 73%) of women received their HIV diagnosis before pregnancy, most of those (413, 79%) had lived with diagnosed HIV longer than three years and two-thirds (342, 65%) were already on antiretroviral therapy (ART) at time of conception. Of the 181 women diagnosed during pregnancy by a screening test, 80 (44%) were diagnosed in the first trimester, 67 (37%) in the second and 34 (19%) in the third trimester. Of 357 (69%) women who had been seen in HIV medical care during three months before conception, 93% achieved an undetectable HIV viral load (VL) at delivery. Of 62 (12%) women with the last medical visit more than six months before conception, only 72% achieved an undetectable VL (p=0.001). Overall, 247 (34%) women were LTFU over 180 days in the year after delivery and 86 (12%) women were LTFU over 360 days with 43 (50%) of those women returning. Being LTFU for 180 days was significantly associated with history of intravenous drug use (aOR 1.73, 95% CI 1.09-2.77, p=0.021) and not achieving an undetectable VL at delivery (aOR 1.79, 95% CI 1.03-3.11, p=0.040) after adjusting for maternal age, ethnicity, time of HIV diagnosis and being on ART at conception. CONCLUSIONS Women with a history of IDU and women with a detectable VL at delivery were more likely to be LTFU after delivery. This is of concern regarding their own health, as well as risk for sexual partners and subsequent pregnancies. Further strategies should be developed to enhance retention in medical care beyond pregnancy.
Resumo:
OBJECTIVES HIV infection has been associated with an increased risk of chronic kidney disease (CKD). Little is known about the prevalence of CKD in individuals with high CD4 cell counts prior to initiation of antiretroviral therapy (ART). We sought to address this knowledge gap. METHODS We describe the prevalence of CKD among 4637 ART-naïve adults (mean age 36.8 years) with CD4 cell counts > 500 cells/μL at enrolment in the Strategic Timing of AntiRetroviral Treatment (START) study. CKD was defined by estimated glomerular filtration rate (eGFR) < 60 mL/min/1.73 m(2) and/or dipstick urine protein ≥ 1+. Logistic regression was used to identify baseline characteristics associated with CKD. RESULTS Among 286 [6.2%; 95% confidence interval (CI) 5.5%, 6.9%] participants with CKD, the majority had isolated proteinuria. A total of 268 participants had urine protein ≥ 1+, including 41 with urine protein ≥ 2+. Only 22 participants (0.5%) had an estimated glomerular filtration rate < 60 mL/min/1.73 m(2) , including four who also had proteinuria. Baseline characteristics independently associated with CKD included diabetes [adjusted odds ratio (aOR) 1.73; 95% CI 1.05, 2.85], hypertension (aOR 1.82; 95% CI 1.38, 2.38), and race/ethnicity (aOR 0.59; 95% CI 0.37, 0.93 for Hispanic vs. white). CONCLUSIONS We observed a low prevalence of CKD associated with traditional CKD risk factors among ART-naïve clinical trial participants with CD4 cell counts > 500 cells/μL.
Resumo:
OBJECTIVES Many paediatric antiretroviral therapy (ART) programmes in Southern Africa rely on CD4⁺ to monitor ART. We assessed the benefit of replacing CD4⁺ by viral load monitoring. DESIGN A mathematical modelling study. METHODS A simulation model of HIV progression over 5 years in children on ART, parameterized by data from seven South African cohorts. We simulated treatment programmes with 6-monthly CD4⁺ or 6- or 12-monthly viral load monitoring. We compared mortality, second-line ART use, immunological failure and time spent on failing ART. In further analyses, we varied the rate of virological failure, and assumed that the rate is higher with CD4⁺ than with viral load monitoring. RESULTS About 7% of children were predicted to die within 5 years, independent of the monitoring strategy. Compared with CD4⁺ monitoring, 12-monthly viral load monitoring reduced the 5-year risk of immunological failure from 1.6 to 1.0% and the mean time spent on failing ART from 6.6 to 3.6 months; 1% of children with CD4⁺ compared with 12% with viral load monitoring switched to second-line ART. Differences became larger when assuming higher rates of virological failure. When assuming higher virological failure rates with CD4⁺ than with viral load monitoring, up to 4.2% of children with CD4⁺ compared with 1.5% with viral load monitoring experienced immunological failure; the mean time spent on failing ART was 27.3 months with CD4⁺ monitoring and 6.0 months with viral load monitoring. Conclusion: Viral load monitoring did not affect 5-year mortality, but reduced time on failing ART, improved immunological response and increased switching to second-line ART.