44 resultados para intravenous drug users
Resumo:
BACKGROUND/AIMS: While several risk factors for the histological progression of chronic hepatitis C have been identified, the contribution of HCV genotypes to liver fibrosis evolution remains controversial. The aim of this study was to assess independent predictors for fibrosis progression. METHODS: We identified 1189 patients from the Swiss Hepatitis C Cohort database with at least one biopsy prior to antiviral treatment and assessable date of infection. Stage-constant fibrosis progression rate was assessed using the ratio of fibrosis Metavir score to duration of infection. Stage-specific fibrosis progression rates were obtained using a Markov model. Risk factors were assessed by univariate and multivariate regression models. RESULTS: Independent risk factors for accelerated stage-constant fibrosis progression (>0.083 fibrosis units/year) included male sex (OR=1.60, [95% CI 1.21-2.12], P<0.001), age at infection (OR=1.08, [1.06-1.09], P<0.001), histological activity (OR=2.03, [1.54-2.68], P<0.001) and genotype 3 (OR=1.89, [1.37-2.61], P<0.001). Slower progression rates were observed in patients infected by blood transfusion (P=0.02) and invasive procedures or needle stick (P=0.03), compared to those infected by intravenous drug use. Maximum likelihood estimates (95% CI) of stage-specific progression rates (fibrosis units/year) for genotype 3 versus the other genotypes were: F0-->F1: 0.126 (0.106-0.145) versus 0.091 (0.083-0.100), F1-->F2: 0.099 (0.080-0.117) versus 0.065 (0.058-0.073), F2-->F3: 0.077 (0.058-0.096) versus 0.068 (0.057-0.080) and F3-->F4: 0.171 (0.106-0.236) versus 0.112 (0.083-0.142, overall P<0.001). CONCLUSIONS: This study shows a significant association of genotype 3 with accelerated fibrosis using both stage-constant and stage-specific estimates of fibrosis progression rates. This observation may have important consequences for the management of patients infected with this genotype.
Resumo:
PURPOSE: The purpose of this study is to review the Chinese-language medical and dental literature from 1982 to 2008 on oral manifestations (OMs) of patients with HIV/AIDS for introducing the spectrum of OMs of the patients in China. MATERIALS AND METHODS: All data were extracted from 18 references which had used diagnostic criteria for HIV/AIDS. Four of the references had used the EC-Clearinghouse classification for oral lesions in HIV infection. The feasible overall rate and 95% confidence interval (95%CI) of the data on OMs were calculated. RESULTS: Risk group analysis revealed that, of 203 patients, 64.3% were men and 35.7% were women (age range, 5 months to 64 years; mean age in three studies, 34.0, 34.3, and 36.1 years). Of these patients, 22.2% were infected by sexual contacts, 11.8% by intravenous drug use (IDU), 59.6% by blood or its products, 2.9% by mother to child transmission, and 3.4% were unclear. In 203 patients, oral candidiasis (OC) was the most common lesion (66%, 95%CI = 59.48-72.52%), followed by herpes simplex (HS) (22.2%, 95%CI = 16.48-27.92%), ulcerative stomatitis (14.8%, 95%CI = 9.92-19.68%), salivary gland disease (11.3%, 95%CI = 6.94-15.66%), oral hairy leukoplakia (OHL) (9.8%, 95%CI = 5.71-13.89%), necrotizing gingivitis (5.9%, 95%CI = 2.66-9.14%), Kaposi's sarcoma (2.9%, 95%CI = 0.59-5.21%), other malignant tumors (2.9%, 95%CI = 0.59-5.21%), and linear gingival erythema (2.0%, 95%CI = 0.07-3.93%). CONCLUSIONS: The spectrum of OMs reported from China is similar to that described in the international literature. Present data are useful to supplement international resources of HIV/AIDS research.
Resumo:
BACKGROUND AND OBJECTIVES: Tuberculosis (TB) is a leading cause of death in HIV-infected patients worldwide. We aimed to study clinical characteristics and outcome of 1075 consecutive patients diagnosed with HIV/TB from 2004 to 2006 in Europe and Argentina. METHODS: One-year mortality was assessed in patients stratified according to region of residence, and factors associated with death were evaluated in multivariable Cox models. RESULTS: At TB diagnosis, patients in Eastern Europe had less advanced immunodeficiency, whereas a greater proportion had a history of intravenous drug use, coinfection with hepatitis C, disseminated TB, and infection with drug-resistant TB (P < 0.0001). In Eastern Europe, fewer patients initiated TB treatment containing at least rifamycin, isoniazid, and pyrazinamide or combination antiretroviral therapy (P < 0.0001). Mortality at 1 year was 27% in Eastern Europe, compared with 7, 9 and 11% in Central/Northern Europe, Southern Europe, and Argentina, respectively (P < 0.0001). In a multivariable model, the adjusted relative hazard of death was significantly lower in each of the other regions compared with Eastern Europe: 0.34 (95% confidence interval 0.17-0.65), 0.28 (0.14-0.57), 0.34 (0.15-0.77) in Argentina, Southern Europe and Central/Northern Europe, respectively. Factors significantly associated with increased mortality were CD4 cell count less than 200 cells/microl [2.31 (1.56-3.45)], prior AIDS [1.74 (1.22-2.47)], disseminated TB [2.00 (1.38-2.85)], initiation of TB treatment not including rifamycin, isoniazid and pyrazinamide [1.68 (1.20-2.36)], and rifamycin resistance [2.10 (1.29-3.41)]. Adjusting for these known confounders did not explain the increased mortality seen in Eastern Europe. CONCLUSION: The poor outcome of patients with HIV/TB in Eastern Europe deserves further study and urgent public health attention.
Resumo:
Switzerland has a complex human immunodeficiency virus (HIV) epidemic involving several populations. We examined transmission of HIV type 1 (HIV-1) in a national cohort study. Latent class analysis was used to identify socioeconomic and behavioral groups among 6,027 patients enrolled in the Swiss HIV Cohort Study between 2000 and 2011. Phylogenetic analysis of sequence data, available for 4,013 patients, was used to identify transmission clusters. Concordance between sociobehavioral groups and transmission clusters was assessed in correlation and multiple correspondence analyses. A total of 2,696 patients were infected with subtype B, 203 with subtype C, 196 with subtype A, and 733 with recombinant subtypes (mainly CRF02_AG and CRF01_AE). Latent class analysis identified 8 patient groups. Most transmission clusters of subtype B were shared between groups of gay men (groups 1-3) or between the heterosexual groups "heterosexual people of lower socioeconomic position" (group 4) and "injection drug users" (group 8). Clusters linking homosexual and heterosexual groups were associated with "older heterosexual and gay people on welfare" (group 5). "Migrant women in heterosexual partnerships" (group 6) and "heterosexual migrants on welfare" (group 7) shared non-B clusters with groups 4 and 5. Combining approaches from social and molecular epidemiology can provide insights into HIV-1 transmission and inform the design of prevention strategies.
Resumo:
Psychological and social factors have a deep impact on the treatment of HIV-infection, from the readiness to start antiretroviral therapy to treatment adherence over time. Among psychological factors, anxiety may affect HIV-infected persons in all stages of disease, from the disclosure of HIV diagnosis to the decision to start and maintain treatment. This is a lifelong challenge for both patients and doctors. Psychiatric comorbidities (depression, addiction) may enhance negative psychological effects of HIV. Among social factors, stigma and discrimination may occur in families and at work, leading to a loss of social support resulting in isolation and poverty. This may prevent HIV-positive individuals from seeking medical care. These aspects are particularly important in some groups of patients as injecting drug users and migrants. Acknowledgment and consideration of psychosocial factors are therefore essential for the long term success of antiretroviral therapy.
Resumo:
INTRODUCTION HIV-infected pregnant women are very likely to engage in HIV medical care to prevent transmission of HIV to their newborn. After delivery, however, childcare and competing commitments might lead to disengagement from HIV care. The aim of this study was to quantify loss to follow-up (LTFU) from HIV care after delivery and to identify risk factors for LTFU. METHODS We used data on 719 pregnancies within the Swiss HIV Cohort Study from 1996 to 2012 and with information on follow-up visits available. Two LTFU events were defined: no clinical visit for >180 days and no visit for >360 days in the year after delivery. Logistic regression analysis was used to identify risk factors for a LTFU event after delivery. RESULTS Median maternal age at delivery was 32 years (IQR 28-36), 357 (49%) women were black, 280 (39%) white, 56 (8%) Asian and 4% other ethnicities. One hundred and seven (15%) women reported any history of IDU. The majority (524, 73%) of women received their HIV diagnosis before pregnancy, most of those (413, 79%) had lived with diagnosed HIV longer than three years and two-thirds (342, 65%) were already on antiretroviral therapy (ART) at time of conception. Of the 181 women diagnosed during pregnancy by a screening test, 80 (44%) were diagnosed in the first trimester, 67 (37%) in the second and 34 (19%) in the third trimester. Of 357 (69%) women who had been seen in HIV medical care during three months before conception, 93% achieved an undetectable HIV viral load (VL) at delivery. Of 62 (12%) women with the last medical visit more than six months before conception, only 72% achieved an undetectable VL (p=0.001). Overall, 247 (34%) women were LTFU over 180 days in the year after delivery and 86 (12%) women were LTFU over 360 days with 43 (50%) of those women returning. Being LTFU for 180 days was significantly associated with history of intravenous drug use (aOR 1.73, 95% CI 1.09-2.77, p=0.021) and not achieving an undetectable VL at delivery (aOR 1.79, 95% CI 1.03-3.11, p=0.040) after adjusting for maternal age, ethnicity, time of HIV diagnosis and being on ART at conception. CONCLUSIONS Women with a history of IDU and women with a detectable VL at delivery were more likely to be LTFU after delivery. This is of concern regarding their own health, as well as risk for sexual partners and subsequent pregnancies. Further strategies should be developed to enhance retention in medical care beyond pregnancy.
Resumo:
OBJECTIVES Respondent-driven sampling (RDS) is a new data collection methodology used to estimate characteristics of hard-to-reach groups, such as the HIV prevalence in drug users. Many national public health systems and international organizations rely on RDS data. However, RDS reporting quality and available reporting guidelines are inadequate. We carried out a systematic review of RDS studies and present Strengthening the Reporting of Observational Studies in Epidemiology for RDS Studies (STROBE-RDS), a checklist of essential items to present in RDS publications, justified by an explanation and elaboration document. STUDY DESIGN AND SETTING We searched the MEDLINE (1970-2013), EMBASE (1974-2013), and Global Health (1910-2013) databases to assess the number and geographical distribution of published RDS studies. STROBE-RDS was developed based on STROBE guidelines, following Guidance for Developers of Health Research Reporting Guidelines. RESULTS RDS has been used in over 460 studies from 69 countries, including the USA (151 studies), China (70), and India (32). STROBE-RDS includes modifications to 12 of the 22 items on the STROBE checklist. The two key areas that required modification concerned the selection of participants and statistical analysis of the sample. CONCLUSION STROBE-RDS seeks to enhance the transparency and utility of research using RDS. If widely adopted, STROBE-RDS should improve global infectious diseases public health decision making.
Resumo:
Opposing effects of ondansetron and tramadol on the serotonin pathway have been suggested which possibly increase tramadol consumption and emesis when co-administered. In a randomized, double-blinded study, 179 patients received intravenous ondansetron, metoclopramide, or placebo for emesis prophylaxis. Analgesic regimen consisted of tramadol intraoperative loading and subsequent patient-controlled analgesia. Tramadol consumption and response to antiemetic treatment were compared. Additionally, plasma concentrations of ondansetron and (+)O-demethyltramadol and CYP2D6 genetic variants were analyzed as possible confounders influencing analgesic and antiemetic efficacy. Tramadol consumption did not differ between the groups. Response rate to antiemetic prophylaxis was superior in patients receiving ondansetron (85.0%) compared with placebo (66.7%, P = .046), with no difference to metoclopramide (69.5%). Less vomiting was reported in the immediate postoperative hours in the verum groups (ondansetron 5.0%, metoclopramide 5.1%) compared with placebo (18.6%; P = .01). Whereas plasma concentrations of (+)O-demethyltramadol were significantly correlated to CYP2D6 genotype, no influence was detected for ondansetron. Co-administration of ondansetron neither increased tramadol consumption nor frequency of PONV in this postoperative setting. PERSPECTIVE: Controversial findings were reported for efficacy of tramadol and ondansetron when co-administered due to their opposing serotonergic effects. Co-medication of these drugs neither increased postoperative analgesic consumption nor frequency of emesis in this study enrolling patients recovering from major surgery.
Resumo:
Differentiation between external contamination and incorporation of drugs or their metabolites from inside the body via blood, sweat or sebum is a general issue in hair analysis and of high concern when interpreting analytical results. In hair analysis for cannabinoids the most common target is Delta9-tetrahydrocannabinol (THC), sometimes cannabidiol (CBD) and cannabinol (CBN) are determined additionally. After repeated external contamination by cannabis smoke these analytes are known to be found in hair even after performing multiple washing steps. A widely accepted strategy to unequivocally prove active cannabis consumption is the analysis of hair extracts for the oxidative metabolite 11-nor-9-carboxy-THC (THC-COOH). Although the acidic nature of this metabolite suggests a lower rate of incorporation into the hair matrix compared to THC, it is not fully understood up to now why hair concentrations of THC-COOH are generally found to be much lower (mostly <10 pg/mg) than the corresponding THC concentrations. Delta9-Tetrahydrocannabinolic acid A (THCA A) is the preliminary end product of the THC biosynthesis in the cannabis plant. Unlike THC it is non-psychoactive and can be regarded as a 'precursor' of THC being largely decarboxylated when heated or smoked. The presented work shows for the first time that THCA A is not only detectable in blood and urine of cannabis consumers but also in THC positive hair samples. A pilot experiment performed within this study showed that after oral intake of THCA A on a regular basis no relevant incorporation into hair occurred. It can be concluded that THCA A in hair almost exclusively derives from external contamination e.g. by side stream smoke. Elevated temperatures during the analytical procedure, particularly under alkaline conditions, can lead to decarboxylation of THCA A and accordingly increase THC concentrations in hair. Additionally, it has to be kept in mind that in hair samples tested positive for THCA A at least a part of the 'non-artefact' THC probably derives from external contamination as well, because in condensate of cannabis smoke both THC and THCA A are present in relevant amounts. External contamination by side stream smoke could therefore explain the great differences in THC and THC-COOH hair concentrations commonly found in cannabis users.
Resumo:
The activation of 5-hydroxytryptamine-3 (5-HT-3) receptors in spinal cord can enhance intrinsic spinal mechanisms of central hypersensitivity, possibly leading to exaggerated pain responses. Clinical studies suggest that 5-HT-3 receptor antagonists may have an analgesic effect. This randomized, double-blind, placebo-controlled crossover study tested the hypothesis that the 5-HT-3 receptor antagonist tropisetron attenuates pain and central hypersensitivity in patients with chronic low back pain. Thirty patients with chronic low back pain, 15 of whom were women (aged 53 ± 14 years) and 15 men (aged 48 ± 14 years), were studied. A single intravenous injection of 0.9% saline solution, tropisetron 2mg, and tropisetron 5mg was administrated in 3 different sessions, in a double-blind crossover manner. The main outcome was the visual analogue scale (VAS) score of spontaneous low back pain before, and 15, 30, 60, and 90 minutes after drug administration. Secondary outcomes were nociceptive withdrawal reflexes to single and repeated electrical stimulation, area of reflex receptive fields, pressure pain detection and tolerance thresholds, conditioned pain modulation, and area of clinical pain. The data were analyzed by analysis of variance and panel multiple regressions. All 3 treatments reduced VAS scores. However, there was no statistically significant difference between tropisetron and placebo in VAS scores. Compared to placebo, tropisetron produced a statistically significant increase in pain threshold after single electrical stimulation, but no difference in all other secondary outcomes was found. A single-dose intravenous administration of tropisetron in patients with chronic low back pain had no significant specific effect on intensity of pain and most parameters of central hypersensitivity.
Resumo:
The ability of anesthetic agents to provide adequate analgesia and sedation is limited by the ventilatory depression associated with overdosing in spontaneously breathing patients. Therefore, quantitation of drug induced ventilatory depression is a pharmacokinetic-pharmacodynamic problem relevant to the practice of anesthesia. Although several studies describe the effect of respiratory depressant drugs on isolated endpoints, an integrated description of drug induced respiratory depression with parameters identifiable from clinically available data is not available. This study proposes a physiological model of CO2 disposition, ventilatory regulation, and the effects of anesthetic agents on the control of breathing. The predictive performance of the model is evaluated through simulations aimed at reproducing experimental observations of drug induced hypercarbia and hypoventilation associated with intravenous administration of a fast-onset, highly potent anesthetic mu agonist (including previously unpublished experimental data determined after administration of 1 mg alfentanil bolus). The proposed model structure has substantial descriptive capability and can provide clinically relevant predictions of respiratory inhibition in the non-steady-state to enhance safety of drug delivery in the anesthetic practice.
Resumo:
AIMS: In the Swiss heroin substitution trials, patients are treated with self-administered diacetylmorphine (heroin). Intravenous administration is not possible in patients that have venosclerosis. Earlier studies have demonstrated that oral diacetylmorphine may be used, although it is completely converted to morphine presystemically. Morphine bioavailability after high-dose oral diacetylmorphine is considerably higher than would be predicted from low-dose trials. The aim was to investigate whether the unexpectedly high bioavailability is due to a difference in the drug examined, and whether it depends on previous exposure or on dose. METHODS: Opioid-naive healthy volunteers and dependent patients from the Swiss heroin trials (n = 8 per group) received low doses of intravenous and oral deuterium-labelled morphine and diacetylmorphine, respectively. Patients also received a high oral diacetylmorphine dose. RESULTS: The maximum plasma concentration (C(max)) of morphine was twofold higher after oral diacetylmorphine than after morphine administration in both groups. However, morphine bioavailability was considerably higher in chronic users [diacetylmorphine 45.6% (95% confidence interval 40.0, 51.3), morphine 37.2% (30.1, 44.3)] than in naive subjects [diacetylmorphine 22.9% (16.4, 29.4), morphine 23.9% (16.5, 31.2)] after low oral doses (48.5 micromol) of either diacetylmorphine or morphine. Morphine clearance was similar in both groups. Moreover, oral absorption of morphine from diacetylmorphine was found to be dose dependent, with bioavailability reaching 64.2% (55.3, 73.1) for high diacetylmorphine doses (1601 micromol). CONCLUSIONS: Oral absorption of opioids is substance-, dose- and patient collective-dependent, suggesting that there may be a saturation of first-pass processes, the exact mechanism of which is not yet understood.
Resumo:
Twenty-three hours after heart transplantation, life-threatening acute right heart failure was diagnosed in a patient requiring continuous venovenous hemodiafiltration (CVVHDF). Increasing doses of catecholamines, sedatives, and muscle relaxants administered through a central venous catheter were ineffective. However, a bolus of epinephrine injected through an alternative catheter provoked a hypertensive crisis. Thus, interference with the central venous infusion by the dialysis catheter was suspected. The catheters were changed, and hemodynamics stabilized at lower catecholamine doses. When the effects of IV drugs are inadequate in patients receiving CVVHDF, interference with adjacent catheters resulting in elimination of the drug by CVVHDF should be suspected.
Resumo:
BACKGROUND Intravenous anaesthetic drugs are the primary means for producing general anaesthesia in equine practice. The ideal drug for intravenous anaesthesia has high reliability and pharmacokinetic properties indicating short elimination and lack of accumulation when administered for prolonged periods. Induction of general anaesthesia with racemic ketamine preceded by profound sedation has already an established place in the equine field anaesthesia. Due to potential advantages over racemic ketamine, S-ketamine has been employed in horses to induce general anaesthesia, but its optimal dose remains under investigation. The objective of this study was to evaluate whether 2.5 mg/kg S-ketamine could be used as a single intravenous bolus to provide short-term surgical anaesthesia in colts undergoing surgical castration, and to report its pharmacokinetic profile. RESULTS After premedication with romifidine and L-methadone, the combination of S-ketamine and diazepam allowed reaching surgical anaesthesia in the 28 colts. Induction of anaesthesia as well as recovery were good to excellent in the majority (n = 22 and 24, respectively) of the colts. Seven horses required additional administration of S-ketamine to prolong the duration of surgical anaesthesia. Redosing did not compromise recovery quality. Plasma concentration of S-ketamine decreased rapidly after administration, following a two-compartmental model, leading to the hypothesis of a consistent unchanged elimination of the parent compound into the urine beside its conversion to S-norketamine. The observed plasma concentrations of S-ketamine at the time of first movement were various and did not support the definition of a clear cut-off value to predict the termination of the drug effect. CONCLUSIONS The administration of 2.5 mg/kg IV S-ketamine after adequate premedication provided good quality of induction and recovery and a duration of action similar to what has been reported for racemic ketamine at the dose of 2.2 mg/kg. Until further investigations will be provided, close monitoring to adapt drug delivery is mandatory, particularly once the first 10 minutes after injection are elapsed. Taking into account rapid elimination of S-ketamine, significant inter-individual variability and rapid loss of effect over a narrow range of concentrations a sudden return of consciousness has to be foreseen.