166 resultados para Krohn, Pietro
Resumo:
BACKGROUND: Little is known about time trends, predictors, and consequences of changes made to antiretroviral therapy (ART) regimens early after patients initially start treatment. METHODS: We compared the incidence of, reasons for, and predictors of treatment change within 1 year after starting combination ART (cART), as well as virological and immunological outcomes at 1 year, among 1866 patients from the Swiss HIV Cohort Study who initiated cART during 2000--2001, 2002--2003, or 2004--2005. RESULTS: The durability of initial regimens did not improve over time (P = .15): 48.8% of 625 patients during 2000--2001, 43.8% of 607 during 2002--2003, and 44.3% of 634 during 2004--2005 changed cART within 1 year; reasons for change included intolerance (51.1% of all patients), patient wish (15.4%), physician decision (14.8%), and virological failure (7.1%). An increased probability of treatment change was associated with larger CD4+ cell counts, larger human immunodeficiency virus type 1 (HIV-1) RNA loads, and receipt of regimens that contained stavudine or indinavir/ritonavir, but a decreased probability was associated with receipt of regimens that contained tenofovir. Treatment discontinuation was associated with larger CD4+ cell counts, current use of injection drugs, and receipt of regimens that contained nevirapine. One-year outcomes improved between 2000--2001 and 2004--2005: 84.5% and 92.7% of patients, respectively, reached HIV-1 RNA loads of <50 copies/mL and achieved median increases in CD4+ cell counts of 157.5 and 197.5 cells/microL, respectively (P < .001 for all comparisons). CONCLUSIONS: Virological and immunological outcomes of initial treatments improved between 2000--2001 and 2004--2005, irrespective of uniformly high rates of early changes in treatment across the 3 study intervals.
Resumo:
BACKGROUND AND PURPOSE: To test the hypothesis that the National Institutes of Health Stroke Scale (NIHSS) score is associated with the findings of arteriography performed within the first hours after ischemic stroke. METHODS: We analyzed NIHSS scores on hospital admission and clinical and arteriographic findings of 226 consecutive patients (94 women, 132 men; mean age 62+/-12 years) who underwent arteriography within 6 hours of symptom onset in carotid stroke and within 12 hours in vertebrobasilar stroke. RESULTS: From stroke onset to hospital admission, 155+/-97 minutes elapsed, and from stroke onset to arteriography 245+/-100 minutes elapsed. Median NIHSS was 14 (range 3 to 38), and scores differed depending on the arteriographic findings (P<0.001). NIHSS scores in basilar, internal carotid, and middle cerebral artery M1 and M2 segment occlusions (central occlusions) were higher than in more peripherally located, nonvisible, or absent occlusions. Patients with NIHSS scores > or =10 had positive predictive values (PPVs) to show arterial occlusions in 97% of carotid and 96% of vertebrobasilar strokes. With an NIHSS score of > or =12, PPV to find a central occlusion was 91%. In a multivariate analysis, NIHSS subitems such as "level of consciousness questions," "gaze," "motor leg," and "neglect" were predictors of central occlusions. CONCLUSIONS: There is a significant association of NIHSS scores and the presence and location of a vessel occlusion. With an NIHSS score > or =10, a vessel occlusion will likely be seen on arteriography, and with a score > or =12, its location will probably be central.
Resumo:
BACKGROUND: Endometrial stromal sarcoma (ESS) represents 0.2% of all uterine malignancies. Based on the mitotic activity, a distinction is made between low and high-grade ESS. Although the overall five-year survival rate for low-grade ESS exceeds 80%, about 50% of the patients show tumor recurrence, mostly after a long latency period. Tumor invasion of the great vessels is extremely rare. We describe a patient with advanced low-grade ESS with tumor invasion of the infrarenal aorta and the inferior vena cava. The patient presented with a large tumor thrombus extending from the inferior vena cava into the right atrium. METHODS: Review of literature and identification of 19 patients, including our own case report, with advanced low-grade ESS with invasion of the great vessels and formation of an inferior vena cava tumor thrombus. RESULTS: All 19 patients presented with an abdominal tumor mass and a tumor thrombus protruding into the inferior vena cava. The tumor thrombus extended into the right heart cavities in nine patients reaching the right atrium in four, the right ventricle in three and the pulmonary artery in two patients. There were 5 patients with an advanced primary tumor and 14 patients with an advanced recurrent tumor. Seven patients presented with synchronous metastatic disease and six patients with a pelvic tumor infiltrating the bladder, the rectosigmoid colon or the infrarenal aorta. Mean age at surgery was 45.9+/-12.3 years (median 47, range 25-65 years). Tumor thrombectomy was accomplished by cavatomy or by right atriotomy after installation of a cardiopulmonary bypass. There was no peri-operative mortality and a very low morbidity. Radical tumor resections were achieved in 10 patients. The follow-up for these 10 patients was 2+/-1.3 years (median 2, range 0.3-4.5 years). Nine patients remained recurrence free whereas one patient suffered an asymptomatic local recurrence. CONCLUSIONS: Low-grade ESS is a rare angioinvasive tumor with a high recurrence rate. Resection of an inferior vena cava tumor thrombus, even with extension into the right heart cavities, can be performed safely. Extensive radical surgery is therefore justified in the treatment of advanced tumor manifestations of a low-grade ESS potentially improving recurrence free survival.
Resumo:
BACKGROUND: Tenofovir (TDF) use has been associated with proximal renal tubulopathy, reduced calculated glomerular filtration rates (cGFR) and losses in bone mineral density. Bone resorption could result in a compensatory osteoblast activation indicated by an increase in serum alkaline phosphatase (sAP). A few small studies have reported a positive correlation between renal phosphate losses, increased bone turnover and sAP. METHODS: We analysed sAP dynamics in patients initiating (n = 657), reinitiating (n = 361) and discontinuing (n = 73) combined antiretroviral therapy with and without TDF and assessed correlations with clinical and epidemiological parameters. RESULTS: TDF use was associated with a significant increase of sAP from a median of 74 U/I (interquartile range 60-98) to a plateau of 99 U/I (82-123) after 6 months (P < 0.0001), with a prompt return to baseline upon TDF discontinuation. No change occurred in TDF-sparing regimes. Univariable and multivariable linear regression analyses revealed a positive correlation between sAP and TDF use (P < or = 0.003), but no correlation with baseline cGFR, TDF-related cGFR reduction, changes in serum alanine aminotransferase (sALT) or active hepatitis C. CONCLUSIONS: We document a highly significant association between TDF use and increased sAP in a large observational cohort. The lack of correlation between TDF use and sALT suggests that the increase in sAP is because of the bone isoenzyme and indicates stimulated bone turnover. This finding, together with published data on TDF-related renal phosphate losses, this finding raises concerns that TDF use could result in osteomalacia with a loss in bone mineral density at least in a subset of patients. This potentially severe long-term toxicity should be addressed in future studies.
Resumo:
BACKGROUND: In recent years, treatment options for human immunodeficiency virus type 1 (HIV-1) infection have changed from nonboosted protease inhibitors (PIs) to nonnucleoside reverse-transcriptase inhibitors (NNRTIs) and boosted PI-based antiretroviral drug regimens, but the impact on immunological recovery remains uncertain. METHODS: During January 1996 through December 2004 [corrected] all patients in the Swiss HIV Cohort were included if they received the first combination antiretroviral therapy (cART) and had known baseline CD4(+) T cell counts and HIV-1 RNA values (n = 3293). For follow-up, we used the Swiss HIV Cohort Study database update of May 2007 [corrected] The mean (+/-SD) duration of follow-up was 26.8 +/- 20.5 months. The follow-up time was limited to the duration of the first cART. CD4(+) T cell recovery was analyzed in 3 different treatment groups: nonboosted PI, NNRTI, or boosted PI. The end point was the absolute increase of CD4(+) T cell count in the 3 treatment groups after the initiation of cART. RESULTS: Two thousand five hundred ninety individuals (78.7%) initiated a nonboosted-PI regimen, 452 (13.7%) initiated an NNRTI regimen, and 251 (7.6%) initiated a boosted-PI regimen. Absolute CD4(+) T cell count increases at 48 months were as follows: in the nonboosted-PI group, from 210 to 520 cells/muL; in the NNRTI group, from 220 to 475 cells/muL; and in the boosted-PI group, from 168 to 511 cells/muL. In a multivariate analysis, the treatment group did not affect the response of CD4(+) T cells; however, increased age, pretreatment with nucleoside reverse-transcriptase inhibitors, serological tests positive for hepatitis C virus, Centers for Disease Control and Prevention stage C infection, lower baseline CD4(+) T cell count, and lower baseline HIV-1 RNA level were risk factors for smaller increases in CD4(+) T cell count. CONCLUSION: CD4(+) T cell recovery was similar in patients receiving nonboosted PI-, NNRTI-, and boosted PI-based cART.
Resumo:
BACKGROUND: The human immunodeficiency virus type 1 reverse-transcriptase mutation K65R is a single-point mutation that has become more frequent after increased use of tenofovir disoproxil fumarate (TDF). We aimed to identify predictors for the emergence of K65R, using clinical data and genotypic resistance tests from the Swiss HIV Cohort Study. METHODS: A total of 222 patients with genotypic resistance tests performed while receiving treatment with TDF-containing regimens were stratified by detectability of K65R (K65R group, 42 patients; undetected K65R group, 180 patients). Patient characteristics at start of that treatment were analyzed. RESULTS: In an adjusted logistic regression, TDF treatment with nonnucleoside reverse-transcriptase inhibitors and/or didanosine was associated with the emergence of K65R, whereas the presence of any of the thymidine analogue mutations D67N, K70R, T215F, or K219E/Q was protective. The previously undescribed mutational pattern K65R/G190S/Y181C was observed in 6 of 21 patients treated with efavirenz and TDF. Salvage therapy after TDF treatment was started for 36 patients with K65R and for 118 patients from the wild-type group. Proportions of patients attaining human immunodeficiency virus type 1 loads <50 copies/mL after 24 weeks of continuous treatment were similar for the K65R group (44.1%; 95% confidence interval, 27.2%-62.1%) and the wild-type group (51.9%; 95% confidence interval, 42.0%-61.6%). CONCLUSIONS: In settings where thymidine analogue mutations are less likely to be present, such as at start of first-line therapy or after extended treatment interruptions, combinations of TDF with other K65R-inducing components or with efavirenz or nevirapine may carry an enhanced risk of the emergence of K65R. The finding of a distinct mutational pattern selected by treatment with TDF and efavirenz suggests a potential fitness interaction between K65R and nonnucleoside reverse-transcriptase inhibitor-induced mutations.
Resumo:
BACKGROUND: The aim of this study was to explore the predictive value of longitudinal self-reported adherence data on viral rebound. METHODS: Individuals in the Swiss HIV Cohort Study on combined antiretroviral therapy (cART) with RNA <50 copies/ml over the previous 3 months and who were interviewed about adherence at least once prior to 1 March 2007 were eligible. Adherence was defined in terms of missed doses of cART (0, 1, 2 or >2) in the previous 28 days. Viral rebound was defined as RNA >500 copies/ml. Cox regression models with time-independent and -dependent covariates were used to evaluate time to viral rebound. RESULTS: A total of 2,664 individuals and 15,530 visits were included. Across all visits, missing doses were reported as follows: 1 dose 14.7%, 2 doses 5.1%, >2 doses 3.8% taking <95% of doses 4.5% and missing > or =2 consecutive doses 3.2%. In total, 308 (11.6%) patients experienced viral rebound. After controlling for confounding variables, self-reported non-adherence remained significantly associated with the rate of occurrence of viral rebound (compared with zero missed doses: 1 dose, hazard ratio [HR] 1.03, 95% confidence interval [CI] 0.72-1.48; 2 doses, HR 2.17, 95% CI 1.46-3.25; >2 doses, HR 3.66, 95% CI 2.50-5.34). Several variables significantly associated with an increased risk of viral rebound irrespective of adherence were identified: being on a protease inhibitor or triple nucleoside regimen (compared with a non-nucleoside reverse transcriptase inhibitor), >5 previous cART regimens, seeing a less-experienced physician, taking co-medication, and a shorter time virally suppressed. CONCLUSIONS: A simple self-report adherence questionnaire repeatedly administered provides a sensitive measure of non-adherence that predicts viral rebound.
Resumo:
BACKGROUND: The outcome of Kaposi sarcoma varies. While many patients do well on highly active antiretroviral therapy, others have progressive disease and need chemotherapy. In order to predict which patients are at risk of unfavorable evolution, we established a prognostic score. METHOD: The survival analysis (Kaplan-Meier method; Cox proportional hazards models) of 144 patients with Kaposi sarcoma prospectively included in the Swiss HIV Cohort Study, from January 1996 to December 2004, was conducted. OUTCOME ANALYZED: use of chemotherapy or death. VARIABLES ANALYZED: demographics, tumor staging [T0 or T1 (16)], CD4 cell counts and HIV-1 RNA concentration, human herpesvirus 8 (HHV8) DNA in plasma and serological titers to latent and lytic antigens. RESULTS: Of 144 patients, 54 needed chemotherapy or died. In the univariate analysis, tumor stage T1, CD4 cell count below 200 cells/microl, positive HHV8 DNA and absence of antibodies against the HHV8 lytic antigen at the time of diagnosis were significantly associated with a bad outcome.Using multivariate analysis, the following variables were associated with an increased risk of unfavorable outcome: T1 [hazard ratio (HR) 5.22; 95% confidence interval (CI) 2.97-9.18], CD4 cell count below 200 cells/microl (HR 2.33; 95% CI 1.22-4.45) and positive HHV8 DNA (HR 2.14; 95% CI 1.79-2.85).We created a score with these variables ranging from 0 to 4: T1 stage counted for two points, CD4 cell count below 200 cells/microl for one point, and positive HHV8 viral load for one point. Each point increase was associated with a HR of 2.26 (95% CI 1.79-2.85). CONCLUSION: In the multivariate analysis, staging (T1), CD4 cell count (<200 cells/microl), positive HHV8 DNA in plasma, at the time of diagnosis, predict evolution towards death or the need of chemotherapy.
Resumo:
OBJECTIVE: To determine the effects of cognitive-behavioral stress management (CBSM) training on clinical and psychosocial markers in HIV-infected persons. METHODS: A randomized controlled trial in four HIV outpatient clinics of 104 HIV-infected persons taking combination antiretroviral therapy (cART), measuring HIV-1 surrogate markers, adherence to therapy and well-being 12 months after 12 group sessions of 2 h CBSM training. RESULTS: Intent-to-treat analyses showed no effects on HIV-1 surrogate markers in the CBSM group compared with the control group: HIV-1 RNA < 50 copies/ml in 81.1% [95% confidence interval (CI), 68.0-90.6] and 74.5% (95% CI, 60.4-85.7), respectively (P = 0.34), and mean CD4 cell change from baseline of 53.0 cells/microl (95% CI, 4.1-101.8) and 15.5 cells/microl (95% CI, -34.3 to 65.4), respectively (P = 0.29). Self-reported adherence to therapy did not differ between groups at baseline (P = 0.53) or at 12 month's post-intervention (P = 0.47). Significant benefits of CBSM over no intervention were observed in mean change of quality of life scores: physical health 2.9 (95% CI, 0.7-5.1) and -0.2 (95% CI, -2.1 to 1.8), respectively (P = 0.05); mental health 4.8 (95% CI, 1.8-7.3) and -0.5 (95% CI, -3.3 to 2.2) (P = 0.02); anxiety -2.1 (95% CI, -3.6 to -1.0) and 0.3 (95% CI, -0.7 to 1.4), respectively (P = 0.002); and depression -2.1 (95% CI, -3.2 to -0.9) and 0.02 (95% CI, -1.0 to 1.1), respectively (P = 0.001). Alleviation of depression and anxiety symptoms were most pronounced among participants with high psychological distress at baseline. CONCLUSION: CBSM training of HIV-infected persons taking on cART does not improve clinical outcome but has lasting effects on quality of life and psychological well-being.
Resumo:
INTRODUCTION: A multi-centre study has been conducted, during 2005, by means of a questionnaire posted on the Italian Society of Emergency Medicine (SIMEU) web page. Our intention was to carry out an organisational and functional analysis of Italian Emergency Departments (ED) in order to pick out some macro-indicators of the activities performed. Participation was good, in that 69 ED (3,285,440 admissions to emergency services) responded to the questionnaire. METHODS: The study was based on 18 questions: 3 regarding the personnel of the ED, 2 regarding organisational and functional aspects, 5 on the activity of the ED, 7 on triage and 1 on the assessment of the quality perceived by the users of the ED. RESULTS AND CONCLUSION: The replies revealed that 91.30% of the ED were equipped with data-processing software, which, in 96.83% of cases, tracked the entire itinerary of the patient. About 48,000 patients/year used the ED: 76.72% were discharged and 18.31% were hospitalised. Observation Units were active in 81.16% of the ED examined. Triage programmes were in place in 92.75% of ED: in 75.81% of these, triage was performed throughout the entire itinerary of the patient; in 16.13% it was performed only symptom-based, and in 8.06% only on-call. Of the patients arriving at the ED, 24.19% were assigned a non-urgent triage code, 60.01% a urgent code, 14.30% a emergent code and 1.49% a life-threatening code. Waiting times were: 52.39 min for non-urgent patients, 40.26 min for urgent, 12.08 for emergent, and 1.19 for life-threatening patients.
Resumo:
Multi-parametric and quantitative magnetic resonance imaging (MRI) techniques have come into the focus of interest, both as a research and diagnostic modality for the evaluation of patients suffering from mild cognitive decline and overt dementia. In this study we address the question, if disease related quantitative magnetization transfer effects (qMT) within the intra- and extracellular matrices of the hippocampus may aid in the differentiation between clinically diagnosed patients with Alzheimer disease (AD), patients with mild cognitive impairment (MCI) and healthy controls. We evaluated 22 patients with AD (n=12) and MCI (n=10) and 22 healthy elderly (n=12) and younger (n=10) controls with multi-parametric MRI. Neuropsychological testing was performed in patients and elderly controls (n=34). In order to quantify the qMT effects, the absorption spectrum was sampled at relevant off-resonance frequencies. The qMT-parameters were calculated according to a two-pool spin-bath model including the T1- and T2 relaxation parameters of the free pool, determined in separate experiments. Histograms (fixed bin-size) of the normalized qMT-parameter values (z-scores) within the anterior and posterior hippocampus (hippocampal head and body) were subjected to a fuzzy-c-means classification algorithm with downstreamed PCA projection. The within-cluster sums of point-to-centroid distances were used to examine the effects of qMT- and diffusion anisotropy parameters on the discrimination of healthy volunteers, patients with Alzheimer and MCIs. The qMT-parameters T2(r) (T2 of the restricted pool) and F (fractional pool size) differentiated between the three groups (control, MCI and AD) in the anterior hippocampus. In our cohort, the MT ratio, as proposed in previous reports, did not differentiate between MCI and AD or healthy controls and MCI, but between healthy controls and AD.
Resumo:
Intestinal intraepithelial lymphocytes (IEL) are specialized subsets of T cells with distinct functional capacities. While some IEL subsets are circulating, others such as CD8alphaalpha TCRalphabeta IEL are believed to represent non-circulating resident T cell subsets [Sim, G.K., Intraepithelial lymphocytes and the immune system. Adv. Immunol., 1995. 58: 297-343.]. Current methods to obtain enriched preparations of intraepithelial lymphocytes are mostly based on Percoll density gradient or magnetic bead-based technologies [Lundqvist, C., et al., Isolation of functionally active intraepithelial lymphocytes and enterocytes from human small and large intestine. J. Immunol. Methods, 1992. 152(2): 253-263.]. However, these techniques are hampered by a generally low yield of isolated cells, and potential artifacts due to the interference of the isolation procedure with subsequent functional assays, in particular, when antibodies against cell surface markers are required. Here we describe a new method for obtaining relatively pure populations of intestinal IEL (55-75%) at a high yield (>85%) by elutriation centrifugation. This technique is equally suited for the isolation and enrichment of intraepithelial lymphocytes of both mouse and human origin. Time requirements for fractionating cell suspensions by elutriation centrifugation are comparable to Percoll-, or MACS-based isolation procedures. Hence, the substantially higher yield and the consistent robust enrichment for intraepithelial lymphocytes, together with the gentle treatment of the cells during elutriation that does not interfere with subsequent functional assays, are important aspects that are in favor of using this elegant technology to obtain unmanipulated, unbiased populations of intestinal intraepithelial lymphocytes, and, if desired, also of pure epithelial cells.
Resumo:
BACKGROUND: Surfactant protein type B (SPB) is needed for alveolar gas exchange. SPB is increased in the plasma of patients with heart failure (HF), with a concentration that is higher when HF severity is highest. The aim of this study was to evaluate the relationship between plasma SPB and both alveolar-capillary diffusion at rest and ventilation versus carbon dioxide production during exercise. METHODS AND RESULTS: Eighty patients with chronic HF and 20 healthy controls were evaluated consecutively, but the required quality for procedures was only reached by 71 patients with HF and 19 healthy controls. Each subject underwent pulmonary function measurements, including lung diffusion for carbon monoxide and membrane diffusion capacity, and maximal cardiopulmonary exercise test. Plasma SPB was measured by immunoblotting. In patients with HF, SPB values were higher (4.5 [11.1] versus 1.6 [2.9], P=0.0006, median and 25th to 75th interquartile), whereas lung diffusion for carbon monoxide (19.7+/-4.5 versus 24.6+/-6.8 mL/mm Hg per min, P<0.0001, mean+/-SD) and membrane diffusion capacity (28.9+/-7.4 versus 38.7+/-14.8, P<0.0001) were lower. Peak oxygen consumption and ventilation/carbon dioxide production slope were 16.2+/-4.3 versus 26.8+/-6.2 mL/kg per min (P<0.0001) and 29.7+/-5.9 and 24.5+/-3.2 (P<0.0001) in HF and controls, respectively. In the HF population, univariate analysis showed a significant relationship between plasma SPB and lung diffusion for carbon monoxide, membrane diffusion capacity, peak oxygen consumption, and ventilation/carbon dioxide production slope (P<0.0001 for all). On multivariable logistic regression analysis, membrane diffusion capacity (beta, -0.54; SE, 0.018; P<0.0001), peak oxygen consumption (beta, -0.53; SE, 0.036; P=0.004), and ventilation/carbon dioxide production slope (beta, 0.25; SE, 0.026; P=0.034) were independently associated with SPB. CONCLUSIONS: Circulating plasma SPB levels are related to alveolar gas diffusion, overall exercise performance, and efficiency of ventilation showing a link between alveolar-capillary barrier damage, gas exchange abnormalities, and exercise performance in HF.
Resumo:
The aim of this study was to investigate the effect of human recombinant erythropoietin (EPO) on the microcirculation and oxygenation of critically ischemic tissue and to elucidate the role of endothelial NO synthase in EPO-mediated tissue protection. Island flaps were dissected from the back skin of anesthetized male Syrian golden hamsters including a critically ischemic, hypoxic area that was perfused via a collateralized vasculature. Before ischemia, animals received an injection of epoetin beta at a dose of 5,000 U/kg body weight with (n = 7) or without (n = 7) blocking NO synthase by 30 mg/kg body weight L-NAME (Nomega-nitro-L-arginine methyl ester hydrochloride). Saline-treated animals served as control (n = 7). Ischemic tissue damage was characterized by severe hypoperfusion and inflammation, hypoxia, and accumulation of apoptotic cell nuclei after 5 h of collateralization. Erythropoietin pretreatment increased arteriolar and venular blood flow by 33% and 37%, respectively (P < 0.05), and attenuated leukocytic inflammation by approximately 75% (P < 0.05). Furthermore, partial tissue oxygen tension in the ischemic tissue increased from 8.2 to 15.8 mmHg (P < 0.05), which was paralleled by a 21% increased density of patent capillaries (P < 0.05) and a 50% reduced apoptotic cell count (P < 0.05). The improved microcirculation and oxygenation were associated with a 2.2-fold (P < 0.05) increase of endothelial NO synthase protein expression. Of interest, L-NAME completely abolished all the beneficial effects of EPO pretreatment. Our study demonstrates that, in critically ischemic and hypoxic collateralized tissue, EPO pretreatment improves tissue perfusion and oxygenation in vivo. This effect may be attributed to NO-dependent vasodilative effects and anti-inflammatory actions on the altered vascular endothelium.