335 resultados para ANKARA BOOST REGIMEN
Resumo:
Objective: To measure renal tissue oxygenation in young normo-and hypertensive volunteers under conditions of salt loading and depletion using blood oxygen level dependent magnetic resonance imaging (BOLD-MRI). Design and Methods: Ten normotensive (NT) male volunteers (age 26.5_7.4 y) and eight non-treated, hypertensive (HT) male volunteers (age 28.8_5.7 y) were studied after one week on a high salt (HS) regimen (6g of salt/day added to their normal regimen) and again after one week of a low sodium diet (LS). On the 8th day, BOLD-MRI was performed under standard hydration conditions. Four coronal slices were selected in each kidney, and combination sequence was used to acquire T2* weighted images. The mean R2* (1/T2*) was measured to determine cortical and medullar oxygenation. Results: Baseline characteristics and their changes are shown in the table. The mean cortical R2* was not different under conditions of HS or LS (17.8_1.3 vs. 18.2_0.6 respectively in NT group, p_0.27; 17.4_0.6 vs 17.8_0.9 in HT group, p_0.16). However, the mean medullary R2* was significantly lower under LS conditions in both groups (31.3_0.6 vs 28.1_0.8 in NT group, p_0.05; 30.3_0.8 vs 27.9_1.5 in HT group, p_0.05), corresponding to higher medullary oxygenation as compared to HS conditions, without significant changes in hemoglobin or hematocrit values. The salt induced changes in medullary oxygenation were comparable in the two groups (ANOVA, p_0.1). Conclusion: Dietary sodium restriction leads to increased renal medullary oxygenation compared to high sodium intake in normo-and hypertensive subjects. This observation may in part explain the potential renal benefits of a low sodium intake.
Resumo:
IMPORTANCE: New data and antiretroviral regimens expand treatment choices in resource-rich settings and warrant an update of recommendations to treat adults infected with human immunodeficiency virus (HIV). OBJECTIVE: To provide updated treatment recommendations for adults with HIV, emphasizing when to start treatment; what treatment to start; the use of laboratory monitoring tools; and managing treatment failure, switches, and simplification. DATA SOURCES, STUDY SELECTION, AND DATA SYNTHESIS: An International Antiviral Society-USA panel of experts in HIV research and patient care considered previous data and reviewed new data since the 2012 update with literature searches in PubMed and EMBASE through June 2014. Recommendations and ratings were based on the quality of evidence and consensus. RESULTS: Antiretroviral therapy is recommended for all adults with HIV infection. Evidence for benefits of treatment and quality of available data increase at lower CD4 cell counts. Recommended initial regimens include 2 nucleoside reverse transcriptase inhibitors (NRTIs; abacavir/lamivudine or tenofovir disoproxil fumarate/emtricitabine) and a third single or boosted drug, which should be an integrase strand transfer inhibitor (dolutegravir, elvitegravir, or raltegravir), a nonnucleoside reverse transcriptase inhibitor (efavirenz or rilpivirine) or a boosted protease inhibitor (darunavir or atazanavir). Alternative regimens are available. Boosted protease inhibitor monotherapy is generally not recommended, but NRTI-sparing approaches may be considered. New guidance for optimal timing of monitoring of laboratory parameters is provided. Suspected treatment failure warrants rapid confirmation, performance of resistance testing while the patient is receiving the failing regimen, and evaluation of reasons for failure before consideration of switching therapy. Regimen switches for adverse effects, convenience, or to reduce costs should not jeopardize antiretroviral potency. CONCLUSIONS AND RELEVANCE: After confirmed diagnosis of HIV infection, antiretroviral therapy should be initiated in all individuals who are willing and ready to start treatment. Regimens should be selected or changed based on resistance test results with consideration of dosing frequency, pill burden, adverse toxic effect profiles, comorbidities, and drug interactions.
Resumo:
With the availability of new generation sequencing technologies, bacterial genome projects have undergone a major boost. Still, chromosome completion needs a costly and time-consuming gap closure, especially when containing highly repetitive elements. However, incomplete genome data may be sufficiently informative to derive the pursued information. For emerging pathogens, i.e. newly identified pathogens, lack of release of genome data during gap closure stage is clearly medically counterproductive. We thus investigated the feasibility of a dirty genome approach, i.e. the release of unfinished genome sequences to develop serological diagnostic tools. We showed that almost the whole genome sequence of the emerging pathogen Parachlamydia acanthamoebae was retrieved even with relatively short reads from Genome Sequencer 20 and Solexa. The bacterial proteome was analyzed to select immunogenic proteins, which were then expressed and used to elaborate the first steps of an ELISA. This work constitutes the proof of principle for a dirty genome approach, i.e. the use of unfinished genome sequences of pathogenic bacteria, coupled with proteomics to rapidly identify new immunogenic proteins useful to develop in the future specific diagnostic tests such as ELISA, immunohistochemistry and direct antigen detection. Although applied here to an emerging pathogen, this combined dirty genome sequencing/proteomic approach may be used for any pathogen for which better diagnostics are needed. These genome sequences may also be very useful to develop DNA based diagnostic tests. All these diagnostic tools will allow further evaluations of the pathogenic potential of this obligate intracellular bacterium.
Mycophenolic acid formulations in adult renal transplantation - update on efficacy and tolerability.
Resumo:
The description more than 30 years ago of the role of de novo purine synthesis in T and B lymphocytes clonal proliferation opened the possibility for selective immunosuppression by targeting specific enzymatic pathways. Mycophenolic acid (MPA) blocks the key enzyme inosine monophosphate dehydrogenase and the production of guanosine nucleotides required for DNA synthesis. Two MPA formulations are currently used in clinical transplantation as part of the maintenance immunosuppressive regimen. Mycophenolate mofetil (MMF) was the first MPA agent to be approved for the prevention of acute rejection following renal transplantation, in combination with cyclosporine and steroids. Enteric-coated mycophenolate sodium (EC-MPS) is an alternative MPA formulation available in clinical transplantation. In this review, we will discuss the clinical trials that have evaluated the efficacy and safety of MPA in adult kidney transplantation for the prevention of acute rejection and their use in new combination regimens aiming at minimizing calcineurin inhibitor toxicity and chronic allograft nephropathy. We will also discuss MPA pharmacokinetics and the rationale for therapeutic drug monitoring in optimizing the balance between efficacy and safety in individual patients.
Resumo:
BACKGROUND: Early virological failure of antiretroviral therapy associated with the selection of drug-resistant human immunodeficiency virus type 1 in treatment-naive patients is very critical, because virological failure significantly increases the risk of subsequent failures. Therefore, we evaluated the possible role of minority quasispecies of drug-resistant human immunodeficiency virus type 1, which are undetectable at baseline by population sequencing, with regard to early virological failure. METHODS: We studied 4 patients who experienced early virological failure of a first-line regimen of lamivudine, tenofovir, and either efavirenz or nevirapine and 18 control patients undergoing similar treatment without virological failure. The key mutations K65R, K103N, Y181C, M184V, and M184I in the reverse transcriptase were quantified by allele-specific real-time polymerase chain reaction performed on plasma samples before and during early virological treatment failure. RESULTS: Before treatment, none of the viruses showed any evidence of drug resistance in the standard genotype analysis. Minority quasispecies with either the M184V mutation or the M184I mutation were detected in 3 of 18 control patients. In contrast, all 4 patients whose treatment was failing had harbored drug-resistant viruses at low frequencies before treatment, with a frequency range of 0.07%-2.0%. A range of 1-4 mutations was detected in viruses from each patient. Most of the minority quasispecies were rapidly selected and represented the major virus population within weeks after the patients started antiretroviral therapy. All 4 patients showed good adherence to treatment. Nonnucleoside reverse-transcriptase inhibitor plasma concentrations were in normal ranges for all 4 patients at 2 separate assessment times. CONCLUSIONS: Minority quasispecies of drug-resistant viruses, detected at baseline, can rapidly outgrow and become the major virus population and subsequently lead to early therapy failure in treatment-naive patients who receive antiretroviral therapy regimens with a low genetic resistance barrier.
Resumo:
Background: In order to improve the immunogenicity of currently available non-replicating pox virus HIV vaccine vectors, NYVAC was genetically modified through re-insertion of two host range genes (K1L and C7L), resulting in restored replicative capacity in human cells. Methods: In the present study these vectors, expressing either a combination of the HIV-1 clade C antigens Env, Gag, Pol, Nef, or a combination of Gal, Pol, Nef were evaluated for safety and immunogenicity in rhesus macaques, which were immunized at weeks 0, 4 and 12 either by scarification (conventional poxvirus route of immunization), intradermal or by intramuscular injection (route used in previous vaccine studies). Results: Replication competent NYVAC-C-KC vectors induced higher HIV-specific responses, as measured by IFN-g ELISpot assay, than the replication defective NYVAC-C vectors. Application through scarification only required one immunization to induce maximum HIV-specific immune responses. This method simultaneously induced relatively lower anti-vector responses. In contrast, two to three immunizations were required when the NYVAC-C-KC vectors were given by intradermal or intramuscular injection and this method tended to generate slightly lower responses. Responses were predominantly directed against Env in the animals that received NYVAC-C-KC vectors expressing HIV-1 Env, Gag, Pol, Nef, while Gag responses were dominant in the NYVAC-C-KC HIV-1 Gag, Pol, Nef immunized animals. Conclusion: The current study demonstrates that NYVAC replication competent vectors were well tolerated and showed increased immunogenicity as compared to replication defective vectors. Further studies are needed to evaluate the most efficient route of immunization and to explore the use of these replication competent NYVAC vectors in prime/boost combination with gp120 proteinbased vaccine candidates. This study was performed within the Poxvirus T-cell Vaccine Discovery Consortium (PTVDC) which is part of the CAVD program.
Resumo:
Tumor vaccines may induce activation and expansion of specific CD8 T cells which can subsequently destroy tumor cells in cancer patients. This phenomenon can be observed in approximately 5-20% of vaccinated melanoma patients. We searched for factors associated with T cell responsiveness to peptide vaccines. Peptide antigen-specific T cells were quantified and characterized ex vivo before and after vaccination. T cell responses occurred primarily in patients with T cells that were already pre-activated before vaccination. Thus, peptide vaccines can efficiently boost CD8 T cells that are pre-activated by endogenous tumor antigen. Our results identify a new state of T cell responsiveness and help to explain and predict tumor vaccine efficacy.
Resumo:
Introduction: The last twenty years has witnessed important changes in the field of obstetric analgesia and anesthesia. In 2007, we conducted a survey to obtain information regarding the clinical practice of obstetric anesthesia in our country. The main objective was to ascertain whether recent developments in obstetric anesthesia had been adequately implemented into current clinical practice. Methodology: A confidential questionnaire was sent to 391 identified wiss obstetric anesthetists. The questionnaire included 58 questions on 5 main topics: activity and organization of the obstetric unit, practice of labor analgesia, practice of anesthesia for caesarean section, prevention of aspiration syndrome, and pain treatment after cesarean section. Results: The response rate was 80% (311/391). 66% of the surveyed anesthetists worked in intermediate size obstetric units (500-1500 deliveries per year). An anesthetist was on site 24/24 hours in only 53% of the obstetric units. Epidural labor analgesia with low dose local anesthetics combined with opioids was used by 87% but only 30% used patient controlled epidural analgesia (PCEA). Spinal anesthesia was the first choice for elective and urgent cesarean section for 95% of the responders. Adequate prevention of aspiration syndrome was prescribed by 78%. After cesarean section, a multimodal analgesic regimen was prescribed by 74%. Conclusion: When comparing these results with those of the two previous Swiss surveys [1, 2], it clearly appears that Swiss obstetric anesthetists have progressively adapted their practice to current clinical recommendations. But this survey also revealed some insufficiencies: 1. Of the public health system: a. Insufficient number of obstetric anesthetists on site 24 hours/24. b. Lack of budget in some hospitals to purchase PCEA pumps. 2. Of individual medical practice: a. Frequent excessive dosage of hyperbaric bupivacaine during spinal anesthesia for cesarean section. b. Frequent use of cristalloid preload before spinal anesthesia for cesarean section. c. Frequent systematic use of opioids when inducing general anesthesia for cesarean section. d. Fentanyl as the first choice opioid during induction of general anesthesia for severe preeclampsia. In the future, wider and more systematic information campaigns by the mean of the Swiss Association of Obstetric Anesthesia (SAOA) should be able to correct these points.
Resumo:
Background: This study explores significant ones' implication before and after transplantation. Methods: Longitudinal semi-structured interviews were conducted in 64 patients awaiting all-organ transplantation. Among them, 58 patients spontaneously discussed the importance of their significant other in their daily support. Discourse analysis was applied. Findings: During the pre-transplantation period renal patients reported that significant others took part in dialysis treatment and participated to regimen adherence. After transplantation, quality of life improved and the couple dynamics returned to normal. Patients awaiting lung or heart transplantation were more heavily impaired. Significant others had to take over abandoned roles. After transplantation resuming normal life became gradually possible, but after one year either transplantation health benefits relieved physical, emotional and social loads, or complications maintained the level of stress on significant others. Discussion: Patients reported that significant others had to take over various responsibilities and were concerned about long-term stress that should be adequately supported.
Resumo:
BACKGROUND: In recent years, treatment options for human immunodeficiency virus type 1 (HIV-1) infection have changed from nonboosted protease inhibitors (PIs) to nonnucleoside reverse-transcriptase inhibitors (NNRTIs) and boosted PI-based antiretroviral drug regimens, but the impact on immunological recovery remains uncertain. METHODS: During January 1996 through December 2004 [corrected] all patients in the Swiss HIV Cohort were included if they received the first combination antiretroviral therapy (cART) and had known baseline CD4(+) T cell counts and HIV-1 RNA values (n = 3293). For follow-up, we used the Swiss HIV Cohort Study database update of May 2007 [corrected] The mean (+/-SD) duration of follow-up was 26.8 +/- 20.5 months. The follow-up time was limited to the duration of the first cART. CD4(+) T cell recovery was analyzed in 3 different treatment groups: nonboosted PI, NNRTI, or boosted PI. The end point was the absolute increase of CD4(+) T cell count in the 3 treatment groups after the initiation of cART. RESULTS: Two thousand five hundred ninety individuals (78.7%) initiated a nonboosted-PI regimen, 452 (13.7%) initiated an NNRTI regimen, and 251 (7.6%) initiated a boosted-PI regimen. Absolute CD4(+) T cell count increases at 48 months were as follows: in the nonboosted-PI group, from 210 to 520 cells/muL; in the NNRTI group, from 220 to 475 cells/muL; and in the boosted-PI group, from 168 to 511 cells/muL. In a multivariate analysis, the treatment group did not affect the response of CD4(+) T cells; however, increased age, pretreatment with nucleoside reverse-transcriptase inhibitors, serological tests positive for hepatitis C virus, Centers for Disease Control and Prevention stage C infection, lower baseline CD4(+) T cell count, and lower baseline HIV-1 RNA level were risk factors for smaller increases in CD4(+) T cell count. CONCLUSION: CD4(+) T cell recovery was similar in patients receiving nonboosted PI-, NNRTI-, and boosted PI-based cART.
Resumo:
Glucocorticoids are used in an attempt to reduce brain edema secondary to head injury. Nevertheless, their usefulness remains uncertain and contradictory. In a randomized study of 24 children with severe head injury, urinary free cortisol was measured by radioimmunoassay. Twelve patients (group 1) received dexamethasone and 12 (group 2) did not. All patients were treated with a standardized regimen. In group 1 there was complete suppression of endogenous cortisol production. In group 2 free cortisol was up to 20-fold higher than under basal conditions and reached maximum values on days 1-3. Since the excretion of cortisol in urine reflects the production rate closely and is not influenced by liver function and barbiturates, the results in group 2 show that the endogenous production of steroids is an adequate reaction to severe head injury. Exogenous glucocorticoids are thus unlikely to have any more beneficial effects than endogenous cortisol.
Resumo:
Background: EATL is a rare subtype of peripheral T-cell lymphomas characterized by primarily intestinal localization and a frequent association with celiac disease. The prognosis is considered to be poor with conventional chemotherapy. Limited data is available on the efficacy of ASCT in this lymphoma subtype. Primary objective: was to study the outcome of ASCT as a consolidation or salvage strategy for EATL. The primary endpoint was overall survival (OS) and progression-free survival (PFS). Eligible patients were > 18 years who had received ASCT between 2000-2010 for EATL that was confirmed by review of written histopathology reports, and had sufficient information on disease history and follow-up available. The search strategy used the EBMT database to identify patients potentially fulfilling the eligibility criteria. An additional questionnaire was sent to individual transplant centres to confirm histological diagnosis (histopathology report or pathology review) as well as updated follow-up data. Patients and transplant characteristics were compared between groups using X2 test or Fisher's exact test for categorical variables and t-test or Mann-Whiney U-test for continuous variables. OS and PFS were estimated using the Kaplan-Meier product-limit estimate and compared by the log-rank test. Estimates for non-relapse mortality (NRM) and relapse or progression were calculated using cumulative incidence rates to accommodate competing risk and compared to Gray's test. Results: Altogether 138 patients were identified. Updated follow-up data was received from 74 patients (54 %) and histology report from 54 patients (39 %). In ten patients the diagnosis of EATL could not be adequately verified. Thus the final analysis included 44. There were 24 males and 20 females with a median age of 56 (35-72) years at the time of transplant. Twenty-five patients (57 %) had a history of celiac disease. Disease stage was I in nine patients (21 %), II in 14 patients (33 %) and IV in 19 patients (45 %). Twenty-four patients (55 %) were in the first CR or PR at the time of transplant. BEAM was used as a high-dose regimen in 36 patients (82 %) and all patients received peripheral blood grafts. The median follow-up for survivors was 46 (2-108) months from ASCT. Three patients died early from transplant-related reasons translating into a 2-year non-relapse mortality of 7 %. Relapse incidence at 4 years after ASCT was 39 %, with no events occurring beyond 2.5 years after ASCT. PFS and OS were 54 % and 59 % at four years, respectively. There was a trend for better OS in patients transplanted in the first CR or PR compared to more advanced disease status (70 % vs. 43 %, p=0.053). Of note, patients with a history of celiac disease had superior PFS (70 % vs. 35 %, p=0.02) and OS (70 % vs. 45 %, p=0.052) whilst age, gender, disease stage, B-symptoms at diagnosis or high-dose regimen were not associated with OS or PFS. Conclusions: This study shows for the first time in a larger patient sample that ASCT is feasible in selected patients with EATL and can yield durable disease control in a significant proportion of the patients. Patients transplanted in first CR or PR appear to do better than those transplanted later. ASCT should be considered in EATL patients responding to initial therapy.
Resumo:
BACKGROUND: Combination antiretroviral treatment (cART) has been very successful, especially among selected patients in clinical trials. The aim of this study was to describe outcomes of cART on the population level in a large national cohort. METHODS: Characteristics of participants of the Swiss HIV Cohort Study on stable cART at two semiannual visits in 2007 were analyzed with respect to era of treatment initiation, number of previous virologically failed regimens and self reported adherence. Starting ART in the mono/dual era before HIV-1 RNA assays became available was counted as one failed regimen. Logistic regression was used to identify risk factors for virological failure between the two consecutive visits. RESULTS: Of 4541 patients 31.2% and 68.8% had initiated therapy in the mono/dual and cART era, respectively, and been on treatment for a median of 11.7 vs. 5.7 years. At visit 1 in 2007, the mean number of previous failed regimens was 3.2 vs. 0.5 and the viral load was undetectable (<50 copies/ml) in 84.6% vs. 89.1% of the participants, respectively. Adjusted odds ratios of a detectable viral load at visit 2 for participants from the mono/dual era with a history of 2 and 3, 4, >4 previous failures compared to 1 were 0.9 (95% CI 0.4-1.7), 0.8 (0.4-1.6), 1.6 (0.8-3.2), 3.3 (1.7-6.6) respectively, and 2.3 (1.1-4.8) for >2 missed cART doses during the last month, compared to perfect adherence. From the cART era, odds ratios with a history of 1, 2 and >2 previous failures compared to none were 1.8 (95% CI 1.3-2.5), 2.8 (1.7-4.5) and 7.8 (4.5-13.5), respectively, and 2.8 (1.6-4.8) for >2 missed cART doses during the last month, compared to perfect adherence. CONCLUSIONS: A higher number of previous virologically failed regimens, and imperfect adherence to therapy were independent predictors of imminent virological failure.
Resumo:
The emergence of omics technologies allowing the global analysis of a given biological or molecular system, rather than the study of its individual components, has revolutionized biomedical research, including cardiovascular medicine research in the past decade. These developments raised the prospect that classical, hypothesis-driven, single gene-based approaches may soon become obsolete. The experience accumulated so far, however, indicates that omic technologies only represent tools similar to those classically used by scientists in the past and nowadays, to make hypothesis and build models, with the main difference that they generate large amounts of unbiased information. Thus, omics and classical hypothesis-driven research are rather complementary approaches with the potential to effectively synergize to boost research in many fields, including cardiovascular medicine. In this article we discuss some general aspects of omics approaches, and review contributions in three areas of vascular biology, thrombosis and haemostasis, atherosclerosis and angiogenesis, in which omics approaches have already been applied (vasculomics).
Resumo:
Background Based on several experimental results and on a preliminary study, a trial was undertaken to assess the efficacy of adalimumab, a TNF-α inhibitor, in patients with radicular pain due to lumbar disc herniation. Methods A multicentre, double-blind, randomised controlled trial was conducted between May 2005 and December 2007 in Switzerland. Patients with acute (< 12 weeks) and severe (Oswestry Disability index > 50) radicular leg pain and imaging-confirmed lumbar disc herniation were randomised to receive as adjuvant therapy either two subcutaneous injections of adalimumab (40 mg) at 7 days interval or matching placebo. The primary outcome was leg pain, which was recorded every day for 10 days and at 6-weeks and 6- months based on a visual analogue scale (0 to 100). Results Of the 265 patients screened, 61 were enrolled (adalimumab= 31) and 4 were lost to follow-up. Over time, the evolution of leg pain was more favourable in the adalimumab group than in the placebo group (p<0.001). However, the effect size was relatively small and at last follow-up the difference was 13.8 (CI95% -11.5 - 39.0). In the adalimumab group twice as many patients fulfilled the criteria for "responders" and for "low residual disease impact" ( p<0.05) and fewer surgical discectomies were performed (6 versus 13, p=0.04). Conclusion The addition of a short course of adalimumab to the treatment regimen of patients suffering from acute and severe sciatica resulted in a small decrease in leg pain and in significantly fewer surgical procedures.