838 resultados para First-line
Resumo:
Stenotrophomonas maltophilia has emerged as an important opportunistic pathogen in the debilitated host. S maltophilia is not an inherently virulent pathogen, but its ability to colonise respiratory-tract epithelial cells and surfaces of medical devices makes it a ready coloniser of hospitalised patients. S maltophilia can cause blood-stream infections and pneumonia with considerable morbidity in immunosuppressed patients. Management of infection is hampered by high-level intrinsic resistance to many antibiotic classes and the increasing occurrence of acquired resistance to the first-line drug co-trimoxazole. Prevention of acquisition and infection depends upon the application of modern infection-control practices, with emphasis on the control of antibiotic use and environmental reservoirs.
Resumo:
BACKGROUND:Accurate quantification of the prevalence of human immunodeficiency virus type 1 (HIV-1) drug resistance in patients who are receiving antiretroviral therapy (ART) is difficult, and results from previous studies vary. We attempted to assess the prevalence and dynamics of resistance in a highly representative patient cohort from Switzerland. METHODS:On the basis of genotypic resistance test results and clinical data, we grouped patients according to their risk of harboring resistant viruses. Estimates of resistance prevalence were calculated on the basis of either the proportion of individuals with a virologic failure or confirmed drug resistance (lower estimate) or the frequency-weighted average of risk group-specific probabilities for the presence of drug resistance mutations (upper estimate). RESULTS:Lower and upper estimates of drug resistance prevalence in 8064 ART-exposed patients were 50% and 57% in 1999 and 37% and 45% in 2007, respectively. This decrease was driven by 2 mechanisms: loss to follow-up or death of high-risk patients exposed to mono- or dual-nucleoside reverse-transcriptase inhibitor therapy (lower estimates range from 72% to 75%) and continued enrollment of low-risk patients who were taking combination ART containing boosted protease inhibitors or nonnucleoside reverse-transcriptase inhibitors as first-line therapy (lower estimates range from 7% to 12%). A subset of 4184 participants (52%) had >or= 1 study visit per year during 2002-2007. In this subset, lower and upper estimates increased from 45% to 49% and from 52% to 55%, respectively. Yearly increases in prevalence were becoming smaller in later years. CONCLUSIONS:Contrary to earlier predictions, in situations of free access to drugs, close monitoring, and rapid introduction of new potent therapies, the emergence of drug-resistant viruses can be minimized at the population level. Moreover, this study demonstrates the necessity of interpreting time trends in the context of evolving cohort populations.
Resumo:
BACKGROUND: The development of arsenical and diamidine resistance in Trypanosoma brucei is associated with loss of drug uptake by the P2 purine transporter as a result of alterations in the corresponding T. brucei adenosine transporter 1 gene (TbAT1). Previously, specific TbAT1 mutant type alleles linked to melarsoprol treatment failure were significantly more prevalent in T. b. gambiense from relapse patients at Omugo health centre in Arua district. Relapse rates of up to 30% prompted a shift from melarsoprol to eflornithine (alpha-difluoromethylornithine, DFMO) as first-line treatment at this centre. The aim of this study was to determine the status of TbAT1 in recent isolates collected from T. b. gambiense sleeping sickness patients from Arua and Moyo districts in Northwestern Uganda after this shift in first-line drug choice. METHODOLOGY AND RESULTS: Blood and cerebrospinal fluids of consenting patients were collected for DNA preparation and subsequent amplification. All of the 105 isolates from Omugo that we successfully analysed by PCR-RFLP possessed the TbAT1 wild type allele. In addition, PCR/RFLP analysis was performed for 74 samples from Moyo, where melarsoprol is still the first line drug; 61 samples displayed the wild genotype while six were mutant and seven had a mixed pattern of both mutant and wild-type TbAT1. The melarsoprol treatment failure rate at Moyo over the same period was nine out of 101 stage II cases that were followed up at least once. Five of the relapse cases harboured mutant TbAT1, one had the wild type, while no amplification was achieved from the remaining three samples. CONCLUSIONS/SIGNIFICANCE: The apparent disappearance of mutant alleles at Omugo may correlate with melarsoprol withdrawal as first-line treatment. Our results suggest that melarsoprol could successfully be reintroduced following a time lag subsequent to its replacement. A field-applicable test to predict melarsoprol treatment outcome and identify patients for whom the drug can still be beneficial is clearly required. This will facilitate cost-effective management of HAT in rural resource-poor settings, given that eflornithine has a much higher logistical requirement for its application.
Resumo:
Dendritic cells (DCs) represent the first line defence of the innate immune system following infection with pathogens. We exploratively addressed invasion and survival ability of Neospora caninum, a parasite causing abortion in cattle, in mouse bone marrow DCs (BMDCs), and respective cytokine expression patterns. Immature BMDCs were exposed to viable (untreated) and nonviable parasites that had been inactivated by different means. Invasion and/or internalization, as well as intracellular survival and proliferation of tachyzoites were determined by NcGRA2-RT-PCR and transmission electron microscopy (TEM). Cytokine expression was evaluated by reverse transcription (RT)-PCR and cytokine ELISA. Transmission electron microscopy of DCs stimulated with untreated viable parasites revealed that N. caninum was able to invade and proliferate within BMDCs. This was confirmed by NcGRA2-RT-PCR. On the other hand, no viable parasite organisms were revealed by TEM when exposing BMDCs to inactivated parasites (nonviability demonstrated by NcGRA2-RT-PCR). Cytokine expression analysis (as assessed by both RT-PCR and ELISA) demonstrated that both viable and nonviable parasites stimulated mBMDCs to express IL-12p40, IL-10 and TNF-alpha, whereas IL-4 RNA expression was not detected. Thus, exposure of mBMDCs to both viable and nonviable parasites results in the expression of cytokines that are relevant for a mixed Th1/Th2 immune response.
Resumo:
OBJECTIVE To compare the effects of antiplatelets and anticoagulants on stroke and death in patients with acute cervical artery dissection. DESIGN Systematic review with Bayesian meta-analysis. DATA SOURCES The reviewers searched MEDLINE and EMBASE from inception to November 2012, checked reference lists, and contacted authors. STUDY SELECTION Studies were eligible if they were randomised, quasi-randomised or observational comparisons of antiplatelets and anticoagulants in patients with cervical artery dissection. DATA EXTRACTION Data were extracted by one reviewer and checked by another. Bayesian techniques were used to appropriately account for studies with scarce event data and imbalances in the size of comparison groups. DATA SYNTHESIS Thirty-seven studies (1991 patients) were included. We found no randomised trial. The primary analysis revealed a large treatment effect in favour of antiplatelets for preventing the primary composite outcome of ischaemic stroke, intracranial haemorrhage or death within the first 3 months after treatment initiation (relative risk 0.32, 95% credibility interval 0.12 to 0.63), while the degree of between-study heterogeneity was moderate (τ(2) = 0.18). In an analysis restricted to studies of higher methodological quality, the possible advantage of antiplatelets over anticoagulants was less obvious than in the main analysis (relative risk 0.73, 95% credibility interval 0.17 to 2.30). CONCLUSION In view of these results and the safety advantages, easier usage and lower cost of antiplatelets, we conclude that antiplatelets should be given precedence over anticoagulants as a first line treatment in patients with cervical artery dissection unless results of an adequately powered randomised trial suggest the opposite.
Resumo:
BACKGROUND Monitoring of HIV viral load in patients on combination antiretroviral therapy (ART) is not generally available in resource-limited settings. We examined the cost-effectiveness of qualitative point-of-care viral load tests (POC-VL) in sub-Saharan Africa. DESIGN Mathematical model based on longitudinal data from the Gugulethu and Khayelitsha township ART programmes in Cape Town, South Africa. METHODS Cohorts of patients on ART monitored by POC-VL, CD4 cell count or clinically were simulated. Scenario A considered the more accurate detection of treatment failure with POC-VL only, and scenario B also considered the effect on HIV transmission. Scenario C further assumed that the risk of virologic failure is halved with POC-VL due to improved adherence. We estimated the change in costs per quality-adjusted life-year gained (incremental cost-effectiveness ratios, ICERs) of POC-VL compared with CD4 and clinical monitoring. RESULTS POC-VL tests with detection limits less than 1000 copies/ml increased costs due to unnecessary switches to second-line ART, without improving survival. Assuming POC-VL unit costs between US$5 and US$20 and detection limits between 1000 and 10,000 copies/ml, the ICER of POC-VL was US$4010-US$9230 compared with clinical and US$5960-US$25540 compared with CD4 cell count monitoring. In Scenario B, the corresponding ICERs were US$2450-US$5830 and US$2230-US$10380. In Scenario C, the ICER ranged between US$960 and US$2500 compared with clinical monitoring and between cost-saving and US$2460 compared with CD4 monitoring. CONCLUSION The cost-effectiveness of POC-VL for monitoring ART is improved by a higher detection limit, by taking the reduction in new HIV infections into account and assuming that failure of first-line ART is reduced due to targeted adherence counselling.
Resumo:
BACKGROUND Since 2005, increasing numbers of children have started antiretroviral therapy (ART) in sub-Saharan Africa and, in recent years, WHO and country treatment guidelines have recommended ART initiation for all infants and very young children, and at higher CD4 thresholds for older children. We examined temporal changes in patient and regimen characteristics at ART start using data from 12 cohorts in 4 countries participating in the IeDEA-SA collaboration. METHODOLOGY/PRINCIPAL FINDINGS Data from 30,300 ART-naïve children aged <16 years at ART initiation who started therapy between 2005 and 2010 were analysed. We examined changes in median values for continuous variables using the Cuzick's test for trend over time. We also examined changes in the proportions of patients with particular disease severity characteristics (expressed as a binary variable e.g. WHO Stage III/IV vs I/II) using logistic regression. Between 2005 and 2010 the number of children starting ART each year increased and median age declined from 63 months (2006) to 56 months (2010). Both the proportion of children <1 year and ≥10 years of age increased from 12 to 19% and 18 to 22% respectively. Children had less severe disease at ART initiation in later years with significant declines in the percentage with severe immunosuppression (81 to 63%), WHO Stage III/IV disease (75 to 62%), severe anemia (12 to 7%) and weight-for-age z-score<-3 (31 to 28%). Similar results were seen when restricting to infants with significant declines in the proportion with severe immunodeficiency (98 to 82%) and Stage III/IV disease (81 to 63%). First-line regimen use followed country guidelines. CONCLUSIONS/SIGNIFICANCE Between 2005 and 2010 increasing numbers of children have initiated ART with a decline in disease severity at start of therapy. However, even in 2010, a substantial number of infants and children started ART with advanced disease. These results highlight the importance of efforts to improve access to HIV diagnostic testing and ART in children.
Resumo:
Since the publication of the European Respiratory Society Task Force report in 2008, significant new evidence has become available on the classification and management of preschool wheezing disorders. In this report, an international consensus group reviews this new evidence and proposes some modifications to the recommendations made in 2008. Specifically, the consensus group acknowledges that wheeze patterns in young children vary over time and with treatment, rendering the distinction between episodic viral wheeze and multiple-trigger wheeze unclear in many patients. Inhaled corticosteroids remain first-line treatment for multiple-trigger wheeze, but may also be considered in patients with episodic viral wheeze with frequent or severe episodes, or when the clinician suspects that interval symptoms are being under reported. Any controller therapy should be viewed as a treatment trial, with scheduled close follow-up to monitor treatment effect. The group recommends discontinuing treatment if there is no benefit and taking favourable natural history into account when making decisions about long-term therapy. Oral corticosteroids are not indicated in mild-to-moderate acute wheeze episodes and should be reserved for severe exacerbations in hospitalised patients. Future research should focus on better clinical and genetic markers, as well as biomarkers, of disease severity.
Resumo:
Context: In virologically suppressed, antiretroviral-treated patients, the effect of switching to tenofovir (TDF) on bone biomarkers compared to patients remaining on stable antiretroviral therapy is unknown. Methods: We examined bone biomarkers (osteocalcin [OC], procollagen type 1 amino-terminal propeptide, and C-terminal cross-linking telopeptide of type 1 collagen) and bone mineral density (BMD) over 48 weeks in virologically suppressed patients (HIV RNA < 50 copies/ml) randomized to switch to TDF/emtricitabine (FTC) or remain on first-line zidovudine (AZT)/lamivudine (3TC). PTH was also measured. Between-group differences in bone biomarkers and associations between change in bone biomarkers and BMD measures were assessed by Student's t tests, Pearson correlation, and multivariable linear regression, respectively. All data are expressed as mean (SD), unless otherwise specified. Results: Of 53 subjects (aged 46.0 y; 84.9% male; 75.5% Caucasian), 29 switched to TDF/FTC. There were reductions in total hip and lumbar spine BMD in those switching to TDF/FTC (total hip, TDF/FTC, −1.73 (2.76)% vs AZT/3TC, −0.39 (2.41)%; between-group P = .07; lumbar spine, TDF/FTC, −1.50 (3.49)% vs AZT/3TC, +0.25 (2.82)%; between-group P = .06), but they did not reach statistical significance. Greater declines in lumbar spine BMD correlated with greater increases in OC (r = −0.28; P = .05). The effect of TDF/FTC on bone biomarkers remained significant when adjusted for baseline biomarker levels, gender, and ethnicity. There was no difference in change in PTH levels over 48 weeks between treatment groups (between-group P = .23). All biomarkers increased significantly from weeks 0 to 48 in the switch group, with no significant change in those remaining on AZT/3TC (between-group, all biomarkers, P < .0001). Conclusion: A switch to TDF/FTC compared to remaining on a stable regimen is associated with increases in bone turnover that correlate with reductions in BMD, suggesting that TDF exposure directly affects bone metabolism in vivo.
Resumo:
Background. Although tenofovir (TDF) use has increased as part of first-line antiretroviral therapy (ART) across sub-Saharan Africa, renal outcomes among patients receiving TDF remain poorly understood. We assessed changes in renal function and mortality in patients starting TDF- or non-TDF-containing ART in Lusaka, Zambia. Methods. We included patients aged ≥16 years who started ART from 2007 onward, with documented baseline weight and serum creatinine. Renal dysfunction was categorized as mild (eGFR 60-89 mL/min), moderate (30-59 mL/min) or severe (<30 mL/min) using the CKD-EPI formula. Differences in eGFR during ART were analyzed using linear mixed-effect models, the odds of developing moderate or severe eGFR decrease with logistic regression and mortality with competing risk regression. Results. We included 62,230 adults, of which 38,716 (62%) initiated a TDF-based regimen. The proportion with moderate or severe renal dysfunction at baseline was lower in the TDF compared to the non-TDF group (1.9% vs. 4.0%). Among patients with no or mild renal dysfunction, those on TDF were more likely to develop moderate (adjusted OR: 3.11; 95%CI: 2.52-3.87) or severe eGFR decrease (adjusted OR: 2.43; 95%CI: 1.80-3.28), although the incidence of such episodes was low. Among patients with moderate or severe renal dysfunction at baseline, renal function improved independently of ART regimen and mortality was similar in both treatment groups. Conclusions. TDF use did not attenuate renal function recovery or increase mortality in patients with renal dysfunction. Further studies are needed to determine the role of routine renal function monitoring before and during ART use in Africa.
Resumo:
Stroke in patients with atrial fibrillation (AF) is often associated with substantial morbidity and mortality. Oral anticoagulation remains the first-line approach to stroke prevention in such individuals; however, for a considerable proportion of patients, traditional treatment using warfarin is limited by a number of factors, such as the inconvenience of frequent therapeutic monitoring and the risk of haemorrhage. The development of new oral anticoagulants with improved efficacy and safety profiles has provided viable options for oral anticoagulation therapy in patients with nonvalvular (nonrheumatic AF). Nonetheless, in patients who have an increased risk of major haemorrhage, a nonpharmacological approach to antithrombotic therapy remains an attractive alternative. The left atrial appendage (LAA) has been found to be the source of >90% of thrombi in patients with nonvalvular AF; thus, prevention of thrombus formation via transcatheter mechanical LAA occlusion is a novel therapeutic target for stroke prevention in this patient population. In this Review, we present the rationale for LAA occlusion in patients with AF, the available occlusion devices and their clinical evidence to date. We also discuss the roles of various imaging techniques in device implantation and the management strategy for associated procedural complications.
Resumo:
Activating epidermal growth factor receptor (EGFR) mutations are recognized biomarkers for patients with metastatic non-small cell lung cancer (NSCLC) treated with EGFR tyrosine kinase inhibitors (TKIs). EGFR TKIs can also have activity against NSCLC without EGFR mutations, requiring the identification of additional relevant biomarkers. Previous studies on tumor EGFR protein levels and EGFR gene copy number revealed inconsistent results. The aim of the study was to identify novel biomarkers of the response to TKIs in NSCLC by investigating whole genome expression at the exon-level. We used exon arrays and clinical samples from a previous trial (SAKK19/05) to investigate the expression variations at the exon-level of 3 genes potentially playing a key role in modulating treatment response: EGFR, V-Ki-ras2 Kirsten rat sarcoma viral oncogene homolog (KRAS) and vascular endothelial growth factor (VEGFA). We identified the expression of EGFR exon 18 as a new predictive marker for patients with untreated metastatic NSCLC treated with bevacizumab and erlotinib in the first line setting. The overexpression of EGFR exon 18 in tumor was significantly associated with tumor shrinkage, independently of EGFR mutation status. A similar significant association could be found in blood samples. In conclusion, exonic EGFR expression particularly in exon 18 was found to be a relevant predictive biomarker for response to bevacizumab and erlotinib. Based on these results, we propose a new model of EGFR testing in tumor and blood.
Resumo:
Pancreatic ductal adenocarcinoma (PDAC) is the fourth leading cancer cause of death in the US. Gemcitabine is the first-line therapy for this disease, but unfortunately it shows only very modest benefit. The focus of the current study was to investigate the role and regulation of EphA2, a receptor tyrosine kinase expressed in PDAC, to further understand this disease and identify new therapeutic targets. The role of EphA2 was determined in PDAC by siRNA mediated silencing. In combination with gemcitabine, silencing of EphA2 caused a dramatic increase in apoptosis even in highly resistant cells in vitro. Furthermore, EphA2 silencing was found to be useful in 2 orthotopic models in vivo: 1) shRNA-pretreated Miapaca-2 cells, and 2) in vivo delivery of siRNA to established MPanc96 tumors. Silencing of EphA2 alone reduced tumor growth in Miapaca-2 cells. In MPanc96, only the combination treatment of gemcitabine plus siEphA2 significantly reduced tumor growth, as well as the number of lung and liver metastases. Taken together, these observations support EphA2 as a target for combination therapies for PDAC. The regulation of EphA2 was further explored with a focus on the role of Ras. K-Ras activating mutations are the most important initiating event in PDAC. We demonstrated that Ras regulates EphA2 expression through activation of MEK2 and phosphorylation of ERK. Downstream of ERK, silencing of the transcription factor AP-1 subunit c-Jun or inhibition of the ERK effector RSK caused a decrease in EphA2 expression, supporting their roles in this process. Further examination of Ras/MEK/ERK pathway modulators revealed that PEA-15, a protein that sequesters ERK to the cytoplasm, inhibited expression of EphA2. A significant inverse correlation between EphA2 and PEA-15 levels was observed in mouse models of PDAC. In cells where an EGFR inhibitor reduced phospho-Erk, expression of EphA2 was also reduced, indicating that changes in EphA2 levels may allow monitoring the effectiveness of anti-Ras/MEK/ERK therapies. In conclusion, EphA2 levels may be a good prognostic factor for anti-EGFR/anti-Ras therapies, and EphA2 itself is a relevant target for the development of new therapies.
Resumo:
The plasma membrane xc- cystine/glutamate transporter mediates cellular uptake of cystine in exchange for intracellular glutamate and is highly expressed by pancreatic cancer cells. The xCT gene, encoding the cystine-specific xCT protein subunit of xc-, is important in regulating intracellular glutathione (GSH) levels, critical for cancer cell protection against oxidative stress, tumor growth and resistance to chemotherapeutic agents including platinum. We examined 4 single nucleotide polymorphisms (SNPs) of the xCT gene in 269 advanced pancreatic cancer patients who received first line gemcitabine with or without cisplatin or oxaliplatin. Genotyping was performed using Taqman real-time PCR assays. A statistically significant correlation was noted between the 3' untranslated region (UTR) xCT SNP rs7674870 and overall survival (OS): Median survival time (MST) was 10.9 and 13.6 months, respectively, for the TT and TC/CC genotypes (p = 0.027). Stratified analysis showed the genotype effect was significant in patients receiving gemcitabine in combination with platinum therapy (n = 145): MST was 10.5 versus 14.1 months for the TT and TC/CC genotypes, respectively (p = 0.013). The 3' UTR xCT SNP rs7674870 may correlate with OS in pancreatic cancer patients receiving gemcitabine and platinum combination therapy. Paraffin-embedded core and surgical biopsy tumor specimens from 98 patients with metastatic pancreatic adenocarcinoma were analyzed by immunohistochemistry using an xCT specific antibody. xCT protein IHC expression scores were analyzed in relation to overall survival in 86 patients and genotype in 12 patients and no statistically significant association was found between the level of xCT IHC expression score and overall survival (p = 0.514). When xCT expression was analyzed in terms of treatment response, no statistically significant associations could be determined (p = 0.908). These data suggest that polymorphic variants of xCT may have predictive value, and that the xc- transporter may represent an important target for therapy in pancreatic cancer.
Resumo:
Natural killer cells may provide an important first line of defense against metastatic implantation of solid tumors. This antitumor function occurs during the intravascular and visceral lodgment phase of cancer dissemination, as demonstrated in small animal metastasis models. The role of the NK cell in controlling human tumor dissemination is more difficult to confirm, at least partially because of ethical restraints on experimental design. Nonetheless, a large number of solid tumor patient studies have demonstrated NK cell cytolysis of both autologous and allogeneic tumors.^ Of the major cancer therapeutic modalities, successful surgery in conjunction with other treatments offers the best possibility of cure. However, small animal experiments have demonstrated that surgical stress can lead to increased rates of primary tumor take, and increased incidence, size, and rapidity of metastasis development. Because the physiologic impact of surgical stress can also markedly impair perioperative antitumor immune function in humans, we examined the effect of surgical stress on perioperative NK cell cytolytic function in a murine preclinical model. Our studies demonstrated that hindlimb amputation led to a marked impairment of postoperative NK cell cytotoxicity. The mechanism underlying this process is complex and involves the postsurgical generation of splenic erythroblasts that successfully compete with NK cells for tumor target binding sites; NK cell-directed suppressor cell populations; and a direct impairment of NK cell recycling capacity. The observed postoperative NK cell suppression could be prevented by in vivo administration of pyrimidinone biologic response modifiers or by short term in vitro exposure of effector cells to recombinant Interleukin-2. It is hoped that insights gained from this research may help in the future development of NK cell specific perioperative immunotherapy relevant to the solid tumor patients undergoing cancer resection. ^