863 resultados para VERSUS-HOST-DISEASE
Resumo:
Sclerotinia sclerotiorum is a necrotrophic ascomycete fungus with an extremely broad host range. This pathogen produces the non-specific phytotoxin and key pathogenicity factor, oxalic acid (OA). Our recent work indicated that this fungus and more specifically OA, can induce apoptotic-like programmed cell death (PCD) in plant hosts, this induction of PCD and disease requires generation of reactive oxygen species (ROS) in the host, a process triggered by fungal secreted OA. Conversely, during the initial stages of infection, OA also dampens the plant oxidative burst, an early host response generally associated with plant defense. This scenario presents a challenge regarding the mechanistic details of OA function; as OA both suppresses and induces host ROS during the compatible interaction. In the present study we generated transgenic plants expressing a redox-regulated GFP reporter. Results show that initially, Sclerotinia (via OA) generates a reducing environment in host cells that suppress host defense responses including the oxidative burst and callose deposition, akin to compatible biotrophic pathogens. Once infection is established however, this necrotroph induces the generation of plant ROS leading to PCD of host tissue, the result of which is of direct benefit to the pathogen. In contrast, a non-pathogenic OA-deficient mutant failed to alter host redox status. The mutant produced hypersensitive response-like features following host inoculation, including ROS induction, callose formation, restricted growth and cell death. These results indicate active recognition of the mutant and further point to suppression of defenses by the wild type necrotrophic fungus. Chemical reduction of host cells with dithiothreitol (DTT) or potassium oxalate (KOA) restored the ability of this mutant to cause disease. Thus, Sclerotinia uses a novel strategy involving regulation of host redox status to establish infection. These results address a long-standing issue involving the ability of OA to both inhibit and promote ROS to achieve pathogenic success.
Resumo:
Selenium (Se) is an essential trace element and the clinical consequences of Se deficiency have been well-documented. Se is primarily obtained through the diet and recent studies have suggested that the level of Se in Australian foods is declining. Currently there is limited data on the Se status of the Australian population so the aim of this study was to determine the plasma concentration of Se and glutathione peroxidase (GSH-Px), a well-established biomarker of Se status. Furthermore, the effect of gender, age and presence of cardiovascular disease (CVD) was also examined. Blood plasma samples from healthy subjects (140 samples, mean age = 54 years; range, 20-86 years) and CVD patients (112 samples, mean age = 67 years; range, 40-87 years) were analysed for Se concentration and GSH-Px activity. The results revealed that the healthy Australian cohort had a mean plasma Se level of 100.2 +/- 1.3 microg Se/L and a mean GSH-Px activity of 108.8 +/- 1.7 U/L. Although the mean value for plasma Se reached the level required for optimal GSH-Px activity (i.e. 100 microg Se/L), 47% of the healthy individuals tested fell below this level. Further evaluation revealed that certain age groups were more at risk of a lowered Se status, in particular, the oldest age group of over 81 years (females = 97.6 +/- 6.1 microg Se/L; males = 89.4 +/- 3.8 microg Se/L). The difference in Se status between males and females was not found to be significant. The presence of CVD did not appear to influence Se status, with the exception of the over 81 age group, which showed a trend for a further decline in Se status with disease (plasma Se, 93.5 +/- 3.6 microg Se/L for healthy versus 88.2 +/- 5.3 microg Se/L for CVD; plasma GSH-Px, 98.3 +/- 3.9 U/L for healthy versus 87.0 +/- 6.5 U/L for CVD). These findings emphasise the importance of an adequate dietary intake of Se for the maintenance of a healthy ageing population, especially in terms of cardiovascular health.
Resumo:
BACKGROUND: US Centers for Disease Control guidelines recommend replacement of peripheral intravenous (IV) catheters no more frequently than every 72 to 96 hours. Routine replacement is thought to reduce the risk of phlebitis and bloodstream infection. Catheter insertion is an unpleasant experience for patients and replacement may be unnecessary if the catheter remains functional and there are no signs of inflammation. Costs associated with routine replacement may be considerable. This is an update of a review first published in 2010. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present.
Resumo:
Chlamydia pneumoniae commonly causes respiratory tract infections in children, and epidemiological investigations strongly link infection to the pathogenesis of asthma. The immune system in early life is immature and may not respond appropriately to pathogens. Toll-like receptor (TLR)2 and 4 are regarded as the primary pattern recognition receptors that sense bacteria, however their contribution to innate and adaptive immunity in early life remains poorly defined. We investigated the role of TLR2 and 4 in the induction of immune responses to Chlamydia muridarum respiratory infection, in neonatal wild-type (Wt) or TLR2-deficient (−/−), 4−/− or 2/4−/− BALB/c mice. Wt mice had moderate disease and infection. TLR2−/− mice had more severe disease and more intense and prolonged infection compared to other groups. TLR4−/− mice were asymptomatic. TLR2/4−/− mice had severe early disease and persistent infection, which resolved thereafter consistent with the absence of symptoms in TLR4−/− mice. Wt mice mounted robust innate and adaptive responses with an influx of natural killer (NK) cells, neutrophils, myeloid (mDCs) and plasmacytoid (pDCs) dendritic cells, and activated CD4+ and CD8+ T-cells into the lungs. Wt mice also had effective production of interferon (IFN)γ in the lymph nodes and lung, and proliferation of lymph node T-cells. TLR2−/− mice had more intense and persistent innate (particularly neutrophil) and adaptive cell responses and IL-17 expression in the lung, however IFNγ responses and T-cell proliferation were reduced. TLR2/4−/− mice had reduced innate and adaptive responses. Most importantly, neutrophil phagocytosis was impaired in the absence of TLR2. Thus, TLR2 expression, particularly on neutrophils, is required for effective control of Chlamydia respiratory infection in early life. Loss of control of infection leads to enhanced but ineffective TLR4-mediated inflammatory responses that prolong disease symptoms. This indicates that TLR2 agonists may be beneficial in the treatment of early life Chlamydia infections and associated diseases.
Resumo:
Background: Phase III studies suggest that non-small-cell lung cancer (NSCLC) patients treated with cisplatin-docetaxel may have higher response rates and better survival compared with other platinum-based regimens. We report the final results of a randomised phase III study of docetaxel and carboplatin versus MIC or MVP in patients with advanced NSCLC. Patients and methods: Patients with biopsy proven stage III-IV NSCLC not suitable for curative surgery or radiotherapy were randomised to receive four cycles of either DCb (docetaxel 75 mg/m 2, carboplatin AUC 6), or MIC/MVP (mitomycin 6 mg/m 2, ifosfamide 3 g/m 2 and cisplatin 50 mg/m 2 or mitomycin 6 mg/ m 2, vinblastine 6 mg/m 2 and cisplatin 50 mg/m 2, respectively), 3 weekly. The primary end point was survival, secondary end points included response rates, toxicity and quality of life. Results: The median follow-up was 17.4 months. Overall response rate was 32% for both arms (partial response = 31%, complete response = 1%); 32% of MIC/MVP and 26% of DCb patients had stable disease. One-year survival was 39% and 35% for DCb and MIC/MVP, respectively. Two-year survival was 13% with both arms. Grade 3/4 neutropenia (74% versus 43%, P < 0.005), infection (18% versus 9%, P = 0.01) and mucositis (5% versus 1%, P = 0.02) were more common with DCb than MIC/MVP. The MIC/MVP arm had significant worsening in overall EORTC score and global health status whereas the DCb arm showed no significant change. Conclusions: The combination of DCb had similar efficacy to MIC/MVP but quality of life was better maintained. © 2006 European Society for Medical Oncology.
Resumo:
BACKGROUND. The authors compared gemcitabine and carboplatin (GC) with mitomycin, ifosfamide, and cisplatin (MIC) or mitomycin, vinblastine, and cisplatin (MVP) in patients with advanced nonsmall cell lung carcinoma (NSCLC). The primary objective was survival. Secondary objectives were time to disease progression, response rates, evaluation of toxicity, disease-related symptoms, World Health Organization performance status (PS), and quality of life (QoL). METHODS. Three hundred seventy-two chemotherapy-naïve patients with International Staging System Stage III/IV NSCLC who were ineligible for curative radiotherapy or surgery were randomized to receive either 4 cycles of gemcitabine (1000 mg/m2 on Days 1, 8, and 15) plus carboplatin (area under the serum concentration-time curve, 5; given on Day 1) every 4 weeks (the GC arm) or MIC/MVP every 3 weeks (the MIC/MVP arm). RESULTS. There was no significant difference in median survival (248 days in the MIC/MVP arm vs. 236 days in the GC arm) or time to progression (225 days in the MIC/MVP arm vs. 218 days in the GC arm) between the 2 treatment arms. The 2-year survival rate was 11.8% in the MIC/MVP arm and 6.9% in the GC arm. The 1-year survival rate was 32.5% in the MIC/MVP arm and 33.2% in the GC arm. In the MIC/MVP arm, 33% of patients responded (4 complete responses [CRs] and 57 partial responses [PRs]) whereas in the GC arm, 30% of patients responded (3 CRs and 54 PRs). Nonhematologic toxicity was comparable for patients with Grade 3-4 symptoms, except there was more alopecia among patients in the MIC/MVP arm. GC appeared to produce more hematologic toxicity and necessitated more transfusions. There was no difference in performance status, disease-related symptoms, of QoL between patients in the two treatment arms. Fewer inpatient stays for complications were required with GC. CONCLUSIONS. The results of the current study failed to demonstrate any difference in efficacy between the newer regimen of GC and the older regimens of MIC and MVP. © 2003 American Cancer Society.
Resumo:
INTRODUCTION: Performance status (PS) 2 patients with non-small cell lung cancer (NSCLC) experience more toxicity, lower response rates, and shorter survival times than healthier patients treated with standard chemotherapy. Paclitaxel poliglumex (PPX), a macromolecule drug conjugate of paclitaxel and polyglutamic acid, reduces systemic exposure to peak concentrations of free paclitaxel and may lead to increased concentrations in tumors due to enhanced vascular permeability. METHODS: Chemotherapy-naive PS 2 patients with advanced NSCLC were randomized to receive carboplatin (area under the curve = 6) and either PPX (210 mg/m/10 min without routine steroid premedication) or paclitaxel (225 mg/m/3 h with standard premedication) every 3 weeks. The primary end point was overall survival. RESULTS: A total of 400 patients were enrolled. Alopecia, arthralgias/myalgias, and cardiac events were significantly less frequent with PPX/carboplatin, whereas grade ≥3 neutropenia and grade 3 neuropathy showed a trend of worsening. There was no significant difference in the incidence of hypersensitivity reactions despite the absence of routine premedication in the PPX arm. Overall survival was similar between treatment arms (hazard ratio, 0.97; log rank p = 0.769). Median and 1-year survival rates were 7.9 months and 31%, for PPX versus 8 months and 31% for paclitaxel. Disease control rates were 64% and 69% for PPX and paclitaxel, respectively. Time to progression was similar: 3.9 months for PPX/carboplatin versus 4.6 months for paclitaxel/carboplatin (p = 0.210). CONCLUSION: PPX/carboplatin failed to provide superior survival compared with paclitaxel/carboplatin in the first-line treatment of PS 2 patients with NSCLC, but the results with respect to progression-free survival and overall survival were comparable and the PPX regimen was more convenient. © 2008International Association for the Study of Lung Cancer.
Resumo:
Purpose: Data from two randomized phase III trials were analyzed to evaluate prognostic factors and treatment selection in the first-line management of advanced non-small cell lung cancer patients with performance status (PS) 2. Patients and Methods: Patients randomized to combination chemotherapy (carboplatin and paclitaxel) in one trial and single-agent therapy (gemcitabine or vinorelbine) in the second were included in these analyses. Both studies had identical eligibility criteria and were conducted simultaneously. Comparison of efficacy and safety was performed between the two cohorts. A regression analysis identified prognostic factors and subgroups of patients that may benefit from combination or single-agent therapy. Results: Two hundred one patients were treated with combination and 190 with single-agent therapy. Objective responses were 37 and 15%, respectively. Median time to progression was 4.6 months in the combination arm and 3.5 months in the single-agent arm (p < 0.001). Median survival imes were 8.0 and 6.6 months, and 1-year survival rates were 31 and 26%, respectively. Albumin <3.5 g, extrathoracic metastases, lactate dehydrogenase ≥200 IU, and 2 comorbid conditions predicted outcome. Patients with 0-2 risk factors had similar outcomes independent of treatment, whereas patients with 3-4 factors had a nonsignificant improvement in median survival with combination chemotherapy. Conclusion: Our results show that PS2 non-small cell lung cancer patients are a heterogeneous group who have significantly different outcomes. Patients treated with first-line combination chemotherapy had a higher response and longer time to progression, whereas overall survival did not appear significantly different. A prognostic model may be helpful in selecting PS 2 patients for either treatment strategy. © 2009 by the International Association for the Study of Lung Cancer.
Resumo:
BACKGROUND: In single-group studies, chromosomal rearrangements of the anaplastic lymphoma kinase gene (ALK ) have been associated with marked clinical responses to crizotinib, an oral tyrosine kinase inhibitor targeting ALK. Whether crizotinib is superior to standard chemotherapy with respect to efficacy is unknown. METHODS: We conducted a phase 3, open-label trial comparing crizotinib with chemotherapy in 347 patients with locally advanced or metastatic ALK-positive lung cancer who had received one prior platinum-based regimen. Patients were randomly assigned to receive oral treatment with crizotinib (250 mg) twice daily or intravenous chemotherapy with either pemetrexed (500 mg per square meter of body-surface area) or docetaxel (75 mg per square meter) every 3 weeks. Patients in the chemotherapy group who had disease progression were permitted to cross over to crizotinib as part of a separate study. The primary end point was progression-free survival. RESULTS: The median progression-free survival was 7.7 months in the crizotinib group and 3.0 months in the chemotherapy group (hazard ratio for progression or death with crizotinib, 0.49; 95% confidence interval [CI], 0.37 to 0.64; P<0.001). The response rates were 65% (95% CI, 58 to 72) with crizotinib, as compared with 20% (95% CI, 14 to 26) with chemotherapy (P<0.001). An interim analysis of overall survival showed no significant improvement with crizotinib as compared with chemotherapy (hazard ratio for death in the crizotinib group, 1.02; 95% CI, 0.68 to 1.54; P=0.54). Common adverse events associated with crizotinib were visual disorder, gastrointestinal side effects, and elevated liver aminotransferase levels, whereas common adverse events with chemotherapy were fatigue, alopecia, and dyspnea. Patients reported greater reductions in symptoms of lung cancer and greater improvement in global quality of life with crizotinib than with chemotherapy. CONCLUSIONS: Crizotinib is superior to standard chemotherapy in patients with previously treated, advanced non-small-cell lung cancer with ALK rearrangement. (Funded by Pfizer; ClinicalTrials.gov number, NCT00932893.) Copyright © 2013 Massachusetts Medical Society.
Resumo:
We present the treatment rationale and study design of the MetLung phase III study. This study will investigate onartuzumab (MetMAb) in combination with erlotinib compared with erlotinib alone, as second- or third-line treatment, in patients with advanced non-small-cell lung cancer (NSCLC) who are Met-positive by immunohistochemistry. Approximately 490 patients (245 per treatment arm) will receive erlotinib (150 mg oral daily) plus onartuzumab or placebo (15 mg/kg intravenous every 3 weeks) until disease progression, unacceptable toxicity, patient or physician decision to discontinue, or death. The efficacy objectives of this study are to compare overall survival (OS) (primary endpoint), progression-free survival, and response rates between the 2 treatment arms. In addition, safety, quality of life, pharmacokinetics, and translational research will be investigated across treatment arms. If the primary objective (OS) is achieved, this study will provide robust results toward an alternative treatment option for patients with Met-positive second- or third-line NSCLC. © 2012 Elsevier Inc. All Rights Reserved.
Resumo:
Background The effects of extra-pleural pneumonectomy (EPP) on survival and quality of life in patients with malignant pleural mesothelioma have, to our knowledge, not been assessed in a randomised trial. We aimed to assess the clinical outcomes of patients who were randomly assigned to EPP or no EPP in the context of trimodal therapy in the Mesothelioma and Radical Surgery (MARS) feasibility study. Methods MARS was a multicentre randomised controlled trial in 12 UK hospitals. Patients aged 18 years or older who had pathologically confirmed mesothelioma and were deemed fit enough to undergo trimodal therapy were included. In a prerandomisation registration phase, all patients underwent induction platinum-based chemotherapy followed by clinical review. After further consent, patients were randomly assigned (1:1) to EPP followed by postoperative hemithorax irradiation or to no EPP. Randomisation was done centrally with computer-generated permuted blocks stratified by surgical centre. The main endpoints were feasibility of randomly assigning 50 patients in 1 year (results detailed in another report), proportion randomised who received treatment, proportion eligible (registered) who proceeded to randomisation, perioperative mortality, and quality of life. Patients and investigators were not masked to treatment allocation. This is the principal report of the MARS study; all patients have been recruited. Analyses were by intention to treat. This trial is registered, number ISRCTN95583524. Findings Between Oct 1, 2005, and Nov 3, 2008, 112 patients were registered and 50 were subsequently randomly assigned: 24 to EPP and 26 to no EPP. The main reasons for not proceeding to randomisation were disease progression (33 patients), inoperability (five patients), and patient choice (19 patients). EPP was completed satisfactorily in 16 of 24 patients assigned to EPP; in five patients EPP was not started and in three patients it was abandoned. Two patients in the EPP group died within 30 days and a further patient died without leaving hospital. One patient in the no EPP group died perioperatively after receiving EPP off trial in a non-MARS centre. The hazard ratio [HR] for overall survival between the EPP and no EPP groups was 1·90 (95% CI 0·92-3·93; exact p=0·082), and after adjustment for sex, histological subtype, stage, and age at randomisation the HR was 2·75 (1·21-6·26; p=0·016). Median survival was 14·4 months (5·3-18·7) for the EPP group and 19·5 months (13·4 to time not yet reached) for the no EPP group. Of the 49 randomly assigned patients who consented to quality of life assessment (EPP n=23; no EPP n=26), 12 patients in the EPP group and 19 in the no EPP group completed the quality of life questionnaires. Although median quality of life scores were lower in the EPP group than the no EPP group, no significant differences between groups were reported in the quality of life analyses. There were ten serious adverse events reported in the EPP group and two in the no EPP group. Interpretation In view of the high morbidity associated with EPP in this trial and in other non-randomised studies a larger study is not feasible. These data, although limited, suggest that radical surgery in the form of EPP within trimodal therapy offers no benefit and possibly harms patients. Funding Cancer Research UK (CRUK/04/003), the June Hancock Mesothelioma Research Fund, and Guy's and St Thomas' NHS Foundation Trust. © 2011 Elsevier Ltd.
Resumo:
Chlamydia trachomatis is the most common sexually transmitted bacterial infection worldwide. The impact of this pathogen on human reproduction has intensified research efforts to better understand chlamydial infection and pathogenesis. Whilst there are animal models available that mimic the many aspects of human chlamydial infection, the mouse is regarded as the most practical and widely used of the models. Studies in mice have greatly contributed to our understanding of the host-pathogen interaction and provided an excellent medium for evaluating vaccines. Here we explore the advantages and disadvantages of all animal models of chlamydial genital tract infection, with a focus on the murine model and what we have learnt from it so far.
Resumo:
Hand, Foot and Mouth Disease (HFMD) is a self-limiting viral disease that mainly affects infants and children. In contrast with other HFMD causing enteroviruses, Enterovirus71 (EV71) has commonly been associated with severe clinical manifestation leading to death. Currently, due to a lack in understanding of EV71 pathogenesis, there is no antiviral therapeutics for the treatment of HFMD patients. Therefore the need to better understand the mechanism of EV71 pathogenesis is warranted. We have previously reported a human colorectal adenocarcinoma cell line (HT29) based model to study the pathogenesis of EV71. Using this system, we showed that knockdown of DGCR8, an essential cofactor for microRNAs biogenesis resulted in a reduction of EV71 replication. We also demonstrated that there are miRNAs changes during EV71 pathogenesis and EV71 utilise host miRNAs to attenuate antiviral pathways during infection. Together, data from this study provide critical information on the role of miRNAs during EV71 infection.
Resumo:
In his letter Cunha suggests that oral antibiotic therapy is safer and less expensive than intravenous therapy via central venous catheters (CVCs) (1). The implication is that costs will fall and increased health benefits will be enjoyed resulting in a gain in efficiency within the healthcare system. CVCs are often used in critically ill patients to deliver antimicrobial therapy, but expose patients to a risk of catheter-related bloodstream infection (CRBSI). Our current knowledge about the efficiency (i.e. costeffectiveness) of allocating resources toward interventions that prevent CRBSI in patients requiring a CVC has already been reviewed (2). If for some patient groups antimicrobial therapy can be delivered orally, instead of through a CVC, then the costs and benefits of this alternate strategy should be evaluated...
Resumo:
The chlamydiae are obligate intracellular parasites that have evolved specific interactions with their various hosts and host cell types to ensure their successful survival and consequential pathogenesis. The species Chlamydia pneumoniae is ubiquitous, with serological studies showing that most humans are infected at some stage in their lifetime. While most human infections are asymptomatic, C. pneumoniae can cause more-severe respiratory disease and pneumonia and has been linked to chronic diseases such as asthma, atherosclerosis, and even Alzheimer's disease. The widely dispersed animal-adapted C. pneumoniae strains cause an equally wide range of diseases in their hosts. It is emerging that the ability of C. pneumoniae to survive inside its target cells, including evasion of the host's immune attack mechanisms, is linked to the acquisition of key metabolites. Tryptophan and arginine are key checkpoint compounds in this host-parasite battle. Interestingly, the animal strains of C. pneumoniae have a slightly larger genome, enabling them to cope better with metabolite restrictions. It therefore appears that as the evolutionarily more ancient animal strains have evolved to infect humans, they have selectively become more "susceptible" to the levels of key metabolites, such as tryptophan. While this might initially appear to be a weakness, it allows these human C. pneumoniae strains to exquisitely sense host immune attack and respond by rapidly reverting to a persistent phase. During persistence, they reduce their metabolic levels, halting progression of their developmental cycle, waiting until the hostile external conditions have passed before they reemerge.