208 resultados para VERSUS-HOST-DISEASE


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chlamydia trachomatis is an obligate intracellular bacterial pathogen that infects the genital and ocular mucosa of humans, causing infections that can lead to pelvic inflammatory disease, infertility, and blinding trachoma. C. pneumoniae is a respiratory pathogen that is the cause of 12–15% of community-acquired pneumonia. Both chlamydial species were believed to be restricted to the epithelia of the genital, ocular, and respiratory mucosa; however, increasing evidence suggests that both these pathogens can be isolated from peripheral blood of both healthy individuals and patients with inflammatory conditions such as coronary artery disease and asthma. Chlamydia can also be isolated from brain tissues of patients with degenerative neurological disorders such as Alzheimer’s disease and multiple sclerosis, and also from certain lymphomas. An increasing number of in vitro studies suggest that some chlamydial species can infect immune cells, at least at low levels. These infections may alter immune cell function in a way that promotes chlamydial persistence in the host and contributes to the progression of several chronic inflammatory diseases. In this paper, we review the evidence for the growth of Chlamydia in immune cells, particularly monocytes/macrophages and dendritic cells, and describe how infection may affect the function of these cells.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An SEI metapopulation model is developed for the spread of an infectious agent by migration. The model portrays two age classes on a number of patches connected by migration routes which are used as host animals mature. A feature of this model is that the basic reproduction ratio may be computed directly, using a scheme that separates topography, demography, and epidemiology. We also provide formulas for individual patch basic reproduction numbers and discuss their connection with the basic reproduction ratio for the system. The model is applied to the problem of spatial spread of bovine tuberculosis in a possum population. The temporal dynamics of infection are investigated for some generic networks of migration links, and the basic reproduction ratio is computed—its value is not greatly different from that for a homogeneous model. Three scenarios are considered for the control of bovine tuberculosis in possums where the spatial aspect is shown to be crucial for the design of disease management operations

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Twin studies offer the opportunity to determine the relative contribution of genes versus environment in traits of interest. Here, we investigate the extent to which variance in brain structure is reduced in monozygous twins with identical genetic make-up. We investigate whether using twins as compared to a control population reduces variability in a number of common magnetic resonance (MR) structural measures, and we investigate the location of areas under major genetic influences. This is fundamental to understanding the benefit of using twins in studies where structure is the phenotype of interest. Twenty-three pairs of healthy MZ twins were compared to matched control pairs. Volume, T2 and diffusion MR imaging were performed as well as spectroscopy (MRS). Images were compared using (i) global measures of standard deviation and effect size, (ii) voxel-based analysis of similarity and (iii) intra-pair correlation. Global measures indicated a consistent increase in structural similarity in twins. The voxel-based and correlation analyses indicated a widespread pattern of increased similarity in twin pairs, particularly in frontal and temporal regions. The areas of increased similarity were most widespread for the diffusion trace and least widespread for T2. MRS showed consistent reduction in metabolite variation that was significant in the temporal lobe N-acetylaspartate (NAA). This study has shown the distribution and magnitude of reduced variability in brain volume, diffusion, T2 and metabolites in twins. The data suggest that evaluation of twins discordant for disease is indeed a valid way to attribute genetic or environmental influences to observed abnormalities in patients since evidence is provided for the underlying assumption of decreased variability in twins.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction:  Smoking status in outpatients with chronic obstructive pulmonary disease (COPD) has been associated with a low body mass index (BMI) and reduced mid-arm muscle circumference (Cochrane & Afolabi, 2004). Individuals with COPD identified as malnourished have also been found to be twice as likely to die within 1 year compared to non-malnourished patients (Collins et al., 2010). Although malnutrition is both preventable and treatable, it is not clear what influence current smoking status, another modifiable risk factor, has on malnutrition risk. The current study aimed to establish the influence of smoking status on malnutrition risk and 1-year mortality in outpatients with COPD. Methods:  A prospective nutritional screening survey was carried out between July 2008 and May 2009 at a large teaching hospital (Southampton General Hospital) and a smaller community hospital within Hampshire (Lymington New Forest Hospital). In total, 424 outpatients with a diagnosis of COPD were routinely screened using the ‘Malnutrition Universal Screening Tool’, ‘MUST’ (Elia, 2003); 222 males, 202 females; mean (SD) age 73 (9.9) years; mean (SD) BMI 25.9 (6.4) kg m−2. Smoking status on the date of screening was obtained for 401 of the outpatients. Severity of COPD was assessed using the GOLD criteria, and social deprivation determined using the Index of Multiple Deprivation (Nobel et al., 2008). Results:  The overall prevalence of malnutrition (medium + high risk) was 22%, with 32% of current smokers at risk (who accounted for 19% of the total COPD population). In comparison, 19% of nonsmokers and ex-smokers were likely to be malnourished [odds ratio, 1.965; 95% confidence interval (CI), 1.133–3.394; P = 0.015]. Smoking status remained an independent risk factor for malnutrition even after adjustment for age, social deprivation and disease-severity (odds ratio, 2.048; 95% CI, 1.085–3.866; P = 0.027) using binary logistic regression. After adjusting for age, disease severity, social deprivation, smoking status, malnutrition remained a significant predictor of 1-year mortality [odds ratio (medium + high risk versus low risk), 2.161; 95% CI, 1.021–4.573; P = 0.044], whereas smoking status did not (odds ratio for smokers versus ex-smokers + nonsmokers was 1.968; 95% CI, 0.788–4.913; P = 0.147). Discussion:  This study highlights the potential importance of combined nutritional support and smoking cessation in order to treat malnutrition. The close association between smoking status and malnutrition risk in COPD suggests that smoking is an important consideration in the nutritional management of malnourished COPD outpatients. Conclusions:  Smoking status in COPD outpatients is a significant independent risk factor for malnutrition and a weaker (nonsignificant) predictor of 1-year mortality. Malnutrition significantly predicted 1 year mortality. References:  Cochrane, W.J. & Afolabi, O.A. (2004) Investigation into the nutritional status, dietary intake and smoking habits of patients with chronic obstructive pulmonary disease. J. Hum. Nutr. Diet.17, 3–11. Collins, P.F., Stratton, R.J., Kurukulaaratchym R., Warwick, H. Cawood, A.L. & Elia, M. (2010) ‘MUST’ predicts 1-year survival in outpatients with chronic obstructive pulmonary disease. Clin. Nutr.5, 17. Elia, M. (Ed) (2003) The ‘MUST’ Report. BAPEN. http://www.bapen.org.uk (accessed on March 30 2011). Nobel, M., McLennan, D., Wilkinson, K., Whitworth, A. & Barnes, H. (2008) The English Indices of Deprivation 2007. http://www.communities.gov.uk (accessed on March 30 2011).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Maize streak virus (MSV; family Geminiviridae, genus Mastrevirus), the causal agent of maize streak disease, ranks amongst the most serious biological threats to food security in subSaharan Africa. Although five distinct MSV strains have been currently described, only one of these - MSV-A - causes severe disease in maize. Due primarily to their not being an obvious threat to agriculture, very little is known about the 'grass-adapted' MSV strains, MSV-B, -C, -D and -E. Since comparing the genetic diversities, geographical distributions and natural host ranges of MSV-A with the other MSV strains could provide valuable information on the epidemiology, evolution and emergence of MSV-A, we carried out a phylogeographical analysis of MSVs found in uncultivated indigenous African grasses. Amongst the 83 new MSV genomes presented here, we report the discovery of six new MSV strains (MSV-F to -K). The non-random recombination breakpoint distributions detectable with these and other available mastrevirus sequences partially mirror those seen in begomoviruses, implying that the forces shaping these breakpoint patterns have been largely conserved since the earliest geminivirus ancestors. We present evidence that the ancestor of all MSV-A variants was the recombinant progeny of ancestral MSV-B and MSV-G/-F variants. While it remains unknown whether recombination influenced the emergence of MSV-A in maize, our discovery that MSV-A variants may both move between and become established in different regions of Africa with greater ease, and infect more grass species than other MSV strains, goes some way towards explaining why MSV-A is such a successful maize pathogen. © 2008 SGM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Circoviruses lack an autonomous DNA polymerase and are dependent on the replication machinery of the host cell for de novo DNA synthesis. Accordingly, the viral DNA needs to cross both the plasma membrane and the nuclear envelope before replication can occur. Here we report on the subcellular distribution of the beak and feather disease virus (BFDV) capsid protein (CP) and replication-associated protein (Rep) expressed via recombinant baculoviruses in an insect cell system and test the hypothesis that the CP is responsible for transporting the viral genome, as well as Rep, across the nuclear envelope. The intracellular localization of the BFDV CP was found to be directed by three partially overlapping bipartite nuclear localization signals (NLSs) situated between residues 16 and 56 at the N terminus of the protein. Moreover, a DNA binding region was also mapped to the N terminus of the protein and falls within the region containing the three putative NLSs. The ability of CP to bind DNA, coupled with the karyophilic nature of this protein, strongly suggests that it may be responsible for nuclear targeting of the viral genome. Interestingly, whereas Rep expressed on its own in insect cells is restricted to the cytoplasm, coexpression with CP alters the subcellular localization of Rep to the nucleus, strongly suggesting that an interaction with CP facilitates movement of Rep into the nucleus. Copyright © 2006, American Society for Microbiology. All Rights Reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Maize streak virus -strain A (MSV-A; Genus Mastrevirus, Family Geminiviridae), the maize-adapted strain of MSV that causes maize streak disease throughout sub-Saharan Africa, probably arose between 100 and 200 years ago via homologous recombination between two MSV strains adapted to wild grasses. MSV recombination experiments and analyses of natural MSV recombination patterns have revealed that this recombination event entailed the exchange of the movement protein - coat protein gene cassette, bounded by the two genomic regions most prone to recombination in mastrevirus genomes; the first surrounding the virion-strand origin of replication, and the second around the interface between the coat protein gene and the short intergenic region. Therefore, aside from the likely adaptive advantages presented by a modular exchange of this cassette, these specific breakpoints may have been largely predetermined by the underlying mechanisms of mastrevirus recombination. To investigate this hypothesis, we constructed artificial, low-fitness, reciprocal chimaeric MSV genomes using alternating genomic segments from two MSV strains; a grass-adapted MSV-B, and a maize-adapted MSV-A. Between them, each pair of reciprocal chimaeric genomes represented all of the genetic material required to reconstruct - via recombination - the highly maize-adapted MSV-A genotype, MSV-MatA. We then co-infected a selection of differentially MSV-resistant maize genotypes with pairs of reciprocal chimaeras to determine the efficiency with which recombination would give rise to high-fitness progeny genomes resembling MSV-MatA. Results Recombinants resembling MSV-MatA invariably arose in all of our experiments. However, the accuracy and efficiency with which the MSV-MatA genotype was recovered across all replicates of each experiment depended on the MSV susceptibility of the maize genotypes used and the precise positions - in relation to known recombination hotspots - of the breakpoints required to re-create MSV-MatA. Although the MSV-sensitive maize genotype gave rise to the greatest variety of recombinants, the measured fitness of each of these recombinants correlated with their similarity to MSV-MatA. Conclusions The mechanistic predispositions of different MSV genomic regions to recombination can strongly influence the accessibility of high-fitness MSV recombinants. The frequency with which the fittest recombinant MSV genomes arise also correlates directly with the escalating selection pressures imposed by increasingly MSV-resistant maize hosts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently there is confusion about the value of using nutritional support to treat malnutrition and improve functional outcomes in chronic obstructive pulmonary disease (COPD). This systematic review and meta-analysis of randomised controlled trials (RCTs) aimed to clarify the effectiveness of nutritional support in improving functional outcomes in COPD. A systematic review identified 12 RCTs (n = 448) in stable COPD patients investigating the effects of nutritional support [dietary advice (1 RCT), oral nutritional supplements (ONS; 10 RCTs), enteral tube feeding (1 RCT)] versus control on functional outcomes. Meta-analysis of the changes induced by intervention found that whilst respiratory function (FEV(1,) lung capacity, blood gases) was unresponsive to nutritional support, both inspiratory and expiratory muscle strength (PI max +3.86 SE 1.89 cm H(2) O, P = 0.041; PE max +11.85 SE 5.54 cm H(2) O, P = 0.032) and handgrip strength (+1.35 SE 0.69 kg, P = 0.05) were significantly improved, and associated with weight gains of ≥ 2 kg. Nutritional support produced significant improvements in quality of life in some trials, although meta-analysis was not possible. It also led to improved exercise performance and enhancement of exercise rehabilitation programmes. This systematic review and meta-analysis demonstrates that nutritional support in COPD results in significant improvements in a number of clinically relevant functional outcomes, complementing a previous review showing improvements in nutritional intake and weight.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sclerotinia sclerotiorum is a necrotrophic ascomycete fungus with an extremely broad host range. This pathogen produces the non-specific phytotoxin and key pathogenicity factor, oxalic acid (OA). Our recent work indicated that this fungus and more specifically OA, can induce apoptotic-like programmed cell death (PCD) in plant hosts, this induction of PCD and disease requires generation of reactive oxygen species (ROS) in the host, a process triggered by fungal secreted OA. Conversely, during the initial stages of infection, OA also dampens the plant oxidative burst, an early host response generally associated with plant defense. This scenario presents a challenge regarding the mechanistic details of OA function; as OA both suppresses and induces host ROS during the compatible interaction. In the present study we generated transgenic plants expressing a redox-regulated GFP reporter. Results show that initially, Sclerotinia (via OA) generates a reducing environment in host cells that suppress host defense responses including the oxidative burst and callose deposition, akin to compatible biotrophic pathogens. Once infection is established however, this necrotroph induces the generation of plant ROS leading to PCD of host tissue, the result of which is of direct benefit to the pathogen. In contrast, a non-pathogenic OA-deficient mutant failed to alter host redox status. The mutant produced hypersensitive response-like features following host inoculation, including ROS induction, callose formation, restricted growth and cell death. These results indicate active recognition of the mutant and further point to suppression of defenses by the wild type necrotrophic fungus. Chemical reduction of host cells with dithiothreitol (DTT) or potassium oxalate (KOA) restored the ability of this mutant to cause disease. Thus, Sclerotinia uses a novel strategy involving regulation of host redox status to establish infection. These results address a long-standing issue involving the ability of OA to both inhibit and promote ROS to achieve pathogenic success.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Selenium (Se) is an essential trace element and the clinical consequences of Se deficiency have been well-documented. Se is primarily obtained through the diet and recent studies have suggested that the level of Se in Australian foods is declining. Currently there is limited data on the Se status of the Australian population so the aim of this study was to determine the plasma concentration of Se and glutathione peroxidase (GSH-Px), a well-established biomarker of Se status. Furthermore, the effect of gender, age and presence of cardiovascular disease (CVD) was also examined. Blood plasma samples from healthy subjects (140 samples, mean age = 54 years; range, 20-86 years) and CVD patients (112 samples, mean age = 67 years; range, 40-87 years) were analysed for Se concentration and GSH-Px activity. The results revealed that the healthy Australian cohort had a mean plasma Se level of 100.2 +/- 1.3 microg Se/L and a mean GSH-Px activity of 108.8 +/- 1.7 U/L. Although the mean value for plasma Se reached the level required for optimal GSH-Px activity (i.e. 100 microg Se/L), 47% of the healthy individuals tested fell below this level. Further evaluation revealed that certain age groups were more at risk of a lowered Se status, in particular, the oldest age group of over 81 years (females = 97.6 +/- 6.1 microg Se/L; males = 89.4 +/- 3.8 microg Se/L). The difference in Se status between males and females was not found to be significant. The presence of CVD did not appear to influence Se status, with the exception of the over 81 age group, which showed a trend for a further decline in Se status with disease (plasma Se, 93.5 +/- 3.6 microg Se/L for healthy versus 88.2 +/- 5.3 microg Se/L for CVD; plasma GSH-Px, 98.3 +/- 3.9 U/L for healthy versus 87.0 +/- 6.5 U/L for CVD). These findings emphasise the importance of an adequate dietary intake of Se for the maintenance of a healthy ageing population, especially in terms of cardiovascular health.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: US Centers for Disease Control guidelines recommend replacement of peripheral intravenous (IV) catheters no more frequently than every 72 to 96 hours. Routine replacement is thought to reduce the risk of phlebitis and bloodstream infection. Catheter insertion is an unpleasant experience for patients and replacement may be unnecessary if the catheter remains functional and there are no signs of inflammation. Costs associated with routine replacement may be considerable. This is an update of a review first published in 2010. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chlamydia pneumoniae commonly causes respiratory tract infections in children, and epidemiological investigations strongly link infection to the pathogenesis of asthma. The immune system in early life is immature and may not respond appropriately to pathogens. Toll-like receptor (TLR)2 and 4 are regarded as the primary pattern recognition receptors that sense bacteria, however their contribution to innate and adaptive immunity in early life remains poorly defined. We investigated the role of TLR2 and 4 in the induction of immune responses to Chlamydia muridarum respiratory infection, in neonatal wild-type (Wt) or TLR2-deficient (−/−), 4−/− or 2/4−/− BALB/c mice. Wt mice had moderate disease and infection. TLR2−/− mice had more severe disease and more intense and prolonged infection compared to other groups. TLR4−/− mice were asymptomatic. TLR2/4−/− mice had severe early disease and persistent infection, which resolved thereafter consistent with the absence of symptoms in TLR4−/− mice. Wt mice mounted robust innate and adaptive responses with an influx of natural killer (NK) cells, neutrophils, myeloid (mDCs) and plasmacytoid (pDCs) dendritic cells, and activated CD4+ and CD8+ T-cells into the lungs. Wt mice also had effective production of interferon (IFN)γ in the lymph nodes and lung, and proliferation of lymph node T-cells. TLR2−/− mice had more intense and persistent innate (particularly neutrophil) and adaptive cell responses and IL-17 expression in the lung, however IFNγ responses and T-cell proliferation were reduced. TLR2/4−/− mice had reduced innate and adaptive responses. Most importantly, neutrophil phagocytosis was impaired in the absence of TLR2. Thus, TLR2 expression, particularly on neutrophils, is required for effective control of Chlamydia respiratory infection in early life. Loss of control of infection leads to enhanced but ineffective TLR4-mediated inflammatory responses that prolong disease symptoms. This indicates that TLR2 agonists may be beneficial in the treatment of early life Chlamydia infections and associated diseases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Phase III studies suggest that non-small-cell lung cancer (NSCLC) patients treated with cisplatin-docetaxel may have higher response rates and better survival compared with other platinum-based regimens. We report the final results of a randomised phase III study of docetaxel and carboplatin versus MIC or MVP in patients with advanced NSCLC. Patients and methods: Patients with biopsy proven stage III-IV NSCLC not suitable for curative surgery or radiotherapy were randomised to receive four cycles of either DCb (docetaxel 75 mg/m 2, carboplatin AUC 6), or MIC/MVP (mitomycin 6 mg/m 2, ifosfamide 3 g/m 2 and cisplatin 50 mg/m 2 or mitomycin 6 mg/ m 2, vinblastine 6 mg/m 2 and cisplatin 50 mg/m 2, respectively), 3 weekly. The primary end point was survival, secondary end points included response rates, toxicity and quality of life. Results: The median follow-up was 17.4 months. Overall response rate was 32% for both arms (partial response = 31%, complete response = 1%); 32% of MIC/MVP and 26% of DCb patients had stable disease. One-year survival was 39% and 35% for DCb and MIC/MVP, respectively. Two-year survival was 13% with both arms. Grade 3/4 neutropenia (74% versus 43%, P < 0.005), infection (18% versus 9%, P = 0.01) and mucositis (5% versus 1%, P = 0.02) were more common with DCb than MIC/MVP. The MIC/MVP arm had significant worsening in overall EORTC score and global health status whereas the DCb arm showed no significant change. Conclusions: The combination of DCb had similar efficacy to MIC/MVP but quality of life was better maintained. © 2006 European Society for Medical Oncology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND. The authors compared gemcitabine and carboplatin (GC) with mitomycin, ifosfamide, and cisplatin (MIC) or mitomycin, vinblastine, and cisplatin (MVP) in patients with advanced nonsmall cell lung carcinoma (NSCLC). The primary objective was survival. Secondary objectives were time to disease progression, response rates, evaluation of toxicity, disease-related symptoms, World Health Organization performance status (PS), and quality of life (QoL). METHODS. Three hundred seventy-two chemotherapy-naïve patients with International Staging System Stage III/IV NSCLC who were ineligible for curative radiotherapy or surgery were randomized to receive either 4 cycles of gemcitabine (1000 mg/m2 on Days 1, 8, and 15) plus carboplatin (area under the serum concentration-time curve, 5; given on Day 1) every 4 weeks (the GC arm) or MIC/MVP every 3 weeks (the MIC/MVP arm). RESULTS. There was no significant difference in median survival (248 days in the MIC/MVP arm vs. 236 days in the GC arm) or time to progression (225 days in the MIC/MVP arm vs. 218 days in the GC arm) between the 2 treatment arms. The 2-year survival rate was 11.8% in the MIC/MVP arm and 6.9% in the GC arm. The 1-year survival rate was 32.5% in the MIC/MVP arm and 33.2% in the GC arm. In the MIC/MVP arm, 33% of patients responded (4 complete responses [CRs] and 57 partial responses [PRs]) whereas in the GC arm, 30% of patients responded (3 CRs and 54 PRs). Nonhematologic toxicity was comparable for patients with Grade 3-4 symptoms, except there was more alopecia among patients in the MIC/MVP arm. GC appeared to produce more hematologic toxicity and necessitated more transfusions. There was no difference in performance status, disease-related symptoms, of QoL between patients in the two treatment arms. Fewer inpatient stays for complications were required with GC. CONCLUSIONS. The results of the current study failed to demonstrate any difference in efficacy between the newer regimen of GC and the older regimens of MIC and MVP. © 2003 American Cancer Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Performance status (PS) 2 patients with non-small cell lung cancer (NSCLC) experience more toxicity, lower response rates, and shorter survival times than healthier patients treated with standard chemotherapy. Paclitaxel poliglumex (PPX), a macromolecule drug conjugate of paclitaxel and polyglutamic acid, reduces systemic exposure to peak concentrations of free paclitaxel and may lead to increased concentrations in tumors due to enhanced vascular permeability. METHODS: Chemotherapy-naive PS 2 patients with advanced NSCLC were randomized to receive carboplatin (area under the curve = 6) and either PPX (210 mg/m/10 min without routine steroid premedication) or paclitaxel (225 mg/m/3 h with standard premedication) every 3 weeks. The primary end point was overall survival. RESULTS: A total of 400 patients were enrolled. Alopecia, arthralgias/myalgias, and cardiac events were significantly less frequent with PPX/carboplatin, whereas grade ≥3 neutropenia and grade 3 neuropathy showed a trend of worsening. There was no significant difference in the incidence of hypersensitivity reactions despite the absence of routine premedication in the PPX arm. Overall survival was similar between treatment arms (hazard ratio, 0.97; log rank p = 0.769). Median and 1-year survival rates were 7.9 months and 31%, for PPX versus 8 months and 31% for paclitaxel. Disease control rates were 64% and 69% for PPX and paclitaxel, respectively. Time to progression was similar: 3.9 months for PPX/carboplatin versus 4.6 months for paclitaxel/carboplatin (p = 0.210). CONCLUSION: PPX/carboplatin failed to provide superior survival compared with paclitaxel/carboplatin in the first-line treatment of PS 2 patients with NSCLC, but the results with respect to progression-free survival and overall survival were comparable and the PPX regimen was more convenient. © 2008International Association for the Study of Lung Cancer.