961 resultados para Ashy stem blight
Resumo:
Acellular dermal matrices (ADM) are commonly used in reconstructive procedures and rely on host cell invasion to become incorporated into host tissues. We investigated different approaches to adipose-derived stem cells (ASCs) engraftment into ADM to enhance this process. Lewis rat adipose-derived stem cells were isolated and grafted (3.0 × 10(5) cells) to porcine ADM disks (1.5 mm thick × 6 mm diameter) using either passive onlay or interstitial injection seeding techniques. Following incubation, seeding efficiency and seeded cell viability were measured in vitro. In addition, Eighteen Lewis rats underwent subcutaneous placement of ADM disk either as control or seeded with PKH67 labeled ASCs. ADM disks were seeded with ASCs using either onlay or injection techniques. On day 7 and or 14, ADM disks were harvested and analyzed for host cell infiltration. Onlay and injection techniques resulted in unique seeding patterns; however cell seeding efficiency and cell viability were similar. In-vivo studies showed significantly increased host cell infiltration towards the ASCs foci following injection seeding in comparison to control group (p < 0.05). Moreover, regional endothelial cell invasion was significantly greater in ASCs injected grafts in comparison to onlay seeding (p < 0.05). ADM can successfully be engrafted with ASCs. Interstitial engraftment of ASCs into ADM via injection enhances regional infiltration of host cells and angiogenesis, whereas onlay seeding showed relatively broad and superficial cell infiltration. These findings may be applied to improve the incorporation of avascular engineered constructs.
Resumo:
Mesenchymal stem cells (MSCs) and endothelial progenitor cells (EPCs) represent promising cell sources for angiogenic therapies. There are, however, conflicting reports regarding the ability of MSCs to support network formation of endothelial cells. The goal of this study was to assess the ability of human bone marrow-derived MSCs to support network formation of endothelial outgrowth cells (EOCs) derived from umbilical cord blood EPCs. We hypothesized that upon in vitro coculture, MSCs and EOCs promote a microenvironment conducive for EOC network formation without the addition of angiogenic growth supplements. EOC networks formed by coculture with MSCs underwent regression and cell loss by day 10 with a near 4-fold and 2-fold reduction in branch points and mean segment length, respectively, in comparison with networks formed by coculture vascular smooth muscle cell (SMC) cocultures. EOC network regression in MSC cocultures was not caused by lack of vascular endothelial growth factor (VEGF)-A or changes in TGF-β1 or Ang-2 supernatant concentrations in comparison with SMC cocultures. Removal of CD45+ cells from MSCs improved EOC network formation through a 2-fold increase in total segment length and number of branch points in comparison to unsorted MSCs by day 6. These improvements, however, were not sustained by day 10. CD45 expression in MSC cocultures correlated with EOC network regression with a 5-fold increase between day 6 and day 10 of culture. The addition of supplemental growth factors VEGF, fibroblastic growth factor-2, EGF, hydrocortisone, insulin growth factor-1, ascorbic acid, and heparin to MSC cocultures promoted stable EOC network formation over 2 weeks in vitro, without affecting CD45 expression, as evidenced by a lack of significant differences in total segment length (p=0.96). These findings demonstrate the ability of MSCs to support EOC network formation correlates with removal of CD45+ cells and improves upon the addition of soluble growth factors.
Resumo:
CD133 is one of the most common stem cell markers, and functional single nucleotide polymorphisms (SNPs) of CD133 may modulate its gene functions and thus cancer risk and patient survival. We hypothesized that potentially functional CD133 SNPs are associated with gastric cancer (GC) risk and survival. To test this hypothesis, we conducted a case-control study of 371 GC patients and 313 cancer-free controls frequency-matched by age, sex, and ethnicity. We genotyped four selected, potentially functional CD133 SNPs (rs2240688A>C, rs7686732C>G, rs10022537T>A, and rs3130C>T) and used logistic regression analysis for associations of these SNPs with GC risk and Cox hazards regression analysis for survival. We found that compared with the miRNA binding site rs2240688 AA genotype, AC + CC genotypes were associated with significantly increased GC risk (adjusted OR = 1.52, 95% CI = 1.09-2.13); for another miRNA binding site rs3130C>T SNP, the TT genotype was associated with significantly reduced GC risk (adjusted OR = 0.68, 95% CI = 0.48-0.97), compared with CC + CT genotypes. In all patients, the risk rs3130 TT variant genotype was significantly associated with overall survival (OS) (adjusted P(trend) = 0.016 and 0.007 under additive and recessive models, respectively). These findings suggest that these two CD133 miRNA binding site variants, rs2240688 and rs3130, may be potential biomarkers for genetic susceptibility to GC and possible predictors for survival in GC patients but require further validation by larger studies.
Resumo:
Emerging evidence suggests that microRNAs can initiate asymmetric division, but whether microRNA and protein cell fate determinants coordinate with each other remains unclear. Here, we show that miR-34a directly suppresses Numb in early-stage colon cancer stem cells (CCSCs), forming an incoherent feedforward loop (IFFL) targeting Notch to separate stem and non-stem cell fates robustly. Perturbation of the IFFL leads to a new intermediate cell population with plastic and ambiguous identity. Lgr5+ mouse intestinal/colon stem cells (ISCs) predominantly undergo symmetric division but turn on asymmetric division to curb the number of ISCs when proinflammatory response causes excessive proliferation. Deletion of miR-34a inhibits asymmetric division and exacerbates Lgr5+ ISC proliferation under such stress. Collectively, our data indicate that microRNA and protein cell fate determinants coordinate to enhance robustness of cell fate decision, and they provide a safeguard mechanism against stem cell proliferation induced by inflammation or oncogenic mutation.
Resumo:
One way we keep track of our movements is by monitoring corollary discharges or internal copies of movement commands. This study tested a hypothesis that the pathway from superior colliculus (SC) to mediodorsal thalamus (MD) to frontal eye field (FEF) carries a corollary discharge about saccades made into the contralateral visual field. We inactivated the MD relay node with muscimol in monkeys and measured corollary discharge deficits using a double-step task: two sequential saccades were made to the locations of briefly flashed targets. To make second saccades correctly, monkeys had to internally monitor their first saccades; therefore deficits in the corollary discharge representation of first saccades should disrupt second saccades. We found, first, that monkeys seemed to misjudge the amplitudes of their first saccades; this was revealed by systematic shifts in second saccade end points. Thus corollary discharge accuracy was impaired. Second, monkeys were less able to detect trial-by-trial variations in their first saccades; this was revealed by reduced compensatory changes in second saccade angles. Thus corollary discharge precision also was impaired. Both deficits occurred only when first saccades went into the contralateral visual field. Single-saccade generation was unaffected. Additional deficits occurred in reaction time and overall performance, but these were bilateral. We conclude that the SC-MD-FEF pathway conveys a corollary discharge used for coordinating sequential saccades and possibly for stabilizing vision across saccades. This pathway is the first elucidated in what may be a multilevel chain of corollary discharge circuits extending from the extraocular motoneurons up into cerebral cortex.
Resumo:
Busulfan, cyclophosphamide, and etoposide (BuCyE) is a commonly used conditioning regimen for autologous stem cell transplantation (ASCT). This multicenter, phase II study examined the safety and efficacy of BuCyE with individually adjusted busulfan based on preconditioning pharmacokinetics. The study initially enrolled Hodgkin lymphoma (HL) and non-Hodgkin lymphoma (NHL) patients ages 18 to 80 years but was amended due to high early treatment-related mortality (TRM) in patients > 65 years. BuCyE outcomes were compared with contemporaneous recipients of carmustine, etoposide, cytarabine, and melphalan (BEAM) from the Center for International Blood and Marrow Transplant Research. Two hundred seven subjects with HL (n = 66) or NHL (n = 141) were enrolled from 32 centers in North America, and 203 underwent ASCT. Day 100 TRM for all subjects (n = 203), patients > 65 years (n = 17), and patients ≤ 65 years (n = 186) were 4.5%, 23.5%, and 2.7%, respectively. The estimated rates of 2-year progression-free survival (PFS) were 33% for HL and 58%, 77%, and 43% for diffuse large B cell lymphoma (DLBCL; n = 63), mantle cell lymphoma (MCL; n = 29), and follicular lymphoma (FL; n = 23), respectively. The estimated rates of 2-year overall survival (OS) were 76% for HL and 65%, 89%, and 89% for DLBCL, MCL, and FL, respectively. In the matched analysis rates of 2-year TRM were 3.3% for BuCyE and 3.9% for BEAM, and there were no differences in outcomes for NHL. Patients with HL had lower rates of 2-year PFS with BuCyE, 33% (95% CI, 21% to 46%), than with BEAM, 59% (95% CI, 52% to 66%), with no differences in TRM or OS. BuCyE provided adequate disease control and safety in B cell NHL patients ≤ 65 years but produced worse PFS in HL patients when compared with BEAM.
The tithe: Public research university STEM faculty perspectives on sponsored research indirect costs
Resumo:
This study sought to understand the phenomenon of faculty involvement in indirect cost under-recovery. The focus of the study was on public research university STEM (science, technology, engineering and mathematics) faculty, and their perspectives on, and behavior towards, a higher education fiscal policy. The explanatory scheme was derived from anthropological theory, and incorporated organizational culture, faculty socialization, and political bargaining models in the conceptual framework. This study drew on two key assumptions. The first assumption was that faculty understanding of, and behavior toward, indirect cost recovery represents values, beliefs, and choices drawn from the distinct professional socialization and distinct culture of faculty. The second assumption was that when faculty and institutional administrators are in conflict over indirect cost recovery, the resultant formal administrative decision comes about through political bargaining over critical resources. The research design was a single site, qualitative case study with a focus on learning the meaning of the phenomenon as understood by the informants. In this study the informants were tenured and tenure track research university faculty in the STEM fields who were highly successful at obtaining Federal sponsored research funds, with individual sponsored research portfolios of at least one million dollars. The data consisted of 11 informant interviews, bolstered by documentary evidence. The findings indicated that faculty socialization and organizational culture were the most dominant themes, while political bargaining emerged as significantly less prominent. Public research university STEM faculty are most concerned about the survival of their research programs and the discovery facilitated by their research programs. They resort to conjecture when confronted by the issue of indirect cost recovery. The findings direct institutional administrators to consider less emphasis on compliance and hierarchy when working with expert professionals such as science faculty. Instead a more effective focus might be on communication and clarity in budget processes and organizational decision-making, and a concentration on critical administrative support that can relieve faculty administrative burdens. For higher education researchers, the findings suggest that we need to create more sophisticated models to help us understand organizations dependent on expert professionals.
Resumo:
The relative resistance of 15 winter barley, three winter wheat and three winter oat cultivars on the UK recommended list 2003 and two spring wheat cultivars on the Irish 2003 recommended list were evaluated using Microdochium nivale in detached leaf assays to further understand components of partial disease resistance (PDR) and Fusarium head blight (FHB) resistance across cereal species. Barley cultivars showed incubation periods comparable to, and latent periods longer than the most FHB resistant Irish and UK wheat cultivars evaluated. In addition, lesions on barley differed from those on wheat as they were not visibly chlorotic when placed over a light box until sporulation occurred, in contrast to wheat cultivars where chlorosis of the infected area occurred when lesions first developed. The pattern of delayed chlorosis of the infected leaf tissue and longer latent periods indicate that resistances are expressed in barley after the incubation period is observed, and that these temporarily arrest the development of mycelium and sporulation. Incubation periods were longer for oats compared to barley or wheat cultivars. However, oat cultivars differed from both wheat and barley in that mycelial growth was observed before obvious tissue damage was detected under macroscopic examination, indicating tolerance of infection rather than inhibition of pathogen development, and morphology of sporodochia differed, appearing less well developed and being much less abundant. Longer latent periods have previously been related to greater FHB resistance in wheat. The present results suggest the longer latent periods of barley and oat cultivars, than wheat, are likely to play a role in overall FHB resistance if under the same genetic control as PDR components expressed in the head. However the limited range of incubation and latent periods observed within barley and oat cultivars evaluated was in contrast with wheat where incubation and latent periods were shorter and more variable among genotypes. The significance of the various combinations of PDR components detected in the detached leaf assay as components of FHB resistance in each crop requires further investigation, particularly with regard to the apparent tolerance of infection in oats and necrosis in barley, after the incubation period is observed, associated with retardation of mycelial growth and sporulation.
Resumo:
Components of partial disease resistance (PDR) to fusarium head blight (FHB), detected in a seed-germination assay, were compared with whole-plant FHB resistance of 30 USA soft red winter wheat entries in the 2002 Uniform Southern FHB Nursery. Highly significant (P <0·001) differences between cultivars in the in vitro seed-germination assay inoculated with Microdochium majus were correlated to FHB disease incidence (r = -0·41; P <0·05), severity (r = -0·47; P <0·01), FHB index (r = -0·46; P <0·01), damaged kernels (r = -0·52; P <0·01), grain deoxynivalenol (DON) concentration (r = -0·40; P <0·05) and incidence/severity/kernel-damage index (ISK) (r = -0·45; P <0·01) caused by Fusarium graminearum. Multiple linear regression analysis explained a greater percentage of variation in FHB resistance using the seed-germination assay and the previously reported detached-leaf assay PDR components as explanatory factors. Shorter incubation periods, longer latent periods, shorter lesion lengths in the detached-leaf assay and higher germination rates in the seed-germination assay were related to greater FHB resistance across all disease variables, collectively explaining 62% of variation for incidence, 49% for severity, 56% for F. graminearum-damaged kernels (FDK), 39% for DON and 59% for ISK index. Incubation period was most strongly related to disease incidence and the early stages of infection, while resistance detected in the seed germination assay and latent period were more strongly related to FHB disease severity. Resistance detected using the seed-germination assay was notable as it related to greater decline in the level of FDK and a smaller reduction in DON than would have been expected from the reduction in FHB disease assessed by visual symptoms.
Resumo:
Purpose: We investigated the potential for improvement in disease control by use of autologous peripheral blood stem cell transplant (PBSCT) to permit administration of high activities of 186Re-hydroxyethylidene diphosphonate (HEDP) in patients with progressive hormone-refractory prostate cancer (HRPC).
Methods: Eligible patients had progressive HRPC metastatic to bone, good performance status and minimal soft tissue disease. Patients received 5,000 MBq of 186Re-HEDP i.v., followed 14 days later by PBSCT. Response was assessed using PSA, survival, pain scores and quality of life.
Results: Thirty-eight patients with a median age of 67 years (range 50–77) and a median PSA of 57 ng/ml (range 4–3,628) received a median activity of 4,978 MBq 186Re-HEDP (range 4,770–5,100 MBq). The most serious toxicity was short-lived grade 3 thrombocytopenia in 8 (21%) patients. The median survival of the group is 21 months (95%CI 18–24 months) with Kaplan-Meier estimated 1- and 2-year survival rates of 83% and 40% respectively. Thirty-one patients (81%, 95% CI 66–90%) had stable or reduced PSA levels 3 months post therapy while 11 (29%, 95% CI 15–49%) had PSA reductions of >50% lasting >4 weeks. Quality of life measures were stable or improved in 27 (66%) at 3 months.
Conclusion: We have shown that it is feasible and safe to deliver high-activity radioisotope therapy with PBSCT to men with metastatic HRPC. Response rates and survival data are encouraging; however, further research is needed to define optimal role of this treatment approach.