897 resultados para Critical number of particles
Resumo:
Estimation of the number of mixture components (k) is an unsolved problem. Available methods for estimation of k include bootstrapping the likelihood ratio test statistics and optimizing a variety of validity functionals such as AIC, BIC/MDL, and ICOMP. We investigate the minimization of distance between fitted mixture model and the true density as a method for estimating k. The distances considered are Kullback-Leibler (KL) and “L sub 2”. We estimate these distances using cross validation. A reliable estimate of k is obtained by voting of B estimates of k corresponding to B cross validation estimates of distance. This estimation methods with KL distance is very similar to Monte Carlo cross validated likelihood methods discussed by Smyth (2000). With focus on univariate normal mixtures, we present simulation studies that compare the cross validated distance method with AIC, BIC/MDL, and ICOMP. We also apply the cross validation estimate of distance approach along with AIC, BIC/MDL and ICOMP approach, to data from an osteoporosis drug trial in order to find groups that differentially respond to treatment.
Resumo:
Simulation-based assessment is a popular and frequently necessary approach to evaluation of statistical procedures. Sometimes overlooked is the ability to take advantage of underlying mathematical relations and we focus on this aspect. We show how to take advantage of large-sample theory when conducting a simulation using the analysis of genomic data as a motivating example. The approach uses convergence results to provide an approximation to smaller-sample results, results that are available only by simulation. We consider evaluating and comparing a variety of ranking-based methods for identifying the most highly associated SNPs in a genome-wide association study, derive integral equation representations of the pre-posterior distribution of percentiles produced by three ranking methods, and provide examples comparing performance. These results are of interest in their own right and set the framework for a more extensive set of comparisons.
Resumo:
PURPOSE: To identify groups of early breast cancer patients with substantial risk (10-year risk > 20%) for locoregional failure (LRF) who might benefit from postmastectomy radiotherapy (RT). PATIENTS AND METHODS: Prognostic factors for LRF were evaluated among 6,660 patients (2,588 node-negative patients, 4,072 node-positive patients) in International Breast Cancer Study Group Trials I to IX treated with chemotherapy and/or endocrine therapy, and observed for a median of 14 years. In total, 1,251 LRFs were detected. All patients were treated with mastectomy without RT. RESULTS: No group with 10-year LRF risk exceeding 20% was found among patients with node-negative disease. Among patients with node-positive breast cancer, increasing numbers of uninvolved nodes were significantly associated with decreased risk of LRF, even after adjustment for other prognostic factors. The highest quartile of uninvolved nodes was compared with the lowest quartile. Among premenopausal patients, LRF risk was decreased by 35% (P = .0010); among postmenopausal patients, LRF risk was decreased by 46% (P < .0001). The 10-year cumulative incidence of LRF was 20% among patients with one to three involved lymph nodes and fewer than 10 uninvolved nodes. Age younger than 40 years and vessel invasion were also associated significantly with increased risk. Among patients with node-positive disease, overall survival was significantly greater in those with higher numbers of uninvolved nodes examined (P < .0001). CONCLUSION: Patients with one to three involved nodes and a low number of uninvolved nodes, vessel invasion, or young age have an increased risk of LRF and may be candidates for a similar treatment as those with at least four lymph node metastases.
Resumo:
The penetration, translocation, and distribution of ultrafine and nanoparticles in tissues and cells are challenging issues in aerosol research. This article describes a set of novel quantitative microscopic methods for evaluating particle distributions within sectional images of tissues and cells by addressing the following questions: (1) is the observed distribution of particles between spatial compartments random? (2) Which compartments are preferentially targeted by particles? and (3) Does the observed particle distribution shift between different experimental groups? Each of these questions can be addressed by testing an appropriate null hypothesis. The methods all require observed particle distributions to be estimated by counting the number of particles associated with each defined compartment. For studying preferential labeling of compartments, the size of each of the compartments must also be estimated by counting the number of points of a randomly superimposed test grid that hit the different compartments. The latter provides information about the particle distribution that would be expected if the particles were randomly distributed, that is, the expected number of particles. From these data, we can calculate a relative deposition index (RDI) by dividing the observed number of particles by the expected number of particles. The RDI indicates whether the observed number of particles corresponds to that predicted solely by compartment size (for which RDI = 1). Within one group, the observed and expected particle distributions are compared by chi-squared analysis. The total chi-squared value indicates whether an observed distribution is random. If not, the partial chi-squared values help to identify those compartments that are preferential targets of the particles (RDI > 1). Particle distributions between different groups can be compared in a similar way by contingency table analysis. We first describe the preconditions and the way to implement these methods, then provide three worked examples, and finally discuss the advantages, pitfalls, and limitations of this method.
Resumo:
Enterohemorrhagic Escherichia coli (EHEC) are the causative agent of hemolytic-uremic syndrome. In the first stage of the infection, EHEC interact with human enterocytes to modulate the innate immune response. Inducible NO synthase (iNOS)-derived NO is a critical mediator of the inflammatory response of the infected intestinal mucosa. We therefore aimed to analyze the role of EHEC on iNOS induction in human epithelial cell lines. In this regard, we show that EHEC down-regulate IFN-gamma-induced iNOS mRNA expression and NO production in Hct-8, Caco-2, and T84 cells. This inhibitory effect occurs through the decrease of STAT-1 activation. In parallel, we demonstrate that EHEC stimulate the rapid inducible expression of the gene hmox-1 that encodes for the enzyme heme oxygenase-1 (HO-1). Knock-down of hmox-1 gene expression by small interfering RNA or the blockade of HO-1 activity by zinc protoporphyrin IX abrogated the EHEC-dependent inhibition of STAT-1 activation and iNOS mRNA expression in activated human enterocytes. These results highlight a new strategy elaborated by EHEC to control the host innate immune response.
Resumo:
In this dissertation, the National Survey of Student Engagement (NSSE) serves as a nodal point through which to examine the power relations shaping the direction and practices of higher education in the twenty-first century. Theoretically, my analysis is informed by Foucault’s concept of governmentality, briefly defined as a technology of power that influences or shapes behavior from a distance. This form of governance operates through apparatuses of security, which include higher education. Foucault identified three essential characteristics of an apparatus—the market, the milieu, and the processes of normalization—through which administrative mechanisms and practices operate and govern populations. In this project, my primary focus is on the governance of faculty and administrators, as a population, at residential colleges and universities. I argue that the existing milieu of accountability is one dominated by the neoliberal assumption that all activity—including higher education—works best when governed by market forces alone, reducing higher education to a market-mediated private good. Under these conditions, what many in the academy believe is an essential purpose of higher education—to educate students broadly, to contribute knowledge for the public good, and to serve as society’s critic and social conscience (Washburn 227)—is being eroded. Although NSSE emerged as a form of resistance to commercial college rankings, it did not challenge the forces that empowered the rankings in the first place. Indeed, NSSE data are now being used to make institutions even more responsive to market forces. Furthermore, NSSE’s use has a normalizing effect that tends to homogenize classroom practices and erode the autonomy of faculty in the educational process. It also positions students as part of the system of surveillance. In the end, if aspects of higher education that are essential to maintaining a civil society are left to be defined solely in market terms, the result may be a less vibrant and, ultimately, a less just society.
Resumo:
Understanding clouds and their role in climate depends in part on our ability to understand how individual cloud particles respond to environmental conditions. Keeping this objective in mind, a quadrupole trap with thermodynamic control has been designed and constructed in order to create an environment conducive to studying clouds in the laboratory. The quadrupole trap allows a single cloud particle to be suspended for long times. The temperature and water vapor saturation ratio near the trapped particle is controlled by the flow of saturated air through a tube with a discontinuous wall temperature. The design has the unique aspect that the quadrupole electrodes are submerged in heat transfer fluid, completely isolated from the cylindrical levitation volume. This fluid is used in the thermodynamic system to cool the chamber to realistic cloud temperatures, and a heated section of the tube provides for the temperature discontinuity. Thus far, charged water droplets, ranging from about 30-70 microns in diameter have been levitated. In addition, the thermodynamic system has been shown to create the necessary thermal conditions that will create supersaturated conditions in subsequent experiments. These advances will help lead to the next generation of ice nucleation experiments, moving from hemispherical droplets on a substrate to a spherical droplet that is not in contact with any surface.
Resumo:
Simulations of forest stand dynamics in a modelling framework including Forest Vegetation Simulator (FVS) are diameter driven, thus the diameter or basal area increment model needs a special attention. This dissertation critically evaluates diameter or basal area increment models and modelling approaches in the context of the Great Lakes region of the United States and Canada. A set of related studies are presented that critically evaluate the sub-model for change in individual tree basal diameter used in the Forest Vegetation Simulator (FVS), a dominant forestry model in the Great Lakes region. Various historical implementations of the STEMS (Stand and Tree Evaluation and Modeling System) family of diameter increment models, including the current public release of the Lake States variant of FVS (LS-FVS), were tested for the 30 most common tree species using data from the Michigan Forest Inventory and Analysis (FIA) program. The results showed that current public release of the LS-FVS diameter increment model over-predicts 10-year diameter increment by 17% on average. Also the study affirms that a simple adjustment factor as a function of a single predictor, dbh (diameter at breast height) used in the past versions, provides an inadequate correction of model prediction bias. In order to re-engineer the basal diameter increment model, the historical, conceptual and philosophical differences among the individual tree increment model families and their modelling approaches were analyzed and discussed. Two underlying conceptual approaches toward diameter or basal area increment modelling have been often used: the potential-modifier (POTMOD) and composite (COMP) approaches, which are exemplified by the STEMS/TWIGS and Prognosis models, respectively. It is argued that both approaches essentially use a similar base function and neither is conceptually different from a biological perspective, even though they look different in their model forms. No matter what modelling approach is used, the base function is the foundation of an increment model. Two base functions – gamma and Box-Lucas – were identified as candidate base functions for forestry applications. The results of a comparative analysis of empirical fits showed that quality of fit is essentially similar, and both are sufficiently detailed and flexible for forestry applications. The choice of either base function in order to model diameter or basal area increment is dependent upon personal preference; however, the gamma base function may be preferred over the Box-Lucas, as it fits the periodic increment data in both a linear and nonlinear composite model form. Finally, the utility of site index as a predictor variable has been criticized, as it has been widely used in models for complex, mixed species forest stands though not well suited for this purpose. An alternative to site index in an increment model was explored, using site index and a combination of climate variables and Forest Ecosystem Classification (FEC) ecosites and data from the Province of Ontario, Canada. The results showed that a combination of climate and FEC ecosites variables can replace site index in the diameter increment model.