943 resultados para Monte-Carlo simulation, Rod-coil block copolymer, Tetrapod polymer mixture


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Double-differential dijet cross-sections measured in pp collisions at the LHC with a 7TeV centre-of-mass energy are presented as functions of dijet mass and half the rapidity separation of the two highest-pT jets. These measurements are obtained using data corresponding to an integrated luminosity of 4.5 fb−1, recorded by the ATLAS detector in 2011. The data are corrected for detector effects so that cross-sections are presented at the particle level. Cross-sections are measured up to 5TeV dijet mass using jets reconstructed with the anti-kt algorithm for values of the jet radius parameter of 0.4 and 0.6. The cross-sections are compared with next-to-leading-order perturbative QCD calculations by NLOJet++ corrected to account for non-perturbative effects. Comparisons with POWHEG predictions, using a next-to-leading-order matrix element calculation interfaced to a partonshower Monte Carlo simulation, are also shown. Electroweak effects are accounted for in both cases. The quantitative comparison of data and theoretical predictions obtained using various parameterizations of the parton distribution functions is performed using a frequentist method. In general, good agreement with data is observed for the NLOJet++ theoretical predictions when using the CT10, NNPDF2.1 and MSTW 2008 PDF sets. Disagreement is observed when using the ABM11 and HERAPDF1.5 PDF sets for some ranges of dijet mass and half the rapidity separation. An example setting a lower limit on the compositeness scale for a model of contact interactions is presented, showing that the unfolded results can be used to constrain contributions to dijet production beyond that predicted by the Standard Model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Long-term electrocardiogram (ECG) often suffers from relevant noise. Baseline wander in particular is pronounced in ECG recordings using dry or esophageal electrodes, which are dedicated for prolonged registration. While analog high-pass filters introduce phase distortions, reliable offline filtering of the baseline wander implies a computational burden that has to be put in relation to the increase in signal-to-baseline ratio (SBR). Here we present a graphics processor unit (GPU) based parallelization method to speed up offline baseline wander filter algorithms, namely the wavelet, finite, and infinite impulse response, moving mean, and moving median filter. Individual filter parameters were optimized with respect to the SBR increase based on ECGs from the Physionet database superimposed to auto-regressive modeled, real baseline wander. A Monte-Carlo simulation showed that for low input SBR the moving median filter outperforms any other method but negatively affects ECG wave detection. In contrast, the infinite impulse response filter is preferred in case of high input SBR. However, the parallelized wavelet filter is processed 500 and 4 times faster than these two algorithms on the GPU, respectively, and offers superior baseline wander suppression in low SBR situations. Using a signal segment of 64 mega samples that is filtered as entire unit, wavelet filtering of a 7-day high-resolution ECG is computed within less than 3 seconds. Taking the high filtering speed into account, the GPU wavelet filter is the most efficient method to remove baseline wander present in long-term ECGs, with which computational burden can be strongly reduced.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The discoveries of the BRCA1 and BRCA2 genes have made it possible for women of families with hereditary breast/ovarian cancer to determine if they carry cancer-predisposing genetic mutations. Women with germline mutations have significantly higher probabilities of developing both cancers than the general population. Since the presence of a BRCA1 or BRCA2 mutation does not guarantee future cancer development, the appropriate course of action remains uncertain for these women. Prophylactic mastectomy and oophorectomy remain controversial since the underlying premise for surgical intervention is based more upon reduction in the estimated risk of cancer than on actual evidence of clinical benefit. Issues that are incorporated in a woman's decision making process include quality of life without breasts, ovaries, attitudes toward possible surgical morbidity as well as a remaining risk of future development of breast/ovarian cancer despite prophylactic surgery. The incorporation of patient preferences into decision analysis models can determine the quality-adjusted survival of different prophylactic approaches to breast/ovarian cancer prevention. Monte Carlo simulation was conducted on 4 separate decision models representing prophylactic oophorectomy, prophylactic mastectomy, prophylactic oophorectomy/mastectomy and screening. The use of 3 separate preference assessment methods across different populations of women allows researchers to determine how quality adjusted survival varies according to clinical strategy, method of preference assessment and the population from which preferences are assessed. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monte Carlo simulation has been conducted to investigate parameter estimation and hypothesis testing in some well known adaptive randomization procedures. The four urn models studied are Randomized Play-the-Winner (RPW), Randomized Pôlya Urn (RPU), Birth and Death Urn with Immigration (BDUI), and Drop-the-Loses Urn (DL). Two sequential estimation methods, the sequential maximum likelihood estimation (SMLE) and the doubly adaptive biased coin design (DABC), are simulated at three optimal allocation targets that minimize the expected number of failures under the assumption of constant variance of simple difference (RSIHR), relative risk (ORR), and odds ratio (OOR) respectively. Log likelihood ratio test and three Wald-type tests (simple difference, log of relative risk, log of odds ratio) are compared in different adaptive procedures. ^ Simulation results indicates that although RPW is slightly better in assigning more patients to the superior treatment, the DL method is considerably less variable and the test statistics have better normality. When compared with SMLE, DABC has slightly higher overall response rate with lower variance, but has larger bias and variance in parameter estimation. Additionally, the test statistics in SMLE have better normality and lower type I error rate, and the power of hypothesis testing is more comparable with the equal randomization. Usually, RSIHR has the highest power among the 3 optimal allocation ratios. However, the ORR allocation has better power and lower type I error rate when the log of relative risk is the test statistics. The number of expected failures in ORR is smaller than RSIHR. It is also shown that the simple difference of response rates has the worst normality among all 4 test statistics. The power of hypothesis test is always inflated when simple difference is used. On the other hand, the normality of the log likelihood ratio test statistics is robust against the change of adaptive randomization procedures. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Breast cancer is the most common non-skin cancer and the second leading cause of cancer-related death in women in the United States. Studies on ipsilateral breast tumor relapse (IBTR) status and disease-specific survival will help guide clinic treatment and predict patient prognosis.^ After breast conservation therapy, patients with breast cancer may experience breast tumor relapse. This relapse is classified into two distinct types: true local recurrence (TR) and new ipsilateral primary tumor (NP). However, the methods used to classify the relapse types are imperfect and are prone to misclassification. In addition, some observed survival data (e.g., time to relapse and time from relapse to death)are strongly correlated with relapse types. The first part of this dissertation presents a Bayesian approach to (1) modeling the potentially misclassified relapse status and the correlated survival information, (2) estimating the sensitivity and specificity of the diagnostic methods, and (3) quantify the covariate effects on event probabilities. A shared frailty was used to account for the within-subject correlation between survival times. The inference was conducted using a Bayesian framework via Markov Chain Monte Carlo simulation implemented in softwareWinBUGS. Simulation was used to validate the Bayesian method and assess its frequentist properties. The new model has two important innovations: (1) it utilizes the additional survival times correlated with the relapse status to improve the parameter estimation, and (2) it provides tools to address the correlation between the two diagnostic methods conditional to the true relapse types.^ Prediction of patients at highest risk for IBTR after local excision of ductal carcinoma in situ (DCIS) remains a clinical concern. The goals of the second part of this dissertation were to evaluate a published nomogram from Memorial Sloan-Kettering Cancer Center, to determine the risk of IBTR in patients with DCIS treated with local excision, and to determine whether there is a subset of patients at low risk of IBTR. Patients who had undergone local excision from 1990 through 2007 at MD Anderson Cancer Center with a final diagnosis of DCIS (n=794) were included in this part. Clinicopathologic factors and the performance of the Memorial Sloan-Kettering Cancer Center nomogram for prediction of IBTR were assessed for 734 patients with complete data. Nomogram for prediction of 5- and 10-year IBTR probabilities were found to demonstrate imperfect calibration and discrimination, with an area under the receiver operating characteristic curve of .63 and a concordance index of .63. In conclusion, predictive models for IBTR in DCIS patients treated with local excision are imperfect. Our current ability to accurately predict recurrence based on clinical parameters is limited.^ The American Joint Committee on Cancer (AJCC) staging of breast cancer is widely used to determine prognosis, yet survival within each AJCC stage shows wide variation and remains unpredictable. For the third part of this dissertation, biologic markers were hypothesized to be responsible for some of this variation, and the addition of biologic markers to current AJCC staging were examined for possibly provide improved prognostication. The initial cohort included patients treated with surgery as first intervention at MDACC from 1997 to 2006. Cox proportional hazards models were used to create prognostic scoring systems. AJCC pathologic staging parameters and biologic tumor markers were investigated to devise the scoring systems. Surveillance Epidemiology and End Results (SEER) data was used as the external cohort to validate the scoring systems. Binary indicators for pathologic stage (PS), estrogen receptor status (E), and tumor grade (G) were summed to create PS+EG scoring systems devised to predict 5-year patient outcomes. These scoring systems facilitated separation of the study population into more refined subgroups than the current AJCC staging system. The ability of the PS+EG score to stratify outcomes was confirmed in both internal and external validation cohorts. The current study proposes and validates a new staging system by incorporating tumor grade and ER status into current AJCC staging. We recommend that biologic markers be incorporating into revised versions of the AJCC staging system for patients receiving surgery as the first intervention.^ Chapter 1 focuses on developing a Bayesian method to solve misclassified relapse status and application to breast cancer data. Chapter 2 focuses on evaluation of a breast cancer nomogram for predicting risk of IBTR in patients with DCIS after local excision gives the statement of the problem in the clinical research. Chapter 3 focuses on validation of a novel staging system for disease-specific survival in patients with breast cancer treated with surgery as the first intervention. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this dissertation, we propose a continuous-time Markov chain model to examine the longitudinal data that have three categories in the outcome variable. The advantage of this model is that it permits a different number of measurements for each subject and the duration between two consecutive time points of measurements can be irregular. Using the maximum likelihood principle, we can estimate the transition probability between two time points. By using the information provided by the independent variables, this model can also estimate the transition probability for each subject. The Monte Carlo simulation method will be used to investigate the goodness of model fitting compared with that obtained from other models. A public health example will be used to demonstrate the application of this method. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This is the seventeenth of a series of symposia devoted to talks by students about their biochemical engineering research. The first, third, fifth, ninth, twelfth, and sixteenth were at Kansas State University, the second and fourth were at the University of Nebraska-Lincoln, the sixth was in Kansas City and was hosted by Iowa State University, the seventh, tenth, thirteenth, and seventeenth were at Iowa State University, the eighth and fourteenth were at the University of Missouri–Columbia, and the eleventh and fifteenth were at Colorado State University. Next year's symposium will be at the University of Colorado. Symposium proceedings are edited by faculty of the host institution. Because final publication usually takes place elsewhere, papers here are brief, and often cover work in progress. ContentsThe Effect of Polymer Dosage Conditions on the Properties of ProteinPolyelectrolyte Precipitates, K. H. Clark and C. E. Glatz, Iowa State University An Immobilized Enzyme Reactor/Separator for the Hydrolysis of Casein by Subtilisin Carlsberg, A. J. Bream, R. A. Yoshisato, and G. R. Carmichael, University of Iowa Cell Density Measurements in Hollow Fiber Bioreactors, Thomas Blute, Colorado State University The Hydrodynamics in an Air-Lift Reactor, Peter Sohn, George Y. Preckshot, and Rakesh K. Bajpai, University of Missouri–Columbia Local Liquid Velocity Measurements in a Split Cylinder Airlift Column, G. Travis Jones, Kansas State University Fluidized Bed Solid Substrate Trichoderma reesei Fermentation, S. Adisasmito, H. N. Karim, and R. P. Tengerdy, Colorado State University The Effect of 2,4-D Concentration on the Growth of Streptanthus tortuosis Cells in Shake Flask and Air-Lift Permenter Culture, I. C. Kong, R. D. Sjolund, and R. A. Yoshisato, University of Iowa Protein Engineering of Aspergillus niger Glucoamylase, Michael R. Sierks, Iowa State University Structured Kinetic Modeling of Hybidoma Growth and Monoclonal Antibody Production in Suspension Cultures, Brian C. Batt and Dhinakar S. Kampala, University of Colorado Modelling and Control of a Zymomonas mobilis Fermentation, John F. Kramer, M. N. Karim, and J. Linden, Colorado State University Modeling of Brettanomyces clausenii Fermentation on Mixtures of Glucose and Cellobiose, Max T. Bynum and Dhinakar S. Kampala, University of Colorado, Karel Grohmann and Charles E. Yyman, Solar Energy Research Institute Master Equation Modeling and Monte Carlo Simulation of Predator-Prey Interactions, R. 0. Fox, Y. Y. Huang, and L. T. Fan, Kansas State University Kinetics and Equilibria of Condensation Reactions Between Two Different Monosaccharides Catalyzed by Aspergillus niger Glucoamylase, Sabine Pestlin, Iowa State University Biodegradation of Metalworking Fluids, S. M. Lee, Ayush Gupta, L. E. Erickson, and L. T. Fan, Kansas State University Redox Potential, Toxicity and Oscillations in Solvent Fermentations, Kim Joong, Rakesh Bajpai, and Eugene L. Iannotti, University of Missouri–Columbia Using Structured Kinetic Models for Analyzing Instability in Recombinant Bacterial Cultures, William E. Bentley and Dhinakar S. Kompala, University of Colorado

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El consumo de productos orgánicos viene creciendo en todo el mundo y la leche orgánica no es la excepción. En Puno, por sus características geográficas, culturales e históricas, la producción agropecuaria aún mantiene tecnologías de producción ancestrales, buscando el equilibrio con el medio ambiente, con un uso mínimo de fertilizantes y pesticidas, con lo que se muestra un enfoque en la sostenibilidad y una tendencia a producir orgánicamente. El objetivo de este trabajo fue determinar mediante una simulación la viabilidad económica y el riesgo de producir leche orgánica como una alternativa de desarrollo sostenible. Se consideró la producción por encima de los 3.000 metros de altura, área de 6,5 has, una vaca (criolla) por hectárea produciendo 10 litros de leche/día. Los indicadores económicos resultaron positivos: el VAN fue S/. 2.916,38, TIR 24, VAE S/. 866,33, ratio B/C S/. 1,48 (se aclaro en la metodología) y el período de recuperación de la inversión resultó de 5,88 años, evidenciando que la producción de leche orgánica es económicamente viable pero con riesgo elevado: la simulación de Monte Carlo mostró que existe 71,43 de probabilidad de no resultar viable

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El consumo de productos orgánicos viene creciendo en todo el mundo y la leche orgánica no es la excepción. En Puno, por sus características geográficas, culturales e históricas, la producción agropecuaria aún mantiene tecnologías de producción ancestrales, buscando el equilibrio con el medio ambiente, con un uso mínimo de fertilizantes y pesticidas, con lo que se muestra un enfoque en la sostenibilidad y una tendencia a producir orgánicamente. El objetivo de este trabajo fue determinar mediante una simulación la viabilidad económica y el riesgo de producir leche orgánica como una alternativa de desarrollo sostenible. Se consideró la producción por encima de los 3.000 metros de altura, área de 6,5 has, una vaca (criolla) por hectárea produciendo 10 litros de leche/día. Los indicadores económicos resultaron positivos: el VAN fue S/. 2.916,38, TIR 24, VAE S/. 866,33, ratio B/C S/. 1,48 (se aclaro en la metodología) y el período de recuperación de la inversión resultó de 5,88 años, evidenciando que la producción de leche orgánica es económicamente viable pero con riesgo elevado: la simulación de Monte Carlo mostró que existe 71,43 de probabilidad de no resultar viable

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El consumo de productos orgánicos viene creciendo en todo el mundo y la leche orgánica no es la excepción. En Puno, por sus características geográficas, culturales e históricas, la producción agropecuaria aún mantiene tecnologías de producción ancestrales, buscando el equilibrio con el medio ambiente, con un uso mínimo de fertilizantes y pesticidas, con lo que se muestra un enfoque en la sostenibilidad y una tendencia a producir orgánicamente. El objetivo de este trabajo fue determinar mediante una simulación la viabilidad económica y el riesgo de producir leche orgánica como una alternativa de desarrollo sostenible. Se consideró la producción por encima de los 3.000 metros de altura, área de 6,5 has, una vaca (criolla) por hectárea produciendo 10 litros de leche/día. Los indicadores económicos resultaron positivos: el VAN fue S/. 2.916,38, TIR 24, VAE S/. 866,33, ratio B/C S/. 1,48 (se aclaro en la metodología) y el período de recuperación de la inversión resultó de 5,88 años, evidenciando que la producción de leche orgánica es económicamente viable pero con riesgo elevado: la simulación de Monte Carlo mostró que existe 71,43 de probabilidad de no resultar viable

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The decomposition technique introduced by Blinder (1973) and Oaxaca (1973) is widely used to study outcome differences between groups. For example, the technique is commonly applied to the analysis of the gender wage gap. However, despite the procedure's frequent use, very little attention has been paid to the issue of estimating the sampling variances of the decomposition components. We therefore suggest an approach that introduces consistent variance estimators for several variants of the decomposition. The accuracy of the new estimators under ideal conditions is illustrated with the results of a Monte Carlo simulation. As a second check, the estimators are compared to bootstrap results obtained using real data. In contrast to previously proposed statistics, the new method takes into account the extra variation imposed by stochastic regressors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

All meta-analyses should include a heterogeneity analysis. Even so, it is not easy to decide whether a set of studies are homogeneous or heterogeneous because of the low statistical power of the statistics used (usually the Q test). Objective: Determine a set of rules enabling SE researchers to find out, based on the characteristics of the experiments to be aggregated, whether or not it is feasible to accurately detect heterogeneity. Method: Evaluate the statistical power of heterogeneity detection methods using a Monte Carlo simulation process. Results: The Q test is not powerful when the meta-analysis contains up to a total of about 200 experimental subjects and the effect size difference is less than 1. Conclusions: The Q test cannot be used as a decision-making criterion for meta-analysis in small sample settings like SE. Random effects models should be used instead of fixed effects models. Caution should be exercised when applying Q test-mediated decomposition into subgroups.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Several meta-analysis methods can be used to quantitatively combine the results of a group of experiments, including the weighted mean difference, statistical vote counting, the parametric response ratio and the non-parametric response ratio. The software engineering community has focused on the weighted mean difference method. However, other meta-analysis methods have distinct strengths, such as being able to be used when variances are not reported. There are as yet no guidelines to indicate which method is best for use in each case. Aim: Compile a set of rules that SE researchers can use to ascertain which aggregation method is best for use in the synthesis phase of a systematic review. Method: Monte Carlo simulation varying the number of experiments in the meta analyses, the number of subjects that they include, their variance and effect size. We empirically calculated the reliability and statistical power in each case Results: WMD is generally reliable if the variance is low, whereas its power depends on the effect size and number of subjects per meta-analysis; the reliability of RR is generally unaffected by changes in variance, but it does require more subjects than WMD to be powerful; NPRR is the most reliable method, but it is not very powerful; SVC behaves well when the effect size is moderate, but is less reliable with other effect sizes. Detailed tables of results are annexed. Conclusions: Before undertaking statistical aggregation in software engineering, it is worthwhile checking whether there is any appreciable difference in the reliability and power of the methods. If there is, software engineers should select the method that optimizes both parameters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This project investigates the utility of differential algebra (DA) techniques applied to the problem of orbital dynamics with initial uncertainties in the orbital determination of the involved bodies. The use of DA theory allows the splitting of a common Monte Carlo simulation in two parts: the generation of a Taylor map of the final states with regard to the perturbation in the initial coordinates, and the evaluation of the map for many points. A propagator is implemented exploiting DA techniques, and tested in the field of asteroid impact risk monitoring with the potentially hazardous 2011 AG5 and 2007 VK184 as test cases. Results show that the new method is able to simulate 2.5 million trajectories with a precision good enough for the impact probability to be accurately reproduced, while running much faster than a traditional Monte Carlo approach (in 1 and 2 days, respectively).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta tesis analiza los elementos que afectan a la evaluación del rendimiento dentro de la técnica de radiodiagnóstico mediante tomografía por emisión de positrones (PET), centrándose en escáneres preclínicos. Se exploran las posibilidades de los protocolos estándar de evaluación sobre los siguientes aspectos: su uso como herramienta para validar programas de simulación Montecarlo, como método para la comparación de escáneres y su validez en el estudio del efecto sobre la calidad de imagen al utilizar radioisótopos alternativos. Inicialmente se estudian los métodos de evaluación orientados a la validación de simulaciones PET, para ello se presenta el programa GAMOS como entorno de simulación y se muestran los resultados de su validación basada en el estándar NEMA NU 4-2008 para escáneres preclínicos. Esta validación se ha realizado mediante la comparación de los resultados simulados frente a adquisiciones reales en el equipo ClearPET, describiendo la metodología de evaluación y selección de los parámetros NEMA. En este apartado también se mencionan las aportaciones desarrolladas en GAMOS para aplicaciones PET, como la inclusión de herramientas para la reconstrucción de imágenes. Por otro lado, la evaluación NEMA del ClearPET es utilizada para comparar su rendimiento frente a otro escáner preclínico: el sistema rPET-1. Esto supone la primera caracterización NEMA NU 4 completa de ambos equipos; al mismo tiempo que se analiza cómo afectan las importantes diferencias de diseño entre ellos, especialmente el tamaño axial del campo de visión y la configuración de los detectores. El 68Ga es uno de los radioisótopos no convencionales en imagen PET que está experimentando un mayor desarrollo, sin embargo, presenta la desventaja del amplio rango o distancia recorrida por el positrón emitido. Además del rango del positrón, otra propiedad física característica de los radioisótopos PET que puede afectar a la imagen es la emisión de fotones gamma adicionales, tal como le ocurre al isótopo 48V. En esta tesis se evalúan dichos efectos mediante estudios de resolución espacial y calidad de imagen NEMA. Finalmente, se analiza el alcance del protocolo NEMA NU 4-2008 cuando se utiliza para este propósito, adaptándolo a tal fin y proponiendo posibles modificaciones. Abstract This thesis analyzes the factors affecting the performance evaluation in positron emission tomography (PET) imaging, focusing on preclinical scanners. It explores the possibilities of standard protocols of assessment on the following aspects: their use as tools to validate Monte Carlo simulation programs, their usefulness as a method for comparing scanners and their validity in the study of the effect of alternative radioisotopes on image quality. Initially we study the methods of performance evaluation oriented to validate PET simulations. For this we present the GAMOS program as a simulation framework and show the results of its validation based on the standard NEMA NU 4-2008 for preclinical PET scanners. This has been accomplished by comparing simulated results against experimental acquisitions in the ClearPET scanner, describing the methodology for the evaluation and selection of NEMA parameters. This section also mentions the contributions developed in GAMOS for PET applications, such as the inclusion of tools for image reconstruction. Furthermore, the evaluation of the ClearPET scanner is used to compare its performance against another preclinical scanner, specifically the rPET-1 system. This is the first complete NEMA NU 4 based characterization study of both systems. At the same time we analyze how do the significant design differences of these two systems, especially the size of the axial field of view and the detectors configuration affect their performance characteristics. 68Ga is one of the unconventional radioisotopes in PET imaging the use of which is currently significantly increasing; however, it presents the disadvantage of the long positron range (distance traveled by the emitted positron before annihilating with an electron). Besides the positron range, additional gamma photon emission is another physical property characteristic of PET radioisotopes that can affect the reconstructed image quality, as it happens to the isotope 48V. In this thesis we assess these effects through studies of spatial resolution and image quality. Finally, we analyze the scope of the NEMA NU 4-2008 to carry out such studies, adapting it and proposing possible modifications.