903 resultados para Simulation and modelling
Resumo:
Image-based modeling of tumor growth combines methods from cancer simulation and medical imaging. In this context, we present a novel approach to adapt a healthy brain atlas to MR images of tumor patients. In order to establish correspondence between a healthy atlas and a pathologic patient image, tumor growth modeling in combination with registration algorithms is employed. In a first step, the tumor is grown in the atlas based on a new multi-scale, multi-physics model including growth simulation from the cellular level up to the biomechanical level, accounting for cell proliferation and tissue deformations. Large-scale deformations are handled with an Eulerian approach for finite element computations, which can operate directly on the image voxel mesh. Subsequently, dense correspondence between the modified atlas and patient image is established using nonrigid registration. The method offers opportunities in atlasbased segmentation of tumor-bearing brain images as well as for improved patient-specific simulation and prognosis of tumor progression.
Resumo:
As atmospheric emissions of S have declined in the Northern Hemisphere, there has been an expectation of increased pH and alkalinity in streams believed to have been acidified by excess S and N. Many streams and lakes have not recovered. Evidence from East Bear Brook in Maine, USA and modelling with the groundwater acid-base model MAGIC (Cosby et al. 1985a,b) indicate that seasonal and yearly variations in soil PCO2 are adequate to enhance or even reverse acid-base (alkalinity) changes anticipated from modest decreases of SO4 in surface waters. Alkalinity is generated in the soil by exchange of H+ from dissociation of H2CO3, which in turn is derived from the dissolving of soil CO2. The variation in soil PCO2 produces an alkalinity variation of up to 15 mu eq L-1 in stream water. Detecting and relating increases in alkalinity to decreases in stream SO4 are significantly more difficult in the short term because of this effect. For example, modelled alkalinity recovery at Bear Brook due to a decline of 20 mu eq SO4 L-1 in soil solution is compensated by a decline from 0.4 to 0.2% for soil air PCO2. This compensation ability decays over time as base saturation declines. Variable PCO2 has less effect in more acidic soils. Short-term decreases of PCO2 below the long-term average value produce short-term decreases in alkalinity, whereas short-term increases in PCO2 produce shortterm alkalization. Trend analysis for detecting recovery of streams and lakes from acidification after reduced atmospheric emissions will require a longer monitoring period for statistical significance than previously appreciated.
Resumo:
This paper presents the electron and photon energy calibration achieved with the ATLAS detector using about 25 fb−1 of LHC proton–proton collision data taken at centre-of-mass energies of √s = 7 and 8 TeV. The reconstruction of electron and photon energies is optimised using multivariate algorithms. The response of the calorimeter layers is equalised in data and simulation, and the longitudinal profile of the electromagnetic showers is exploited to estimate the passive material in front of the calorimeter and reoptimise the detector simulation. After all corrections, the Z resonance is used to set the absolute energy scale. For electrons from Z decays, the achieved calibration is typically accurate to 0.05% in most of the detector acceptance, rising to 0.2% in regions with large amounts of passive material. The remaining inaccuracy is less than 0.2–1% for electrons with a transverse energy of 10 GeV, and is on average 0.3% for photons. The detector resolution is determined with a relative inaccuracy of less than 10% for electrons and photons up to 60 GeV transverse energy, rising to 40% for transverse energies above 500 GeV.
Resumo:
SOLUTIONS (2013 to 2018) is a European Union Seventh Framework Programme Project (EU-FP7). The project aims to deliver a conceptual framework to support the evidence-based development of environmental policies with regard to water quality. SOLUTIONS will develop the tools for the identification, prioritisation and assessment of those water contaminants that may pose a risk to ecosystems and human health. To this end, a new generation of chemical and effect-based monitoring tools is developed and integrated with a full set of exposure, effect and risk assessment models. SOLUTIONS attempts to address legacy, present and future contamination by integrating monitoring and modelling based approaches with scenarios on future developments in society, economy and technology and thus in contamination. The project follows a solutions-oriented approach by addressing major problems of water and chemicals management and by assessing abatement options. SOLUTIONS takes advantage of the access to the infrastructure necessary to investigate the large basins of the Danube and Rhine as well as relevant Mediterranean basins as case studies, and puts major efforts on stakeholder dialogue and support. Particularly, the EU Water Framework Directive (WFD) Common Implementation Strategy (CIS) working groups, International River Commissions, and water works associations are directly supported with consistent guidance for the early detection, identification, prioritisation, and abatement of chemicals in the water cycle. SOLUTIONS will give a specific emphasis on concepts and tools for the impact and risk assessment of complex mixtures of emerging pollutants, their metabolites and transformation products. Analytical and effect-based screening tools will be applied together with ecological assessment tools for the identification of toxicants and their impacts. The SOLUTIONS approach is expected to provide transparent and evidence-based candidates or River Basin Specific Pollutants in the case study basins and to assist future review of priority pollutants under the WFD as well as potential abatement options.
Resumo:
The majority of sensor network research deals with land-based networks, which are essentially two-dimensional, and thus the majority of simulation and animation tools also only handle such networks. Underwater sensor networks on the other hand, are essentially 3D networks because the depth at which a sensor node is located needs to be considered as well. Due to that additional dimension, specialized tools need to be used when conducting simulations for experimentation. The School of Engineering’s Underwater Sensor Network (UWSN) lab is conducting research on underwater sensor networks and requires simulation tools for 3D networks. The lab has extended NS-2, a widely used network simulator, so that it can simulate three-dimensional networks. However, NAM, a widely used network animator, currently only supports two-dimensional networks and no extensions have been implemented to give it three-dimensional capabilities. In this project, we develop a network visualization tool that functions similarly to NAM but is able to render network environments in full 3-D. It is able to take as input a NS-2 trace file (the same file taken as input by NAM), create the environment, position the sensor nodes, and animate the events of the simulation. Further, the visualization tool is easy to use, especially friendly to NAM users, as it is designed to follow the interfaces and functions similar to NAM. So far, the development has fulfilled the basic functionality. Future work includes fully functional capabilities for visualization and much improved user interfaces.
Resumo:
My dissertation focuses on developing methods for gene-gene/environment interactions and imprinting effect detections for human complex diseases and quantitative traits. It includes three sections: (1) generalizing the Natural and Orthogonal interaction (NOIA) model for the coding technique originally developed for gene-gene (GxG) interaction and also to reduced models; (2) developing a novel statistical approach that allows for modeling gene-environment (GxE) interactions influencing disease risk, and (3) developing a statistical approach for modeling genetic variants displaying parent-of-origin effects (POEs), such as imprinting. In the past decade, genetic researchers have identified a large number of causal variants for human genetic diseases and traits by single-locus analysis, and interaction has now become a hot topic in the effort to search for the complex network between multiple genes or environmental exposures contributing to the outcome. Epistasis, also known as gene-gene interaction is the departure from additive genetic effects from several genes to a trait, which means that the same alleles of one gene could display different genetic effects under different genetic backgrounds. In this study, we propose to implement the NOIA model for association studies along with interaction for human complex traits and diseases. We compare the performance of the new statistical models we developed and the usual functional model by both simulation study and real data analysis. Both simulation and real data analysis revealed higher power of the NOIA GxG interaction model for detecting both main genetic effects and interaction effects. Through application on a melanoma dataset, we confirmed the previously identified significant regions for melanoma risk at 15q13.1, 16q24.3 and 9p21.3. We also identified potential interactions with these significant regions that contribute to melanoma risk. Based on the NOIA model, we developed a novel statistical approach that allows us to model effects from a genetic factor and binary environmental exposure that are jointly influencing disease risk. Both simulation and real data analyses revealed higher power of the NOIA model for detecting both main genetic effects and interaction effects for both quantitative and binary traits. We also found that estimates of the parameters from logistic regression for binary traits are no longer statistically uncorrelated under the alternative model when there is an association. Applying our novel approach to a lung cancer dataset, we confirmed four SNPs in 5p15 and 15q25 region to be significantly associated with lung cancer risk in Caucasians population: rs2736100, rs402710, rs16969968 and rs8034191. We also validated that rs16969968 and rs8034191 in 15q25 region are significantly interacting with smoking in Caucasian population. Our approach identified the potential interactions of SNP rs2256543 in 6p21 with smoking on contributing to lung cancer risk. Genetic imprinting is the most well-known cause for parent-of-origin effect (POE) whereby a gene is differentially expressed depending on the parental origin of the same alleles. Genetic imprinting affects several human disorders, including diabetes, breast cancer, alcoholism, and obesity. This phenomenon has been shown to be important for normal embryonic development in mammals. Traditional association approaches ignore this important genetic phenomenon. In this study, we propose a NOIA framework for a single locus association study that estimates both main allelic effects and POEs. We develop statistical (Stat-POE) and functional (Func-POE) models, and demonstrate conditions for orthogonality of the Stat-POE model. We conducted simulations for both quantitative and qualitative traits to evaluate the performance of the statistical and functional models with different levels of POEs. Our results showed that the newly proposed Stat-POE model, which ensures orthogonality of variance components if Hardy-Weinberg Equilibrium (HWE) or equal minor and major allele frequencies is satisfied, had greater power for detecting the main allelic additive effect than a Func-POE model, which codes according to allelic substitutions, for both quantitative and qualitative traits. The power for detecting the POE was the same for the Stat-POE and Func-POE models under HWE for quantitative traits.
Resumo:
Conservative procedures in low-dose risk assessment are used to set safety standards for known or suspected carcinogens. However, the assumptions upon which the methods are based and the effects of these methods are not well understood.^ To minimize the number of false-negatives and to reduce the cost of bioassays, animals are given very high doses of potential carcinogens. Results must then be extrapolated to much smaller doses to set safety standards for risks such as one per million. There are a number of competing methods that add a conservative safety factor into these calculations.^ A method of quantifying the conservatism of these methods was described and tested on eight procedures used in setting low-dose safety standards. The results using these procedures were compared by computer simulation and by the use of data from a large scale animal study.^ The method consisted of determining a "true safe dose" (tsd) according to an assumed underlying model. If one assumed that Y = the probability of cancer = P(d), a known mathematical function of the dose, then by setting Y to some predetermined acceptable risk, one can solve for d, the model's "true safe dose".^ Simulations were generated, assuming a binomial distribution, for an artificial bioassay. The eight procedures were then used to determine a "virtual safe dose" (vsd) that estimates the tsd, assuming a risk of one per million. A ratio R = ((tsd-vsd)/vsd) was calculated for each "experiment" (simulation). The mean R of 500 simulations and the probability R $<$ 0 was used to measure the over and under conservatism of each procedure.^ The eight procedures included Weil's method, Hoel's method, the Mantel-Byran method, the improved Mantel-Byran, Gross's method, fitting a one-hit model, Crump's procedure, and applying Rai and Van Ryzin's method to a Weibull model.^ None of the procedures performed uniformly well for all types of dose-response curves. When the data were linear, the one-hit model, Hoel's method, or the Gross-Mantel method worked reasonably well. However, when the data were non-linear, these same methods were overly conservative. Crump's procedure and the Weibull model performed better in these situations. ^
Resumo:
The nineteenth symposium was held at the University of Missouri–Columbia on April 22, 1989. A total of eighteen papers were scheduled for presentation, of which nine were in poster session. Finally, fifteen papers were presented and sixteen were submitted for this proceedings. It was attended by 53 participants from five institutions. A sixth group (from Colorado State University) was kept from attending the symposium due to mechanical problems on the road and we missed them. Since they worked hard at their presentations, I requested CSU-group to submit their papers for the proceedings and I am happy that they did. ContentsMathematical modelling of a flour milling system. K. Takahashi, Y. Chen, J. Hosokoschi, and L. T. Fan. Kansas State University A novel solution to the problem of plasmid segregation in continuous bacterial fermentations. K.L. Henry, R. H. Davis, and A. L. Taylor. University of Colorado Modelling of embryonic growth in avian and reptile Eggs. C.L. Krause, R. C. Seagrave, and R. A. Ackerman. Iowa State University Mathematical modeling of in situ biodegradation processes. J.C. Wu, L. T. Fan, and L. E. Erickson. Kansas State University Effect of molecular changes on starch viscosity. C.H. Rosane and V. G. Murphy. Colorado State University Analysis of two stage recombinant bacterial fermentations using a structured kinetic model. F. Miao and D. S. Kampala. University of Colorado Lactic acid fermentation from enzyme-thinned starch by Lactobacillus amylovorus. P.S. Cheng, E. L. Iannotti, R. K. Bajpai, R. Mueller, and s. Yaeger. University of Missouri–Columbia Solubilization of preoxidized Texas lignite by cell-free broths of Penicillium strains. R. Moolick, M. N. Karim, J. C. Linden, and B. L. Burback. Colorado State University Separation of proteins from polyelectrolytes by ultrafiltration. A.G. Bazzano and C. E. Glatz. Iowa State University Growth estimation and modelling of Rhizopus oligosporus in solid state fermentations. D.-H. Ryoo, V. G. Murphy, M. N. Karim, and R. P. Tengerdy. Colorado State University Simulation of ethanol fermentations from sugars in cheese whey. C.J. Wang and R. K. Bajpai. University of Missouri–Columbia Studies on protoplast fusion of B. licheniformis. B. Shi, Kansas State University Cell separations of non-dividing and dividing yeasts using an inclined settler. C.-Y. Lee, R. H. Davis, and R. A. Sclafani. University of Colorado Effect of·serum upon local hydrodynamics within an airlift column. G.T. Jones, L. E. Erickson, and L. A. Glasgow. Kansas State University Optimization of heterologous protein secretion in continuous culture. A. Chatterjee, W. F. Remirez, and R. H. Davis. University of Colorado An improved model for lactic acid fermentation. P. Yeh, R. K. Bajpai, and E. L. Iannotti. University of Missouri–Columbia
Resumo:
The Est Constanta 1986-1994 dataset contains zooplankton data collected allong a 5 station transect in front of the city Constanta (44°10'N, 28°41.5'E - EC1; 44°10'N, 28°47'E - EC2; 44°10'N, 28°54'E - EC3; 44°10'N, 29°08'E - EC4; 44°10'N, 29°22'E - EC5). Zooplankton sampling was undertaken at 5 stations where samples were collected using a Juday closing net in the 0-10, 10-25, 25-50m layer (depending also on the water masses). The dataset includes samples analysed for mesozooplankton species composition and abundance. Sampling volume was estimated by multiplying the mouth area with the wire length. Taxon-specific mesozooplankton abundance was count under microscope. Total abundance is the sum of the counted individuals. Total biomass Fodder, Rotifera , Ctenophora and Noctiluca was estimated using a tabel with wet weight for each species an stage.
Resumo:
The Danubs 2002 dataset contains zooplankton data collected in April, June,September and October 2002 in 11 station allong 5 transect in front of the Romanian littoral. Zooplankton sampling was undertaken at 11 stations where samples were collected using a Juday closing net in the 0-10, 10-25, and 25-50m layer (depending also on the water masses). The dataset includes samples analysed for mesozooplankton species composition and abundance. Sampling volume was estimated by multiplying the mouth area with the wire length. Taxon-specific mesozooplankton abundance was count under microscope. Total abundance is the sum of the counted individuals. Total biomass Fodder, Rotifera , Ctenophora and Noctiluca was estimated using a tabel with wet weight for each species an stage.