921 resultados para Computational Simulation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The phosphine distribution in a cylindrical silo containing grain is predicted. A three-dimensional mathematical model, which accounts for multicomponent gas phase transport and the sorption of phosphine into the grain kernel is developed. In addition, a simple model is presented to describe the death of insects within the grain as a function of their exposure to phosphine gas. The proposed model is solved using the commercially available computational fluid dynamics (CFD) software, FLUENT, together with our own C code to customize the solver in order to incorporate the models for sorption and insect extinction. Two types of fumigation delivery are studied, namely, fan- forced from the base of the silo and tablet from the top of the silo. An analysis of the predicted phosphine distribution shows that during fan forced fumigation, the position of the leaky area is very important to the development of the gas flow field and the phosphine distribution in the silo. If the leak is in the lower section of the silo, insects that exist near the top of the silo may not be eradicated. However, the position of a leak does not affect phosphine distribution during tablet fumigation. For such fumigation in a typical silo configuration, phosphine concentrations remain low near the base of the silo. Furthermore, we find that half-life pressure test readings are not an indicator of phosphine distribution during tablet fumigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work proposes a supermarket optimization simulation model called Swarm-Moves is based on self organized complex system studies to identify parameters and their values that can influence customers to buy more on impulse in a given period of time. In the proposed model, customers are assumed to have trolleys equipped with technology like RFID that can aid the passing of products' information directly from the store to them in real-time and vice-versa. Therefore, they can get the information about other customers purchase patterns and constantly informing the store of their own shopping behavior. This can be easily achieved because the trolleys "know" what products they contain at any point. The Swarm-Moves simulation is the virtual supermarket providing the visual display to run and test the proposed model. The simulation is also flexible to incorporate any given model of customers' behavior tailored to particular supermarket, settings, events or promotions. The results, although preliminary, are promising to use RFID technology for marketing products in supermarkets and provide several dimensions to look for influencing customers via feedback, real-time marketing, target advertisement and on-demand promotions. ©2009 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Importance of the field: The shift in focus from ligand based design approaches to target based discovery over the last two to three decades has been a major milestone in drug discovery research. Currently, it is witnessing another major paradigm shift by leaning towards the holistic systems based approaches rather the reductionist single molecule based methods. The effect of this new trend is likely to be felt strongly in terms of new strategies for therapeutic intervention, new targets individually and in combinations, and design of specific and safer drugs. Computational modeling and simulation form important constituents of new-age biology because they are essential to comprehend the large-scale data generated by high-throughput experiments and to generate hypotheses, which are typically iterated with experimental validation. Areas covered in this review: This review focuses on the repertoire of systems-level computational approaches currently available for target identification. The review starts with a discussion on levels of abstraction of biological systems and describes different modeling methodologies that are available for this purpose. The review then focuses on how such modeling and simulations can be applied for drug target discovery. Finally, it discusses methods for studying other important issues such as understanding targetability, identifying target combinations and predicting drug resistance, and considering them during the target identification stage itself. What the reader will gain: The reader will get an account of the various approaches for target discovery and the need for systems approaches, followed by an overview of the different modeling and simulation approaches that have been developed. An idea of the promise and limitations of the various approaches and perspectives for future development will also be obtained. Take home message: Systems thinking has now come of age enabling a `bird's eye view' of the biological systems under study, at the same time allowing us to `zoom in', where necessary, for a detailed description of individual components. A number of different methods available for computational modeling and simulation of biological systems can be used effectively for drug target discovery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An adaptive drug delivery design is presented in this paper using neural networks for effective treatment of infectious diseases. The generic mathematical model used describes the coupled evolution of concentration of pathogens, plasma cells, antibodies and a numerical value that indicates the relative characteristic of a damaged organ due to the disease under the influence of external drugs. From a system theoretic point of view, the external drugs can be interpreted as control inputs, which can be designed based on control theoretic concepts. In this study, assuming a set of nominal parameters in the mathematical model, first a nonlinear controller (drug administration) is designed based on the principle of dynamic inversion. This nominal drug administration plan was found to be effective in curing "nominal model patients" (patients whose immunological dynamics conform to the mathematical model used for the control design exactly. However, it was found to be ineffective in curing "realistic model patients" (patients whose immunological dynamics may have off-nominal parameter values and possibly unwanted inputs) in general. Hence, to make the drug delivery dosage design more effective for realistic model patients, a model-following adaptive control design is carried out next by taking the help of neural networks, that are trained online. Simulation studies indicate that the adaptive controller proposed in this paper holds promise in killing the invading pathogens and healing the damaged organ even in the presence of parameter uncertainties and continued pathogen attack. Note that the computational requirements for computing the control are very minimal and all associated computations (including the training of neural networks) can be carried out online. However it assumes that the required diagnosis process can be carried out at a sufficient faster rate so that all the states are available for control computation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nucleation is the first step in the formation of a new phase inside a mother phase. Two main forms of nucleation can be distinguished. In homogeneous nucleation, the new phase is formed in a uniform substance. In heterogeneous nucleation, on the other hand, the new phase emerges on a pre-existing surface (nucleation site). Nucleation is the source of about 30% of all atmospheric aerosol which in turn has noticeable health effects and a significant impact on climate. Nucleation can be observed in the atmosphere, studied experimentally in the laboratory and is the subject of ongoing theoretical research. This thesis attempts to be a link between experiment and theory. By comparing simulation results to experimental data, the aim is to (i) better understand the experiments and (ii) determine where the theory needs improvement. Computational fluid dynamics (CFD) tools were used to simulate homogeneous onecomponent nucleation of n-alcohols in argon and helium as carrier gases, homogeneous nucleation in the water-sulfuric acid-system, and heterogeneous nucleation of water vapor on silver particles. In the nucleation of n-alcohols, vapor depletion, carrier gas effect and carrier gas pressure effect were evaluated, with a special focus on the pressure effect whose dependence on vapor and carrier gas properties could be specified. The investigation of nucleation in the water-sulfuric acid-system included a thorough analysis of the experimental setup, determining flow conditions, vapor losses, and nucleation zone. Experimental nucleation rates were compared to various theoretical approaches. We found that none of the considered theoretical descriptions of nucleation captured the role of water in the process at all relative humidities. Heterogeneous nucleation was studied in the activation of silver particles in a TSI 3785 particle counter which uses water as its working fluid. The role of the contact angle was investigated and the influence of incoming particle concentrations and homogeneous nucleation on counting efficiency determined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work belongs to the field of computational high-energy physics (HEP). The key methods used in this thesis work to meet the challenges raised by the Large Hadron Collider (LHC) era experiments are object-orientation with software engineering, Monte Carlo simulation, the computer technology of clusters, and artificial neural networks. The first aspect discussed is the development of hadronic cascade models, used for the accurate simulation of medium-energy hadron-nucleus reactions, up to 10 GeV. These models are typically needed in hadronic calorimeter studies and in the estimation of radiation backgrounds. Various applications outside HEP include the medical field (such as hadron treatment simulations), space science (satellite shielding), and nuclear physics (spallation studies). Validation results are presented for several significant improvements released in Geant4 simulation tool, and the significance of the new models for computing in the Large Hadron Collider era is estimated. In particular, we estimate the ability of the Bertini cascade to simulate Compact Muon Solenoid (CMS) hadron calorimeter HCAL. LHC test beam activity has a tightly coupled cycle of simulation-to-data analysis. Typically, a Geant4 computer experiment is used to understand test beam measurements. Thus an another aspect of this thesis is a description of studies related to developing new CMS H2 test beam data analysis tools and performing data analysis on the basis of CMS Monte Carlo events. These events have been simulated in detail using Geant4 physics models, full CMS detector description, and event reconstruction. Using the ROOT data analysis framework we have developed an offline ANN-based approach to tag b-jets associated with heavy neutral Higgs particles, and we show that this kind of NN methodology can be successfully used to separate the Higgs signal from the background in the CMS experiment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nucleation is the first step of a first order phase transition. A new phase is always sprung up in nucleation phenomena. The two main categories of nucleation are homogeneous nucleation, where the new phase is formed in a uniform substance, and heterogeneous nucleation, when nucleation occurs on a pre-existing surface. In this thesis the main attention is paid on heterogeneous nucleation. This thesis wields the nucleation phenomena from two theoretical perspectives: the classical nucleation theory and the statistical mechanical approach. The formulation of the classical nucleation theory relies on equilibrium thermodynamics and use of macroscopically determined quantities to describe the properties of small nuclei, sometimes consisting of just a few molecules. The statistical mechanical approach is based on interactions between single molecules, and does not bear the same assumptions as the classical theory. This work gathers up the present theoretical knowledge of heterogeneous nucleation and utilizes it in computational model studies. A new exact molecular approach on heterogeneous nucleation was introduced and tested by Monte Carlo simulations. The results obtained from the molecular simulations were interpreted by means of the concepts of the classical nucleation theory. Numerical calculations were carried out for a variety of substances nucleating on different substances. The classical theory of heterogeneous nucleation was employed in calculations of one-component nucleation of water on newsprint paper, Teflon and cellulose film, and binary nucleation of water-n-propanol and water-sulphuric acid mixtures on silver nanoparticles. The results were compared with experimental results. The molecular simulation studies involved homogeneous nucleation of argon and heterogeneous nucleation of argon on a planar platinum surface. It was found out that the use of a microscopical contact angle as a fitting parameter in calculations based on the classical theory of heterogeneous nucleation leads to a fair agreement between the theoretical predictions and experimental results. In the presented cases the microscopical angle was found to be always smaller than the contact angle obtained from macroscopical measurements. Furthermore, molecular Monte Carlo simulations revealed that the concept of the geometrical contact parameter in heterogeneous nucleation calculations can work surprisingly well even for very small clusters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As an emerging research method that has showed promising potential in several research disciplines, simulation received relatively few attention in information systems research. This paper illustrates a framework for employing simulation to study IT value cocreation. Although previous studies identified factors driving IT value cocreation, its underlying process remains unclear. Simulation can address this limitation through exploring such underlying process with computational experiments. The simulation framework in this paper is based on an extended NK model. Agent-based modeling is employed as the theoretical basis for the NK model extensions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We compare magnetovolume effects in bulk and nanoparticles by performing Monte Carlo simulations of a spin-analogous model with coupled spatial and magnetic degrees of freedom and chemical disorder. We find that correlations between surface and bulk atoms lead with decreasing particle size to a substantial modification of the magnetic and elastic behavior at low temperatures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We offer a technique, motivated by feedback control and specifically sliding mode control, for the simulation of differential-algebraic equations (DAEs) that describe common engineering systems such as constrained multibody mechanical structures and electric networks. Our algorithm exploits the basic results from sliding mode control theory to establish a simulation environment that then requires only the most primitive of numerical solvers. We circumvent the most important requisite for the conventionalsimulation of DAEs: the calculation of a set of consistent initial conditions. Our algorithm, which relies on the enforcement and occurrence of sliding mode, will ensure that the algebraic equation is satisfied by the dynamic system even for inconsistent initial conditions and for all time thereafter. [DOI:10.1115/1.4001904]

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gene mapping is a systematic search for genes that affect observable characteristics of an organism. In this thesis we offer computational tools to improve the efficiency of (disease) gene-mapping efforts. In the first part of the thesis we propose an efficient simulation procedure for generating realistic genetical data from isolated populations. Simulated data is useful for evaluating hypothesised gene-mapping study designs and computational analysis tools. As an example of such evaluation, we demonstrate how a population-based study design can be a powerful alternative to traditional family-based designs in association-based gene-mapping projects. In the second part of the thesis we consider a prioritisation of a (typically large) set of putative disease-associated genes acquired from an initial gene-mapping analysis. Prioritisation is necessary to be able to focus on the most promising candidates. We show how to harness the current biomedical knowledge for the prioritisation task by integrating various publicly available biological databases into a weighted biological graph. We then demonstrate how to find and evaluate connections between entities, such as genes and diseases, from this unified schema by graph mining techniques. Finally, in the last part of the thesis, we define the concept of reliable subgraph and the corresponding subgraph extraction problem. Reliable subgraphs concisely describe strong and independent connections between two given vertices in a random graph, and hence they are especially useful for visualising such connections. We propose novel algorithms for extracting reliable subgraphs from large random graphs. The efficiency and scalability of the proposed graph mining methods are backed by extensive experiments on real data. While our application focus is in genetics, the concepts and algorithms can be applied to other domains as well. We demonstrate this generality by considering coauthor graphs in addition to biological graphs in the experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An implicit sub-grid scale model for large eddy simulation is presented by utilising the concept of a relaxation system for one dimensional Burgers' equation in a novel way. The Burgers' equation is solved for three different unsteady flow situations by varying the ratio of relaxation parameter (epsilon) to time step. The coarse mesh results obtained with a relaxation scheme are compared with the filtered DNS solution of the same problem on a fine mesh using a fourth-order CWENO discretisation in space and third-order TVD Runge-Kutta discretisation in time. The numerical solutions obtained through the relaxation system have the same order of accuracy in space and time and they closely match with the filtered DNS solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Discrete vortex simulations of the mixing layer carried out in the past have usually involved large induced velocity fluctuations, and thus demanded rather long time-averaging to obtain satisfactory values of Reynolds stresses and third-order moments. This difficulty has been traced here, in part, to the use of discrete vortices to model what in actuality are continuous vortex sheets. We propose here a novel two-dimensional vortex sheet technique for computing mixing layer flow in the limit of infinite Reynolds number. The method divides the vortex sheet into constant-strength linear elements, whose motions are computed using the Biot-Savart law. The downstream far-field is modelled by a steady vorticity distribution derived by application of conical similarity from the solution obtained in a finite computational domain. The boundary condition on the splitter plate is satisfied rigorously using a doublet sheet. The computed large-scale roll-up of the vortex sheet is qualitatively similar to experimentally obtained shadow-graphs of the plane turbulent mixing layer. The mean streamwise velocity profile and the growth rate agree well with experimental data. The presently computed Reynolds stresses and third-order moments are comparable with experimental and previous vortex-dynamical results, without using any external parameter (such as the vortex core-size) of the kind often used in the latter. The computed autocorrelations are qualitatively similar to experimental results along the top and bottom edges of the mixing layer, and show a well-defined periodicity along the centreline. The accuracy of the present computation is independently established by demonstrating negligibly small changes in the five invariants (including the Hamiltonian) in vortex dynamics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An improved Monte Carlo technique is presented in this work to simulate nanoparticle formation through a micellar route. The technique builds on the simulation technique proposed by Bandyopadhyaya et al. (Langmuir 2000, 16, 7139) which is general and rigorous but at the same time very computation intensive, so much so that nanoparticle formation in low occupancy systems cannot be simulated in reasonable time. In view of this, several strategies, rationalized by simple mathematical analyses, are proposed to accelerate Monte Carlo simulations. These are elimination of infructuous events, removal of excess reactant postreaction, and use of smaller micelle population a large number of times. Infructuous events include collision of an empty micelle with another empty one or with another one containing only one molecule or only a solid particle. These strategies are incorporated in a new simulation technique which divides the entire micelle population in four classes and shifts micelles from one class to other as the simulation proceeds. The simulation results, throughly tested using chi-square and other tests, show that the predictions of the improved technique remain unchanged, but with more than an order of magnitude decrease in computational effort for some of the simulations reported in the literature. A post priori validation scheme for the correctness of the simulation results has been utilized to propose a new simulation strategy to arrive at converged simulation results with near minimum computational effort.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CFD investigations are carried out to study the heat flux and temperature distribution in the calandria using a 3–Dimensional RANS code. Internal flow computations and experimental studies are carried out for a calandria embedded with a matrix of tubes working together as a reactor. Numerical investigations are carried on the Calandria reactor vessel with horizontal inlets and outlets located on top and the bottom to study the flow pattern and the associated temperature distribution. The computations have been carried out to simulate fluid flow and convective heat transfer for assigned near–to working conditions with different moderator injection rates and reacting heat fluxes. The results of computations provide an estimate of the tolerance bands for safe working limits for the heat dissipation at different working conditions by virtue of prediction of the hot spots in the calandria. The isothermal CFD results are validated by a set of experiments on a specially designed scaled model conducted over a range of flows and simulation parameters. The comparison of CFD results with experiments show good agreement.