988 resultados para Cellular-Automata


Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this work, some aspects of the erythrocyte cycle of the malaria parasite was incorporated into a cellular automata model to simulated the major factors leading to disruption of the erythrocyte cycle and consequent appearance of gametocytes, which infected the mosquitoes. Furthermore, the time seies of parasitaemia of infected patients was analyzed and compared to simulated data. The results suggested that differences in the temporal patterns of the asexual parasitaemia are associated with different effectiveness of the immune system in controlling the infection

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The market of digital games have grown in the last years, becoming popular between many ages, the number of smartphones and tablets users have also showed a recent increase, including the ones using Android as operational system. The main objective of a digital game is the ludic activity but also it can be used as a tool to education, learning and even simulation. This work proposes the development of a game for smartphones or tablet running on Android operational system, this game will simulate living beings in an environment, each one with different behaviors, based in the concepts of artificial live, cellular automata and emergence. This way simulating the behavior of a living being community with a computational base in artificial live and following concepts of game design. The game can represent visually some characteristics of living beings, as well behaviors and interactions between them, in a very simple way. The game can be upgraded in the future to represent better living beings using more details to the simulation of these

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In epidemiology, the basic reproduction number R-0 is usually defined as the average number of new infections caused by a single infective individual introduced into a completely susceptible population. According to this definition. R-0 is related to the initial stage of the spreading of a contagious disease. However, from epidemiological models based on ordinary differential equations (ODE), R-0 is commonly derived from a linear stability analysis and interpreted as a bifurcation parameter: typically, when R-0 >1, the contagious disease tends to persist in the population because the endemic stationary solution is asymptotically stable: when R-0 <1, the corresponding pathogen tends to naturally disappear because the disease-free stationary solution is asymptotically stable. Here we intend to answer the following question: Do these two different approaches for calculating R-0 give the same numerical values? In other words, is the number of secondary infections caused by a unique sick individual equal to the threshold obtained from stability analysis of steady states of ODE? For finding the answer, we use a susceptibleinfective-recovered (SIR) model described in terms of ODE and also in terms of a probabilistic cellular automaton (PCA), where each individual (corresponding to a cell of the PCA lattice) is connected to others by a random network favoring local contacts. The values of R-0 obtained from both approaches are compared, showing good agreement. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We study the firing rate properties of a cellular automaton model for a neuronal network with chemical synapses. We propose a simple mechanism in which the nonlocal connections are included, through electrical and chemical synapses. In the latter case, we introduce a time delay which produces self-sustained activity. Nonlocal connections, or shortcuts, are randomly introduced according to a specified connection probability. There is a range of connection probabilities for which neuron firing occurs, as well as a critical probability for which the firing ceases in the absence of time delay. The critical probability for nonlocal shortcuts depends on the network size according to a power-law. We also compute the firing rate amplification factor by varying both the connection probability and the time delay for different network sizes. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this work we present an agent-based model for the spread of tuberculosis where the individuals can be infected with either drug-susceptible or drug-resistant strains and can also receive a treatment. The dynamics of the model and the role of each one of the parameters are explained. The whole set of parameters is explored to check their importance in the numerical simulation results. The model captures the beneficial impact of the adequate treatment on the prevalence of tuberculosis. Nevertheless, depending on the treatment parameters range, it also captures the emergence of drug resistance. Drug resistance emergence is particularly likely to occur for parameter values corresponding to less efficacious treatment, as usually found in developing countries.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We investigate the nonequilibrium roughening transition of a one-dimensional restricted solid-on-solid model by directly sampling the stationary probability density of a suitable order parameter as the surface adsorption rate varies. The shapes of the probability density histograms suggest a typical Ginzburg-Landau scenario for the phase transition of the model, and estimates of the "magnetic" exponent seem to confirm its mean-field critical behavior. We also found that the flipping times between the metastable phases of the model scale exponentially with the system size, signaling the breaking of ergodicity in the thermodynamic limit. Incidentally, we discovered that a closely related model not considered before also displays a phase transition with the same critical behavior as the original model. Our results support the usefulness of off-critical histogram techniques in the investigation of nonequilibrium phase transitions. We also briefly discuss in the appendix a good and simple pseudo-random number generator used in our simulations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Os princípios e as diretrizes do Sistema Único de Saúde (SUS) impõem uma estrutura de assistência baseada em redes de políticas públicas que, combinada ao modelo de financiamento adotado, conduz a falhas de mercado. Isso impõe barreiras à gestão do sistema público de saúde e à concretização dos objetivos do SUS. As características institucionais e a heterogeneidade dos atores, aliadas à existência de diferentes redes de atenção à saúde, geram complexidade analítica no estudo da dinâmica global da rede do SUS. Há limitações ao emprego de métodos quantitativos baseados em análise estática com dados retrospectivos do sistema público de saúde. Assim, propõe-se a abordagem do SUS como sistema complexo, a partir da utilização de metodologia quantitativa inovadora baseada em simulação computacional. O presente artigo buscou analisar desafios e potencialidades na utilização de modelagem com autômatos celulares combinada com modelagem baseada em agentes para simulação da evolução da rede de serviços do SUS. Tal abordagem deve permitir melhor compreensão da organização, heterogeneidade e dinâmica estrutural da rede de serviços do SUS e possibilitar minimização dos efeitos das falhas de mercado no sistema de saúde brasileiro.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Water distribution networks optimization is a challenging problem due to the dimension and the complexity of these systems. Since the last half of the twentieth century this field has been investigated by many authors. Recently, to overcome discrete nature of variables and non linearity of equations, the research has been focused on the development of heuristic algorithms. This algorithms do not require continuity and linearity of the problem functions because they are linked to an external hydraulic simulator that solve equations of mass continuity and of energy conservation of the network. In this work, a NSGA-II (Non-dominating Sorting Genetic Algorithm) has been used. This is a heuristic multi-objective genetic algorithm based on the analogy of evolution in nature. Starting from an initial random set of solutions, called population, it evolves them towards a front of solutions that minimize, separately and contemporaneously, all the objectives. This can be very useful in practical problems where multiple and discordant goals are common. Usually, one of the main drawback of these algorithms is related to time consuming: being a stochastic research, a lot of solutions must be analized before good ones are found. Results of this thesis about the classical optimal design problem shows that is possible to improve results modifying the mathematical definition of objective functions and the survival criterion, inserting good solutions created by a Cellular Automata and using rules created by classifier algorithm (C4.5). This part has been tested using the version of NSGA-II supplied by Centre for Water Systems (University of Exeter, UK) in MATLAB® environment. Even if orientating the research can constrain the algorithm with the risk of not finding the optimal set of solutions, it can greatly improve the results. Subsequently, thanks to CINECA help, a version of NSGA-II has been implemented in C language and parallelized: results about the global parallelization show the speed up, while results about the island parallelization show that communication among islands can improve the optimization. Finally, some tests about the optimization of pump scheduling have been carried out. In this case, good results are found for a small network, while the solutions of a big problem are affected by the lack of constraints on the number of pump switches. Possible future research is about the insertion of further constraints and the evolution guide. In the end, the optimization of water distribution systems is still far from a definitive solution, but the improvement in this field can be very useful in reducing the solutions cost of practical problems, where the high number of variables makes their management very difficult from human point of view.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Traditional logic gates are rapidly reaching the limits of miniaturization. Overheating of these components is no longer negligible. A new physical approach to the machine was proposed by Prof. C S. Lent “Molecular Quantum cellular automata”. Indeed the quantum-dot cellular automata (QCA) approach offers an attractive alternative to diode or transistor devices. Th units encode binary information by two polarizations without corrent flow. The units for QCA theory are called QCA cells and can be realized in several way. Molecules can act as QCA cells at room temperature. In collaboration with STMicroelectronic, the group of Electrochemistry of Prof. Paolucci and the Nananotecnology laboratory from Lecce, we synthesized and studied with many techniques surface-active chiral bis-ferrocenes, conveniently designed in order to act as prototypical units for molecular computing devices. The chemistry of ferrocene has been studied thoroughly and found the opportunity to promote substitution reaction of a ferrocenyl alcohols with various nucleophiles without the aid of Lewis acid as catalysts. The only interaction between water and the two reagents is involve in the formation of a carbocation specie which is the true reactive species. We have generalized this concept to other benzyl alcohols which generating stabilized carbocations. Carbocation describe in Mayr’s scale were fondametal for our research. Finally, we used these alcohols to alkylate in enantioselective way aldehydes via organocatalysis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The assessment of the RAMS (Reliability, Availability, Maintainability and Safety) performances of system generally includes the evaluations of the “Importance” of its components and/or of the basic parameters of the model through the use of the Importance Measures. The analytical equations proposed in this study allow the estimation of the first order Differential Importance Measure on the basis of the Birnbaum measures of components, under the hypothesis of uniform percentage changes of parameters. The aging phenomena are introduced into the model by assuming exponential-linear or Weibull distributions for the failure probabilities. An algorithm based on a combination of MonteCarlo simulation and Cellular Automata is applied in order to evaluate the performance of a networked system, made up of source nodes, user nodes and directed edges subjected to failure and repair. Importance Sampling techniques are used for the estimation of the first and total order Differential Importance Measures through only one simulation of the system “operational life”. All the output variables are computed contemporaneously on the basis of the same sequence of the involved components, event types (failure or repair) and transition times. The failure/repair probabilities are forced to be the same for all components; the transition times are sampled from the unbiased probability distributions or it can be also forced, for instance, by assuring the occurrence of at least a failure within the system operational life. The algorithm allows considering different types of maintenance actions: corrective maintenance that can be performed either immediately upon the component failure or upon finding that the component has failed for hidden failures that are not detected until an inspection; and preventive maintenance, that can be performed upon a fixed interval. It is possible to use a restoration factor to determine the age of the component after a repair or any other maintenance action.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This technical report discusses the application of Lattice Boltzmann Method (LBM) in the fluid flow simulation through porous filter-wall of disordered media. The diesel particulate filter (DPF) is an example of disordered media. DPF is developed as a cutting edge technology to reduce harmful particulate matter in the engine exhaust. Porous filter-wall of DPF traps these soot particles in the after-treatment of the exhaust gas. To examine the phenomena inside the DPF, researchers are looking forward to use the Lattice Boltzmann Method as a promising alternative simulation tool. The lattice Boltzmann method is comparatively a newer numerical scheme and can be used to simulate fluid flow for single-component single-phase, single-component multi-phase. It is also an excellent method for modelling flow through disordered media. The current work focuses on a single-phase fluid flow simulation inside the porous micro-structure using LBM. Firstly, the theory concerning the development of LBM is discussed. LBM evolution is always related to Lattice gas Cellular Automata (LGCA), but it is also shown that this method is a special discretized form of the continuous Boltzmann equation. Since all the simulations are conducted in two-dimensions, the equations developed are in reference with D2Q9 (two-dimensional 9-velocity) model. The artificially created porous micro-structure is used in this study. The flow simulations are conducted by considering air and CO2 gas as fluids. The numerical model used in this study is explained with a flowchart and the coding steps. The numerical code is constructed in MATLAB. Different types of boundary conditions and their importance is discussed separately. Also the equations specific to boundary conditions are derived. The pressure and velocity contours over the porous domain are studied and recorded. The results are compared with the published work. The permeability values obtained in this study can be fitted to the relation proposed by Nabovati [8], and the results are in excellent agreement within porosity range of 0.4 to 0.8.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Empirical evidence and theoretical studies suggest that the phenotype, i.e., cellular- and molecular-scale dynamics, including proliferation rate and adhesiveness due to microenvironmental factors and gene expression that govern tumor growth and invasiveness, also determine gross tumor-scale morphology. It has been difficult to quantify the relative effect of these links on disease progression and prognosis using conventional clinical and experimental methods and observables. As a result, successful individualized treatment of highly malignant and invasive cancers, such as glioblastoma, via surgical resection and chemotherapy cannot be offered and outcomes are generally poor. What is needed is a deterministic, quantifiable method to enable understanding of the connections between phenotype and tumor morphology. Here, we critically assess advantages and disadvantages of recent computational modeling efforts (e.g., continuum, discrete, and cellular automata models) that have pursued this understanding. Based on this assessment, we review a multiscale, i.e., from the molecular to the gross tumor scale, mathematical and computational "first-principle" approach based on mass conservation and other physical laws, such as employed in reaction-diffusion systems. Model variables describe known characteristics of tumor behavior, and parameters and functional relationships across scales are informed from in vitro, in vivo and ex vivo biology. We review the feasibility of this methodology that, once coupled to tumor imaging and tumor biopsy or cell culture data, should enable prediction of tumor growth and therapy outcome through quantification of the relation between the underlying dynamics and morphological characteristics. In particular, morphologic stability analysis of this mathematical model reveals that tumor cell patterning at the tumor-host interface is regulated by cell proliferation, adhesion and other phenotypic characteristics: histopathology information of tumor boundary can be inputted to the mathematical model and used as a phenotype-diagnostic tool to predict collective and individual tumor cell invasion of surrounding tissue. This approach further provides a means to deterministically test effects of novel and hypothetical therapy strategies on tumor behavior.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Agent-Based Modelling and simulation (ABM) is a rather new approach for studying complex systems withinteracting autonomous agents that has lately undergone great growth in various fields such as biology, physics, social science, economics and business. Efforts to model and simulate the highly complex cement hydration process have been made over the past 40 years, with the aim of predicting the performance of concrete and designing innovative and enhanced cementitious materials. The ABM presented here - based on previous work - focuses on the early stages of cement hydration by modelling the physical-chemical processes at the particle level. The model considers the cement hydration process as a time and 3D space system, involving multiple diffusing and reacting species of spherical particles. Chemical reactions are simulated by adaptively selecting discrete stochastic simulation for the appropriate reaction, whenever that is necessary. Interactions between particles are also considered. The model has been inspired by reported cellular automata?s approach which provides detailed predictions of cement microstructure at the expense of significant computational difficulty. The ABM approach herein seeks to bring about an optimal balance between accuracy and computational efficiency.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper introduces APA (?Artificial Prion Assembly?): a pattern recognition system based on artificial prion crystalization. Specifically, the system exhibits the capability to classify patterns according to the resulting prion self- assembly simulated with cellular automata. Our approach is inspired in the biological process of proteins aggregation, known as prions, which are assembled as amyloid fibers related with neurodegenerative disorders.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Landforms and earthquakes appear to be extremely complex; yet, there is order in the complexity. Both satisfy fractal statistics in a variety of ways. A basic question is whether the fractal behavior is due to scale invariance or is the signature of a broadly applicable class of physical processes. Both landscape evolution and regional seismicity appear to be examples of self-organized critical phenomena. A variety of statistical models have been proposed to model landforms, including diffusion-limited aggregation, self-avoiding percolation, and cellular automata. Many authors have studied the behavior of multiple slider-block models, both in terms of the rupture of a fault to generate an earthquake and in terms of the interactions between faults associated with regional seismicity. The slider-block models exhibit a remarkably rich spectrum of behavior; two slider blocks can exhibit low-order chaotic behavior. Large numbers of slider blocks clearly exhibit self-organized critical behavior.