928 resultados para Stochastic Model
Resumo:
Based on the literature data from HT-29 cell monolayers, we develop a model for its growth, analogous to an epidemic model, mixing local and global interactions. First, we propose and solve a deterministic equation for the progress of these colonies. Thus, we add a stochastic (local) interaction and simulate the evolution of an Eden-like aggregate by using dynamical Monte Carlo methods. The growth curves of both deterministic and stochastic models are in excellent agreement with the experimental observations. The waiting times distributions, generated via our stochastic model, allowed us to analyze the role of mesoscopic events. We obtain log-normal distributions in the initial stages of the growth and Gaussians at long times. We interpret these outcomes in the light of cellular division events: in the early stages, the phenomena are dependent each other in a multiplicative geometric-based process, and they are independent at long times. We conclude that the main ingredients for a good minimalist model of tumor growth, at mesoscopic level, are intrinsic cooperative mechanisms and competitive search for space. © 2013 Elsevier Ltd.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Pós-graduação em Engenharia Elétrica - FEIS
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
O Município de Marabá- PA, situado na região Amazônica, sudeste do Estado do Pará, sofre anualmente com eventos de enchentes, ocasionados pelo aumento periódico do rio Tocantins e pela situação de vulnerabilidade da população que reside em áreas de risco. A defesa civil estadual e municipal anualmente planeja e prepara equipes para ações de defesa no município. Nesta fase o monitoramento e previsão de eventos de enchentes são importantes. Portanto, com o objetivo de diminuir erros nas previsões hidrológicas para o Município de Marabá, desenvolveu-se um modelo estocástico para previsão de nível do rio Tocantins, baseado na metodologia de Box e Jenkins. Utilizou os dados de níveis diários observados nas estações hidrológicas de Marabá e Carolina e Conceição do Araguaia da Agência Nacional de Águas (ANA), do período de 01/12/ 2008 a 31/03/2011. Efetuou-se o ajustamento de três modelos (Mt, Nt e Yt), através de diferentes aplicativos estatísticos: o SAS e o Gretl, usando diferentes interpretações do comportamento das séries para gerar as equações dos modelos. A principal diferença entre os aplicativos é que no SAS usa o modelo de função de transferência na modelagem. Realizou-se uma classificação da variabilidade do nível do rio, através da técnica dos Quantis para o período de 1972 a 2011, examinando-se apenas as categorizações de níveis ACIMA e MUITO ACIMA do normal. Para análise de impactos socioeconômicos foram usados os dados das ações da Defesa Civil Estado do Pará nas cheias de 2009 e 2011. Os resultados mostraram que o número de eventos de cheias com níveis MUITO ACIMA do normal, geralmente, podem estar associados a eventos de La Niña. Outro resultado importante: os modelos gerados simularam muito bem o nível do rio para o período de sete dias (01/04/2011 a 07/04/2011). O modelo multivariado Nt (com pequenos erros) representou o comportamento da série original, subestimando os valores reais nos dias 3, 4 e 5 de abril de 2011, com erro máximo de 0,28 no dia 4. O modelo univariado (Yt) teve bons resultados nas simulações com erros absolutos em torno de 0,12 m. O modelo com menor erro absoluto (0,08m) para o mesmo período foi o modelo Mt, desenvolvido pelo aplicativo SAS, que interpreta a série original como sendo não linear e não estacionária. A análise quantitativa dos impactos fluviométricos, ocorridos nas enchentes de 2009 e 2011 na cidade de Marabá, revelou em média que mais de 4 mil famílias sofrem com estes eventos, implicado em gastos financeiros elevados. Logo, conclui-se que os modelos de previsão de níveis são importantes ferramentas que a Defesa Civil, utiliza no planejamento e preparo de ações preventivas para o município de Marabá.
Resumo:
In this paper, the optimal reactive power planning problem under risk is presented. The classical mixed-integer nonlinear model for reactive power planning is expanded into two stage stochastic model considering risk. This new model considers uncertainty on the demand load. The risk is quantified by a factor introduced into the objective function and is identified as the variance of the random variables. Finally numerical results illustrate the performance of the proposed model, that is applied to IEEE 30-bus test system to determine optimal amount and location for reactive power expansion.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This work aimed to apply genetic algorithms (GA) and particle swarm optimization (PSO) in cash balance management using Miller-Orr model, which consists in a stochastic model that does not define a single ideal point for cash balance, but an oscillation range between a lower bound, an ideal balance and an upper bound. Thus, this paper proposes the application of GA and PSO to minimize the Total Cost of cash maintenance, obtaining the parameter of the lower bound of the Miller-Orr model, using for this the assumptions presented in literature. Computational experiments were applied in the development and validation of the models. The results indicated that both the GA and PSO are applicable in determining the cash level from the lower limit, with best results of PSO model, which had not yet been applied in this type of problem.
Resumo:
Reproducing Fourier's law of heat conduction from a microscopic stochastic model is a long standing challenge in statistical physics. As was shown by Rieder, Lebowitz and Lieb many years ago, a chain of harmonically coupled oscillators connected to two heat baths at different temperatures does not reproduce the diffusive behaviour of Fourier's law, but instead a ballistic one with an infinite thermal conductivity. Since then, there has been a substantial effort from the scientific community in identifying the key mechanism necessary to reproduce such diffusivity, which usually revolved around anharmonicity and the effect of impurities. Recently, it was shown by Dhar, Venkateshan and Lebowitz that Fourier's law can be recovered by introducing an energy conserving noise, whose role is to simulate the elastic collisions between the atoms and other microscopic degrees of freedom, which one would expect to be present in a real solid. For a one-dimensional chain this is accomplished numerically by randomly flipping - under the framework of a Poisson process with a variable “rate of collisions" - the sign of the velocity of an oscillator. In this poster we present Langevin simulations of a one-dimensional chain of oscillators coupled to two heat baths at different temperatures. We consider both harmonic and anharmonic (quartic) interactions, which are studied with and without the energy conserving noise. With these results we are able to map in detail how the heat conductivity k is influenced by both anharmonicity and the energy conserving noise. We also present a detailed analysis of the behaviour of k as a function of the size of the system and the rate of collisions, which includes a finite-size scaling method that enables us to extract the relevant critical exponents. Finally, we show that for harmonic chains, k is independent of temperature, both with and without the noise. Conversely, for anharmonic chains we find that k increases roughly linearly with the temperature of a given reservoir, while keeping the temperature difference fixed.
Resumo:
Statistical modelling and statistical learning theory are two powerful analytical frameworks for analyzing signals and developing efficient processing and classification algorithms. In this thesis, these frameworks are applied for modelling and processing biomedical signals in two different contexts: ultrasound medical imaging systems and primate neural activity analysis and modelling. In the context of ultrasound medical imaging, two main applications are explored: deconvolution of signals measured from a ultrasonic transducer and automatic image segmentation and classification of prostate ultrasound scans. In the former application a stochastic model of the radio frequency signal measured from a ultrasonic transducer is derived. This model is then employed for developing in a statistical framework a regularized deconvolution procedure, for enhancing signal resolution. In the latter application, different statistical models are used to characterize images of prostate tissues, extracting different features. These features are then uses to segment the images in region of interests by means of an automatic procedure based on a statistical model of the extracted features. Finally, machine learning techniques are used for automatic classification of the different region of interests. In the context of neural activity signals, an example of bio-inspired dynamical network was developed to help in studies of motor-related processes in the brain of primate monkeys. The presented model aims to mimic the abstract functionality of a cell population in 7a parietal region of primate monkeys, during the execution of learned behavioural tasks.
Resumo:
Basic concepts and definitions relative to Lagrangian Particle Dispersion Models (LPDMs)for the description of turbulent dispersion are introduced. The study focusses on LPDMs that use as input, for the large scale motion, fields produced by Eulerian models, with the small scale motions described by Lagrangian Stochastic Models (LSMs). The data of two different dynamical model have been used: a Large Eddy Simulation (LES) and a General Circulation Model (GCM). After reviewing the small scale closure adopted by the Eulerian model, the development and implementation of appropriate LSMs is outlined. The basic requirement of every LPDM used in this work is its fullfillment of the Well Mixed Condition (WMC). For the dispersion description in the GCM domain, a stochastic model of Markov order 0, consistent with the eddy-viscosity closure of the dynamical model, is implemented. A LSM of Markov order 1, more suitable for shorter timescales, has been implemented for the description of the unresolved motion of the LES fields. Different assumptions on the small scale correlation time are made. Tests of the LSM on GCM fields suggest that the use of an interpolation algorithm able to maintain an analytical consistency between the diffusion coefficient and its derivative is mandatory if the model has to satisfy the WMC. Also a dynamical time step selection scheme based on the diffusion coefficient shape is introduced, and the criteria for the integration step selection are discussed. Absolute and relative dispersion experiments are made with various unresolved motion settings for the LSM on LES data, and the results are compared with laboratory data. The study shows that the unresolved turbulence parameterization has a negligible influence on the absolute dispersion, while it affects the contribution of the relative dispersion and meandering to absolute dispersion, as well as the Lagrangian correlation.
Resumo:
The problem of estimating the numbers of motor units N in a muscle is embedded in a general stochastic model using the notion of thinning from point process theory. In the paper a new moment type estimator for the numbers of motor units in a muscle is denned, which is derived using random sums with independently thinned terms. Asymptotic normality of the estimator is shown and its practical value is demonstrated with bootstrap and approximative confidence intervals for a data set from a 31-year-old healthy right-handed, female volunteer. Moreover simulation results are presented and Monte-Carlo based quantiles, means, and variances are calculated for N in{300,600,1000}.
Resumo:
The AEGISS (Ascertainment and Enhancement of Gastrointestinal Infection Surveillance and Statistics) project aims to use spatio-temporal statistical methods to identify anomalies in the space-time distribution of non-specific, gastrointestinal infections in the UK, using the Southampton area in southern England as a test-case. In this paper, we use the AEGISS project to illustrate how spatio-temporal point process methodology can be used in the development of a rapid-response, spatial surveillance system. Current surveillance of gastroenteric disease in the UK relies on general practitioners reporting cases of suspected food-poisoning through a statutory notification scheme, voluntary laboratory reports of the isolation of gastrointestinal pathogens and standard reports of general outbreaks of infectious intestinal disease by public health and environmental health authorities. However, most statutory notifications are made only after a laboratory reports the isolation of a gastrointestinal pathogen. As a result, detection is delayed and the ability to react to an emerging outbreak is reduced. For more detailed discussion, see Diggle et al. (2003). A new and potentially valuable source of data on the incidence of non-specific gastro-enteric infections in the UK is NHS Direct, a 24-hour phone-in clinical advice service. NHS Direct data are less likely than reports by general practitioners to suffer from spatially and temporally localized inconsistencies in reporting rates. Also, reporting delays by patients are likely to be reduced, as no appointments are needed. Against this, NHS Direct data sacrifice specificity. Each call to NHS Direct is classified only according to the general pattern of reported symptoms (Cooper et al, 2003). The current paper focuses on the use of spatio-temporal statistical analysis for early detection of unexplained variation in the spatio-temporal incidence of non-specific gastroenteric symptoms, as reported to NHS Direct. Section 2 describes our statistical formulation of this problem, the nature of the available data and our approach to predictive inference. Section 3 describes the stochastic model. Section 4 gives the results of fitting the model to NHS Direct data. Section 5 shows how the model is used for spatio-temporal prediction. The paper concludes with a short discussion.
Resumo:
Asthma is an increasing health problem worldwide, but the long-term temporal pattern of clinical symptoms is not understood and predicting asthma episodes is not generally possible. We analyse the time series of peak expiratory flows, a standard measurement of airway function that has been assessed twice daily in a large asthmatic population during a long-term crossover clinical trial. Here we introduce an approach to predict the risk of worsening airflow obstruction by calculating the conditional probability that, given the current airway condition, a severe obstruction will occur within 30 days. We find that, compared with a placebo, a regular long-acting bronchodilator (salmeterol) that is widely used to improve asthma control decreases the risk of airway obstruction. Unexpectedly, however, a regular short-acting beta2-agonist bronchodilator (albuterol) increases this risk. Furthermore, we find that the time series of peak expiratory flows show long-range correlations that change significantly with disease severity, approaching a random process with increased variability in the most severe cases. Using a nonlinear stochastic model, we show that both the increased variability and the loss of correlations augment the risk of unstable airway function. The characterization of fluctuations in airway function provides a quantitative basis for objective risk prediction of asthma episodes and for evaluating the effectiveness of therapy.