928 resultados para Stochastic Model
Resumo:
We investigate the behavior of a single-cell protozoan in a narrow tubular ring. This environment forces them to swim under a one-dimensional periodic boundary condition. Above a critical density, single-cell protozoa aggregate spontaneously without external stimulation. The high-density zone of swimming cells exhibits a characteristic collective dynamics including translation and boundary fluctuation. We analyzed the velocity distribution and turn rate of swimming cells and found that the regulation of the turing rate leads to a stable aggregation and that acceleration of velocity triggers instability of aggregation. These two opposing effects may help to explain the spontaneous dynamics of collective behavior. We also propose a stochastic model for the mechanism underlying the collective behavior of swimming cells.
Resumo:
The growing energy consumption in the residential sector represents about 30% of global demand. This calls for Demand Side Management solutions propelling change in behaviors of end consumers, with the aim to reduce overall consumption as well as shift it to periods in which demand is lower and where the cost of generating energy is lower. Demand Side Management solutions require detailed knowledge about the patterns of energy consumption. The profile of electricity demand in the residential sector is highly correlated with the time of active occupancy of the dwellings; therefore in this study the occupancy patterns in Spanish properties was determined using the 2009–2010 Time Use Survey (TUS), conducted by the National Statistical Institute of Spain. The survey identifies three peaks in active occupancy, which coincide with morning, noon and evening. This information has been used to input into a stochastic model which generates active occupancy profiles of dwellings, with the aim to simulate domestic electricity consumption. TUS data were also used to identify which appliance-related activities could be considered for Demand Side Management solutions during the three peaks of occupancy.
Resumo:
Numerical experiments are described that pertain to the climate of a coupled atmosphere–ocean–ice system in the absence of land, driven by modern-day orbital and CO2 forcing. Millennial time-scale simulations yield a mean state in which ice caps reach down to 55° of latitude and both the atmosphere and ocean comprise eastward- and westward-flowing zonal jets, whose structure is set by their respective baroclinic instabilities. Despite the zonality of the ocean, it is remarkably efficient at transporting heat meridionally through the agency of Ekman transport and eddy-driven subduction. Indeed the partition of heat transport between the atmosphere and ocean is much the same as the present climate, with the ocean dominating in the Tropics and the atmosphere in the mid–high latitudes. Variability of the system is dominated by the coupling of annular modes in the atmosphere and ocean. Stochastic variability inherent to the atmospheric jets drives variability in the ocean. Zonal flows in the ocean exhibit decadal variability, which, remarkably, feeds back to the atmosphere, coloring the spectrum of annular variability. A simple stochastic model can capture the essence of the process. Finally, it is briefly reviewed how the aquaplanet can provide information about the processes that set the partition of heat transport and the climate of Earth.
Resumo:
We introduce jump processes in R(k), called density-profile processes, to model biological signaling networks. Our modeling setup describes the macroscopic evolution of a finite-size spin-flip model with k types of spins with arbitrary number of internal states interacting through a non-reversible stochastic dynamics. We are mostly interested on the multi-dimensional empirical-magnetization vector in the thermodynamic limit, and prove that, within arbitrary finite time-intervals, its path converges almost surely to a deterministic trajectory determined by a first-order (non-linear) differential equation with explicit bounds on the distance between the stochastic and deterministic trajectories. As parameters of the spin-flip dynamics change, the associated dynamical system may go through bifurcations, associated to phase transitions in the statistical mechanical setting. We present a simple example of spin-flip stochastic model, associated to a synthetic biology model known as repressilator, which leads to a dynamical system with Hopf and pitchfork bifurcations. Depending on the parameter values, the magnetization random path can either converge to a unique stable fixed point, converge to one of a pair of stable fixed points, or asymptotically evolve close to a deterministic orbit in Rk. We also discuss a simple signaling pathway related to cancer research, called p53 module.
Resumo:
This paper studies the electricity hourly load demand in the area covered by a utility situated in the southeast of Brazil. We propose a stochastic model which employs generalized long memory (by means of Gegenbauer processes) to model the seasonal behavior of the load. The model is proposed for sectional data, that is, each hour’s load is studied separately as a single series. This approach avoids modeling the intricate intra-day pattern (load profile) displayed by the load, which varies throughout days of the week and seasons. The forecasting performance of the model is compared with a SARIMA benchmark using the years of 1999 and 2000 as the out-of-sample. The model clearly outperforms the benchmark. We conclude for general long memory in the series.
Resumo:
O presente estudo pretende avaliar o desempenho das Delegacias da Receita Federal através do estabelecimento de uma fronteira de eficiência paramétrica baseada nos custos, utilizando para tal um modelo estocástico que divide o ruído em dois componentes, sendo um aleatório e outro proveniente da ineficiência de cada unidade. O trabalho terá por base dados relativos aos anos de 2006 e 2008 em uma análise em corte transversal e visa avaliar a política pública de unificação dos órgãos estatais responsáveis pela arrecadação de tributos em nível Federal, a Secretaria da Receita Federal (SRF) com a Secretaria da Receita Previdenciária (SRP), ocorrida através da lei 11.457 de 16 de março de 2007. O objetivo principal da pesquisa é determinar se as unidades descentralizadas da Receita Federal, notadamente as Delegacias da Receita Federal estão operando com eficiência, na tarefa de arrecadar tributos, em função dos recursos colocados a disposição para execução de suas atividades. Na presente pesquisa o produto da unidade a ser avaliado é a arrecadação, dentre as inúmeras atividades realizadas pelo órgão, no sentido de proporcionar ao Estado recurso para implantação de Políticas Públicas. O resultado encontrado indica que as regiões onde existe um grande número de empresas optantes pelo regime de tributação do SIMPLES, bem como as que possuem em sua jurisdição empresas consideradas DIFERENCIADAS pelo seu porte, provocam um aumento nos custos das Delegacias. As unidades que se encontram nas capitais dos Estados melhoraram o seu desempenho após a unificação. Além disso, uma proporção maior de Auditores Fiscais dentro da Delegacia em relação ao total de servidores reduz a ineficiência. O trabalho espera contribuir na avaliação desse novo modelo de gestão implantado na administração tributária federal no país.
Resumo:
In 1991 Gary S. Becker presented A Note on Restaurant Pricing and Other Examples of Social In uences on Price explaining why many successful restaurants, plays, sporting events, and other activities do not raise their prices even with persistent excess demand. The main reason for this is due to the discontinuity of stable demands, which is explained in Becker's (1991) analysis. In the present paper we construct a discrete time stochastic model of socially interacting consumers deciding for one of two establishments. With this model we show that the discontinuity of stable demands, proposed by Gary S. Becker, depends crucially on an additional factor: the dispersion of the consumers' intrinsic preferences for the establishments.
Resumo:
O presente trabalho trata da importância do modelamento da dinâmica do livro de ordens visando a compreensão dessa microestrutura de mercado. Para tanto, objetiva aplicar as técnicas de modelamento do livro de ofertas utilizando o modelo estocástico proposto por Cont, Stoikov e Talreja (2010) ao mercado acionário brasileiro. Uma vez aplicado o modelo, analisamos os resultados obtidos sob a ótica de outros estudos empíricos, como Bouchaud et al. (2002). Após a estimação e análise dos resultados foram realizadas simulações para constatar se os parâmetros encontrados refletem a dinâmica do mercado local, para diferentes cenários de normalização do tamanho das ordens. Por fim, com a análise dos resultados encontrados, foi possível concluir, com algumas ressalvas, que o modelo proposto é válido para o mercado de ações brasileiro, assim como é apresentado o impacto da liquidez dos ativos na comparação aos parâmetros encontrados em outros mercados internacionais.
Resumo:
Modeling transport of particulate suspensions in porous media is essential for understanding various processes of industrial and scientific interest. During these processes, particles are retained due to mechanisms like size exclusion (straining), adsorption, sedimentation and diffusion. In this thesis, a mathematical model is proposed and analytical solutions are obtained. The obtained analytic solutions for the proposed model, which takes pore and particle size distributions into account, were applied to predict the particle retention, pore blocking and permeability reduction during dead-end microfiltration in membranes. Various scenarios, considering different particle and pore size distributions were studied. The obtained results showed that pore blocking and permeability reduction are highly influenced by the initial pore and particle size distributions. This feature was observed even when different initial pore and particle size distributions with the same average pore size and injected particle size were considered. Finally, a mathematical model for predicting equivalent permeability in porous media during particle retention (and pore blocking) is proposed and the obtained solutions were applied to study permeability decline in different scenarios
Resumo:
The pair contact process - PCP is a nonequilibrium stochastic model which, like the basic contact process - CP, exhibits a phase transition to an absorbing state. While the absorbing state CP corresponds to a unique configuration (empty lattice), the PCP process infinitely many. Numerical and theoretical studies, nevertheless, indicate that the PCP belongs to the same universality class as the CP (direct percolation class), but with anomalies in the critical spreading dynamics. An infinite number of absorbing configurations arise in the PCP because all process (creation and annihilation) require a nearest-neighbor pair of particles. The diffusive pair contact process - PCPD) was proposed by Grassberger in 1982. But the interest in the problem follows its rediscovery by the Langevin description. On the basis of numerical results and renormalization group arguments, Carlon, Henkel and Schollwöck (2001), suggested that certain critical exponents in the PCPD had values similar to those of the party-conserving - PC class. On the other hand, Hinrichsen (2001), reported simulation results inconsistent with the PC class, and proposed that the PCPD belongs to a new universality class. The controversy regarding the universality of the PCPD remains unresolved. In the PCPD, a nearest-neighbor pair of particles is necessary for the process of creation and annihilation, but the particles to diffuse individually. In this work we study the PCPD with diffusion of pair, in which isolated particles cannot move; a nearest-neighbor pair diffuses as a unit. Using quasistationary simulation, we determined with good precision the critical point and critical exponents for three values of the diffusive probability: D=0.5 and D=0.1. For D=0.5: PC=0.89007(3), β/v=0.252(9), z=1.573(1), =1.10(2), m=1.1758(24). For D=0.1: PC=0.9172(1), β/v=0.252(9), z=1.579(11), =1.11(4), m=1.173(4)
Resumo:
In Brazil, there have been several GPS applications and with the introduction of the Law 10.267/2001 that among other dispositions, deals with georeferencing of the rural parcels. However, most commercial softwares of processing and adjustment of GPS data don't let users to evaluate their results in a reliable way. For example, the constraints are normally used as absolute, which provides results with very optimists precisions. The adoption of additional analyses and the implementation of softwares can reduce these kinds of problems. Thus, a software for adjustment of GPS networks was developed, aiming at assisting the requirements of the Law 10.267/2001 in a reliable way. In this context, in this work it is analyzed the adjustments of GPS networks, by using absolute and relative constraints. In the latter, the adjustments were accomplished considering and not considering the correlations among the coordinate positions.
Resumo:
The diffusive epidemic process (PED) is a nonequilibrium stochastic model which, exhibits a phase trnasition to an absorbing state. In the model, healthy (A) and sick (B) individuals diffuse on a lattice with diffusion constants DA and DB, respectively. According to a Wilson renormalization calculation, the system presents a first-order phase transition, for the case DA > DB. Several researches performed simulation works for test this is conjecture, but it was not possible to observe this first-order phase transition. The explanation given was that we needed to perform simulation to higher dimensions. In this work had the motivation to investigate the critical behavior of a diffusive epidemic propagation with Lévy interaction(PEDL), in one-dimension. The Lévy distribution has the interaction of diffusion of all sizes taking the one-dimensional system for a higher-dimensional. We try to explain this is controversy that remains unresolved, for the case DA > DB. For this work, we use the Monte Carlo Method with resuscitation. This is method is to add a sick individual in the system when the order parameter (sick density) go to zero. We apply a finite size scalling for estimates the critical point and the exponent critical =, e z, for the case DA > DB
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The effect of the ionosphere on the signals of Global Navigation Satellite Systems (GNSS), such as the Global Positionig System (GPS) and the proposed European Galileo, is dependent on the ionospheric electron density, given by its Total Electron Content (TEC). Ionospheric time-varying density irregularities may cause scintillations, which are fluctuations in phase and amplitude of the signals. Scintillations occur more often at equatorial and high latitudes. They can degrade navigation and positioning accuracy and may cause loss of signal tracking, disrupting safety-critical applications, such as marine navigation and civil aviation. This paper addresses the results of initial research carried out on two fronts that are relevant to GNSS users if they are to counter ionospheric scintillations, i.e. forecasting and mitigating their effects. On the forecasting front, the dynamics of scintillation occurrence were analysed during the severe ionospheric storm that took place on the evening of 30 October 2003, using data from a network of GPS Ionospheric Scintillation and TEC Monitor (GISTM) receivers set up in Northern Europe. Previous results [1] indicated that GPS scintillations in that region can originate from ionospheric plasma structures from the American sector. In this paper we describe experiments that enabled confirmation of those findings. On the mitigation front we used the variance of the output error of the GPS receiver DLL (Delay Locked Loop) to modify the least squares stochastic model applied by an ordinary receiver to compute position. This error was modelled according to [2], as a function of the S4 amplitude scintillation index measured by the GISTM receivers. An improvement of up to 21% in relative positioning accuracy was achieved with this technnique.
Resumo:
In the Brazil, several have been the applications of GPS and with the introduction of the Law 10.267/2001 that among other dispositions, it treats of the georeferencing of the rural parcels. However, most of the commercial softwares of processing and adjustment of GPS data doesn't allow that the users may evaluate their results in a reliable way. For example, the constraints are normally used as absolute, which provides results with very optimists precisions. The adoption of additional analyses and the implementation of softwares can reduce these kinds of problems. Thus, it was developed a software for adjustment of GPS networks, aiming to assist in a reliable way the requirements of the Law 10.267/2001. In this context, in this work it is analyzed the adjustments of GPS networks, utilizing absolute and relative constraints. In the case of the last one, the adjustments were accomplished considering and not considering the correlations among the coordinate positions.