49 resultados para Probability Distribution Function
Resumo:
A quantum random walk on the integers exhibits pseudo memory effects, in that its probability distribution after N steps is determined by reshuffling the first N distributions that arise in a classical random walk with the same initial distribution. In a classical walk, entropy increase can be regarded as a consequence of the majorization ordering of successive distributions. The Lorenz curves of successive distributions for a symmetric quantum walk reveal no majorization ordering in general. Nevertheless, entropy can increase, and computer experiments show that it does so on average. Varying the stages at which the quantum coin system is traced out leads to new quantum walks, including a symmetric walk for which majorization ordering is valid but the spreading rate exceeds that of the usual symmetric quantum walk.
Resumo:
The focus of the present work is the well-known feature of the probability density function (PDF) transport equations in turbulent flows-the inverse parabolicity of the equations. While it is quite common in fluid mechanics to interpret equations with direct (forward-time) parabolicity as diffusive (or as a combination of diffusion, convection and reaction), the possibility of a similar interpretation for equations with inverse parabolicity is not clear. According to Einstein's point of view, a diffusion process is associated with the random walk of some physical or imaginary particles, which can be modelled by a Markov diffusion process. In the present paper it is shown that the Markov diffusion process directly associated with the PDF equation represents a reasonable model for dealing with the PDFs of scalars but it significantly underestimates the diffusion rate required to simulate turbulent dispersion when the velocity components are considered.
Resumo:
Statistical tests of Load-Unload Response Ratio (LURR) signals are carried in order to verify statistical robustness of the previous studies using the Lattice Solid Model (MORA et al., 2002b). In each case 24 groups of samples with the same macroscopic parameters (tidal perturbation amplitude A, period T and tectonic loading rate k) but different particle arrangements are employed. Results of uni-axial compression experiments show that before the normalized time of catastrophic failure, the ensemble average LURR value rises significantly, in agreement with the observations of high LURR prior to the large earthquakes. In shearing tests, two parameters are found to control the correlation between earthquake occurrence and tidal stress. One is, A/(kT) controlling the phase shift between the peak seismicity rate and the peak amplitude of the perturbation stress. With an increase of this parameter, the phase shift is found to decrease. Another parameter, AT/k, controls the height of the probability density function (Pdf) of modeled seismicity. As this parameter increases, the Pdf becomes sharper and narrower, indicating a strong triggering. Statistical studies of LURR signals in shearing tests also suggest that except in strong triggering cases, where LURR cannot be calculated due to poor data in unloading cycles, the larger events are more likely to occur in higher LURR periods than the smaller ones, supporting the LURR hypothesis.
Resumo:
Adsorption of binary mixtures onto activated carbon Norit R1 for the system nitrogen-methane-carbon dioxide was investigated over the pressure range up to 15 MPa. A new model is proposed to describe the experimental data. It is based on the assumption that an activated carbon can be characterized by the distribution function of elements of adsorption volume (EAV) over the solid-fluid potential. This function may be evaluated from pure component isotherms using the equality of the chemical potentials in the adsorbed phase and in the bulk phase for each EAV. In the case of mixture adsorption a simple combining rule is proposed, which allows determining the adsorbed phase density and its composition in the EAV at given pressure and compositions of the bulk phase. The adsorbed concentration of each adsorbate is the integral of its density over the set of EAV. The comparison with experimental data on binary mixtures has shown that the approach works reasonably well. In the case of high-pressure binary mixture adsorption, when only total amount adsorbed was measured, the proposed model allows reliably determining partial amounts of the adsorbed components. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
A dual resistance model with distribution of either barrier or pore diffusional activation energy is proposed in this work for gas transport in carbon molecular sieve (CMS) micropores. This is a novel approach in which the equilibrium is homogeneous, but the kinetics is heterogeneous. The model seems to provide a possible explanation for the concentration dependence of the thermodynamically corrected barrier and pore diffusion coefficients observed in previous studies from this laboratory on gas diffusion in CMS.(1.2) The energy distribution is assumed to follow the gamma distribution function. It is shown that the energy distribution model can fully capture the behavior described by the empirical model established in earlier studies to account for the concentration dependence of thermodynamically corrected barrier and pore diffusion coefficients. A methodology is proposed for extracting energy distribution parameters, and it is further shown that the extracted energy distribution parameters can effectively predict integral uptake and column breakthrough profiles over a wide range of operating pressures.
Resumo:
Nucleation is the first stage in any granulation process where binder liquid first comes into contact with the powder. This paper investigates the nucleation process where binder liquid is added to a fine powder with a spray nozzle. The dimensionless spray flux approach of Hapgood et al. (Powder Technol. 141 (2004) 20) is extended to account for nonuniform spray patterns and allow for overlap of nuclei granules rather than spray drops. A dimensionless nuclei distribution function which describes the effects of the design and operating parameters of the nucleation process (binder spray characteristics, the nucleation area ratio between droplets and nuclei and the powder bed velocity) on the fractional surface area coverage of nuclei on a moving powder bed is developed. From this starting point, a Monte Carlo nucleation model that simulates full nuclei size distributions as a function of the design and operating parameters that were implemented in the dimensionless nuclei distribution function is developed. The nucleation model was then used to investigate the effects of the design and operating parameters on the formed nuclei size distributions and to correlate these effects to changes of the dimensionless nuclei distribution function. Model simulations also showed that it is possible to predict nuclei size distributions beyond the drop controlled nucleation regime in Hapgood's nucleation regime map. Qualitative comparison of model simulations and experimental nucleation data showed similar shapes of the nuclei size distributions. In its current form, the nucleation model can replace the nucleation term in one-dimensional population balance models describing wet granulation processes. Implementation of more sophisticated nucleation kinetics can make the model applicable to multi-dimensional population balance models.
Resumo:
The rate of generation of fluctuations with respect to the scalar values conditioned on the mixture fraction, which significantly affects turbulent nonpremixed combustion processes, is examined. Simulation of the rate in a major mixing model is investigated and the derived equations can assist in selecting the model parameters so that the level of conditional fluctuations is better reproduced by the models. A more general formulation of the multiple mapping conditioning (MMC) model that distinguishes the reference and conditioning variables is suggested. This formulation can be viewed as a methodology of enforcing certain desired conditional properties onto conventional mixing models. Examples of constructing consistent MMC models with dissipation and velocity conditioning as well as of combining MMC with large eddy simulations (LES) are also provided. (c) 2005 The Combustion Institute. Published by Elsevier Inc. All rights reserved.
Resumo:
The performance of the maximum ratio combining method for the combining of antenna-diversity signals in correlated Rician-fading channels is rigorously studied. The distribution function of the normalized signal-to-noise ratio (SNR) is expanded in terms of a power series and calculated numerically. This power series can easily take into account the signal correlations and antenna gains and can be applied to any number of receiving antennas. An application of the method to dual-antenna diversity systems produces useful distribution curves for the normalized SNR which can be used to find the diversity gain. It is revealed that signal correlation in Rician-fading channels helps to increase the diversity gain rather than to decrease it as in the Rayleigh fading channels. It is also shown that with a relative strong direct signal component, the diversity gain can be much higher than that without a direct signal component.
Resumo:
Classical metapopulation theory assumes a static landscape. However, empirical evidence indicates many metapopulations are driven by habitat succession and disturbance. We develop a stochastic metapopulation model, incorporating habitat disturbance and recovery, coupled with patch colonization and extinction, to investigate the effect of habitat dynamics on persistence. We discover that habitat dynamics play a fundamental role in metapopulation dynamics. The mean number of suitable habitat patches is not adequate for characterizing the dynamics of the metapopulation. For a fixed mean number of suitable patches, we discover that the details of how disturbance affects patches and how patches recover influences metapopulation dynamics in a fundamental way. Moreover, metapopulation persistence is dependent not only oil the average lifetime of a patch, but also on the variance in patch lifetime and the synchrony in patch dynamics that results from disturbance. Finally, there is an interaction between the habitat and metapopulation dynamics, for instance declining metapopulations react differently to habitat dynamics than expanding metapopulations. We close, emphasizing the importance of using performance measures appropriate to stochastic systems when evaluating their behavior, such as the probability distribution of the state of the. metapopulation, conditional on it being extant (i.e., the quasistationary distribution).
Resumo:
We consider a problem of robust performance analysis of linear discrete time varying systems on a bounded time interval. The system is represented in the state-space form. It is driven by a random input disturbance with imprecisely known probability distribution; this distributional uncertainty is described in terms of entropy. The worst-case performance of the system is quantified by its a-anisotropic norm. Computing the anisotropic norm is reduced to solving a set of difference Riccati and Lyapunov equations and a special form equation.
Resumo:
The recurrence interval statistics for regional seismicity follows a universal distribution function, independent of the tectonic setting or average rate of activity (Corral, 2004). The universal function is a modified gamma distribution with power-law scaling of recurrence intervals shorter than the average rate of activity and exponential decay for larger intervals. We employ the method of Corral (2004) to examine the recurrence statistics of a range of cellular automaton earthquake models. The majority of models has an exponential distribution of recurrence intervals, the same as that of a Poisson process. One model, the Olami-Feder-Christensen automaton, has recurrence statistics consistent with regional seismicity for a certain range of the conservation parameter of that model. For conservation parameters in this range, the event size statistics are also consistent with regional seismicity. Models whose dynamics are dominated by characteristic earthquakes do not appear to display universality of recurrence statistics.
Resumo:
Stochastic models based on Markov birth processes are constructed to describe the process of invasion of a fly larva by entomopathogenic nematodes. Various forms for the birth (invasion) rates are proposed. These models are then fitted to data sets describing the observed numbers of nematodes that have invaded a fly larval after a fixed period of time. Non-linear birthrates are required to achieve good fits to these data, with their precise form leading to different patterns of invasion being identified for three populations of nematodes considered. One of these (Nemasys) showed the greatest propensity for invasion. This form of modelling may be useful more generally for analysing data that show variation which is different from that expected from a binomial distribution.
Resumo:
A new lifetime distribution capable of modeling a bathtub-shaped hazard-rate function is proposed. The proposed model is derived as a limiting case of the Beta Integrated Model and has both the Weibull distribution and Type I extreme value distribution as special cases. The model can be considered as another useful 3-parameter generalization of the Weibull distribution. An advantage of the model is that the model parameters can be estimated easily based on a Weibull probability paper (WPP) plot that serves as a tool for model identification. Model characterization based on the WPP plot is studied. A numerical example is provided and comparison with another Weibull extension, the exponentiated Weibull, is also discussed. The proposed model compares well with other competing models to fit data that exhibits a bathtub-shaped hazard-rate function.