920 resultados para probability distribution


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Os processos estocásticos com ruído branco multiplicativo são objeto de atenção constante em uma grande área da pesquisa científica. A variedade de prescrições possíveis para definir matematicamente estes processos oferece um obstáculo ao desenvolvimento de ferramentas gerais para seu tratamento. Na presente tese, estudamos propriedades de equilíbrio de processos markovianos com ruído branco multiplicativo. Para conseguirmos isto, definimos uma transformação de reversão temporal de tais processos levando em conta que a distribuição estacionária de probabilidade depende da prescrição. Deduzimos um formalismo funcional visando obter o funcional gerador das funções de correlação e resposta de um processo estocástico multiplicativo representado por uma equação de Langevin. Ao representar o processo estocástico neste formalismo (de Grassmann) funcional eludimos a necessidade de fixar uma prescrição particular. Neste contexto, analisamos as propriedades de equilíbrio e estudamos as simetrias ocultas do processo. Mostramos que, usando uma definição apropriada da distribuição de equilíbrio e considerando a transformação de reversão temporal adequada, as propriedades usuais de equilíbrio são satisfeitas para qualquer prescrição. Finalmente, apresentamos uma dedução detalhada da formulação supersimétrica covariante de um processo markoviano com ruído branco multiplicativo e estudamos algumas das relações impostas pelas funções de correlação através das identidades de Ward-Takahashi.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Spanish Relativity Meeting (ERE 2014) Valencia, SPAIN, SEP 01-05, 2014

Relevância:

60.00% 60.00%

Publicador:

Resumo:

As técnicas de injeção de traçadores têm sido amplamente utilizadas na investigação de escoamentos em meios porosos, principalmente em problemas envolvendo a simulação numérica de escoamentos miscíveis em reservatórios de petróleo e o transporte de contaminantes em aquíferos. Reservatórios subterrâneos são em geral heterogêneos e podem apresentar variações significativas das suas propriedades em várias escalas de comprimento. Estas variações espaciais são incorporadas às equações que governam o escoamento no interior do meio poroso por meio de campos aleatórios. Estes campos podem prover uma descrição das heterogeneidades da formação subterrânea nos casos onde o conhecimento geológico não fornece o detalhamento necessário para a predição determinística do escoamento através do meio poroso. Nesta tese é empregado um modelo lognormal para o campo de permeabilidades a fim de reproduzir-se a distribuição de permeabilidades do meio real, e a geração numérica destes campos aleatórios é feita pelo método da Soma Sucessiva de Campos Gaussianos Independentes (SSCGI). O objetivo principal deste trabalho é o estudo da quantificação de incertezas para o problema inverso do transporte de um traçador em um meio poroso heterogêneo empregando uma abordagem Bayesiana para a atualização dos campos de permeabilidades, baseada na medição dos valores da concentração espacial do traçador em tempos específicos. Um método do tipo Markov Chain Monte Carlo a dois estágios é utilizado na amostragem da distribuição de probabilidade a posteriori e a cadeia de Markov é construída a partir da reconstrução aleatória dos campos de permeabilidades. Na resolução do problema de pressão-velocidade que governa o escoamento empregase um método do tipo Elementos Finitos Mistos adequado para o cálculo acurado dos fluxos em campos de permeabilidades heterogêneos e uma abordagem Lagrangiana, o método Forward Integral Tracking (FIT), é utilizada na simulação numérica do problema do transporte do traçador. Resultados numéricos são obtidos e apresentados para um conjunto de realizações amostrais dos campos de permeabilidades.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

When estimating parameters that constitute a discrete probability distribution {pj}, it is difficult to determine how constraints should be made to guarantee that the estimated parameters { pˆj} constitute a probability distribution (i.e., pˆj>0, Σ pˆj =1). For age distributions estimated from mixtures of length-at-age distributions, the EM (expectationmaximization) algorithm (Hasselblad, 1966; Hoenig and Heisey, 1987; Kimura and Chikuni, 1987), restricted least squares (Clark, 1981), and weak quasisolutions (Troynikov, 2004) have all been used. Each of these methods appears to guarantee that the estimated distribution will be a true probability distribution with all categories greater than or equal to zero and with individual probabilities that sum to one. In addition, all these methods appear to provide a theoretical basis for solutions that will be either maximum-likelihood estimates or at least convergent to a probability distribut

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O presente trabalho apresenta um estudo referente à aplicação da abordagem Bayesiana como técnica de solução do problema inverso de identificação de danos estruturais, onde a integridade da estrutura é continuamente descrita por um parâmetro estrutural denominado parâmetro de coesão. A estrutura escolhida para análise é uma viga simplesmente apoiada do tipo Euler-Bernoulli. A identificação de danos é baseada em alterações na resposta impulsiva da estrutura, provocadas pela presença dos mesmos. O problema direto é resolvido através do Método de Elementos Finitos (MEF), que, por sua vez, é parametrizado pelo parâmetro de coesão da estrutura. O problema de identificação de danos é formulado como um problema inverso, cuja solução, do ponto de vista Bayesiano, é uma distribuição de probabilidade a posteriori para cada parâmetro de coesão da estrutura, obtida utilizando-se a metodologia de amostragem de Monte Carlo com Cadeia de Markov. As incertezas inerentes aos dados medidos serão contempladas na função de verossimilhança. Três estratégias de solução são apresentadas. Na Estratégia 1, os parâmetros de coesão da estrutura são amostrados de funções densidade de probabilidade a posteriori que possuem o mesmo desvio padrão. Na Estratégia 2, após uma análise prévia do processo de identificação de danos, determina-se regiões da viga potencialmente danificadas e os parâmetros de coesão associados à essas regiões são amostrados a partir de funções de densidade de probabilidade a posteriori que possuem desvios diferenciados. Na Estratégia 3, após uma análise prévia do processo de identificação de danos, apenas os parâmetros associados às regiões identificadas como potencialmente danificadas são atualizados. Um conjunto de resultados numéricos é apresentado levando-se em consideração diferentes níveis de ruído para as três estratégias de solução apresentadas.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Atlantic Croaker (Micropogonias undulatus) production dynamics along the U.S. Atlantic coast are regulated by fishing and winter water temperature. Stakeholders for this resource have recommended investigating the effects of climate covariates in assessment models. This study used state-space biomass dynamic models without (model 1) and with (model 2) the minimum winter estuarine temperature (MWET) to examine MWET effects on Atlantic Croaker population dynamics during 1972–2008. In model 2, MWET was introduced into the intrinsic rate of population increase (r). For both models, a prior probability distribution (prior) was constructed for r or a scaling parameter (r0); imputs were the fishery removals, and fall biomass indices developed by using data from the Multispecies Bottom Trawl Survey of the Northeast Fisheries Science Center, National Marine Fisheries Service, and the Coastal Trawl Survey of the Southeast Area Monitoring and Assessment Program. Model sensitivity runs incorporated a uniform (0.01,1.5) prior for r or r0 and bycatch data from the shrimp-trawl fishery. All model variants produced similar results and therefore supported the conclusion of low risk of overfishing for the Atlantic Croaker stock in the 2000s. However, the data statistically supported only model 1 and its configuration that included the shrimp-trawl fishery bycatch. The process errors of these models showed slightly positive and significant correlations with MWET, indicating that warmer winters would enhance Atlantic Croaker biomass production. Inconclusive, somewhat conflicting results indicate that biomass dynamic models should not integrate MWET, pending, perhaps, accumulation of longer time series of the variables controlling the production dynamics of Atlantic Croaker, preferably including winter-induced estimates of Atlantic Croaker kills.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We report a Monte Carlo representation of the long-term inter-annual variability of monthly snowfall on a detailed (1 km) grid of points throughout the southwest. An extension of the local climate model of the southwestern United States (Stamm and Craig 1992) provides spatially based estimates of mean and variance of monthly temperature and precipitation. The mean is the expected value from a canonical regression using independent variables that represent controls on climate in this area, including orography. Variance is computed as the standard error of the prediction and provides site-specific measures of (1) natural sources of variation and (2) errors due to limitations of the data and poor distribution of climate stations. Simulation of monthly temperature and precipitation over a sequence of years is achieved by drawing from a bivariate normal distribution. The conditional expectation of precipitation. given temperature in each month, is the basis of a numerical integration of the normal probability distribution of log precipitation below a threshold temperature (3°C) to determine snowfall as a percent of total precipitation. Snowfall predictions are tested at stations for which long-term records are available. At Donner Memorial State Park (elevation 1811 meters) a 34-year simulation - matching the length of instrumental record - is within 15 percent of observed for mean annual snowfall. We also compute resulting snowpack using a variation of the model of Martinec et al. (1983). This allows additional tests by examining spatial patterns of predicted snowfall and snowpack and their hydrologic implications.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

本文利用地理信息系统(GIS)技术、景观生态学理论和方法、分形理论以及统计分析方法对北京地区植被景观的空间分布特征进行了分析,并对景观格局和景观多样性的分析方法进行了探讨,结果表明: (1)对几乎所有的斑块类型,其斑块大小的分布都不是对称的,而是右偏的。4种概率分布(Г—分布、对数正态分布、Weibull分布和(负)指数分布)都只能刻划部分斑块类型,并且服从对数正态分布的斑块类型最多,服从(负)指数分布的斑块类型最少。 (2)随着斑块面积的增加,边界效应越来越小,而斑块形状越来越不紧凑。 (3)分形分析识别出本地区植被景观中的两个尺度域:一个是斑块面积小于(大约)2.7km2,另一个是斑块面积大于(大约)2.7km2。两个域中的斑块复杂程度有很大差异,后一个域中的斑块明显比前一个域中的斑块复杂,并且随着斑块面积的增加,斑块形状越来越复杂。 (4)用斑块数作为多度指标时,该景观的斑块类型一多度分布服从(截断)对数正态分布和(截断)负二项分布,不服从对数级数分布和几何分布。用斑块面积作为多度指标时,该景观的斑块类型一多度分布服从对数正态分布、Weibull分布和Г—争布,不服从正态分布。从而该景观的斑块类型一多度分布不是对称的,也是右偏的。在4个优势度/多样性模型中,“生态位优先占领”模型和Zipf-Mandelbrot模型可以较好地刻划该景观的斑块类型一多度关系。 (5)样本大小对多样性测度有直接的影响。如果这种影响比较小,就说明测度指标比较稳定。三个丰富度指数中,Ri比R2和R3更稳定;五个多样性性指数中,D和Di最稳定,OD最不稳定,因此,OD是用于景观多样性监测的理想指标;五个均匀度指数中,Jgi最稳定。根据设计的3种计算临界样方数量(即多样性测度指标达到稳定时的样方数量)方法的计算结果,上述几个最稳定的测度指标在通常情况下只需要几个样方(即总抽样面积为数百km2)就达到稳定状态。 (6)斑块类型数目随面积的增加而增加。根据四个评价指标的评价结果,认为双曲线对该景观的斑块类型一面积关系的拟合效果最好。 (7)样本较大(对于一阶刀切估计,大于30个样方;对于二阶刀切估计,大于60个样方)时,刀切法能够给出斑块类型数目(NPT)较好的估计;样本较小(小于30个样方)时,Mingoti和Meeden提出的经验贝叶斯方法能够对NPT给出比刀切法和自助法更好的估计。斑块类型一面积曲线外推虽然也能给出NPT较好的估计,但这种方法需要慎重使用,不能外推得很远。 (8)列联表分析表明,该植被景观中的斑块类型与土壤类型、岩石类型、海拔高度和坡向各因子之间均存在显著的相关性。植被景观多样性与岩石类型多样性和地形多样性之间也均呈显著的正相关关系,即植被景观多样性随岩石类型多样性和地形多样性的增加而增加。但植被景观多样性与土壤类型多样性之间不存在显著的线性相关或秩相关关系,这可能是由于二者的分类体系不吻合。植被景观多样性与总的道路密度和第二类道路密度之间均呈显著的负相关关系,而与第一类和第三类道路密度之间的关系都不显著。这反映出景观样本单元(10kmxlOkm)的尺度对应于第二类道路的影响尺度。而道路密度在一定程度上反映了人类活动的强度,因此,在10kmxlOkm这个尺度上,人类活动愈剧烈,景观多样性就愈小。

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A pivotal problem in Bayesian nonparametrics is the construction of prior distributions on the space M(V) of probability measures on a given domain V. In principle, such distributions on the infinite-dimensional space M(V) can be constructed from their finite-dimensional marginals---the most prominent example being the construction of the Dirichlet process from finite-dimensional Dirichlet distributions. This approach is both intuitive and applicable to the construction of arbitrary distributions on M(V), but also hamstrung by a number of technical difficulties. We show how these difficulties can be resolved if the domain V is a Polish topological space, and give a representation theorem directly applicable to the construction of any probability distribution on M(V) whose first moment measure is well-defined. The proof draws on a projective limit theorem of Bochner, and on properties of set functions on Polish spaces to establish countable additivity of the resulting random probabilities.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Accurate simulation of rolling-tyre vibrations, and the associated noise, requires knowledge of road-surface topology. Full scans of the surface types in common use are, however, not widely available, and are likely to remain so. Ways of producing simulated surfaces from incomplete starting information are thus needed. In this paper, a simulation methodology based solely on line measurements is developed, and validated against a full two-dimensional height map of a real asphalt surface. First the tribological characteristics-asperity height, curvature and nearest-neighbour distributions-of the real surface are analysed. It is then shown that a standard simulation technique, which matches the (isotropic) spectrum and the probability distribution of the height measurements, is unable to reproduce these characteristics satisfactorily. A modification, whereby the inherent granularity of the surface is enforced at the initialisation stage, is introduced, and found to produce simulations whose tribological characteristics are in excellent agreement with the measurements. This method will thus make high-fidelity tyre-vibration calculations feasible for researchers with access to line-scan data only. In addition, the approach to surface tribological characterisation set out here provides a template for efficient cataloguing of road textures, as long as the resulting information can subsequently be used to produce sample realisations. A third simulation algorithm, which successfully addresses this requirement, is therefore also presented. © 2011 Elsevier B.V.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Motor behavior may be viewed as a problem of maximizing the utility of movement outcome in the face of sensory, motor and task uncertainty. Viewed in this way, and allowing for the availability of prior knowledge in the form of a probability distribution over possible states of the world, the choice of a movement plan and strategy for motor control becomes an application of statistical decision theory. This point of view has proven successful in recent years in accounting for movement under risk, inferring the loss function used in motor tasks, and explaining motor behavior in a wide variety of circumstances.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

© 2012 Elsevier Ltd. Motor behavior may be viewed as a problem of maximizing the utility of movement outcome in the face of sensory, motor and task uncertainty. Viewed in this way, and allowing for the availability of prior knowledge in the form of a probability distribution over possible states of the world, the choice of a movement plan and strategy for motor control becomes an application of statistical decision theory. This point of view has proven successful in recent years in accounting for movement under risk, inferring the loss function used in motor tasks, and explaining motor behavior in a wide variety of circumstances.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Various packed beds of copper-based oxygen carriers (CuO on Al2O3) were tested over 100 cycles of low temperature (673K) Chemical Looping Combustion (CLC) with H2 as the fuel gas. The oxygen carriers were uniformly mixed with alumina (Al2O3) in order to investigate the level of separation necessary to prevent agglomeration. It was found that a mass ratio of 1:6 oxygen carrier to alumina gave the best performance in terms of stable, repeating hydrogen breakthrough curves over 100 cycles. In order to quantify the average separation achieved in the mixed packed beds, two sphere-packing models were developed. The hexagonal close-packing model assumed a uniform spherical packing structure, and based the separation calculations on a hypergeometric probability distribution. The more computationally intensive full-scale model used discrete element modelling to simulate random packing arrangements governed by gravity and contact dynamics. Both models predicted that average 'nearest neighbour' particle separation drops to near zero for oxygen carrier mass fractions of x≥0.25. For the packed bed systems studied, agglomeration was observed when the mass fraction of oxygen carrier was above this threshold. © 2013 Elsevier B.V.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Vibration and acoustic analysis at higher frequencies faces two challenges: computing the response without using an excessive number of degrees of freedom, and quantifying its uncertainty due to small spatial variations in geometry, material properties and boundary conditions. Efficient models make use of the observation that when the response of a decoupled vibro-acoustic subsystem is sufficiently sensitive to uncertainty in such spatial variations, the local statistics of its natural frequencies and mode shapes saturate to universal probability distributions. This holds irrespective of the causes that underly these spatial variations and thus leads to a nonparametric description of uncertainty. This work deals with the identification of uncertain parameters in such models by using experimental data. One of the difficulties is that both experimental errors and modeling errors, due to the nonparametric uncertainty that is inherent to the model type, are present. This is tackled by employing a Bayesian inference strategy. The prior probability distribution of the uncertain parameters is constructed using the maximum entropy principle. The likelihood function that is subsequently computed takes the experimental information, the experimental errors and the modeling errors into account. The posterior probability distribution, which is computed with the Markov Chain Monte Carlo method, provides a full uncertainty quantification of the identified parameters, and indicates how well their uncertainty is reduced, with respect to the prior information, by the experimental data. © 2013 Taylor & Francis Group, London.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The photoluminescence correlation from a single CdSe nanocrystal under pulsed excitation is studied, and a single photon is realized at wavelength 655 nm at room temperature. The single colloidal CdSe quantum dot is prepared on a SiO2/silicon surface by a drop-and-drag technique. The long-term stability of the single-photon source is investigated; it is found that the antibunching effect weakens with excitation time, and the reason for the weakening is attributed to photobleaching. The lifetimes of photoluminescence from a single quantum dot are analyzed at different excitation times. By analyzing the probability distribution of on and off times of photoluminescence, the Auger assisted tunneling and Auger assisted photobleaching models are applied to explain the antibunching phenomenon.