920 resultados para probability distribution


Relevância:

60.00% 60.00%

Publicador:

Resumo:

En este documento se revisa teóricamente la distribución de probabilidad de Poisson como función que asigna a cada suceso definido, sobre una variable aleatoria discreta, la probabilidad de ocurrencia en un intervalo de tiempo o región del espacio disjunto. Adicionalmente se revisa la distribución exponencial negativa empleada para modelar el intervalo de tiempo entre eventos consecutivos de Poisson que ocurren de manera independiente; es decir, en los cuales la probabilidad de ocurrencia de los eventos sucedidos en un intervalo de tiempo no depende de los ocurridos en otros intervalos de tiempo, por esta razón se afirma que es una distribución que no tiene memoria. El proceso de Poisson relaciona la función de Poisson, que representa un conjunto de eventos independientes sucedidos en un intervalo de tiempo o región del espacio con los tiempos dados entre la ocurrencia de los eventos según la distribución exponencial negativa. Los anteriores conceptos se usan en la teoría de colas, rama de la investigación de operaciones que describe y brinda soluciones a situaciones en las que un conjunto de individuos o elementos forman colas en espera de que se les preste un servicio, por lo cual se presentan ejemplos de aplicación en el ámbito médico.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La present Tesi Doctoral, titulada desenvolupament computacional de la semblança molecular quàntica, tracta, fonamentalment, els aspectes de càlcul de mesures de semblança basades en la comparació de funcions de densitat electrònica.El primer capítol, Semblança quàntica, és introductori. S'hi descriuen les funcions de densitat de probabilitat electrònica i llur significança en el marc de la mecànica quàntica. Se n'expliciten els aspectes essencials i les condicions matemàtiques a satisfer, cara a una millor comprensió dels models de densitat electrònica que es proposen. Hom presenta les densitats electròniques, mencionant els teoremes de Hohenberg i Kohn i esquematitzant la teoria de Bader, com magnituds fonamentals en la descripció de les molècules i en la comprensió de llurs propietats.En el capítol Models de densitats electròniques moleculars es presenten procediments computacionals originals per l'ajust de funcions densitat a models expandits en termes de gaussianes 1s centrades en els nuclis. Les restriccions físico-matemàtiques associades a les distribucions de probabilitat s'introdueixen de manera rigorosa, en el procediment anomenat Atomic Shell Approximation (ASA). Aquest procediment, implementat en el programa ASAC, parteix d'un espai funcional quasi complert, d'on se seleccionen variacionalment les funcions o capes de l'expansió, d'acord als requisits de no negativitat. La qualitat d'aquestes densitats i de les mesures de semblança derivades es verifica abastament. Aquest model ASA s'estén a representacions dinàmiques, físicament més acurades, en quant que afectades per les vibracions nuclears, cara a una exploració de l'efecte de l'esmorteïment dels pics nuclears en les mesures de semblança molecular. La comparació de les densitats dinàmiques respecte les estàtiques evidencia un reordenament en les densitats dinàmiques, d'acord al que constituiria una manifestació del Principi quàntic de Le Chatelier. El procediment ASA, explícitament consistent amb les condicions de N-representabilitat, s'aplica també a la determinació directe de densitats electròniques hidrogenoides, en un context de teoria del funcional de la densitat.El capítol Maximització global de la funció de semblança presenta algorismes originals per la determinació de la màxima sobreposició de les densitats electròniques moleculars. Les mesures de semblança molecular quàntica s'identifiquen amb el màxim solapament, de manera es mesuri la distància entre les molècules, independentment dels sistemes de referència on es defineixen les densitats electròniques. Partint de la solució global en el límit de densitats infinitament compactades en els nuclis, es proposen tres nivells de aproximació per l'exploració sistemàtica, no estocàstica, de la funció de semblança, possibilitant la identificació eficient del màxim global, així com també dels diferents màxims locals. Es proposa també una parametrització original de les integrals de recobriment a través d'ajustos a funcions lorentzianes, en quant que tècnica d'acceleració computacional. En la pràctica de les relacions estructura-activitat, aquests avenços possibiliten la implementació eficient de mesures de semblança quantitatives, i, paral·lelament, proporcionen una metodologia totalment automàtica d'alineació molecular. El capítol Semblances d'àtoms en molècules descriu un algorisme de comparació dels àtoms de Bader, o regions tridimensionals delimitades per superfícies de flux zero de la funció de densitat electrònica. El caràcter quantitatiu d'aquestes semblances possibilita la mesura rigorosa de la noció química de transferibilitat d'àtoms i grups funcionals. Les superfícies de flux zero i els algorismes d'integració usats han estat publicats recentment i constitueixen l'aproximació més acurada pel càlcul de les propietats atòmiques. Finalment, en el capítol Semblances en estructures cristal·lines hom proposa una definició original de semblança, específica per la comparació dels conceptes de suavitat o softness en la distribució de fonons associats a l'estructura cristal·lina. Aquests conceptes apareixen en estudis de superconductivitat a causa de la influència de les interaccions electró-fonó en les temperatures de transició a l'estat superconductor. En aplicar-se aquesta metodologia a l'anàlisi de sals de BEDT-TTF, s'evidencien correlacions estructurals entre sals superconductores i no superconductores, en consonància amb les hipòtesis apuntades a la literatura sobre la rellevància de determinades interaccions.Conclouen aquesta tesi un apèndix que conté el programa ASAC, implementació de l'algorisme ASA, i un capítol final amb referències bibliogràfiques.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the article the author considers and analyzes operations and functions on risk variables. She takes into account the following variables: the sum of risk variables, its product, multiplication by a constant, division, maximum, minimum and median of a sum of random variables. She receives the formulas for probability distribution and basic distribution parameters. She conducts the analysis for dependent and independent random variables. She propose the examples of the situations in the economy and production management of risk modelled by this operations. The analysis is conducted with the way of mathematical proving. Some of the formulas presented are taken from the literature but others are the permanent results of the author.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A stochastic parameterization scheme for deep convection is described, suitable for use in both climate and NWP models. Theoretical arguments and the results of cloud-resolving models, are discussed in order to motivate the form of the scheme. In the deterministic limit, it tends to a spectrum of entraining/detraining plumes and is similar to other current parameterizations. The stochastic variability describes the local fluctuations about a large-scale equilibrium state. Plumes are drawn at random from a probability distribution function (pdf) that defines the chance of finding a plume of given cloud-base mass flux within each model grid box. The normalization of the pdf is given by the ensemble-mean mass flux, and this is computed with a CAPE closure method. The characteristics of each plume produced are determined using an adaptation of the plume model from the Kain-Fritsch parameterization. Initial tests in the single column version of the Unified Model verify that the scheme is effective in producing the desired distributions of convective variability without adversely affecting the mean state.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Many time series are measured monthly, either as averages or totals, and such data often exhibit seasonal variability-the values of the series are consistently larger for some months of the year than for others. A typical series of this type is the number of deaths each month attributed to SIDS (Sudden Infant Death Syndrome). Seasonality can be modelled in a number of ways. This paper describes and discusses various methods for modelling seasonality in SIDS data, though much of the discussion is relevant to other seasonally varying data. There are two main approaches, either fitting a circular probability distribution to the data, or using regression-based techniques to model the mean seasonal behaviour. Both are discussed in this paper.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study investigates the response of wintertime North Atlantic Oscillation (NAO) to increasing concentrations of atmospheric carbon dioxide (CO2) as simulated by 18 global coupled general circulation models that participated in phase 2 of the Coupled Model Intercomparison Project (CMIP2). NAO has been assessed in control and transient 80-year simulations produced by each model under constant forcing, and 1% per year increasing concentrations of CO2, respectively. Although generally able to simulate the main features of NAO, the majority of models overestimate the observed mean wintertime NAO index of 8 hPa by 5-10 hPa. Furthermore, none of the models, in either the control or perturbed simulations, are able to reproduce decadal trends as strong as that seen in the observed NAO index from 1970-1995. Of the 15 models able to simulate the NAO pressure dipole, 13 predict a positive increase in NAO with increasing CO2 concentrations. The magnitude of the response is generally small and highly model-dependent, which leads to large uncertainty in multi-model estimates such as the median estimate of 0.0061 +/- 0.0036 hPa per %CO2. Although an increase of 0.61 hPa in NAO for a doubling in CO2 represents only a relatively small shift of 0.18 standard deviations in the probability distribution of winter mean NAO, this can cause large relative increases in the probabilities of extreme values of NAO associated with damaging impacts. Despite the large differences in NAO responses, the models robustly predict similar statistically significant changes in winter mean temperature (warmer over most of Europe) and precipitation (an increase over Northern Europe). Although these changes present a pattern similar to that expected due to an increase in the NAO index, linear regression is used to show that the response is much greater than can be attributed to small increases in NAO. NAO trends are not the key contributor to model-predicted climate change in wintertime mean temperature and precipitation over Europe and the Mediterranean region. However, the models' inability to capture the observed decadal variability in NAO might also signify a major deficiency in their ability to simulate the NAO-related responses to climate change.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We use geomagnetic activity data to study the rise and fall over the past century of the solar wind flow speed VSW, the interplanetary magnetic field strength B, and the open solar flux FS. Our estimates include allowance for the kinematic effect of longitudinal structure in the solar wind flow speed. As well as solar cycle variations, all three parameters show a long-term rise during the first half of the 20th century followed by peaks around 1955 and 1986 and then a recent decline. Cosmogenic isotope data reveal that this constitutes a grand maximum of solar activity which began in 1920, using the definition that such grand maxima are when 25-year averages of the heliospheric modulation potential exceeds 600 MV. Extrapolating the linear declines seen in all three parameters since 1985, yields predictions that the grand maximum will end in the years 2013, 2014, or 2027 using VSW, FS, or B, respectively. These estimates are consistent with predictions based on the probability distribution of the durations of past grand solar maxima seen in cosmogenic isotope data. The data contradict any suggestions of a floor to the open solar flux: we show that the solar minimum open solar flux, kinematically corrected to allow for the excess flux effect, has halved over the past two solar cycles.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A method was developed to evaluate crop disease predictive models for their economic and environmental benefits. Benefits were quantified as the value of a prediction measured by costs saved and fungicide dose saved. The value of prediction was defined as the net gain made by using predictions, measured as the difference between a scenario where predictions are available and used and a scenario without prediction. Comparable 'with' and 'without' scenarios were created with the use of risk levels. These risk levels were derived from a probability distribution fitted to observed disease severities. These distributions were used to calculate the probability that a certain disease induced economic loss was incurred. The method was exemplified by using it to evaluate a model developed for Mycosphaerella graminicola risk prediction. Based on the value of prediction, the tested model may have economic and environmental benefits to growers if used to guide treatment decisions on resistant cultivars. It is shown that the value of prediction measured by fungicide dose saved and costs saved is constant with the risk level. The model could also be used to evaluate similar crop disease predictive models.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A means of assessing, monitoring and controlling aggregate emissions from multi-instrument Emissions Trading Schemes is proposed. The approach allows contributions from different instruments with different forms of emissions targets to be integrated. Where Emissions Trading Schemes are helping meet specific national targets, the approach allows the entry requirements of new participants to be calculated and set at a level that will achieve these targets. The approach is multi-levelled, and may be extended downwards to support pooling of participants within instruments, or upwards to embed Emissions Trading Schemes within a wider suite of policies and measures with hard and soft targets. Aggregate emissions from each instrument are treated stochastically. Emissions from the scheme as a whole are then the joint probability distribution formed by integrating the emissions from its instruments. Because a Bayesian approach is adopted, qualitative and semi-qualitative data from expert opinion can be used where quantitative data is not currently available, or is incomplete. This approach helps government retain sufficient control over emissions trading scheme targets to allow them to meet their emissions reduction obligations, while minimising the need for retrospectively adjusting existing participants’ conditions of entry. This maintains participant confidence, while providing the necessary policy levers for good governance.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In real-world environments it is usually difficult to specify the quality of a preventive maintenance (PM) action precisely. This uncertainty makes it problematic to optimise maintenance policy.-This problem is tackled in this paper by assuming that the-quality of a PM action is a random variable following a probability distribution. Two frequently studied PM models, a failure rate PM model and an age reduction PM model, are investigated. The optimal PM policies are presented and optimised. Numerical examples are also given.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A new Bayesian algorithm for retrieving surface rain rate from Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI) over the ocean is presented, along with validations against estimates from the TRMM Precipitation Radar (PR). The Bayesian approach offers a rigorous basis for optimally combining multichannel observations with prior knowledge. While other rain-rate algorithms have been published that are based at least partly on Bayesian reasoning, this is believed to be the first self-contained algorithm that fully exploits Bayes’s theorem to yield not just a single rain rate, but rather a continuous posterior probability distribution of rain rate. To advance the understanding of theoretical benefits of the Bayesian approach, sensitivity analyses have been conducted based on two synthetic datasets for which the “true” conditional and prior distribution are known. Results demonstrate that even when the prior and conditional likelihoods are specified perfectly, biased retrievals may occur at high rain rates. This bias is not the result of a defect of the Bayesian formalism, but rather represents the expected outcome when the physical constraint imposed by the radiometric observations is weak owing to saturation effects. It is also suggested that both the choice of the estimators and the prior information are crucial to the retrieval. In addition, the performance of the Bayesian algorithm herein is found to be comparable to that of other benchmark algorithms in real-world applications, while having the additional advantage of providing a complete continuous posterior probability distribution of surface rain rate.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We present extensive molecular dynamics simulations of the dynamics of diluted long probe chains entangled with a matrix of shorter chains. The chain lengths of both components are above the entanglement strand length, and the ratio of their lengths is varied over a wide range to cover the crossover from the chain reptation regime to tube Rouse motion regime of the long probe chains. Reducing the matrix chain length results in a faster decay of the dynamic structure factor of the probe chains, in good agreement with recent neutron spin echo experiments. The diffusion of the long chains, measured by the mean square displacements of the monomers and the centers of mass of the chains, demonstrates a systematic speed-up relative to the pure reptation behavior expected for monodisperse melts of sufficiently long polymers. On the other hand, the diffusion of the matrix chains is only weakly perturbed by the diluted long probe chains. The simulation results are qualitatively consistent with the theoretical predictions based on constraint release Rouse model, but a detailed comparison reveals the existence of a broad distribution of the disentanglement rates, which is partly confirmed by an analysis of the packing and diffusion of the matrix chains in the tube region of the probe chains. A coarse-grained simulation model based on the tube Rouse motion model with incorporation of the probability distribution of the tube segment jump rates is developed and shows results qualitatively consistent with the fine scale molecular dynamics simulations. However, we observe a breakdown in the tube Rouse model when the short chain length is decreased to around N-S = 80, which is roughly 3.5 times the entanglement spacing N-e(P) = 23. The location of this transition may be sensitive to the chain bending potential used in our simulations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Valuation is the process of estimating price. The methods used to determine value attempt to model the thought processes of the market and thus estimate price by reference to observed historic data. This can be done using either an explicit model, that models the worth calculation of the most likely bidder, or an implicit model, that that uses historic data suitably adjusted as a short cut to determine value by reference to previous similar sales. The former is generally referred to as the Discounted Cash Flow (DCF) model and the latter as the capitalisation (or All Risk Yield) model. However, regardless of the technique used, the valuation will be affected by uncertainties. Uncertainty in the comparable data available; uncertainty in the current and future market conditions and uncertainty in the specific inputs for the subject property. These input uncertainties will translate into an uncertainty with the output figure, the estimate of price. In a previous paper, we have considered the way in which uncertainty is allowed for in the capitalisation model in the UK. In this paper, we extend the analysis to look at the way in which uncertainty can be incorporated into the explicit DCF model. This is done by recognising that the input variables are uncertain and will have a probability distribution pertaining to each of them. Thus buy utilising a probability-based valuation model (using Crystal Ball) it is possible to incorporate uncertainty into the analysis and address the shortcomings of the current model. Although the capitalisation model is discussed, the paper concentrates upon the application of Crystal Ball to the Discounted Cash Flow approach.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper sequential importance sampling is used to assess the impact of observations on a ensemble prediction for the decadal path transitions of the Kuroshio Extension (KE). This particle filtering approach gives access to the probability density of the state vector, which allows us to determine the predictive power — an entropy based measure — of the ensemble prediction. The proposed set-up makes use of an ensemble that, at each time, samples the climatological probability distribution. Then, in a post-processing step, the impact of different sets of observations is measured by the increase in predictive power of the ensemble over the climatological signal during one-year. The method is applied in an identical-twin experiment for the Kuroshio Extension using a reduced-gravity shallow water model. We investigate the impact of assimilating velocity observations from different locations during the elongated and the contracted meandering state of the KE. Optimal observations location correspond to regions with strong potential vorticity gradients. For the elongated state the optimal location is in the first meander of the KE. During the contracted state of the KE it is located south of Japan, where the Kuroshio separates from the coast.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, the statistical properties of tropical ice clouds (ice water content, visible extinction, effective radius, and total number concentration) derived from 3 yr of ground-based radar–lidar retrievals from the U.S. Department of Energy Atmospheric Radiation Measurement Climate Research Facility in Darwin, Australia, are compared with the same properties derived using the official CloudSat microphysical retrieval methods and from a simpler statistical method using radar reflectivity and air temperature. It is shown that the two official CloudSat microphysical products (2B-CWC-RO and 2B-CWC-RVOD) are statistically virtually identical. The comparison with the ground-based radar–lidar retrievals shows that all satellite methods produce ice water contents and extinctions in a much narrower range than the ground-based method and overestimate the mean vertical profiles of microphysical parameters below 10-km height by over a factor of 2. Better agreements are obtained above 10-km height. Ways to improve these estimates are suggested in this study. Effective radii retrievals from the standard CloudSat algorithms are characterized by a large positive bias of 8–12 μm. A sensitivity test shows that in response to such a bias the cloud longwave forcing is increased from 44.6 to 46.9 W m−2 (implying an error of about 5%), whereas the negative cloud shortwave forcing is increased from −81.6 to −82.8 W m−2. Further analysis reveals that these modest effects (although not insignificant) can be much larger for optically thick clouds. The statistical method using CloudSat reflectivities and air temperature was found to produce inaccurate mean vertical profiles and probability distribution functions of effective radius. This study also shows that the retrieval of the total number concentration needs to be improved in the official CloudSat microphysical methods prior to a quantitative use for the characterization of tropical ice clouds. Finally, the statistical relationship used to produce ice water content from extinction and air temperature obtained by the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) satellite is evaluated for tropical ice clouds. It is suggested that the CALIPSO ice water content retrieval is robust for tropical ice clouds, but that the temperature dependence of the statistical relationship used should be slightly refined to better reproduce the radar–lidar retrievals.