46 resultados para Gaussian Probability Distribution

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The co-polar correlation coefficient (ρhv) has many applications, including hydrometeor classification, ground clutter and melting layer identification, interpretation of ice microphysics and the retrieval of rain drop size distributions (DSDs). However, we currently lack the quantitative error estimates that are necessary if these applications are to be fully exploited. Previous error estimates of ρhv rely on knowledge of the unknown "true" ρhv and implicitly assume a Gaussian probability distribution function of ρhv samples. We show that frequency distributions of ρhv estimates are in fact highly negatively skewed. A new variable: L = -log10(1 - ρhv) is defined, which does have Gaussian error statistics, and a standard deviation depending only on the number of independent radar pulses. This is verified using observations of spherical drizzle drops, allowing, for the first time, the construction of rigorous confidence intervals in estimates of ρhv. In addition, we demonstrate how the imperfect co-location of the horizontal and vertical polarisation sample volumes may be accounted for. The possibility of using L to estimate the dispersion parameter (µ) in the gamma drop size distribution is investigated. We find that including drop oscillations is essential for this application, otherwise there could be biases in retrieved µ of up to ~8. Preliminary results in rainfall are presented. In a convective rain case study, our estimates show µ to be substantially larger than 0 (an exponential DSD). In this particular rain event, rain rate would be overestimated by up to 50% if a simple exponential DSD is assumed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The translation of an ensemble of model runs into a probability distribution is a common task in model-based prediction. Common methods for such ensemble interpretations proceed as if verification and ensemble were draws from the same underlying distribution, an assumption not viable for most, if any, real world ensembles. An alternative is to consider an ensemble as merely a source of information rather than the possible scenarios of reality. This approach, which looks for maps between ensembles and probabilistic distributions, is investigated and extended. Common methods are revisited, and an improvement to standard kernel dressing, called ‘affine kernel dressing’ (AKD), is introduced. AKD assumes an affine mapping between ensemble and verification, typically not acting on individual ensemble members but on the entire ensemble as a whole, the parameters of this mapping are determined in parallel with the other dressing parameters, including a weight assigned to the unconditioned (climatological) distribution. These amendments to standard kernel dressing, albeit simple, can improve performance significantly and are shown to be appropriate for both overdispersive and underdispersive ensembles, unlike standard kernel dressing which exacerbates over dispersion. Studies are presented using operational numerical weather predictions for two locations and data from the Lorenz63 system, demonstrating both effectiveness given operational constraints and statistical significance given a large sample.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper proposes a method for describing the distribution of observed temperatures on any day of the year such that the distribution and summary statistics of interest derived from the distribution vary smoothly through the year. The method removes the noise inherent in calculating summary statistics directly from the data thus easing comparisons of distributions and summary statistics between different periods. The method is demonstrated using daily effective temperatures (DET) derived from observations of temperature and wind speed at De Bilt, Holland. Distributions and summary statistics are obtained from 1985 to 2009 and compared to the period 1904–1984. A two-stage process first obtains parameters of a theoretical probability distribution, in this case the generalized extreme value (GEV) distribution, which describes the distribution of DET on any day of the year. Second, linear models describe seasonal variation in the parameters. Model predictions provide parameters of the GEV distribution, and therefore summary statistics, that vary smoothly through the year. There is evidence of an increasing mean temperature, a decrease in the variability in temperatures mainly in the winter and more positive skew, more warm days, in the summer. In the winter, the 2% point, the value below which 2% of observations are expected to fall, has risen by 1.2 °C, in the summer the 98% point has risen by 0.8 °C. Medians have risen by 1.1 and 0.9 °C in winter and summer, respectively. The method can be used to describe distributions of future climate projections and other climate variables. Further extensions to the methodology are suggested.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A stochastic parameterization scheme for deep convection is described, suitable for use in both climate and NWP models. Theoretical arguments and the results of cloud-resolving models, are discussed in order to motivate the form of the scheme. In the deterministic limit, it tends to a spectrum of entraining/detraining plumes and is similar to other current parameterizations. The stochastic variability describes the local fluctuations about a large-scale equilibrium state. Plumes are drawn at random from a probability distribution function (pdf) that defines the chance of finding a plume of given cloud-base mass flux within each model grid box. The normalization of the pdf is given by the ensemble-mean mass flux, and this is computed with a CAPE closure method. The characteristics of each plume produced are determined using an adaptation of the plume model from the Kain-Fritsch parameterization. Initial tests in the single column version of the Unified Model verify that the scheme is effective in producing the desired distributions of convective variability without adversely affecting the mean state.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many time series are measured monthly, either as averages or totals, and such data often exhibit seasonal variability-the values of the series are consistently larger for some months of the year than for others. A typical series of this type is the number of deaths each month attributed to SIDS (Sudden Infant Death Syndrome). Seasonality can be modelled in a number of ways. This paper describes and discusses various methods for modelling seasonality in SIDS data, though much of the discussion is relevant to other seasonally varying data. There are two main approaches, either fitting a circular probability distribution to the data, or using regression-based techniques to model the mean seasonal behaviour. Both are discussed in this paper.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study investigates the response of wintertime North Atlantic Oscillation (NAO) to increasing concentrations of atmospheric carbon dioxide (CO2) as simulated by 18 global coupled general circulation models that participated in phase 2 of the Coupled Model Intercomparison Project (CMIP2). NAO has been assessed in control and transient 80-year simulations produced by each model under constant forcing, and 1% per year increasing concentrations of CO2, respectively. Although generally able to simulate the main features of NAO, the majority of models overestimate the observed mean wintertime NAO index of 8 hPa by 5-10 hPa. Furthermore, none of the models, in either the control or perturbed simulations, are able to reproduce decadal trends as strong as that seen in the observed NAO index from 1970-1995. Of the 15 models able to simulate the NAO pressure dipole, 13 predict a positive increase in NAO with increasing CO2 concentrations. The magnitude of the response is generally small and highly model-dependent, which leads to large uncertainty in multi-model estimates such as the median estimate of 0.0061 +/- 0.0036 hPa per %CO2. Although an increase of 0.61 hPa in NAO for a doubling in CO2 represents only a relatively small shift of 0.18 standard deviations in the probability distribution of winter mean NAO, this can cause large relative increases in the probabilities of extreme values of NAO associated with damaging impacts. Despite the large differences in NAO responses, the models robustly predict similar statistically significant changes in winter mean temperature (warmer over most of Europe) and precipitation (an increase over Northern Europe). Although these changes present a pattern similar to that expected due to an increase in the NAO index, linear regression is used to show that the response is much greater than can be attributed to small increases in NAO. NAO trends are not the key contributor to model-predicted climate change in wintertime mean temperature and precipitation over Europe and the Mediterranean region. However, the models' inability to capture the observed decadal variability in NAO might also signify a major deficiency in their ability to simulate the NAO-related responses to climate change.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We use geomagnetic activity data to study the rise and fall over the past century of the solar wind flow speed VSW, the interplanetary magnetic field strength B, and the open solar flux FS. Our estimates include allowance for the kinematic effect of longitudinal structure in the solar wind flow speed. As well as solar cycle variations, all three parameters show a long-term rise during the first half of the 20th century followed by peaks around 1955 and 1986 and then a recent decline. Cosmogenic isotope data reveal that this constitutes a grand maximum of solar activity which began in 1920, using the definition that such grand maxima are when 25-year averages of the heliospheric modulation potential exceeds 600 MV. Extrapolating the linear declines seen in all three parameters since 1985, yields predictions that the grand maximum will end in the years 2013, 2014, or 2027 using VSW, FS, or B, respectively. These estimates are consistent with predictions based on the probability distribution of the durations of past grand solar maxima seen in cosmogenic isotope data. The data contradict any suggestions of a floor to the open solar flux: we show that the solar minimum open solar flux, kinematically corrected to allow for the excess flux effect, has halved over the past two solar cycles.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A method was developed to evaluate crop disease predictive models for their economic and environmental benefits. Benefits were quantified as the value of a prediction measured by costs saved and fungicide dose saved. The value of prediction was defined as the net gain made by using predictions, measured as the difference between a scenario where predictions are available and used and a scenario without prediction. Comparable 'with' and 'without' scenarios were created with the use of risk levels. These risk levels were derived from a probability distribution fitted to observed disease severities. These distributions were used to calculate the probability that a certain disease induced economic loss was incurred. The method was exemplified by using it to evaluate a model developed for Mycosphaerella graminicola risk prediction. Based on the value of prediction, the tested model may have economic and environmental benefits to growers if used to guide treatment decisions on resistant cultivars. It is shown that the value of prediction measured by fungicide dose saved and costs saved is constant with the risk level. The model could also be used to evaluate similar crop disease predictive models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A means of assessing, monitoring and controlling aggregate emissions from multi-instrument Emissions Trading Schemes is proposed. The approach allows contributions from different instruments with different forms of emissions targets to be integrated. Where Emissions Trading Schemes are helping meet specific national targets, the approach allows the entry requirements of new participants to be calculated and set at a level that will achieve these targets. The approach is multi-levelled, and may be extended downwards to support pooling of participants within instruments, or upwards to embed Emissions Trading Schemes within a wider suite of policies and measures with hard and soft targets. Aggregate emissions from each instrument are treated stochastically. Emissions from the scheme as a whole are then the joint probability distribution formed by integrating the emissions from its instruments. Because a Bayesian approach is adopted, qualitative and semi-qualitative data from expert opinion can be used where quantitative data is not currently available, or is incomplete. This approach helps government retain sufficient control over emissions trading scheme targets to allow them to meet their emissions reduction obligations, while minimising the need for retrospectively adjusting existing participants’ conditions of entry. This maintains participant confidence, while providing the necessary policy levers for good governance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In real-world environments it is usually difficult to specify the quality of a preventive maintenance (PM) action precisely. This uncertainty makes it problematic to optimise maintenance policy.-This problem is tackled in this paper by assuming that the-quality of a PM action is a random variable following a probability distribution. Two frequently studied PM models, a failure rate PM model and an age reduction PM model, are investigated. The optimal PM policies are presented and optimised. Numerical examples are also given.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A new Bayesian algorithm for retrieving surface rain rate from Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI) over the ocean is presented, along with validations against estimates from the TRMM Precipitation Radar (PR). The Bayesian approach offers a rigorous basis for optimally combining multichannel observations with prior knowledge. While other rain-rate algorithms have been published that are based at least partly on Bayesian reasoning, this is believed to be the first self-contained algorithm that fully exploits Bayes’s theorem to yield not just a single rain rate, but rather a continuous posterior probability distribution of rain rate. To advance the understanding of theoretical benefits of the Bayesian approach, sensitivity analyses have been conducted based on two synthetic datasets for which the “true” conditional and prior distribution are known. Results demonstrate that even when the prior and conditional likelihoods are specified perfectly, biased retrievals may occur at high rain rates. This bias is not the result of a defect of the Bayesian formalism, but rather represents the expected outcome when the physical constraint imposed by the radiometric observations is weak owing to saturation effects. It is also suggested that both the choice of the estimators and the prior information are crucial to the retrieval. In addition, the performance of the Bayesian algorithm herein is found to be comparable to that of other benchmark algorithms in real-world applications, while having the additional advantage of providing a complete continuous posterior probability distribution of surface rain rate.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Rainfall can be modeled as a spatially correlated random field superimposed on a background mean value; therefore, geostatistical methods are appropriate for the analysis of rain gauge data. Nevertheless, there are certain typical features of these data that must be taken into account to produce useful results, including the generally non-Gaussian mixed distribution, the inhomogeneity and low density of observations, and the temporal and spatial variability of spatial correlation patterns. Many studies show that rigorous geostatistical analysis performs better than other available interpolation techniques for rain gauge data. Important elements are the use of climatological variograms and the appropriate treatment of rainy and nonrainy areas. Benefits of geostatistical analysis for rainfall include ease of estimating areal averages, estimation of uncertainties, and the possibility of using secondary information (e.g., topography). Geostatistical analysis also facilitates the generation of ensembles of rainfall fields that are consistent with a given set of observations, allowing for a more realistic exploration of errors and their propagation in downstream models, such as those used for agricultural or hydrological forecasting. This article provides a review of geostatistical methods used for kriging, exemplified where appropriate by daily rain gauge data from Ethiopia.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present extensive molecular dynamics simulations of the dynamics of diluted long probe chains entangled with a matrix of shorter chains. The chain lengths of both components are above the entanglement strand length, and the ratio of their lengths is varied over a wide range to cover the crossover from the chain reptation regime to tube Rouse motion regime of the long probe chains. Reducing the matrix chain length results in a faster decay of the dynamic structure factor of the probe chains, in good agreement with recent neutron spin echo experiments. The diffusion of the long chains, measured by the mean square displacements of the monomers and the centers of mass of the chains, demonstrates a systematic speed-up relative to the pure reptation behavior expected for monodisperse melts of sufficiently long polymers. On the other hand, the diffusion of the matrix chains is only weakly perturbed by the diluted long probe chains. The simulation results are qualitatively consistent with the theoretical predictions based on constraint release Rouse model, but a detailed comparison reveals the existence of a broad distribution of the disentanglement rates, which is partly confirmed by an analysis of the packing and diffusion of the matrix chains in the tube region of the probe chains. A coarse-grained simulation model based on the tube Rouse motion model with incorporation of the probability distribution of the tube segment jump rates is developed and shows results qualitatively consistent with the fine scale molecular dynamics simulations. However, we observe a breakdown in the tube Rouse model when the short chain length is decreased to around N-S = 80, which is roughly 3.5 times the entanglement spacing N-e(P) = 23. The location of this transition may be sensitive to the chain bending potential used in our simulations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Valuation is the process of estimating price. The methods used to determine value attempt to model the thought processes of the market and thus estimate price by reference to observed historic data. This can be done using either an explicit model, that models the worth calculation of the most likely bidder, or an implicit model, that that uses historic data suitably adjusted as a short cut to determine value by reference to previous similar sales. The former is generally referred to as the Discounted Cash Flow (DCF) model and the latter as the capitalisation (or All Risk Yield) model. However, regardless of the technique used, the valuation will be affected by uncertainties. Uncertainty in the comparable data available; uncertainty in the current and future market conditions and uncertainty in the specific inputs for the subject property. These input uncertainties will translate into an uncertainty with the output figure, the estimate of price. In a previous paper, we have considered the way in which uncertainty is allowed for in the capitalisation model in the UK. In this paper, we extend the analysis to look at the way in which uncertainty can be incorporated into the explicit DCF model. This is done by recognising that the input variables are uncertain and will have a probability distribution pertaining to each of them. Thus buy utilising a probability-based valuation model (using Crystal Ball) it is possible to incorporate uncertainty into the analysis and address the shortcomings of the current model. Although the capitalisation model is discussed, the paper concentrates upon the application of Crystal Ball to the Discounted Cash Flow approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper sequential importance sampling is used to assess the impact of observations on a ensemble prediction for the decadal path transitions of the Kuroshio Extension (KE). This particle filtering approach gives access to the probability density of the state vector, which allows us to determine the predictive power — an entropy based measure — of the ensemble prediction. The proposed set-up makes use of an ensemble that, at each time, samples the climatological probability distribution. Then, in a post-processing step, the impact of different sets of observations is measured by the increase in predictive power of the ensemble over the climatological signal during one-year. The method is applied in an identical-twin experiment for the Kuroshio Extension using a reduced-gravity shallow water model. We investigate the impact of assimilating velocity observations from different locations during the elongated and the contracted meandering state of the KE. Optimal observations location correspond to regions with strong potential vorticity gradients. For the elongated state the optimal location is in the first meander of the KE. During the contracted state of the KE it is located south of Japan, where the Kuroshio separates from the coast.