879 resultados para estimating conditional probabilities


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A key aspect in designing an ecient decadal prediction system is ensuring that the uncertainty in the ocean initial conditions is sampled optimally. Here, we consider one strategy to address this issue by investigating the growth of optimal perturbations in the HadCM3 global climate model (GCM). More specically, climatically relevant singular vectors (CSVs) - the small perturbations which grow most rapidly for a specic initial condition - are estimated for decadal timescales in the Atlantic Ocean. It is found that reliable CSVs can be estimated by running a large ensemble of integrations of the GCM. Amplication of the optimal perturbations occurs for more than 10 years, and possibly up to 40 years. The identi ed regions for growing perturbations are found to be in the far North Atlantic, and these perturbations cause amplication through an anomalous meridional overturning circulation response. Additionally, this type of analysis potentially informs the design of future ocean observing systems by identifying the sensitive regions where small uncertainties in the ocean state can grow maximally. Although these CSVs are expensive to compute, we identify ways in which the process could be made more ecient in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Bayesian method of estimating multivariate sample selection models is introduced and applied to the estimation of a demand system for food in the UK to account for censoring arising from infrequency of purchase. We show how it is possible to impose identifying restrictions on the sample selection equations and that, unlike a maximum likelihood framework, the imposition of adding up at both latent and observed levels is straightforward. Our results emphasise the role played by low incomes and socio-economic circumstances in leading to poor diets and also indicate that the presence of children in a household has a negative impact on dietary quality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Observations of a chemical at a point in the atmosphere typically show sudden transitions between episodes of high and low concentration. Often these are associated with a rapid change in the origin of air arriving at the site. Lagrangian chemical models riding along trajectories can reproduce such transitions, but small timing errors from trajectory phase errors dramatically reduce the correlation between modeled concentrations and observations. Here the origin averaging technique is introduced to obtain maps of average concentration as a function of air mass origin for the East Atlantic Summer Experiment 1996 (EASE96, a ground-based chemistry campaign). These maps are used to construct origin averaged time series which enable comparison between a chemistry model and observations with phase errors factored out. The amount of the observed signal explained by trajectory changes can be quantified, as can the systematic model errors as a function of air mass origin. The Cambridge Tropospheric Trajectory model of Chemistry and Transport (CiTTyCAT) can account for over 70% of the observed ozone signal variance during EASE96 when phase errors are side-stepped by origin averaging. The dramatic increase in correlation (from 23% without averaging) cannot be achieved by time averaging. The success of the model is attributed to the strong relationship between changes in ozone along trajectories and their origin and its ability to simulate those changes. The model performs less well for longer-lived chemical constituents because the initial conditions 5 days before arrival are insufficiently well known.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many well-established statistical methods in genetics were developed in a climate of severe constraints on computational power. Recent advances in simulation methodology now bring modern, flexible statistical methods within the reach of scientists having access to a desktop workstation. We illustrate the potential advantages now available by considering the problem of assessing departures from Hardy-Weinberg (HW) equilibrium. Several hypothesis tests of HW have been established, as well as a variety of point estimation methods for the parameter which measures departures from HW under the inbreeding model. We propose a computational, Bayesian method for assessing departures from HW, which has a number of important advantages over existing approaches. The method incorporates the effects-of uncertainty about the nuisance parameters--the allele frequencies--as well as the boundary constraints on f (which are functions of the nuisance parameters). Results are naturally presented visually, exploiting the graphics capabilities of modern computer environments to allow straightforward interpretation. Perhaps most importantly, the method is founded on a flexible, likelihood-based modelling framework, which can incorporate the inbreeding model if appropriate, but also allows the assumptions of the model to he investigated and, if necessary, relaxed. Under appropriate conditions, information can be shared across loci and, possibly, across populations, leading to more precise estimation. The advantages of the method are illustrated by application both to simulated data and to data analysed by alternative methods in the recent literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article proposes a new model for autoregressive conditional heteroscedasticity and kurtosis. Via a time-varying degrees of freedom parameter, the conditional variance and conditional kurtosis are permitted to evolve separately. The model uses only the standard Student’s t-density and consequently can be estimated simply using maximum likelihood. The method is applied to a set of four daily financial asset return series comprising U.S. and U.K. stocks and bonds, and significant evidence in favor of the presence of autoregressive conditional kurtosis is observed. Various extensions to the basic model are proposed, and we show that the response of kurtosis to good and bad news is not significantly asymmetric.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research on the topic of liquidity has greatly benefited from the improved availability of data. Researchers have addressed questions regarding the factors that influence bid-ask spreads and the relationship between spreads and risk, return and liquidity. Intra-day data have been used to measure the effective spread and researchers have been able to refine the concepts of liquidity to include the price impact of transactions on a trade-by-trade analysis. The growth in the creation of tax-transparent securities has greatly enhanced the visibility of securitized real estate, and has naturally led to the question of whether the increased visibility of real estate has caused market liquidity to change. Although the growth in the public market for securitized real estate has occurred in international markets, it has not been accompanied by universal publication of transaction data. Therefore this paper develops an aggregate daily data-based test for liquidity and applies the test to US data in order to check for consistency with the results of prior intra-day analysis. If the two approaches produce similar results, we can apply the same technique to markets in which less detailed data are available and offer conclusions on the liquidity of a wider set of markets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Various methods of assessment have been applied to the One Dimensional Time to Explosion (ODTX) apparatus and experiments with the aim of allowing an estimate of the comparative violence of the explosion event to be made. Non-mechanical methods used were a simple visual inspection, measuring the increase in the void volume of the anvils following an explosion and measuring the velocity of the sound produced by the explosion over 1 metre. Mechanical methods used included monitoring piezo-electric devices inserted in the frame of the machine and measuring the rotational velocity of a rotating bar placed on the top of the anvils after it had been displaced by the shock wave. This last method, which resembles original Hopkinson Bar experiments, seemed the easiest to apply and analyse, giving relative rankings of violence and the possibility of the calculation of a “detonation” pressure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sting jets are transient mesoscale jets of air that descend from the tip of the cloud head towards the top of the boundary layer in severe extratropical cyclones and can lead to damaging surface wind gusts. This recently identified jet is distinct from the well-documented jets associated with the cold and warm conveyor belts. One mechanism proposed for their development is the release of conditional symmetric instability (CSI). Here the spatial distribution and temporal evolution of several CSI diagnostics in four severe storms are analysed. A sting jet has been identified in three of these storms; for comparison, we also analysed one storm that did not have a sting jet, even though it hadmany of the apparent features of sting-jet storms. The sting-jet storms are distinct from the non-sting-jet storms by having much greater andmore extensive conditional instability (CI) and CSI. CSI is released by ascending air parcels in the cloud head in two of the sting-jet storms and by descending air parcels in the other sting-jet storm. By contrast, only weak CI to ascending air parcels is present at the cloud-head tip in the non-sting-jet storm. CSI released by descending air parcels, as diagnosed by decaying downdraught slantwise convective available potential energy (DSCAPE), is collocated with the sting jets in all three sting-jet storms and has a localisedmaximum in two of them. Consistent evolutions of saturated moist potential vorticity are found.We conclude that CSI release has a role in the generation of the sting jet, that the sting jet may be driven by the release of instability to both ascending and descending parcels, and that DSCAPE could be used as a discriminating diagnostic for the sting jet based on these four case-studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the finite sample properties of model selection by information criteria in conditionally heteroscedastic models. Recent theoretical results show that certain popular criteria are consistent in that they will select the true model asymptotically with probability 1. To examine the empirical relevance of this property, Monte Carlo simulations are conducted for a set of non–nested data generating processes (DGPs) with the set of candidate models consisting of all types of model used as DGPs. In addition, not only is the best model considered but also those with similar values of the information criterion, called close competitors, thus forming a portfolio of eligible models. To supplement the simulations, the criteria are applied to a set of economic and financial series. In the simulations, the criteria are largely ineffective at identifying the correct model, either as best or a close competitor, the parsimonious GARCH(1, 1) model being preferred for most DGPs. In contrast, asymmetric models are generally selected to represent actual data. This leads to the conjecture that the properties of parameterizations of processes commonly used to model heteroscedastic data are more similar than may be imagined and that more attention needs to be paid to the behaviour of the standardized disturbances of such models, both in simulation exercises and in empirical modelling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Statistical graphics are a fundamental, yet often overlooked, set of components in the repertoire of data analytic tools. Graphs are quick and efficient, yet simple instruments of preliminary exploration of a dataset to understand its structure and to provide insight into influential aspects of inference such as departures from assumptions and latent patterns. In this paper, we present and assess a graphical device for choosing a method for estimating population size in capture-recapture studies of closed populations. The basic concept is derived from a homogeneous Poisson distribution where the ratios of neighboring Poisson probabilities multiplied by the value of the larger neighbor count are constant. This property extends to the zero-truncated Poisson distribution which is of fundamental importance in capture–recapture studies. In practice however, this distributional property is often violated. The graphical device developed here, the ratio plot, can be used for assessing specific departures from a Poisson distribution. For example, simple contaminations of an otherwise homogeneous Poisson model can be easily detected and a robust estimator for the population size can be suggested. Several robust estimators are developed and a simulation study is provided to give some guidance on which should be used in practice. More systematic departures can also easily be detected using the ratio plot. In this paper, the focus is on Gamma mixtures of the Poisson distribution which leads to a linear pattern (called structured heterogeneity) in the ratio plot. More generally, the paper shows that the ratio plot is monotone for arbitrary mixtures of power series densities.