74 resultados para Empirical distribution function


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the plausible model of activated carbon proposed by Harris and co-workers and grand canonical Monte Carlo simulations, we study the applicability of standard methods for describing adsorption data on microporous carbons widely used in adsorption science. Two carbon structures are studied, one with a small distribution of micropores in the range up to 1 nm, and the other with micropores covering a wide range of porosity. For both structures, adsorption isotherms of noble gases (from Ne to Xe), carbon tetrachloride and benzene are simulated. The data obtained are considered in terms of Dubinin-Radushkevich plots. Moreover, for benzene and carbon tetrachloride the temperature invariance of the characteristic curve is also studied. We show that using simulated data some empirical relationships obtained from experiment can be successfully recovered. Next we test the applicability of Dubinin's related models including the Dubinin-Izotova, Dubinin-Radushkevich-Stoeckli, and Jaroniec-Choma equations. The results obtained demonstrate the limits and applications of the models studied in the field of carbon porosity characterization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ensemble forecasting of nonlinear systems involves the use of a model to run forward a discrete ensemble (or set) of initial states. Data assimilation techniques tend to focus on estimating the true state of the system, even though model error limits the value of such efforts. This paper argues for choosing the initial ensemble in order to optimise forecasting performance rather than estimate the true state of the system. Density forecasting and choosing the initial ensemble are treated as one problem. Forecasting performance can be quantified by some scoring rule. In the case of the logarithmic scoring rule, theoretical arguments and empirical results are presented. It turns out that, if the underlying noise dominates model error, we can diagnose the noise spread.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we perform an analytical and numerical study of Extreme Value distributions in discrete dynamical systems that have a singular measure. Using the block maxima approach described in Faranda et al. [2011] we show that, numerically, the Extreme Value distribution for these maps can be associated to the Generalised Extreme Value family where the parameters scale with the information dimension. The numerical analysis are performed on a few low dimensional maps. For the middle third Cantor set and the Sierpinskij triangle obtained using Iterated Function Systems, experimental parameters show a very good agreement with the theoretical values. For strange attractors like Lozi and H\`enon maps a slower convergence to the Generalised Extreme Value distribution is observed. Even in presence of large statistics the observed convergence is slower if compared with the maps which have an absolute continuous invariant measure. Nevertheless and within the uncertainty computed range, the results are in good agreement with the theoretical estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper review the literature on the distribution of commercial real estate returns. There is growing evidence that the assumption of normality in returns is not safe. Distributions are found to be peaked, fat-tailed and, tentatively, skewed. There is some evidence of compound distributions and non-linearity. Public traded real estate assets (such as property company or REIT shares) behave in a fashion more similar to other common stocks. However, as in equity markets, it would be unwise to assume normality uncritically. Empirical evidence for UK real estate markets is obtained by applying distribution fitting routines to IPD Monthly Index data for the aggregate index and selected sub-sectors. It is clear that normality is rejected in most cases. It is often argued that observed differences in real estate returns are a measurement issue resulting from appraiser behaviour. However, unsmoothing the series does not assist in modelling returns. A large proportion of returns are close to zero. This would be characteristic of a thinly-traded market where new information arrives infrequently. Analysis of quarterly data suggests that, over longer trading periods, return distributions may conform more closely to those found in other asset markets. These results have implications for the formulation and implementation of a multi-asset portfolio allocation strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Volume determination of tephra deposits is necessary for the assessment of the dynamics and hazards of explosive volcanoes. Several methods have been proposed during the past 40 years that include the analysis of crystal concentration of large pumices, integrations of various thinning relationships, and the inversion of field observations using analytical and computational models. Regardless of their strong dependence on tephra-deposit exposure and distribution of isomass/isopach contours, empirical integrations of deposit thinning trends still represent the most widely adopted strategy due to their practical and fast application. The most recent methods involve the best fitting of thinning data using various exponential seg- ments or a power-law curve on semilog plots of thickness (or mass/area) versus square root of isopach area. The exponential method is mainly sensitive to the number and the choice of straight segments, whereas the power-law method can better reproduce the natural thinning of tephra deposits but is strongly sensitive to the proximal or distal extreme of integration. We analyze a large data set of tephra deposits and propose a new empirical method for the deter- mination of tephra-deposit volumes that is based on the integration of the Weibull function. The new method shows a better agreement with observed data, reconciling the debate on the use of the exponential versus power-law method. In fact, the Weibull best fitting only depends on three free parameters, can well reproduce the gradual thinning of tephra deposits, and does not depend on the choice of arbitrary segments or of arbitrary extremes of integration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We employ a large dataset of physical inventory data on 21 different commodities for the period 1993–2011 to empirically analyze the behavior of commodity prices and their volatility as predicted by the theory of storage. We examine two main issues. First, we analyze the relationship between inventory and the shape of the forward curve. Low (high) inventory is associated with forward curves in backwardation (contango), as the theory of storage predicts. Second, we show that price volatility is a decreasing function of inventory for the majority of commodities in our sample. This effect is more pronounced in backwardated markets. Our findings are robust with respect to alternative inventory measures and over the recent commodity price boom.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Let 0 denote the level of quality inherent in a food product that is delivered to some terminal market. In this paper, I characterize allocations over 0 and provide an economic rationale for regulating safety and quality standards in the food system. Zusman and Bockstael investigate the theoretical foundations for imposing standards and stress the importance of providing a tractable conceptual foundation. Despite a wealth of contributions that are mainly empirical (for reviews of these works see, respectively, Caswell and Antle), there have been relatively few attempts to model formally the linkages between farm and food markets when food quality and consumer safety are at issue. Here, I attempt to provide such a framework, building on key contributions in the theoretical literature and linking them in a simple model of quality determination in a vertically related marketing channel. The food-marketing model is due to Gardner. Spence provides a foundation for Pareto-improving intervention in a deterministic model of quality provision, and Leland, building on the classic paper by Akerlof, investigates licensing and minimum standards when the information structure is incomplete. Linking these ideas in a satisfactory model of the food markets is the main objective of the paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce a new methodology that allows the construction of wave frequency distributions due to growing incoherent whistler-mode waves in the magnetosphere. The technique combines the equations of geometric optics (i.e. raytracing) with the equation of transfer of radiation in an anisotropic lossy medium to obtain spectral energy density as a function of frequency and wavenormal angle. We describe the method in detail, and then demonstrate how it could be used in an idealised magnetosphere during quiet geomagnetic conditions. For a specific set of plasma conditions, we predict that the wave power peaks off the equator at ~15 degrees magnetic latitude. The new calculations predict that wave power as a function of frequency can be adequately described using a Gaussian function, but as a function of wavenormal angle, it more closely resembles a skew normal distribution. The technique described in this paper is the first known estimate of the parallel and oblique incoherent wave spectrum as a result of growing whistler-mode waves, and provides a means to incorporate self-consistent wave-particle interactions in a kinetic model of the magnetosphere over a large volume.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A primitive equation model is used to study the sensitivity of baroclinic wave life cycles to the initial latitude-height distribution of humidity. Diabatic heating is parametrized only as a consequence of condensation in regions of large-scale ascent. Experiments are performed in which the initial relative humidity is a simple function of model level, and in some cases latitude bands are specified which are initially relatively dry. It is found that the presence of moisture can either increase or decrease the peak eddy kinetic energy of the developing wave, depending on the initial moisture distribution. A relative abundance of moisture at mid-latitudes tends to weaken the wave, while a relative abundance at low latitudes tends to strengthen it. This sensitivity exists because competing processes are at work. These processes are described in terms of energy box diagnostics. The most realistic case lies on the cusp of this sensitivity. Further physical parametrizations are then added, including surface fluxes and upright moist convection. These have the effect of increasing wave amplitude, but the sensitivity to initial conditions of relative humidity remains. Finally, 'control' and 'doubled CO2' life cycles are performed, with initial conditions taken from the time-mean zonal-mean output of equilibrium GCM experiments. The attenuation of the wave resulting from reduced baroclinicity is more pronounced than any effect due to changes in initial moisture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We examine whether and under what circumstances World Bank and International Monetary Fund (IMF) programs affect the likelihood of major government crises. We find that crises are, on average, more likely as a consequence of World Bank programs. We also find that governments face an increasing risk of entering a crisis when they remain under an IMF or World Bank arrangement once the economy's performance improves. The international financial institution's (IFI) scapegoat function thus seems to lose its value when the need for financial support is less urgent. While the probability of a crisis increases when a government turns to the IFIs, programs inherited by preceding governments do not affect the probability of a crisis. This is in line with two interpretations. First, the conclusion of IFI programs can signal the government's incompetence, and second, governments that inherit programs might be less likely to implement program conditions agreed to by their predecessors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This contribution proposes a novel probability density function (PDF) estimation based over-sampling (PDFOS) approach for two-class imbalanced classification problems. The classical Parzen-window kernel function is adopted to estimate the PDF of the positive class. Then according to the estimated PDF, synthetic instances are generated as the additional training data. The essential concept is to re-balance the class distribution of the original imbalanced data set under the principle that synthetic data sample follows the same statistical properties. Based on the over-sampled training data, the radial basis function (RBF) classifier is constructed by applying the orthogonal forward selection procedure, in which the classifier’s structure and the parameters of RBF kernels are determined using a particle swarm optimisation algorithm based on the criterion of minimising the leave-one-out misclassification rate. The effectiveness of the proposed PDFOS approach is demonstrated by the empirical study on several imbalanced data sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

tWe develop an orthogonal forward selection (OFS) approach to construct radial basis function (RBF)network classifiers for two-class problems. Our approach integrates several concepts in probabilisticmodelling, including cross validation, mutual information and Bayesian hyperparameter fitting. At eachstage of the OFS procedure, one model term is selected by maximising the leave-one-out mutual infor-mation (LOOMI) between the classifier’s predicted class labels and the true class labels. We derive theformula of LOOMI within the OFS framework so that the LOOMI can be evaluated efficiently for modelterm selection. Furthermore, a Bayesian procedure of hyperparameter fitting is also integrated into theeach stage of the OFS to infer the l2-norm based local regularisation parameter from the data. Since eachforward stage is effectively fitting of a one-variable model, this task is very fast. The classifier construc-tion procedure is automatically terminated without the need of using additional stopping criterion toyield very sparse RBF classifiers with excellent classification generalisation performance, which is par-ticular useful for the noisy data sets with highly overlapping class distribution. A number of benchmarkexamples are employed to demonstrate the effectiveness of our proposed approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Filamin A (FlnA) cross-links actin filaments and connects the Von Willebrand factor receptor GPIb-IX-V to the underlying cytoskeleton in platelets. Because FlnA deficiency is embryonic lethal, mice lacking FlnA in platelets were generated by breeding FlnA(loxP/loxP) females with GATA1-Cre males. FlnA(loxP/y) GATA1-Cre males have a macrothrombocytopenia and increased tail bleeding times. FlnA-null platelets have decreased expression and altered surface distribution of GPIbalpha because they lack the normal cytoskeletal linkage of GPIbalpha to underlying actin filaments. This results in approximately 70% less platelet coverage on collagen-coated surfaces at shear rates of 1,500/s, compared with wild-type platelets. Unexpectedly, however, immunoreceptor tyrosine-based activation motif (ITAM)- and ITAM-like-mediated signals are severely compromised in FlnA-null platelets. FlnA-null platelets fail to spread and have decreased alpha-granule secretion, integrin alphaIIbbeta3 activation, and protein tyrosine phosphorylation, particularly that of the protein tyrosine kinase Syk and phospholipase C-gamma2, in response to stimulation through the collagen receptor GPVI and the C-type lectin-like receptor 2. This signaling defect was traced to the loss of a novel FlnA-Syk interaction, as Syk binds to FlnA at immunoglobulin-like repeat 5. Our findings reveal that the interaction between FlnA and Syk regulates ITAM- and ITAM-like-containing receptor signaling and platelet function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Models for which the likelihood function can be evaluated only up to a parameter-dependent unknown normalizing constant, such as Markov random field models, are used widely in computer science, statistical physics, spatial statistics, and network analysis. However, Bayesian analysis of these models using standard Monte Carlo methods is not possible due to the intractability of their likelihood functions. Several methods that permit exact, or close to exact, simulation from the posterior distribution have recently been developed. However, estimating the evidence and Bayes’ factors for these models remains challenging in general. This paper describes new random weight importance sampling and sequential Monte Carlo methods for estimating BFs that use simulation to circumvent the evaluation of the intractable likelihood, and compares them to existing methods. In some cases we observe an advantage in the use of biased weight estimates. An initial investigation into the theoretical and empirical properties of this class of methods is presented. Some support for the use of biased estimates is presented, but we advocate caution in the use of such estimates.