897 resultados para Unbiased estimating functions


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Bayesian method of estimating multivariate sample selection models is introduced and applied to the estimation of a demand system for food in the UK to account for censoring arising from infrequency of purchase. We show how it is possible to impose identifying restrictions on the sample selection equations and that, unlike a maximum likelihood framework, the imposition of adding up at both latent and observed levels is straightforward. Our results emphasise the role played by low incomes and socio-economic circumstances in leading to poor diets and also indicate that the presence of children in a household has a negative impact on dietary quality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Time correlation functions yield profound information about the dynamics of a physical system and hence are frequently calculated in computer simulations. For systems whose dynamics span a wide range of time, currently used methods require significant computer time and memory. In this paper, we discuss the multiple-tau correlator method for the efficient calculation of accurate time correlation functions on the fly during computer simulations. The multiple-tau correlator is efficacious in terms of computational requirements and can be tuned to the desired level of accuracy. Further, we derive estimates for the error arising from the use of the multiple-tau correlator and extend it for use in the calculation of mean-square particle displacements and dynamic structure factors. The method described here, in hardware implementation, is routinely used in light scattering experiments but has not yet found widespread use in computer simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Observations of a chemical at a point in the atmosphere typically show sudden transitions between episodes of high and low concentration. Often these are associated with a rapid change in the origin of air arriving at the site. Lagrangian chemical models riding along trajectories can reproduce such transitions, but small timing errors from trajectory phase errors dramatically reduce the correlation between modeled concentrations and observations. Here the origin averaging technique is introduced to obtain maps of average concentration as a function of air mass origin for the East Atlantic Summer Experiment 1996 (EASE96, a ground-based chemistry campaign). These maps are used to construct origin averaged time series which enable comparison between a chemistry model and observations with phase errors factored out. The amount of the observed signal explained by trajectory changes can be quantified, as can the systematic model errors as a function of air mass origin. The Cambridge Tropospheric Trajectory model of Chemistry and Transport (CiTTyCAT) can account for over 70% of the observed ozone signal variance during EASE96 when phase errors are side-stepped by origin averaging. The dramatic increase in correlation (from 23% without averaging) cannot be achieved by time averaging. The success of the model is attributed to the strong relationship between changes in ozone along trajectories and their origin and its ability to simulate those changes. The model performs less well for longer-lived chemical constituents because the initial conditions 5 days before arrival are insufficiently well known.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neurofuzzy modelling systems combine fuzzy logic with quantitative artificial neural networks via a concept of fuzzification by using a fuzzy membership function usually based on B-splines and algebraic operators for inference, etc. The paper introduces a neurofuzzy model construction algorithm using Bezier-Bernstein polynomial functions as basis functions. The new network maintains most of the properties of the B-spline expansion based neurofuzzy system, such as the non-negativity of the basis functions, and unity of support but with the additional advantages of structural parsimony and Delaunay input space partitioning, avoiding the inherent computational problems of lattice networks. This new modelling network is based on the idea that an input vector can be mapped into barycentric co-ordinates with respect to a set of predetermined knots as vertices of a polygon (a set of tiled Delaunay triangles) over the input space. The network is expressed as the Bezier-Bernstein polynomial function of barycentric co-ordinates of the input vector. An inverse de Casteljau procedure using backpropagation is developed to obtain the input vector's barycentric co-ordinates that form the basis functions. Extension of the Bezier-Bernstein neurofuzzy algorithm to n-dimensional inputs is discussed followed by numerical examples to demonstrate the effectiveness of this new data based modelling approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper introduces a new neurofuzzy model construction algorithm for nonlinear dynamic systems based upon basis functions that are Bezier-Bernstein polynomial functions. This paper is generalized in that it copes with n-dimensional inputs by utilising an additive decomposition construction to overcome the curse of dimensionality associated with high n. This new construction algorithm also introduces univariate Bezier-Bernstein polynomial functions for the completeness of the generalized procedure. Like the B-spline expansion based neurofuzzy systems, Bezier-Bernstein polynomial function based neurofuzzy networks hold desirable properties such as nonnegativity of the basis functions, unity of support, and interpretability of basis function as fuzzy membership functions, moreover with the additional advantages of structural parsimony and Delaunay input space partition, essentially overcoming the curse of dimensionality associated with conventional fuzzy and RBF networks. This new modeling network is based on additive decomposition approach together with two separate basis function formation approaches for both univariate and bivariate Bezier-Bernstein polynomial functions used in model construction. The overall network weights are then learnt using conventional least squares methods. Numerical examples are included to demonstrate the effectiveness of this new data based modeling approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research on the topic of liquidity has greatly benefited from the improved availability of data. Researchers have addressed questions regarding the factors that influence bid-ask spreads and the relationship between spreads and risk, return and liquidity. Intra-day data have been used to measure the effective spread and researchers have been able to refine the concepts of liquidity to include the price impact of transactions on a trade-by-trade analysis. The growth in the creation of tax-transparent securities has greatly enhanced the visibility of securitized real estate, and has naturally led to the question of whether the increased visibility of real estate has caused market liquidity to change. Although the growth in the public market for securitized real estate has occurred in international markets, it has not been accompanied by universal publication of transaction data. Therefore this paper develops an aggregate daily data-based test for liquidity and applies the test to US data in order to check for consistency with the results of prior intra-day analysis. If the two approaches produce similar results, we can apply the same technique to markets in which less detailed data are available and offer conclusions on the liquidity of a wider set of markets.