50 resultados para Maximum-likelihood-estimation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, the problem of frame-level symboltiming acquisition for UWB signals is addressed. The main goalis the derivation of a frame-level timing estimator which does notrequire any prior knowledge of neither the transmitted symbolsnor the received template waveform. The independence withrespect to the received waveform is of special interest in UWBcommunication systems, where a fast and accurate estimation ofthe end-to-end channel response is a challenging and computationallydemanding task. The proposed estimator is derived under theunconditional maximum likelihood criterion, and because of thelow power of UWB signals, the low-SNR assumption is adopted. Asa result, an optimal frame-level timing estimator is derived whichoutperforms existing acquisition methods in low-SNR scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We develop a method for obtaining 3D polarimetric integral images from elemental images recorded in low light illumination conditions. Since photon-counting images are very sparse, calculation of the Stokes parameters and the degree of polarization should be handled carefully. In our approach, polarimetric 3D integral images are generated using the Maximum Likelihood Estimation and subsequently reconstructed by means of a Total Variation Denoising filter. In this way, polarimetric results are comparable to those obtained in conventional illumination conditions. We also show that polarimetric information retrieved from photon starved images can be used in 3D object recognition problems. To the best of our knowledge, this is the first report on 3D polarimetric photon counting integral imaging.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We set up a dynamic model of firm investment in which liquidity constraintsenter explicity into the firm's maximization problem. The optimal policyrules are incorporated into a maximum likelihood procedure which estimatesthe structural parameters of the model. Investment is positively related tothe firm's internal financial position when the firm is relatively poor. This relationship disappears for wealthy firms, which can reach theirdesired level of investment. Borrowing is an increasing function of financial position for poor firms. This relationship is reversed as a firm's financial position improves, and large firms hold little debt.Liquidity constrained firms may be unused credits lines and the capacity toinvest further if they desire. However the fear that liquidity constraintswill become binding in the future induces them to invest only when internalresources increase.We estimate the structural parameters of the model and use them to quantifythe importance of liquidity constraints on firms' investment. We find thatliquidity constraints matter significantly for the investment decisions of firms. If firms can finance investment by issuing fresh equity, rather than with internal funds or debt, average capital stock is almost 35% higher overa period of 20 years. Transitory shocks to internal funds have a sustained effect on the capital stock. This effect lasts for several periods and ismore persistent for small firms than for large firms. A 10% negative shock to firm fundamentals reduces the capital stock of firms which face liquidityconstraints by almost 8% over a period as opposed to only 3.5% for firms which do not face these constraints.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Standard indirect Inference (II) estimators take a given finite-dimensional statistic, Z_{n} , and then estimate the parameters by matching the sample statistic with the model-implied population moment. We here propose a novel estimation method that utilizes all available information contained in the distribution of Z_{n} , not just its first moment. This is done by computing the likelihood of Z_{n}, and then estimating the parameters by either maximizing the likelihood or computing the posterior mean for a given prior of the parameters. These are referred to as the maximum indirect likelihood (MIL) and Bayesian Indirect Likelihood (BIL) estimators, respectively. We show that the IL estimators are first-order equivalent to the corresponding moment-based II estimator that employs the optimal weighting matrix. However, due to higher-order features of Z_{n} , the IL estimators are higher order efficient relative to the standard II estimator. The likelihood of Z_{n} will in general be unknown and so simulated versions of IL estimators are developed. Monte Carlo results for a structural auction model and a DSGE model show that the proposed estimators indeed have attractive finite sample properties.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper shows how a high level matrix programming language may be used to perform Monte Carlo simulation, bootstrapping, estimation by maximum likelihood and GMM, and kernel regression in parallel on symmetric multiprocessor computers or clusters of workstations. The implementation of parallelization is done in a way such that an investigator may use the programs without any knowledge of parallel programming. A bootable CD that allows rapid creation of a cluster for parallel computing is introduced. Examples show that parallelization can lead to important reductions in computational time. Detailed discussion of how the Monte Carlo problem was parallelized is included as an example for learning to write parallel programs for Octave.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We use a threshold seemingly unrelated regressions specification to assess whether the Central and East European countries (CEECs) are synchronized in their business cycles to the Euro-area. This specification is useful in two ways: First, it takes into account the common institutional factors and the similarities across CEECs in their process of economic transition. Second, it captures business cycle asymmetries by allowing for the presence of two distinct regimes for the CEECs. As the CEECs are strongly affected by the Euro-area these regimes may be associated with Euro-area expansions and contractions. We discuss representation, estimation by maximum likelihood and inference. The methodology is illustrated by using monthly industrial production in 8 CEECs. The results show that apart from Lithuania the rest of the CEECs experience “normal” growth when the Euro-area contracts and “high” growth when the Euro-area expands. Given that the CEECs are “catching up” with the Euro-area this result shows that most CEECs seem synchronized to the Euro-area cycle. Keywords: Threshold SURE; asymmetry; business cycles; CEECs. JEL classification: C33; C50; E32.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper empirically studies the effects of service offshoring on white-collar employment, using data for more than one hundred U.S. occupations. A model of firm behavior based on separability allows to derive the labor demand elasticity with respect to service offshoring for each occupation. Estimation is performed with Quasi-Maximum Likelihood, to account for high degrees of censoring in the employment variable. The estimated elasticities are then related to proxies for the skill level and the degree of tradability of the occupations. Results show that service offshoring increases high skilled employment and decreases medium and low skilled employment. Within each skill group, however, service offshoring penalizes tradable occupations and benefits non-tradable occupations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A new debate over the speed of convergence in per capita income across economies is going on. Cross sectional estimates support the idea of slow convergence of about two percent per year. Panel data estimates support the idea of fast convergence of five, ten or even twenty percent per year. This paper shows that, if you ``do it right'', even the panel data estimation method yields the result of slow convergence of about two percent per year.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper we analyse the observed systematic differences incosts for teaching hospitals (THhenceforth) in Spain. Concernhas been voiced regarding the existence of a bias in thefinancing of TH s has been raised once prospective budgets arein the arena for hospital finance, and claims for adjusting totake into account the legitimate extra costs of teaching onhospital expenditure are well grounded. We focus on theestimation of the impact of teaching status on average cost. Weused a version of a multiproduct hospital cost function takinginto account some relevant factors from which to derive theobserved differences. We assume that the relationship betweenthe explanatory and the dependent variables follows a flexibleform for each of the explanatory variables. We also model theunderlying covariance structure of the data. We assumed twoqualitatively different sources of variation: random effects andserial correlation. Random variation refers to both general levelvariation (through the random intercept) and the variationspecifically related to teaching status. We postulate that theimpact of the random effects is predominant over the impact ofthe serial correlation effects. The model is estimated byrestricted maximum likelihood. Our results show that costs are 9%higher (15% in the case of median costs) in teaching than innon-teaching hospitals. That is, teaching status legitimatelyexplains no more than half of the observed difference in actualcosts. The impact on costs of the teaching factor depends on thenumber of residents, with an increase of 51.11% per resident forhospitals with fewer than 204 residents (third quartile of thenumber of residents) and 41.84% for hospitals with more than 204residents. In addition, the estimated dispersion is higher amongteaching hospitals. As a result, due to the considerable observedheterogeneity, results should be interpreted with caution. From apolicy making point of view, we conclude that since a higherrelative burden for medical training is under public hospitalcommand, an explicit adjustment to the extra costs that theteaching factor imposes on hospital finance is needed, beforehospital competition for inpatient services takes place.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We study the statistical properties of three estimation methods for a model of learning that is often fitted to experimental data: quadratic deviation measures without unobserved heterogeneity, and maximum likelihood withand without unobserved heterogeneity. After discussing identification issues, we show that the estimators are consistent and provide their asymptotic distribution. Using Monte Carlo simulations, we show that ignoring unobserved heterogeneity can lead to seriously biased estimations in samples which have the typical length of actual experiments. Better small sample properties areobtained if unobserved heterogeneity is introduced. That is, rather than estimating the parameters for each individual, the individual parameters are considered random variables, and the distribution of those random variables is estimated.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper addresses the estimation of the code-phase(pseudorange) and the carrier-phase of the direct signal received from a direct-sequence spread-spectrum satellite transmitter. Thesignal is received by an antenna array in a scenario with interferenceand multipath propagation. These two effects are generallythe limiting error sources in most high-precision positioning applications.A new estimator of the code- and carrier-phases is derivedby using a simplified signal model and the maximum likelihood(ML) principle. The simplified model consists essentially ofgathering all signals, except for the direct one, in a component withunknown spatial correlation. The estimator exploits the knowledgeof the direction-of-arrival of the direct signal and is much simplerthan other estimators derived under more detailed signal models.Moreover, we present an iterative algorithm, that is adequate for apractical implementation and explores an interesting link betweenthe ML estimator and a hybrid beamformer. The mean squarederror and bias of the new estimator are computed for a numberof scenarios and compared with those of other methods. The presentedestimator and the hybrid beamforming outperform the existingtechniques of comparable complexity and attains, in manysituations, the Cramér–Rao lower bound of the problem at hand.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

El presente trabajo tiene por objetivo principal analizar tres funciones de perfil del fuste sobre tres clones de Populus x euramericana (Canadá Blanco, I-214 y MC) en la Comunidad Foral de Navarra para elaborar una tarifa de cubicación con clasificación de volumen. Para minimizar el efecto de la autocorrelación entre los residuos se emplea una estructura de error continua autorregresiva de orden 2 o de orden 3 en función del clon analizado. Por otra parte, se compara el coeficiente local de forma de cada uno de los clones estudiados mediante dos metodologías: el análisis de la varianza de la estimación individual de dicho coeficiente y el contraste del estadístico de máxima verosimilitud entre ajustes, resultando ser el clon Canadá el más cónico de los tres. Los datos utilizados provienen de 143 chopos de plantaciones coetáneas y con mismo marco de plantación (marco real de 4,5 × 4,5 m).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A BASIC computer program (REMOVAL) was developed to compute in a VAXNMS environment all the calculations of the removal method for population size estimation (catch-effort method for closed populations with constant sampling effort). The program follows the maximum likelihood methodology,checks the failure conditions, applies the appropriate formula, and displays the estimates of population size and catchability, with their standard deviations and coefficients of variation, and two goodness-of-fit statistics with their significance levels. Data of removal experiments for the cyprinodontid fish Aphanius iberus in the Alt Emporda wetlands are used to exemplify the use of the program

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Ever since the appearance of the ARCH model [Engle(1982a)], an impressive array of variance specifications belonging to the same class of models has emerged [i.e. Bollerslev's (1986) GARCH; Nelson's (1990) EGARCH]. This recent domain has achieved very successful developments. Nevertheless, several empirical studies seem to show that the performance of such models is not always appropriate [Boulier(1992)]. In this paper we propose a new specification: the Quadratic Moving Average Conditional heteroskedasticity model. Its statistical properties, such as the kurtosis and the symmetry, as well as two estimators (Method of Moments and Maximum Likelihood) are studied. Two statistical tests are presented, the first one tests for homoskedasticity and the second one, discriminates between ARCH and QMACH specification. A Monte Carlo study is presented in order to illustrate some of the theoretical results. An empirical study is undertaken for the DM-US exchange rate.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Hausman (1978) test is based on the vector of differences of two estimators. It is usually assumed that one of the estimators is fully efficient, since this simplifies calculation of the test statistic. However, this assumption limits the applicability of the test, since widely used estimators such as the generalized method of moments (GMM) or quasi maximum likelihood (QML) are often not fully efficient. This paper shows that the test may easily be implemented, using well-known methods, when neither estimator is efficient. To illustrate, we present both simulation results as well as empirical results for utilization of health care services.