116 resultados para frequency estimation
Resumo:
We propose a method to estimate time invariant cyclical DSGE models using the informationprovided by a variety of filters. We treat data filtered with alternative procedures as contaminated proxies of the relevant model-based quantities and estimate structural and non-structuralparameters jointly using a signal extraction approach. We employ simulated data to illustratethe properties of the procedure and compare our conclusions with those obtained when just onefilter is used. We revisit the role of money in the transmission of monetary business cycles.
Resumo:
A new parametric minimum distance time-domain estimator for ARFIMA processes is introduced in this paper. The proposed estimator minimizes the sum of squared correlations of residuals obtained after filtering a series through ARFIMA parameters. The estimator iseasy to compute and is consistent and asymptotically normally distributed for fractionallyintegrated (FI) processes with an integration order d strictly greater than -0.75. Therefore, it can be applied to both stationary and non-stationary processes. Deterministic components are also allowed in the DGP. Furthermore, as a by-product, the estimation procedure provides an immediate check on the adequacy of the specified model. This is so because the criterion function, when evaluated at the estimated values, coincides with the Box-Pierce goodness of fit statistic. Empirical applications and Monte-Carlo simulations supporting the analytical results and showing the good performance of the estimator in finite samples are also provided.
Resumo:
A national survey designed for estimating a specific population quantity is sometimes used for estimation of this quantity also for a small area, such as a province. Budget constraints do not allow a greater sample size for the small area, and so other means of improving estimation have to be devised. We investigate such methods and assess them by a Monte Carlo study. We explore how a complementary survey can be exploited in small area estimation. We use the context of the Spanish Labour Force Survey (EPA) and the Barometer in Spain for our study.
Resumo:
This paper demonstrates that, unlike what the conventional wisdom says, measurement error biases in panel data estimation of convergence using OLS with fixed effects are huge, not trivial. It does so by way of the "skipping estimation"': taking data from every m years of the sample (where m is an integer greater than or equal to 2), as opposed to every single year. It is shown that the estimated speed of convergence from the OLS with fixed effects is biased upwards by as much as 7 to 15%.
Resumo:
Many dynamic revenue management models divide the sale period into a finite number of periods T and assume, invoking a fine-enough grid of time, that each period sees at most one booking request. These Poisson-type assumptions restrict the variability of the demand in the model, but researchers and practitioners were willing to overlook this for the benefit of tractability of the models. In this paper, we criticize this model from another angle. Estimating the discrete finite-period model poses problems of indeterminacy and non-robustness: Arbitrarily fixing T leads to arbitrary control values and on the other hand estimating T from data adds an additional layer of indeterminacy. To counter this, we first propose an alternate finite-population model that avoids this problem of fixing T and allows a wider range of demand distributions, while retaining the useful marginal-value properties of the finite-period model. The finite-population model still requires jointly estimating market size and the parameters of the customer purchase model without observing no-purchases. Estimation of market-size when no-purchases are unobservable has rarely been attempted in the marketing or revenue management literature. Indeed, we point out that it is akin to the classical statistical problem of estimating the parameters of a binomial distribution with unknown population size and success probability, and hence likely to be challenging. However, when the purchase probabilities are given by a functional form such as a multinomial-logit model, we propose an estimation heuristic that exploits the specification of the functional form, the variety of the offer sets in a typical RM setting, and qualitative knowledge of arrival rates. Finally we perform simulations to show that the estimator is very promising in obtaining unbiased estimates of population size and the model parameters.
Resumo:
This paper considers a job search model where the environment is notstationary along the unemployment spell and where jobs do not lastforever. Under this circumstance, reservation wages can be lower thanwithout separations, as in a stationary environment, but they can alsobe initially higher because of the non-stationarity of the model. Moreover,the time-dependence of reservation wages is stronger than with noseparations. The model is estimated structurally using Spanish data forthe period 1985-1996. The main finding is that, although the decrease inreservation wages is the main determinant of the change in the exit ratefrom unemployment for the first four months, later on the only effect comesfrom the job offer arrival rate, given that acceptance probabilities areroughly equal to one.
Resumo:
We study model selection strategies based on penalized empirical loss minimization. We point out a tight relationship between error estimation and data-based complexity penalization: any good error estimate may be converted into a data-based penalty function and the performance of the estimate is governed by the quality of the error estimate. We consider several penalty functions, involving error estimates on independent test data, empirical {\sc vc} dimension, empirical {\sc vc} entropy, andmargin-based quantities. We also consider the maximal difference between the error on the first half of the training data and the second half, and the expected maximal discrepancy, a closely related capacity estimate that can be calculated by Monte Carlo integration. Maximal discrepancy penalty functions are appealing for pattern classification problems, since their computation is equivalent to empirical risk minimization over the training data with some labels flipped.
Resumo:
Precise estimation of propagation parameters inprecipitation media is of interest to improve the performanceof communications systems and in remote sensing applications.In this paper, we present maximum-likelihood estimators ofspecific attenuation and specific differential phase in rain. Themodel used for obtaining the cited estimators assumes coherentpropagation, reflection symmetry of the medium, and Gaussianstatistics of the scattering matrix measurements. No assumptionsabout the microphysical properties of the medium are needed.The performance of the estimators is evaluated through simulateddata. Results show negligible estimators bias and variances closeto Cramer–Rao bounds.
Resumo:
This work proposes novel network analysis techniques for multivariate time series.We define the network of a multivariate time series as a graph where verticesdenote the components of the process and edges denote non zero long run partialcorrelations. We then introduce a two step LASSO procedure, called NETS, toestimate high dimensional sparse Long Run Partial Correlation networks. This approachis based on a VAR approximation of the process and allows to decomposethe long run linkages into the contribution of the dynamic and contemporaneousdependence relations of the system. The large sample properties of the estimatorare analysed and we establish conditions for consistent selection and estimation ofthe non zero long run partial correlations. The methodology is illustrated with anapplication to a panel of U.S. bluechips.
Resumo:
This editorial examines the actions undertaken in number 5 and announces two important changes for the numbers 6 and 7, namely frequency and languages. Furthermore, the use of some bibliometric indicators, such as impact factor and refusal rate, is critically analyzed.
Resumo:
The relief of the seafloor is an important source of data for many scientists. In this paper we present an optical system to deal with underwater 3D reconstruction. This system is formed by three cameras that take images synchronously in a constant frame rate scheme. We use the images taken by these cameras to compute dense 3D reconstructions. We use Bundle Adjustment to estimate the motion ofthe trinocular rig. Given the path followed by the system, we get a dense map of the observed scene by registering the different dense local reconstructions in a unique and bigger one
Resumo:
We report millimetre-wave continuum observations of the X-ray binaries Cygnus X-3, SS 433, LSI+61 303, Cygnus X-1 and GRS 1915+105. The observations were carried out with the IRAM 30 m-antenna at 250 GHz (1.25 mm) from 1998 March 14 to March 20. These millimetre measurements are complemented with centimetre observations from the Ryle Telescope, at 15 GHz (2.0 cm) and from the Green Bank Interferometer at 2.25 and 8.3 GHz (13 and 3.6 cm). Both Cygnus X-3 and SS 433 underwent moderate flaring events during our observations, whose main spectral evolution properties are described and interpreted. A significant spectral steepening was observed in both sources during the flare decay, that is likely to be caused by adiabatic expansion, inverse Compton and synchrotron losses. Finally, we also report 250 GHz upper limits for three additional undetected X-ray binary stars: LSI+65 010, LSI+61 235 and X Per.
Resumo:
Every year, flash floods cause economic losses and major problems for undertaking daily activity in the Catalonia region (NE Spain). Sometimes catastrophic damage and casualties occur. When a long term analysis of floods is undertaken, a question arises regarding the changing role of the vulnerability and the hazard in risk evolution. This paper sets out to give some information to deal with this question, on the basis of analysis of all the floods that have occurred in Barcelona county (Catalonia) since the 14th century, as well as the flooded area, urban evolution, impacts and the weather conditions for any of most severe events. With this objective, the identification and classification of historical floods, and characterisation of flash-floods among these, have been undertaken. Besides this, the main meteorological factors associated with recent flash floods in this city and neighbouring regions are well-known. On the other hand, the identification of rainfall trends that could explain the historical evolution of flood hazard occurrence in this city has been analysed. Finally, identification of the influence of urban development on the vulnerability to floods has been carried out. Barcelona city has been selected thanks to its long continuous data series (daily rainfall data series, since 1854; one of the longest rainfall rate series of Europe, since 1921) and for the accurate historical archive information that is available (since the Roman Empire for the urban evolution). The evolution of flood occurrence shows the existence of oscillations in the earlier and later modern-age periods that can be attributed to climatic variability, evolution of the perception threshold and changes in vulnerability. A great increase of vulnerability can be assumed for the period 1850¿1900. The analysis of the time evolution for the Barcelona rainfall series (1854¿2000) shows that no trend exists, although, due to changes in urban planning, flash-floods impact has altered over this time. The number of catastrophic flash floods has diminished, although the extraordinary ones have increased.
Measurement of cell microrheology by magnetic twisting cytometry with frequency domain demodulation.