983 resultados para Calibration estimators
Resumo:
In the fixed design regression model, additional weights areconsidered for the Nadaraya--Watson and Gasser--M\"uller kernel estimators.We study their asymptotic behavior and the relationships between new andclassical estimators. For a simple family of weights, and considering theIMSE as global loss criterion, we show some possible theoretical advantages.An empirical study illustrates the performance of the weighted estimatorsin finite samples.
Resumo:
The present work aims at knowing the faunal composition of drosophilids in forest areas of southern Brazil. Besides, estimation of species richness for this fauna is briefly discussed. The sampling were carried out in three well-preserved areas of the Atlantic Rain Forest in the State of Santa Catarina. In this study, 136,931 specimens were captured and 96.6% of them were identified in the specific level. The observed species richness (153 species) is the largest that has been registered in faunal inventories conducted in Brazil. Sixty-three of the captured species did not fit to the available descriptions, and we believe that most of them are non-described species. The incidence-based estimators tended to give rise to the largest richness estimates while the abundance based give rise to the smallest ones. Such estimators suggest the presence from 172.28 to 220.65 species in the studied area. Based on these values, from 69.35 to 88.81% of the expected species richness were sampled. We suggest that the large richness recorded in this study is a consequence of the large sampling effort, the capture method, recent advances in the taxonomy of drosophilids, the high preservation level and the large extension of the sampled fragment and the high complexity of the Atlantic Rain forest. Finally, our data set suggest that the employment of estimators of richness for drosophilid assemblages is useful but it requires caution.
Resumo:
This paper illustrates the philosophy which forms the basis of calibrationexercises in general equilibrium macroeconomic models and the details of theprocedure, the advantages and the disadvantages of the approach, with particularreference to the issue of testing ``false'' economic models. We provide anoverview of the most recent simulation--based approaches to the testing problemand compare them to standard econometric methods used to test the fit of non--lineardynamic general equilibrium models. We illustrate how simulation--based techniques can be used to formally evaluate the fit of a calibrated modelto the data and obtain ideas on how to improve the model design using a standardproblem in the international real business cycle literature, i.e. whether amodel with complete financial markets and no restrictions to capital mobility is able to reproduce the second order properties of aggregate savingand aggregate investment in an open economy.
Resumo:
A procedure using a chirobiotic V column is presented which allows separation of the enantiomers of citalopram and its two N-demethylated metabolites, and of the internal standard, alprenolol, in human plasma. Citalopram, demethylcitalopram and didemethylcitalopram, as well as the internal standard, were recovered from plasma by liquid-liquid extraction. The limits of quantification were found to be 5 ng/ml for each enantiomer of citalopram and demethylcitalopram, and 7.5 ng/ml for each enantiomer of didemethylcitalopram. Inter- and intra-day coefficients of variation varied from 2.4% to 8.6% for S- and R-citalopram, from 2.9% to 7.4% for S- and R-demethylcitalopram, and from 5.6% to 12.4% for S- and R- didemethylcitalopram. No interference was observed from endogenous compounds following the extraction of plasma samples from 10 different patients treated with citalopram. This method allows accurate quantification for each enantiomer and is, therefore, well suited for pharmacokinetic and drug interaction investigations. The presented method replaces a previously described highly sensitive and selective high-performance liquid chromatography procedure using an acetylated 3-cyclobond column which, because of manufactural problems, is no longer usable for the separation of the enantiomers of citalopram and its demethylated metabolites.
Resumo:
Statistical computing when input/output is driven by a Graphical User Interface is considered. A proposal is made for automatic control ofcomputational flow to ensure that only strictly required computationsare actually carried on. The computational flow is modeled by a directed graph for implementation in any object-oriented programming language with symbolic manipulation capabilities. A complete implementation example is presented to compute and display frequency based piecewise linear density estimators such as histograms or frequency polygons.
Resumo:
A previous study has shown the possibility to identify methane (CH4 ) using headspace-GC-MS and quantify it with a stable isotope as internal standard. The main drawback of the GC-MS methods discussed in literature for CH4 measurement is the absence of a specific internal standard necessary to perform quantification. However, it becomes essential to develop a safer method to limit the manipulation of gaseous CH4 and to precisely control the injected amount of gas for spiking and calibration by comparison with external calibration. To avoid the manipulation of a stable isotope-labeled gas, we have chosen to generate a labeled gas as an internal standard in a vial on the basis of the formation of CH4 by the reaction of Grignard reagent methylmagnesium chloride with deuterated water. This method allows precise measurement of CH4 concentrations in gaseous sample as well as in a solid or a liquid sample after a thermodesorption step in a headspace vial. A full accuracy profile validation of this method is then presented.
Resumo:
This work is part of a project studying the performance of model basedestimators in a small area context. We have chosen a simple statisticalapplication in which we estimate the growth rate of accupation for severalregions of Spain. We compare three estimators: the direct one based onstraightforward results from the survey (which is unbiassed), and a thirdone which is based in a statistical model and that minimizes the mean squareerror.
Resumo:
Abstract Accurate characterization of the spatial distribution of hydrological properties in heterogeneous aquifers at a range of scales is a key prerequisite for reliable modeling of subsurface contaminant transport, and is essential for designing effective and cost-efficient groundwater management and remediation strategies. To this end, high-resolution geophysical methods have shown significant potential to bridge a critical gap in subsurface resolution and coverage between traditional hydrological measurement techniques such as borehole log/core analyses and tracer or pumping tests. An important and still largely unresolved issue, however, is how to best quantitatively integrate geophysical data into a characterization study in order to estimate the spatial distribution of one or more pertinent hydrological parameters, thus improving hydrological predictions. Recognizing the importance of this issue, the aim of the research presented in this thesis was to first develop a strategy for the assimilation of several types of hydrogeophysical data having varying degrees of resolution, subsurface coverage, and sensitivity to the hydrologic parameter of interest. In this regard a novel simulated annealing (SA)-based conditional simulation approach was developed and then tested in its ability to generate realizations of porosity given crosshole ground-penetrating radar (GPR) and neutron porosity log data. This was done successfully for both synthetic and field data sets. A subsequent issue that needed to be addressed involved assessing the potential benefits and implications of the resulting porosity realizations in terms of groundwater flow and contaminant transport. This was investigated synthetically assuming first that the relationship between porosity and hydraulic conductivity was well-defined. Then, the relationship was itself investigated in the context of a calibration procedure using hypothetical tracer test data. Essentially, the relationship best predicting the observed tracer test measurements was determined given the geophysically derived porosity structure. Both of these investigations showed that the SA-based approach, in general, allows much more reliable hydrological predictions than other more elementary techniques considered. Further, the developed calibration procedure was seen to be very effective, even at the scale of tomographic resolution, for predictions of transport. This also held true at locations within the aquifer where only geophysical data were available. This is significant because the acquisition of hydrological tracer test measurements is clearly more complicated and expensive than the acquisition of geophysical measurements. Although the above methodologies were tested using porosity logs and GPR data, the findings are expected to remain valid for a large number of pertinent combinations of geophysical and borehole log data of comparable resolution and sensitivity to the hydrological target parameter. Moreover, the obtained results allow us to have confidence for future developments in integration methodologies for geophysical and hydrological data to improve the 3-D estimation of hydrological properties.
Resumo:
In this paper I explore the issue of nonlinearity (both in the datageneration process and in the functional form that establishes therelationship between the parameters and the data) regarding the poorperformance of the Generalized Method of Moments (GMM) in small samples.To this purpose I build a sequence of models starting with a simple linearmodel and enlarging it progressively until I approximate a standard (nonlinear)neoclassical growth model. I then use simulation techniques to find the smallsample distribution of the GMM estimators in each of the models.
Resumo:
Existing models of equilibrium unemployment with endogenous labor market participation are complex, generate procyclical unemployment rates and cannot match unemployment variability relative to GDP. We embed endogenous participation in a simple, tractable job market matching model, show analytically how variations in the participation rate are driven by the cross-sectional density of home productivity near the participation threshold, andhow this density translates into an extensive-margin labor supply elasticity. A calibration of the model to macro data not only matches employment and participation variabilities but also generates strongly countercyclical unemployment rates. With some wage rigidity the model also matches unemployment variations well. Furthermore, the labor supply elasticity implied by our calibration is consistent with microeconometric evidence for the US.
Resumo:
A class of composite estimators of small area quantities that exploit spatial (distancerelated)similarity is derived. It is based on a distribution-free model for the areas, but theestimators are aimed to have optimal design-based properties. Composition is applied alsoto estimate some of the global parameters on which the small area estimators depend.It is shown that the commonly adopted assumption of random effects is not necessaryfor exploiting the similarity of the districts (borrowing strength across the districts). Themethods are applied in the estimation of the mean household sizes and the proportions ofsingle-member households in the counties (comarcas) of Catalonia. The simplest version ofthe estimators is more efficient than the established alternatives, even though the extentof spatial similarity is quite modest.
Resumo:
Se presenta un nuevo modelo integrado de evaluación para el stock norte-centro de la anchoveta peruana que permite reconstruir y hacer un seguimiento de la estructura de longitudes del stock desde un modelo basado en edades. El modelo fue calibrado usando estimados acústicos de biomasa y estructuras de tallas provenientes de cruceros científicos y de desembarques de la pesquería. Para la calibración se utilizó un algoritmo evolutivo con diferentes funciones de aptitud para cada variable calibrada (biomasas y capturas). Se presentan los estimados mensuales de biomasa total, biomasa desovante, reclutamiento y mortalidad por pesca obtenidos por el modelo de evaluación integrada para el periodo 1964-2008. Se encontraron tres periodos cualitativamente distintos en la dinámica de anchoveta, entre 1961-1971, 1971-1991 y 1991 al presente, que se distinguen tanto por las biomasas medias anuales como por los niveles de reclutamiento observado.
Resumo:
Several estimators of the expectation, median and mode of the lognormal distribution are derived. They aim to be approximately unbiased, efficient, or have a minimax property in the class of estimators we introduce. The small-sample properties of these estimators are assessed by simulations and, when possible, analytically. Some of these estimators of the expectation are far more efficient than the maximum likelihood or the minimum-variance unbiased estimator, even for substantial samplesizes.
Resumo:
We study the statistical properties of three estimation methods for a model of learning that is often fitted to experimental data: quadratic deviation measures without unobserved heterogeneity, and maximum likelihood withand without unobserved heterogeneity. After discussing identification issues, we show that the estimators are consistent and provide their asymptotic distribution. Using Monte Carlo simulations, we show that ignoring unobserved heterogeneity can lead to seriously biased estimations in samples which have the typical length of actual experiments. Better small sample properties areobtained if unobserved heterogeneity is introduced. That is, rather than estimating the parameters for each individual, the individual parameters are considered random variables, and the distribution of those random variables is estimated.
Resumo:
Reductions in firing costs are often advocated as a way of increasingthe dynamism of labour markets in both developed and less developed countries. Evidence from Europe and the U.S. on the impact of firing costs has, however, been mixed. Moreover, legislative changes both in Europe and the U.S. have been limited. This paper, instead, examines the impact of the Colombian Labour Market Reform of 1990, which substantially reduced dismissal costs. I estimate the incidence of a reduction in firing costs on worker turnover by exploiting the temporal change in the Colombian labour legislation as well as the variability in coverage between formal and informal sector workers. Using a grouping estimator to control for common aggregate shocks and selection, I find that the exit hazard rates into and out of unemployment increased after the reform by over 1% for formal workers (covered by the legislation) relative to informal workers (uncovered). The increase of the hazards implies a net decrease in unemployment of a third of a percentage point, which accounts for about one quarter of the fall in unemployment during the period of study.