61 resultados para Fantôme de calibration


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we propose a simple and general model for computing the Ramsey optimal inflation tax, which includes several models from the previous literature as special cases. We show that it cannot be claimed that the Friedman rule is always optimal (or always non--optimal) on theoretical grounds. The Friedman rule is optimal or not, depending on conditions related to the shape of various relevant functions. One contribution of this paper is to relate these conditions to {\it measurable} variables such as the interest rate or the consumption elasticity of money demand. We find that it tends to be optimal to tax money when there are economies of scale in the demand for money (the scale elasticity is smaller than one) and/or when money is required for the payment of consumption or wage taxes. We find that it tends to be optimal to tax money more heavily when the interest elasticity of money demand is small. We present empirical evidence on the parameters that determine the optimal inflation tax. Calibrating the model to a variety of empirical studies yields a optimal nominal interest rate of less than 1\%/year, although that finding is sensitive to the calibration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Why was England first? And why Europe? We present a probabilistic model that builds on big-push models by Murphy, Shleifer and Vishny (1989), combined with hierarchical preferences. The interaction of exogenous demographic factors (in particular the English low-pressure variant of the European marriage pattern)and redistributive institutions such as the old Poor Law combined to make an Industrial Revolution more likely. Essentially, industrialization is the result of having a critical mass of consumers that is rich enough to afford (potentially) mass-produced goods. Our model is then calibrated to match the main characteristics of the English economy in 1750 and the observed transition until 1850.This allows us to address explicitly one of the key features of the British IndustrialRevolution unearthed by economic historians over the last three decades the slowness of productivity and output change. In our calibration, we find that the probability of Britain industrializing is 5 times larger than France s. Contrary to the recent argument by Pomeranz, China in the 18th century had essentially no chance to industrialize at all. This difference is decomposed into a demographic and a policy component, with the former being far more important than the latter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper illustrates the philosophy which forms the basis of calibrationexercises in general equilibrium macroeconomic models and the details of theprocedure, the advantages and the disadvantages of the approach, with particularreference to the issue of testing ``false'' economic models. We provide anoverview of the most recent simulation--based approaches to the testing problemand compare them to standard econometric methods used to test the fit of non--lineardynamic general equilibrium models. We illustrate how simulation--based techniques can be used to formally evaluate the fit of a calibrated modelto the data and obtain ideas on how to improve the model design using a standardproblem in the international real business cycle literature, i.e. whether amodel with complete financial markets and no restrictions to capital mobility is able to reproduce the second order properties of aggregate savingand aggregate investment in an open economy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Existing models of equilibrium unemployment with endogenous labor market participation are complex, generate procyclical unemployment rates and cannot match unemployment variability relative to GDP. We embed endogenous participation in a simple, tractable job market matching model, show analytically how variations in the participation rate are driven by the cross-sectional density of home productivity near the participation threshold, andhow this density translates into an extensive-margin labor supply elasticity. A calibration of the model to macro data not only matches employment and participation variabilities but also generates strongly countercyclical unemployment rates. With some wage rigidity the model also matches unemployment variations well. Furthermore, the labor supply elasticity implied by our calibration is consistent with microeconometric evidence for the US.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

How did Europe escape the "Iron Law of Wages?" We construct a simple Malthusian model withtwo sectors and multiple steady states, and use it to explain why European per capita incomes andurbanization rates increased during the period 1350-1700. Productivity growth can only explain a smallfraction of the rise in output per capita. Population dynamics changes of the birth and death schedules were far more important determinants of steady states. We show how a major shock to population cantrigger a transition to a new steady state with higher per-capita income. The Black Death was such ashock, raising wages substantially. Because of Engel's Law, demand for urban products increased, andurban centers grew in size. European cities were unhealthy, and rising urbanization pushed up aggregatedeath rates. This effect was reinforced by diseases spread through war, financed by higher tax revenues.In addition, rising trade also spread diseases. In this way higher wages themselves reduced populationpressure. We show in a calibration exercise that our model can account for the sustained rise in Europeanurbanization as well as permanently higher per capita incomes in 1700, without technological change.Wars contributed importantly to the "Rise of Europe", even if they had negative short-run effects. We thustrace Europe s precocious rise to economic riches to interactions of the plague shock with the belligerentpolitical environment and the nature of cities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Remote sensing spatial, spectral, and temporal resolutions of images, acquired over a reasonably sized image extent, result in imagery that can be processed to represent land cover over large areas with an amount of spatial detail that is very attractive for monitoring, management, and scienti c activities. With Moore's Law alive and well, more and more parallelism is introduced into all computing platforms, at all levels of integration and programming to achieve higher performance and energy e ciency. Being the geometric calibration process one of the most time consuming processes when using remote sensing images, the aim of this work is to accelerate this process by taking advantage of new computing architectures and technologies, specially focusing in exploiting computation over shared memory multi-threading hardware. A parallel implementation of the most time consuming process in the remote sensing geometric correction has been implemented using OpenMP directives. This work compares the performance of the original serial binary versus the parallelized implementation, using several multi-threaded modern CPU architectures, discussing about the approach to nd the optimum hardware for a cost-e ective execution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This letter discusses the detection and correction ofresidual motion errors that appear in airborne synthetic apertureradar (SAR) interferograms due to the lack of precision in the navigationsystem. As it is shown, the effect of this lack of precision istwofold: azimuth registration errors and phase azimuth undulations.Up to now, the correction of the former was carried out byestimating the registration error and interpolating, while the latterwas based on the estimation of the phase azimuth undulations tocompensate the phase of the computed interferogram. In this letter,a new correction method is proposed, which avoids the interpolationstep and corrects at the same time the azimuth phase undulations.Additionally, the spectral diversity technique, used to estimateregistration errors, is critically analyzed. Airborne L-bandrepeat-pass interferometric data of the German Aerospace Center(DLR) experimental airborne SAR is used to validate the method

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new statistical parallax method using the Maximum Likelihood principle is presented, allowing the simultaneous determination of a luminosity calibration, kinematic characteristics and spatial distribution of a given sample. This method has been developed for the exploitation of the Hipparcos data and presents several improvements with respect to the previous ones: the effects of the selection of the sample, the observational errors, the galactic rotation and the interstellar absorption are taken into account as an intrinsic part of the formulation (as opposed to external corrections). Furthermore, the method is able to identify and characterize physically distinct groups in inhomogeneous samples, thus avoiding biases due to unidentified components. Moreover, the implementation used by the authors is based on the extensive use of numerical methods, so avoiding the need for simplification of the equations and thus the bias they could introduce. Several examples of application using simulated samples are presented, to be followed by applications to real samples in forthcoming articles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new statistical parallax method using the Maximum Likelihood principle is presented, allowing the simultaneous determination of a luminosity calibration, kinematic characteristics and spatial distribution of a given sample. This method has been developed for the exploitation of the Hipparcos data and presents several improvements with respect to the previous ones: the effects of the selection of the sample, the observational errors, the galactic rotation and the interstellar absorption are taken into account as an intrinsic part of the formulation (as opposed to external corrections). Furthermore, the method is able to identify and characterize physically distinct groups in inhomogeneous samples, thus avoiding biases due to unidentified components. Moreover, the implementation used by the authors is based on the extensive use of numerical methods, so avoiding the need for simplification of the equations and thus the bias they could introduce. Several examples of application using simulated samples are presented, to be followed by applications to real samples in forthcoming articles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An instrument designed to measure thermal conductivity of consolidated rocks, dry or saturated, using a transient method is presented. The instrument measures relative values of the thermal conductivity, and it needs calibration to obtain absolute values. The device can be used as heat pulse line source and as continuous heat line source. Two parameters to determine thermal conductivity are proposed: TMAX, in heat pulse line source, and SLOPE, in continuous heat line source. Its performance is better, and the operation simpler, in heat pulse line-source mode with a measuring time of 170 s and a reproducibility better than 2.5%. The sample preparation is very simple on both modes. The performance has been tested with a set of ten rocks with thermal conductivity values between 1.4 and 5.2 W m¿1 K¿1 which covers the usual range for consolidated rocks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We have developed a differential scanning calorimeter capable of working under applied magnetic fields of up to 5 T. The calorimeter is highly sensitive and operates over the temperature range 10¿300 K. It is shown that, after a proper calibration, the system enables determination of the latent heat and entropy changes in first-order solid¿solid phase transitions. The system is particularly useful for investigating materials that exhibit the giant magnetocaloric effect arising from a magnetostructural phase transition. Data for Gd5(Si0.1Ge0.9)4 are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe the design, calibration, and performance of surface forces apparatus with the capability of illumination of the contact interface for spectroscopic investigation using optical techniques. The apparatus can be placed in the path of a Nd-YAG laser for studies of the linear response or the second harmonic and sum-frequency generation from a material confined between the two surfaces. In addition to the standard fringes of equal chromatic order technique, which we have digitized for accurate and fast analysis, the distance of separation can be measured with a fiber-optic interferometer during spectroscopic measurements (2 Å resolution and 10 ms response time). The sample approach is accomplished through application of a motor drive, piezoelectric actuator, or electromagnetic lever deflection for variable degrees of range, sensitivity, and response time. To demonstrate the operation of the instrument, the stepwise expulsion of discrete layers of octamethylcyclotetrasiloxane from the contact is shown. Lateral forces may also be studied by using piezoelectric bimorphs to induce and direct the motion of one surface.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We have developed a differential scanning calorimeter capable of working under applied magnetic fields of up to 5 T. The calorimeter is highly sensitive and operates over the temperature range 10¿300 K. It is shown that, after a proper calibration, the system enables determination of the latent heat and entropy changes in first-order solid¿solid phase transitions. The system is particularly useful for investigating materials that exhibit the giant magnetocaloric effect arising from a magnetostructural phase transition. Data for Gd5(Si0.1Ge0.9)4 are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Application of semi-distributed hydrological models to large, heterogeneous watersheds deals with several problems. On one hand, the spatial and temporal variability in catchment features should be adequately represented in the model parameterization, while maintaining the model complexity in an acceptable level to take advantage of state-of-the-art calibration techniques. On the other hand, model complexity enhances uncertainty in adjusted model parameter values, therefore increasing uncertainty in the water routing across the watershed. This is critical for water quality applications, where not only streamflow, but also a reliable estimation of the surface versus subsurface contributions to the runoff is needed. In this study, we show how a regularized inversion procedure combined with a multiobjective function calibration strategy successfully solves the parameterization of a complex application of a water quality-oriented hydrological model. The final value of several optimized parameters showed significant and consistentdifferences across geological and landscape features. Although the number of optimized parameters was significantly increased by the spatial and temporal discretization of adjustable parameters, the uncertainty in water routing results remained at reasonable values. In addition, a stepwise numerical analysis showed that the effects on calibration performance due to inclusion of different data types in the objective function could be inextricably linked. Thus caution should be taken when adding or removing data from an aggregated objective function.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En la actualidad, las cooperativas recolectan, seleccionan, tratan y separan la fruta según su calibre (peso, diámetro máximo, medio y/o mínimo) para que esta llegue al consumidor final según la categoría (calibre). Para poder competir en un mercado cada vez más exigente en calidad y precios, se requieren sistemas de clasificación automáticos que nos permitan obtener óptimos resultados con altos niveles de producción y productividad. Para realizar estas tareas existen calibradoras industriales que pesan la fruta mediante células de carga y con el peso obtenido las clasifican asignando las piezas a su salida correspondiente (mesa de confección) a través de un sistema de electroimanes. Desafortunadamente el proceso de calibración de la fruta por peso no es en absoluto fiable ya que en este proceso no se considera el grosor de la piel, contenido de agua, de azúcar u otros factores altamente relevantes que influyen considerablemente en los resultados finales. El objeto de este proyecto es el de evolucionar las existentes calibradoras de fruta instalando un sistema industrial de visión artificial (rápido y robusto) que trabaje en un rango de espectro Infrarrojo (mayor fiabilidad) proporcionando óptimos resultados finales en la clasificación de las frutas, verduras y hortalizas. De este modo, el presente proyecto ofrece la oportunidad de mejorar el rendimiento de la línea de clasificación de frutas, aumentando la velocidad, disminuyendo pérdidas en tiempo y error humano y mejorando indiscutiblemente la calidad del producto final deseada por los consumidores.