48 resultados para Radiometric calibration


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the determinants of regional development using a newly constructed database of 1569 sub-national regions from 110 countries covering 74 percent of the world s surface and 97 percent of its GDP. We combine the cross-regional analysis of geographic, institutional, cultural, and human capital determinants of regional development with an examination of productivity in several thousand establishments located in these regions. To organize the discussion, we present a new model of regional development that introduces into a standard migration framework elements of both the Lucas (1978) model of the allocation of talent between entrepreneurship and work, and the Lucas (1988) model of human capital externalities. The evidence points to the paramount importance of human capital in accounting for regional differences in development, but also suggests from model estimation and calibration that entrepreneurial inputs and possibly human capital externalities help understand the data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The 1994 Northridge earthquake sent ripples to insurance conpanieseverywhere. This was one in a series of natural disasters such asHurricane Andrew which together with the problems in Lloyd's of Londonhave insurance companies running for cover. This paper presents a calibration of the U.S. economy in a model with financial markets forinsurance derivatives that suggests the U.S. economy can deal with thedamage of natural catastrophe far better than one might think.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we propose a simple and general model for computing the Ramsey optimal inflation tax, which includes several models from the previous literature as special cases. We show that it cannot be claimed that the Friedman rule is always optimal (or always non--optimal) on theoretical grounds. The Friedman rule is optimal or not, depending on conditions related to the shape of various relevant functions. One contribution of this paper is to relate these conditions to {\it measurable} variables such as the interest rate or the consumption elasticity of money demand. We find that it tends to be optimal to tax money when there are economies of scale in the demand for money (the scale elasticity is smaller than one) and/or when money is required for the payment of consumption or wage taxes. We find that it tends to be optimal to tax money more heavily when the interest elasticity of money demand is small. We present empirical evidence on the parameters that determine the optimal inflation tax. Calibrating the model to a variety of empirical studies yields a optimal nominal interest rate of less than 1\%/year, although that finding is sensitive to the calibration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Why was England first? And why Europe? We present a probabilistic model that builds on big-push models by Murphy, Shleifer and Vishny (1989), combined with hierarchical preferences. The interaction of exogenous demographic factors (in particular the English low-pressure variant of the European marriage pattern)and redistributive institutions such as the old Poor Law combined to make an Industrial Revolution more likely. Essentially, industrialization is the result of having a critical mass of consumers that is rich enough to afford (potentially) mass-produced goods. Our model is then calibrated to match the main characteristics of the English economy in 1750 and the observed transition until 1850.This allows us to address explicitly one of the key features of the British IndustrialRevolution unearthed by economic historians over the last three decades the slowness of productivity and output change. In our calibration, we find that the probability of Britain industrializing is 5 times larger than France s. Contrary to the recent argument by Pomeranz, China in the 18th century had essentially no chance to industrialize at all. This difference is decomposed into a demographic and a policy component, with the former being far more important than the latter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper illustrates the philosophy which forms the basis of calibrationexercises in general equilibrium macroeconomic models and the details of theprocedure, the advantages and the disadvantages of the approach, with particularreference to the issue of testing ``false'' economic models. We provide anoverview of the most recent simulation--based approaches to the testing problemand compare them to standard econometric methods used to test the fit of non--lineardynamic general equilibrium models. We illustrate how simulation--based techniques can be used to formally evaluate the fit of a calibrated modelto the data and obtain ideas on how to improve the model design using a standardproblem in the international real business cycle literature, i.e. whether amodel with complete financial markets and no restrictions to capital mobility is able to reproduce the second order properties of aggregate savingand aggregate investment in an open economy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Existing models of equilibrium unemployment with endogenous labor market participation are complex, generate procyclical unemployment rates and cannot match unemployment variability relative to GDP. We embed endogenous participation in a simple, tractable job market matching model, show analytically how variations in the participation rate are driven by the cross-sectional density of home productivity near the participation threshold, andhow this density translates into an extensive-margin labor supply elasticity. A calibration of the model to macro data not only matches employment and participation variabilities but also generates strongly countercyclical unemployment rates. With some wage rigidity the model also matches unemployment variations well. Furthermore, the labor supply elasticity implied by our calibration is consistent with microeconometric evidence for the US.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

How did Europe escape the "Iron Law of Wages?" We construct a simple Malthusian model withtwo sectors and multiple steady states, and use it to explain why European per capita incomes andurbanization rates increased during the period 1350-1700. Productivity growth can only explain a smallfraction of the rise in output per capita. Population dynamics changes of the birth and death schedules were far more important determinants of steady states. We show how a major shock to population cantrigger a transition to a new steady state with higher per-capita income. The Black Death was such ashock, raising wages substantially. Because of Engel's Law, demand for urban products increased, andurban centers grew in size. European cities were unhealthy, and rising urbanization pushed up aggregatedeath rates. This effect was reinforced by diseases spread through war, financed by higher tax revenues.In addition, rising trade also spread diseases. In this way higher wages themselves reduced populationpressure. We show in a calibration exercise that our model can account for the sustained rise in Europeanurbanization as well as permanently higher per capita incomes in 1700, without technological change.Wars contributed importantly to the "Rise of Europe", even if they had negative short-run effects. We thustrace Europe s precocious rise to economic riches to interactions of the plague shock with the belligerentpolitical environment and the nature of cities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Remote sensing spatial, spectral, and temporal resolutions of images, acquired over a reasonably sized image extent, result in imagery that can be processed to represent land cover over large areas with an amount of spatial detail that is very attractive for monitoring, management, and scienti c activities. With Moore's Law alive and well, more and more parallelism is introduced into all computing platforms, at all levels of integration and programming to achieve higher performance and energy e ciency. Being the geometric calibration process one of the most time consuming processes when using remote sensing images, the aim of this work is to accelerate this process by taking advantage of new computing architectures and technologies, specially focusing in exploiting computation over shared memory multi-threading hardware. A parallel implementation of the most time consuming process in the remote sensing geometric correction has been implemented using OpenMP directives. This work compares the performance of the original serial binary versus the parallelized implementation, using several multi-threaded modern CPU architectures, discussing about the approach to nd the optimum hardware for a cost-e ective execution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This letter discusses the detection and correction ofresidual motion errors that appear in airborne synthetic apertureradar (SAR) interferograms due to the lack of precision in the navigationsystem. As it is shown, the effect of this lack of precision istwofold: azimuth registration errors and phase azimuth undulations.Up to now, the correction of the former was carried out byestimating the registration error and interpolating, while the latterwas based on the estimation of the phase azimuth undulations tocompensate the phase of the computed interferogram. In this letter,a new correction method is proposed, which avoids the interpolationstep and corrects at the same time the azimuth phase undulations.Additionally, the spectral diversity technique, used to estimateregistration errors, is critically analyzed. Airborne L-bandrepeat-pass interferometric data of the German Aerospace Center(DLR) experimental airborne SAR is used to validate the method

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The impact of topography and mixed pixels on L-band radiometric observations over land needs to be quantified to improve the accuracy of soil moisture retrievals. For this purpose, a series of simulations has been performed with an improved version of the soil moisture and ocean salinity (SMOS) end-to-end performance simulator (SEPS). The brightness temperature generator of SEPS has been modified to include a 100-m-resolution land cover map and a 30-m-resolution digital elevation map of Catalonia (northeast of Spain). This high-resolution generator allows the assessment of the errors in soil moisture retrieval algorithms due to limited spatial resolution and provides a basis for the development of pixel disaggregation techniques. Variation of the local incidence angle, shadowing, and atmospheric effects (up- and downwelling radiation) due to surface topography has been analyzed. Results are compared to brightness temperatures that are computed under the assumption of an ellipsoidal Earth.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new statistical parallax method using the Maximum Likelihood principle is presented, allowing the simultaneous determination of a luminosity calibration, kinematic characteristics and spatial distribution of a given sample. This method has been developed for the exploitation of the Hipparcos data and presents several improvements with respect to the previous ones: the effects of the selection of the sample, the observational errors, the galactic rotation and the interstellar absorption are taken into account as an intrinsic part of the formulation (as opposed to external corrections). Furthermore, the method is able to identify and characterize physically distinct groups in inhomogeneous samples, thus avoiding biases due to unidentified components. Moreover, the implementation used by the authors is based on the extensive use of numerical methods, so avoiding the need for simplification of the equations and thus the bias they could introduce. Several examples of application using simulated samples are presented, to be followed by applications to real samples in forthcoming articles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new statistical parallax method using the Maximum Likelihood principle is presented, allowing the simultaneous determination of a luminosity calibration, kinematic characteristics and spatial distribution of a given sample. This method has been developed for the exploitation of the Hipparcos data and presents several improvements with respect to the previous ones: the effects of the selection of the sample, the observational errors, the galactic rotation and the interstellar absorption are taken into account as an intrinsic part of the formulation (as opposed to external corrections). Furthermore, the method is able to identify and characterize physically distinct groups in inhomogeneous samples, thus avoiding biases due to unidentified components. Moreover, the implementation used by the authors is based on the extensive use of numerical methods, so avoiding the need for simplification of the equations and thus the bias they could introduce. Several examples of application using simulated samples are presented, to be followed by applications to real samples in forthcoming articles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An instrument designed to measure thermal conductivity of consolidated rocks, dry or saturated, using a transient method is presented. The instrument measures relative values of the thermal conductivity, and it needs calibration to obtain absolute values. The device can be used as heat pulse line source and as continuous heat line source. Two parameters to determine thermal conductivity are proposed: TMAX, in heat pulse line source, and SLOPE, in continuous heat line source. Its performance is better, and the operation simpler, in heat pulse line-source mode with a measuring time of 170 s and a reproducibility better than 2.5%. The sample preparation is very simple on both modes. The performance has been tested with a set of ten rocks with thermal conductivity values between 1.4 and 5.2 W m¿1 K¿1 which covers the usual range for consolidated rocks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We have developed a differential scanning calorimeter capable of working under applied magnetic fields of up to 5 T. The calorimeter is highly sensitive and operates over the temperature range 10¿300 K. It is shown that, after a proper calibration, the system enables determination of the latent heat and entropy changes in first-order solid¿solid phase transitions. The system is particularly useful for investigating materials that exhibit the giant magnetocaloric effect arising from a magnetostructural phase transition. Data for Gd5(Si0.1Ge0.9)4 are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe the design, calibration, and performance of surface forces apparatus with the capability of illumination of the contact interface for spectroscopic investigation using optical techniques. The apparatus can be placed in the path of a Nd-YAG laser for studies of the linear response or the second harmonic and sum-frequency generation from a material confined between the two surfaces. In addition to the standard fringes of equal chromatic order technique, which we have digitized for accurate and fast analysis, the distance of separation can be measured with a fiber-optic interferometer during spectroscopic measurements (2 Å resolution and 10 ms response time). The sample approach is accomplished through application of a motor drive, piezoelectric actuator, or electromagnetic lever deflection for variable degrees of range, sensitivity, and response time. To demonstrate the operation of the instrument, the stepwise expulsion of discrete layers of octamethylcyclotetrasiloxane from the contact is shown. Lateral forces may also be studied by using piezoelectric bimorphs to induce and direct the motion of one surface.