988 resultados para Fantôme de calibration


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We show a standard model where the optimal tax reform is to cut labor taxes and leave capital taxes very high in the short and medium run. Only in the very long run would capital taxes be zero. Our model is a version of Chamley??s, with heterogeneous agents, without lump sum transfers, an upper bound on capital taxes, and a focus on Pareto improving plans. For our calibration labor taxes should be low for the first ten to twenty years, while capital taxes should be at their maximum. This policy ensures that all agents benefit from the tax reform and that capital grows quickly after when the reform begins. Therefore, the long run optimal tax mix is the opposite from the short and medium run tax mix. The initial labor tax cut is financed by deficits that lead to a positive long run level of government debt, reversing the standard prediction that government accumulates savings in models with optimal capital taxes. If labor supply is somewhat elastic benefits from tax reform are high and they can be shifted entirely to capitalists or workers by varying the length of the transition. With inelastic labor supply there is an increasing part of the equilibrium frontier, this means that the scope for benefitting the workers is limited and the total benefits from reforming taxes are much lower.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En este proyecto, lo primero que hemos hecho ha sido desarrollar un algoritmo en Matlab que implementara el método de calibración TRL, el funcionamiento del cual hemos comprobado en primera instancia mediante simulaciones y, posteriormente, mediante un ejemplo real. Posteriormente, hemos desarrollado otro algoritmo en Matlab para implementar el método de calibración LRM. Este algoritmo sólo lo hemos podido comprobar a nivel de simulación. A continuación, mediante los dos algoritmos, hemos realizado una comparación entre ambos sistemas de calibración a través de simulaciones. Finalmente, analizando los resultados de varias simulaciones calibradas con nuestro programa del método TRL, hemos buscado cuáles pueden ser los motivos para la aparición de picos indeseados y hemos encontrado uno de ellos.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Projecte de recerca elaborat a partir d’una estada a la National Oceanography Centre of Southampton (NOCS), Gran Bretanya, entre maig i juliol del 2006. La possibilitat d’obtenir una estimació precissa de la salinitat marina (SSS) és important per a investigar i predir l’extensió del fenòmen del canvi climàtic. La missió Soil Moisture and Ocean Salinity (SMOS) va ser seleccionada per l’Agència Espacial Europea (ESA) per a obtenir mapes de salinitat de la superfície marina a escala global i amb un temps de revisita petit. Abans del llençament de SMOS es preveu l’anàlisi de la variabilitat horitzontal de la SSS i del potencial de les dades recuperades a partir de mesures de SMOS per a reproduir comportaments oceanogràfics coneguts. L’objectiu de tot plegat és emplenar el buit existent entre les fonts de dades d’entrada/auxiliars fiables i les eines desenvolupades per a simular i processar les dades adquirides segons la configuració de SMOS. El SMOS End-to-end Performance Simulator (SEPS) és un simulador adhoc desenvolupat per la Universitat Politècnica de Catalunya (UPC) per a generar dades segons la configuració de SMOS. Es va utilitzar dades d’entrada a SEPS procedents del projecte Ocean Circulation and Climate Advanced Modeling (OCCAM), utilitzat al NOCS, a diferents resolucions espacials. Modificant SEPS per a poder fer servir com a entrada les dades OCCAM es van obtenir dades de temperatura de brillantor simulades durant un mes amb diferents observacions ascendents que cobrien la zona seleccionada. Les tasques realitzades durant l’estada a NOCS tenien la finalitat de proporcionar una tècnica fiable per a realitzar la calibració externa i per tant cancel•lar el bias, una metodologia per a promitjar temporalment les diferents adquisicions durant les observacions ascendents, i determinar la millor configuració de la funció de cost abans d’explotar i investigar les posibiltats de les dades SEPS/OCCAM per a derivar la SSS recuperada amb patrons d’alta resolució.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Per tal de reconstruir i obtenir una evolució climàtica a partir de les temperatures superficials marines dels últims dos milions d’anys al Corrent de Benguela (costa oest sud-africana) s’han analitzat 60 mostres del testimoni amb ODP 175-1084. Per fer-ho, s’ha utilitzat la nova calibració de l’índex TEX86 (Kim et al., 2007) i s’han representat els resultats juntament amb els valors obtinguts amb altres índexs referents a les temperatures mitjanes anuals de l’aire (MAAT) i el grau d’aportació sedimentària d’origen continental als sediments marins (BIT). També s’han comparat amb els registres de temperatura d’altres estudis en la mateixa àrea obtinguts a partir de proxies diferents. Els resultats del TEX86 mostren certa concordança amb alguns dels valors obtinguts en altres estudis i proposen hipòtesis que relacionen de manera directa els tres índexs calculats.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Labour market regulations aimed at enhancing job-security are dominant in several OECD countries. These regulations seek to reduce dismissals of workers and fluctuations in employment. The main theoretical contribution is to gauge the effects of such regulations on labour demand across establishment sizes. In order to achieve this, we investigate an optimising model of labour demand under uncertainty through the application of real option theory. We also consider other forms of employment which increase the flexibility of the labour market. In particular, we are modelling the contribution of temporary employment agencies (Zeitarbeit) allowing for quick personnel adjustments in client firms. The calibration results indicate that labour market rigidities may be crucial for understanding sluggishness in firms´ labour demand and the emergence and growth of temporary work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The level of information provided by ink evidence to the criminal and civil justice system is limited. The limitations arise from the weakness of the interpretative framework currently used, as proposed in the ASTM 1422-05 and 1789-04 on ink analysis. It is proposed to use the likelihood ratio from the Bayes theorem to interpret ink evidence. Unfortunately, when considering the analytical practices, as defined in the ASTM standards on ink analysis, it appears that current ink analytical practices do not allow for the level of reproducibility and accuracy required by a probabilistic framework. Such framework relies on the evaluation of the statistics of the ink characteristics using an ink reference database and the objective measurement of similarities between ink samples. A complete research programme was designed to (a) develop a standard methodology for analysing ink samples in a more reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in a forensic context. This report focuses on the first of the three stages. A calibration process, based on a standard dye ladder, is proposed to improve the reproducibility of ink analysis by HPTLC, when these inks are analysed at different times and/or by different examiners. The impact of this process on the variability between the repetitive analyses of ink samples in various conditions is studied. The results show significant improvements in the reproducibility of ink analysis compared to traditional calibration methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We introduce duration dependent skill decay among the unemployed into a New-Keynesian model with hiring frictions developed by Blanchard/Gali (2008). If the central bank responds only to (current, lagged or expected future) inflation and quarterly skill decay is above a threshold level, determinacy requires a coefficient on inflation smaller than one. The threshold level is plausible with little steady-state hiring and firing ("Continental European Calibration") but implausibly high in the opposite case ("American calibration"). Neither interest rate smoothing nor responding to the output gap helps to restore determinacy if skill decay exceeds the threshold level. However, a modest response to unemployment guarantees determinacy. Moreover, under indeterminacy, both an adverse sunspot shock and an adverse technology shock increase unemployment extremely persistently.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: An LC-MS/MS method has been developed for the simultaneous quantification of P-glycoprotein (P-gp) and cytochrome P450 (CYP) probe substrates and their Phase I metabolites in DBS and plasma. P-gp (fexofenadine) and CYP-specific substrates (caffeine for CYP1A2, bupropion for CYP2B6, flurbiprofen for CYP2C9, omeprazole for CYP2C19, dextromethorphan for CYP2D6 and midazolam for CYP3A4) and their metabolites were extracted from DBS (10 µl) using methanol. Analytes were separated on a reversed-phase LC column followed by SRM detection within a 6 min run time. RESULTS: The method was fully validated over the expected clinical concentration range for all substances tested, in both DBS and plasma. The method has been successfully applied to a PK study where healthy male volunteers received a low dose cocktail of the here described P-gp and CYP probes. Good correlation was observed between capillary DBS and venous plasma drug concentrations. CONCLUSION: Due to its low-invasiveness, simple sample collection and minimal sample preparation, DBS represents a suitable method to simultaneously monitor in vivo activities of P-gp and CYP.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A novel approach to measure carbon dioxide (CO2) in gaseous samples, based on a precise and accurate quantification by (13)CO2 internal standard generated in situ is presented. The main goal of this study was to provide an innovative headspace-gas chromatography-mass spectrometry (HS-GC-MS) method applicable in the routine determination of CO2. The main drawback of the GC methods discussed in the literature for CO2 measurement is the lack of a specific internal standard necessary to perform quantification. CO2 measurement is still quantified by external calibration without taking into account analytical problems which can often occur considering gaseous samples. To avoid the manipulation of a stable isotope-labeled gas, we have chosen to generate in situ an internal labeled standard gas ((13)CO2) on the basis of the stoichiometric formation of CO2 by the reaction of hydrochloric acid (HCl) with sodium hydrogen carbonate (NaH(13)CO3). This method allows a precise measurement of CO2 concentration and was validated on various human postmortem gas samples in order to study its efficiency.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we examine the importance of imperfect competition in product and labour markets in determining the long-run welfare e¤ects of tax reforms assuming agent heterogeneneity in capital hold- ings. Each of these market failures, independently, results in welfare losses for at least a segment of the population, after a capital tax cut and a concurrent labour tax increase. However, when combined in a realistic calibration to the UK economy, they imply that a capital tax cut will be Pareto improving in the long run. Consistent with the the- ory of second-best, the two distortions in this context work to correct the negative distributional e¤ects of a capital tax cut that each one, on its own, creates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a DSGE model in which long run inflation risk matters for social welfare. Aggregate and welfare effects of long run inflation risk are assessed under two monetary regimes: inflation targeting (IT) and price-level targeting (PT). These effects differ because IT implies base-level drift in the price level, while PT makes the price level stationary around a target price path. Under IT, the welfare cost of long run inflation risk is equal to 0.35 percent of aggregate consumption. Under PT, where long run inflation risk is largely eliminated, it is lowered to only 0.01 per cent. There are welfare gains from PT because it raises average consumption for the young and lowers consumption risk substantially for the old. These results are strongly robust to changes in the PT target horizon and fairly robust to imperfect credibility, fiscal policy, and model calibration. While the distributional effects of an unexpected transition to PT are sizeable, they are short-lived and not welfare-reducing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a DSGE model in which long run inflation risk matters for social welfare. Optimal indexation of long-term government debt is studied under two monetary policy regimes: inflation targeting (IT) and price-level targeting (PT). Under IT, full indexation is optimal because long run inflation risk is substantial due to base-level drift, making indexed bonds a much better store of value than nominal bonds. Under PT, where long run inflation risk is largely eliminated, optimal indexation is substantially lower because nominal bonds become a better store of value relative to indexed bonds. These results are robust to the PT target horizon, imperfect credibility of PT and model calibration, but the assumption that indexation is lagged is crucial. From a policy perspective, a key finding is that accounting for optimal indexation has important welfare implications for comparisons of IT and PT.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

I develop a model of endogenous bounded rationality due to search costs, arising implicitly from the problems complexity. The decision maker is not required to know the entire structure of the problem when making choices but can think ahead, through costly search, to reveal more of it. However, the costs of search are not assumed exogenously; they are inferred from revealed preferences through her choices. Thus, bounded rationality and its extent emerge endogenously: as problems become simpler or as the benefits of deeper search become larger relative to its costs, the choices more closely resemble those of a rational agent. For a fixed decision problem, the costs of search will vary across agents. For a given decision maker, they will vary across problems. The model explains, therefore, why the disparity, between observed choices and those prescribed under rationality, varies across agents and problems. It also suggests, under reasonable assumptions, an identifying prediction: a relation between the benefits of deeper search and the depth of the search. As long as calibration of the search costs is possible, this can be tested on any agent-problem pair. My approach provides a common framework for depicting the underlying limitations that force departures from rationality in different and unrelated decision-making situations. Specifically, I show that it is consistent with violations of timing independence in temporal framing problems, dynamic inconsistency and diversification bias in sequential versus simultaneous choice problems, and with plausible but contrasting risk attitudes across small- and large-stakes gambles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lean meat percentage (LMP) is an important carcass quality parameter. The aim of this work is to obtain a calibration equation for the Computed Tomography (CT) scans with the Partial Least Square Regression (PLS) technique in order to predict the LMP of the carcass and the different cuts and to study and compare two different methodologies of the selection of the variables (Variable Importance for Projection — VIP- and Stepwise) to be included in the prediction equation. The error of prediction with cross-validation (RMSEPCV) of the LMP obtained with PLS and selection based on VIP value was 0.82% and for stepwise selection it was 0.83%. The prediction of the LMP scanning only the ham had a RMSEPCV of 0.97% and if the ham and the loin were scanned the RMSEPCV was 0.90%. Results indicate that for CT data both VIP and stepwise selection are good methods. Moreover the scanning of only the ham allowed us to obtain a good prediction of the LMP of the whole carcass.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modern sonic logging tools designed for shallow environmental and engineering applications allow for P-wave phase velocity measurements over a wide frequency band. Methodological considerations indicate that, for saturated unconsolidated sediments in the silt to sand range and source frequencies ranging from approximately 1 to 30 kHz, the observable poro-elastic P-wave velocity dispersion is sufficiently pronounced to allow for reliable first-order estimations of the underlying permeability structure. These predictions have been tested on and verified for a surficial alluvial aquifer. Our results indicate that, even without any further calibration, the thus obtained permeability estimates as well as their variabilities within the pertinent lithological units are remarkably close to those expected based on the corresponding granulometric characteristics.