814 resultados para swd: Benchmark
Resumo:
From the classical gold standard up to the current ERM2 arrangement of the European Union, target zones have been a widely used exchange regime in contemporary history. This paper presents a benchmark model that rationalizes the choice of target zones over the rest of regimes: the fixed rate, the free float and the managed float. It is shown that the monetary authority may gain efficiency by reducing volatility of both the exchange rate and the interest rate at the same time. Furthermore, the model is consistent with some known stylized facts in the empirical literature that previous models were not able to produce, namely, the positive relation between the exchange rate and the interest rate differential, the degree of non-linearity of the function linking the exchage rate to fundamentals and the shape of the exchange rate stochastic distribution.
Resumo:
We present a Search and Matching model with heterogeneous workers (entrants and incumbents) that replicates the stylized facts characterizing the US and the Spanish labor markets. Under this benchmark, we find the Post-Match Labor Turnover Costs (PMLTC) to be the centerpiece to explain why the Spanish labor market is as volatile as the US one. The two driving forces governing this volatility are the gaps between entrants and incumbents in terms of separation costs and productivity. We use the model to analyze the cyclical implications of changes in labor market institutions affecting these two gaps. The scenario with a low degree of workers heterogeneity illustrates its suitability to understand why the Spanish labor market has become as volatile as the US one.
Resumo:
Este trabajo tiene como objetivo estudiar el impacto de las estrategias activas de gestión en la performance de los fondos de inversión de renta fija y se realiza en tres fases, en primer lugar, a partir de la información homogénea disponible para cualquier partícipe se elabora un perfil de riesgo de los fondos a partir de los tipos de riesgo asociados a la renta fija. En segundo lugar, se propone una medida de performance que permite la comparación entre fondos, a dos niveles: por un lado, tomando como benchmark una cartera puramente pasiva y por otro lado, adecuando el benchmark al vencimiento de la cartera. En tercer lugar se realiza un contraste con el fin de determinar el impacto en la performance de los fondos estudiados de los indicadores de actividad de la estrategia asociados al perfil de riesgo del fondo.
Resumo:
Economic development goes hand in hand with an increase in the consumption of natural resources. Some analysts use material flows to describe such relationship [Eurostat 2001, Weisz et al., 2006], or exergy [Ayres et al., 2003]. Instead this paper will use a characterisation of the exosomatic energy metabolism based on expected benchmark values to describe possible constraints to economic development posed by available human time and energy. The aim of the paper is to identify types of exosomatic energy metabolism of different societies to interpret its consequences for economic development. This is done with the application of the accounting methodology called Multi-Scale Integrated Analysis of Societal Metabolism (MSIASM) to the particular case of energy metabolism for the analysis of the economies of Brazil, Chile and Venezuela.
Resumo:
Aquest projecte tracta de realitzar un estudi per comprovar si es pot simular una aplicació de prova (benchmark) reduint el temps de simulació. Per tal de reduir el temps de simulació es seleccionaran uns determinats fragments significatius de l'execució de l'aplicació. L'objectiu és obtenir un resultat de simulació el més similar possible al de la simulació completa però en menys temps. El mètode que farem servir s'anomena incremental i consisteix a dividir la simulació en intervals d'un milió d'instruccions. Un cop dividit hem simulat per passos. En cada pas s'afegeixen intervals i s'atura la simulació quan la diferència entre el resultat del pas actual i l'anterior és inferior a un determinat valor escollit inicialment. Després es proposa una millora que es realitza i es mostren els resultats obtinguts. La millora consisteix a simular un petit interval previ a l'interval significatiu per millorar el resultat.
Resumo:
Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.
Resumo:
Expectations about the future are central for determination of current macroeconomic outcomes and the formulation of monetary policy. Recent literature has explored ways for supplementing the benchmark of rational expectations with explicit models of expectations formation that rely on econometric learning. Some apparently natural policy rules turn out to imply expectational instability of private agents’ learning. We use the standard New Keynesian model to illustrate this problem and survey the key results about interest-rate rules that deliver both uniqueness and stability of equilibrium under econometric learning. We then consider some practical concerns such as measurement errors in private expectations, observability of variables and learning of structural parameters required for policy. We also discuss some recent applications including policy design under perpetual learning, estimated models with learning, recurrent hyperinflations, and macroeconomic policy to combat liquidity traps and deflation.
Resumo:
In this paper a Social Accounting Matrix is constructed for Libya for the year 2000. The procedure was divided into three steps. First, a macro SAM was constructed to consistently capture and represent the macroeconomic framework of the Libyan economy in 2000. Second, that macro SAM was disaggregated into a micro SAM incorporating the accounts for individual activities, primary factors and the main economic institutions. But the SAM obtained in this way was not balanced. So in thE final step we balanced the SAM using a cross-entropy procedure in General Algebraic Modelling System (GAMS). This SAM integrates national income, inputoutput, flow-of-funds, and foreign trade statistics into a comprehensive and consistent dataset. The lack of coherent time series data for Libya is a serious obstacle for applied research that uses econometric analysis. Our main intension in constructing this SAM has been one of providing benchmark data for economy-wide analysis using CGE modelling for Libya.
Resumo:
We consider optimal monetary and scal policies in a New Keynesian model of a small open economy with sticky prices and wages. In this benchmark setting monetary policy is all we need - analytical results demonstrate that variations in government spending should play no role in the stabilization of shocks. In extensions we show, rstly, that this is even when true when allowing for in ation inertia through backward-looking rule-of-thumb price and wage-setting, as long as there is no discrepancy between the private and social evaluation of the marginal rate of substitution between consumption and leisure. Secondly, the optimal neutrality of government spending is robust to the issuance of public debt. In the presence of debt government spending will deviate from the optimal steady-state but only to the extent required to cover the deficit, not to provide any additional macroeconomic stabilization. However, unlike government spending variations in tax rates can play a complementary role to monetary policy, as they change relative prices rather than demand.
Resumo:
We forecast quarterly US inflation based on the generalized Phillips curve using econometric methods which incorporate dynamic model averaging. These methods not only allow for coe¢ cients to change over time, but also allow for the entire forecasting model to change over time. We nd that dynamic model averaging leads to substantial forecasting improvements over simple benchmark regressions and more sophisticated approaches such as those using time varying coe¢ cient models. We also provide evidence on which sets of predictors are relevant for forecasting in each period.
Resumo:
We forecast quarterly US inflation based on the generalized Phillips curve using econometric methods which incorporate dynamic model averaging. These methods not only allow for coe¢ cients to change over time, but also allow for the entire forecasting model to change over time. We nd that dynamic model averaging leads to substantial forecasting improvements over simple benchmark regressions and more sophisticated approaches such as those using time varying coe¢ cient models. We also provide evidence on which sets of predictors are relevant for forecasting in each period.
Resumo:
A number of different models with behavioral economics have a reduced form representation where potentially boundedly rational decision-makers do not necessarily internalize all the consequences of their actions on payoff relevant features (which we label as psychological states) of the choice environment. This paper studies the restrictions that such behavioral models impose on choice data and the implications they have for welfare analysis. First, we propose a welfare benchmark that is justified using standard axioms of rational choice and can be applied to a number of existing seminal behavioral economics models. Second, we show that Sen's axioms and fully characterize choice data consistent with behavioral decision-makers. Third, we show how choice data to infer information about the normative signi.cance of psychological states and establish the possibility of identifying welfare dominated choices.
Resumo:
An expanding literature articulates the view that Taylor rules are helpful in predicting exchange rates. In a changing world however, Taylor rule parameters may be subject to structural instabilities, for example during the Global Financial Crisis. This paper forecasts exchange rates using such Taylor rules with Time Varying Parameters (TVP) estimated by Bayesian methods. In core out-of-sample results, we improve upon a random walk benchmark for at least half, and for as many as eight out of ten, of the currencies considered. This contrasts with a constant parameter Taylor rule model that yields a more limited improvement upon the benchmark. In further results, Purchasing Power Parity and Uncovered Interest Rate Parity TVP models beat a random walk benchmark, implying our methods have some generality in exchange rate prediction.
Resumo:
We analyse the role of time-variation in coefficients and other sources of uncertainty in exchange rate forecasting regressions. Our techniques incorporate the notion that the relevant set of predictors and their corresponding weights, change over time. We find that predictive models which allow for sudden rather than smooth, changes in coefficients significantly beat the random walk benchmark in out-of-sample forecasting exercise. Using innovative variance decomposition scheme, we identify uncertainty in coefficients' estimation and uncertainty about the precise degree of coefficients' variability, as the main factors hindering models' forecasting performance. The uncertainty regarding the choice of the predictor is small.
Resumo:
Background: Anaesthesia Databank Switzerland (ADS) is a voluntary data registry introduced in 1996. The goal was to promote quality in anaesthesiology. Methods: Analysis of routinely recorded adverse events. Internal and external benchmark comparisons between anaesthesia departments. Results: In 2010, the database included 2'158'735 anaesthetic procedures. Forty-four anaesthesia departments were participating to the data collection in 2010. Over time, the number of patients in older age groups increased, the largest group being patients aged 50 to 64 years. Over time, the percentage of patients with ASA physical status score 1 decreased while the number of ASA 2 or 3 patients increased. The most frequent co-morbidities were hypertension (21%), smoking (16%), allergy (15%), and obesity (12%). Between 1996 and 2010, 146'459 adverse events were recorded, of which 34% were cardiovascular, 7% respiratory, 39% specific to anaesthesia and 17% nonspecific. The overall proportion of adverse events decreased over time, whatever their severity. Conclusion: The ADS routine data collection contributes to monitoring the trends of anaesthesia care in Switzerland.