867 resultados para Time-Consistent Policy


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transport is an essential sector in modern societies. It connects economic sectors and industries. Next to its contribution to economic development and social interconnection, it also causes adverse impacts on the environment and results in health hazards. Transport is a major source of ground air pollution, especially in urban areas, and therefore contributing to the health problems, such as cardiovascular and respiratory diseases, cancer, and physical injuries. This thesis presents the results of a health risk assessment that quantifies the mortality and the diseases associated with particulate matter pollution resulting from urban road transport in Hai Phong City, Vietnam. The focus is on the integration of modelling and GIS approaches in the exposure analysis to increase the accuracy of the assessment and to produce timely and consistent assessment results. The modelling was done to estimate traffic conditions and concentrations of particulate matters based on geo-references data. A simplified health risk assessment was also done for Ha Noi based on monitoring data that allows a comparison of the results between the two cases. The results of the case studies show that health risk assessment based on modelling data can provide a much more detail results and allows assessing health impacts of different mobility development options at micro level. The use of modeling and GIS as a common platform for the integration of different assessments (environmental, health, socio-economic, etc.) provides various strengths, especially in capitalising on the available data stored in different units and forms and allows handling large amount of data. The use of models and GIS in a health risk assessment, from a decision making point of view, can reduce the processing/waiting time while providing a view at different scales: from micro scale (sections of a city) to a macro scale. It also helps visualising the links between air quality and health outcomes which is useful discussing different development options. However, a number of improvements can be made to further advance the integration. An improved integration programme of the data will facilitate the application of integrated models in policy-making. Data on mobility survey, environmental monitoring and measuring must be standardised and legalised. Various traffic models, together with emission and dispersion models, should be tested and more attention should be given to their uncertainty and sensitivity

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper studies the effects of monetary policy on mutual fund risk taking using a sample of Portuguese fixed-income mutual funds in the 2000-2012 period. Firstly I estimate time-varying measures of risk exposure (betas) for the individual funds, for the benchmark portfolio, as well as for a representative equally-weighted portfolio, through 24-month rolling regressions of a two-factor model with two systematic risk factors: interest rate risk (TERM) and default risk (DEF). Next, in the second phase, using the estimated betas, I try to understand what portion of the risk exposure is in excess of the benchmark (active risk) and how it relates to monetary policy proxies (one-month rate, Taylor residual, real rate and first principal component of a cross-section of government yields and rates). Using this methodology, I provide empirical evidence that Portuguese fixed-income mutual funds respond to accommodative monetary policy by significantly increasing exposure, in excess of their benchmarks, to default risk rate and slightly to interest risk rate as well. I also find that the increase in funds’ risk exposure to gain a boost in return (search-for-yield) is more pronounced following the 2007-2009 global financial crisis, indicating that the current historic low interest rates may incentivize excessive risk taking. My results suggest that monetary policy affects the risk appetite of non-bank financial intermediaries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Liver transplantation is now the standard treatment for end-stage liver disease. Given the shortage of liver donors and the progressively higher number of patients waiting for transplantation, improvements in patient selection and optimization of timing for transplantation are needed. Several solutions have been suggested, including increasing the donor pool; a fair policy for allocation, not permitting variables such as age, gender, and race, or third-party payer status to play any role; and knowledge of the natural history of each liver disease for which transplantation is offered. To observe ethical rules and distributive justice (guarantee to every citizen the same opportunity to get an organ), the "sickest first" policy must be used. Studies have demonstrated that death has no relationship with waiting time, but rather with the severity of liver disease at the time of inclusion. Thus, waiting time is no longer part of the United Network for Organ Sharing distribution criteria. Waiting time only differentiates between equally severely diseased patients. The authors have analyzed the waiting list mortality and 1-year survival for patients of the State of São Paulo, from July 1997 through January 2001. Only the chronological criterion was used. According to "Secretaria de Estado da Saúde de São Paulo" data, among all waiting list deaths, 82.2% occurred within the first year, and 37.6% within the first 3 months following inclusion. The allocation of livers based on waiting time is neither fair nor ethical, impairs distributive justice and human rights, and does not occur in any other part of the world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The observational method in tunnel engineering allows the evaluation in real time of the actual conditions of the ground and to take measures if its behavior deviates considerably from predictions. However, it lacks a consistent and structured methodology to use the monitoring data to adapt the support system in real time. The definition of limit criteria above which adaptation is required are not defined and complex inverse analysis procedures (Rechea et al. 2008, Levasseur et al. 2010, Zentar et al. 2001, Lecampion et al. 2002, Finno and Calvello 2005, Goh 1999, Cui and Pan 2012, Deng et al. 2010, Mathew and Lehane 2013, Sharifzadeh et al. 2012, 2013) may be needed to consistently analyze the problem. In this paper a methodology for the real time adaptation of the support systems during tunneling is presented. In a first step limit criteria for displacements and stresses are proposed. The methodology uses graphics that are constructed during the project stage based on parametric calculations to assist in the process and when these graphics are not available, since it is not possible to predict every possible scenario, inverse analysis calculations are carried out. The methodology is applied to the “Bois de Peu” tunnel which is composed by two tubes with over 500 m long. High uncertainty levels existed concerning the heterogeneity of the soil and consequently in the geomechanical design parameters. The methodology was applied in four sections and the results focus on two of them. It is shown that the methodology has potential to be applied in real cases contributing for a consistent approach of a real time adaptation of the support system and highlight the importance of the existence of good quality and specific monitoring data to improve the inverse analysis procedure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large scale distributed data stores rely on optimistic replication to scale and remain highly available in the face of net work partitions. Managing data without coordination results in eventually consistent data stores that allow for concurrent data updates. These systems often use anti-entropy mechanisms (like Merkle Trees) to detect and repair divergent data versions across nodes. However, in practice hash-based data structures are too expensive for large amounts of data and create too many false conflicts. Another aspect of eventual consistency is detecting write conflicts. Logical clocks are often used to track data causality, necessary to detect causally concurrent writes on the same key. However, there is a nonnegligible metadata overhead per key, which also keeps growing with time, proportional with the node churn rate. Another challenge is deleting keys while respecting causality: while the values can be deleted, perkey metadata cannot be permanently removed without coordination. Weintroduceanewcausalitymanagementframeworkforeventuallyconsistentdatastores,thatleveragesnodelogicalclocks(BitmappedVersion Vectors) and a new key logical clock (Dotted Causal Container) to provides advantages on multiple fronts: 1) a new efficient and lightweight anti-entropy mechanism; 2) greatly reduced per-key causality metadata size; 3) accurate key deletes without permanent metadata.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyzes the propagation of monetary policy shocks through the creation of credit in an economy. Models of the monetary transmission mechanism typically feature responses which last for a few quarters contrary to what the empirical evidence suggests. To propagate the impact of monetary shocks over time, these models introduce adjustment costs by which agents find it optimal to change their decisions slowly. This paper presents another explanation that does not rely on any sort of adjustment costs or stickiness. In our economy, agents own assets and make occupational choices. Banks intermediate between agents demanding and supplying assets. Our interpretation is based on the way banks create credit and how the monetary authority affects the process of financial intermediation through its monetary policy. As the central bank lowers the interest rate by buying government bonds in exchange for reserves, high productive entrepreneurs are able to borrow more resources from low productivity agents. We show that this movement of capital among agents sets in motion a response of the economy that resembles an expansionary phase of the cycle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are both theoretical and empirical reasons for believing that the parameters of macroeconomic models may vary over time. However, work with time-varying parameter models has largely involved Vector autoregressions (VARs), ignoring cointegration. This is despite the fact that cointegration plays an important role in informing macroeconomists on a range of issues. In this paper we develop time varying parameter models which permit cointegration. Time-varying parameter VARs (TVP-VARs) typically use state space representations to model the evolution of parameters. In this paper, we show that it is not sensible to use straightforward extensions of TVP-VARs when allowing for cointegration. Instead we develop a specification which allows for the cointegrating space to evolve over time in a manner comparable to the random walk variation used with TVP-VARs. The properties of our approach are investigated before developing a method of posterior simulation. We use our methods in an empirical investigation involving a permanent/transitory variance decomposition for inflation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The possibility of low-probability extreme events has reignited the debate over the optimal intensity and timing of climate policy. In this paper we therefore contribute to the literature by assessing the implications of low-probability extreme events on environmental policy in a continuous-time real options model with “tail risk”. In a nutshell, our results indicate the importance of tail risk and call for foresighted pre-emptive climate policies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

These notes try to clarify some discussions on the formulation of individual intertemporal behavior under adaptive learning in representative agent models. First, we discuss two suggested approaches and related issues in the context of a simple consumption-saving model. Second, we show that the analysis of learning in the NewKeynesian monetary policy model based on “Euler equations” provides a consistent and valid approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Official calculations of automatic stabilizers are seriously flawed since they rest on the assumption that the only element of social spending that reacts automatically to the cycle is unemployment compensation. This puts into question many estimates of discretionary fiscal policy. In response, we propose a simultaneous estimate of automatic and discretionary fiscal policy. This leads us, quite naturally, to a tripartite decomposition of the budget balance between revenues, social spending and other spending as a bare minimum. Our headline results for a panel of 20 OECD countries in 1981-2003 are .59 automatic stabilization in percentage-points of primary surplus balances. All of this stabilization remains following discretionary responses during contractions, but arguably only about 3/5 of it remains so in expansions while discretionary behavior cancels the rest. We pay a lot of attention to the impact of the Maastricht Treaty and the SGP on the EU members of our sample and to real time data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The quintessence of recent natural science studies is that the 2 degrees C target can only be achieved with massive emission reductions in the next few years. The central twist of this paper is the addition of this limited time to act into a non-perpetual real options framework analysing optimal climate policy under uncertainty. The window-of-opportunity modelling setup shows that the limited time to act may spark a trend reversal in the direction of low-carbon alternatives. However, the implementation of a climate policy is evaded by high uncertainty about possible climate pathways.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The monetary policy reaction function of the Bank of England is estimated by the standard GMM approach and the ex-ante forecast method developed by Goodhart (2005), with particular attention to the horizons for inflation and output at which each approach gives the best fit. The horizons for the ex-ante approach are much closer to what is implied by the Bank’s view of the transmission mechanism, while the GMM approach produces an implausibly slow adjustment of the interest rate, and suffers from a weak instruments problem. These findings suggest a strong preference for the ex-ante approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Official calculations of automatic stabilizers are seriously flawed since they rest on the assumption that the only element of social spending that reacts automatically to the cycle is unemployment compensation. This puts into question many estimates of discretionary fiscal policy. In response, we propose a simultaneous estimate of automatic and discretionary fiscal policy. This leads us, quite naturally, to a tripartite decomposition of the budget balance between revenues, social spending and other spending as a bare minimum. Our headline results for a panel of 20 OECD countries in 1981-2003 are .59 automatic stabilization in percentage-points of primary surplus balances. All of this stabilization remains following discretionary responses during contractions, but arguably only about 3/5 of it remains so in expansions while discretionary behavior cancels the rest. We pay a lot of attention to the impact of the Maastricht Treaty and the SGP on the EU members of our sample and to real time data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The monetary policy reaction function of the Bank of England is estimated by the standard GMM approach and the ex-ante forecast method developed by Goodhart (2005), with particular attention to the horizons for inflation and output at which each approach gives the best fit. The horizons for the ex-ante approach are much closer to what is implied by the Bank’s view of the transmission mechanism, while the GMM approach produces an implausibly slow adjustment of the interest rate, and suffers from a weak instruments problem. These findings suggest a strong preference for the ex-ante approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There has been much debate regarding the electoral strategy adopted by New Labour in the lead-up to and then during their time in government. This paper addresses the issue from the perspective of left/right and libertarian/authoritarian considerations by examining data on individual attitudes from the British Social Attitudes survey between 1986 and 2009. The analysis indicates that New Labour’s move towards the right on economic and public policy was the main driver towards attracting new centrist voters and could thus be labelled ‘broadly’ populist. The move towards a tougher stance on law and order was more ‘narrowly’ populist in that it was used more to minimise the reduction in support from Labour’s traditional base on the left than to attract new votes.