999 resultados para Policy Manual
Resumo:
Estimated Taylor rules became popular as a description of monetary policy conduct. There are numerous reasons why real monetary policy can be asymmetric and estimated Taylor rule nonlinear. This paper tests whether monetary policy can be described as asymmetric in three new European Union (EU) members (the Czech Republic, Hungary and Poland), which apply an inflation targeting regime. Two different empirical frameworks are
Resumo:
We examine whether and how main central banks responded to episodes of financial stress over the last three decades. We employ a new methodology for monetary policy rules estimation, which allows for time-varying response coefficients as well as corrects for endogeneity. This flexible framework applied to the U.S., U.K., Australia, Canada and Sweden together with a new financial stress dataset developed by the International Monetary Fund allows not only testing whether the central banks responded to financial stress but also detects the periods and type of stress that were the most worrying for monetary authorities and to quantify the intensity of policy response. Our findings suggest that central banks often change policy
Resumo:
This paper investigates the effects of fiscal policy on the trade balance using a structural factor model. A fiscal policy shock worsens the trade balance and produces an appreciation of the domestic currency but the effects are quantitatively small. The findings match the theoretical predictions of the standard Mundell-Fleming model, although fiscal policy should not be considered one of the main causes of the large US external deficit. My conclusions differ from those reached using VAR models since the fiscal shock, possibly due to fiscal foresight, is nonfundamental for the variables typically used in open economy VARs.
Resumo:
This paper addresses the issue of policy evaluation in a context in which policymakers are uncertain about the effects of oil prices on economic performance. I consider models of the economy inspired by Solow (1980), Blanchard and Gali (2007), Kim and Loungani (1992) and Hamilton (1983, 2005), which incorporate different assumptions on the channels through which oil prices have an impact on economic activity. I first study the characteristics of the model space and I analyze the likelihood of the different specifications. I show that the existence of plausible alternative representations of the economy forces the policymaker to face the problem of model uncertainty. Then, I use the Bayesian approach proposed by Brock, Durlauf and West (2003, 2007) and the minimax approach developed by Hansen and Sargent (2008) to integrate this form of uncertainty into policy evaluation. I find that, in the environment under analysis, the standard Taylor rule is outperformed under a number of criteria by alternative simple rules in which policymakers introduce persistence in the policy instrument and respond to changes in the real price of oil.
Resumo:
We present a dynamic model where the accumulation of patents generates an increasing number of claims on sequential innovation. We compare innovation activity under three regimes -patents, no-patents, and patent pools- and find that none of them can reach the first best. We find that the first best can be reached through a decentralized tax-subsidy mechanism, by which innovators receive a subsidy when they innovate, and are taxed with subsequent innovations. This finding implies that optimal transfers work in the exact opposite way as traditional patents. Finally, we consider patents of finite duration and determine the optimal patent length.
Resumo:
This paper characterizes a mixed strategy Nash equilibrium in a one-dimensional Downsian model of two-candidate elections with a continuous policy space, where candidates are office motivated and one candidate enjoys a non-policy advantage over the other candidate. We assume that voters have quadratic preferences over policies and that their ideal points are drawn from a uniform distribution over the unit interval. In our equilibrium the advantaged candidate chooses the expected median voter with probability one and the disadvantaged candidate uses a mixed strategy that is symmetric around it. We show that this equilibrium exists if the number of voters is large enough relative to the size of the advantage.
Resumo:
This dissertation focuses on the practice of regulatory governance, throughout the study of the functioning of formally independent regulatory agencies (IRAs), with special attention to their de facto independence. The research goals are grounded on a "neo-positivist" (or "reconstructed positivist") position (Hawkesworth 1992; Radaelli 2000b; Sabatier 2000). This perspective starts from the ontological assumption that even if subjective perceptions are constitutive elements of political phenomena, a real world exists beyond any social construction and can, however imperfectly, become the object of scientific inquiry. Epistemologically, it follows that hypothetical-deductive theories with explanatory aims can be tested by employing a proper methodology and set of analytical techniques. It is thus possible to make scientific inferences and general conclusions to a certain extent, according to a Bayesian conception of knowledge, in order to update the prior scientific beliefs in the truth of the related hypotheses (Howson 1998), while acknowledging the fact that the conditions of truth are at least partially subjective and historically determined (Foucault 1988; Kuhn 1970). At the same time, a sceptical position is adopted towards the supposed disjunction between facts and values and the possibility of discovering abstract universal laws in social science. It has been observed that the current version of capitalism corresponds to the golden age of regulation, and that since the 1980s no government activity in OECD countries has grown faster than regulatory functions (Jacobs 1999). Following an apparent paradox, the ongoing dynamics of liberalisation, privatisation, decartelisation, internationalisation, and regional integration hardly led to the crumbling of the state, but instead promoted a wave of regulatory growth in the face of new risks and new opportunities (Vogel 1996). Accordingly, a new order of regulatory capitalism is rising, implying a new division of labour between state and society and entailing the expansion and intensification of regulation (Levi-Faur 2005). The previous order, relying on public ownership and public intervention and/or on sectoral self-regulation by private actors, is being replaced by a more formalised, expert-based, open, and independently regulated model of governance. Independent regulation agencies (IRAs), that is, formally independent administrative agencies with regulatory powers that benefit from public authority delegated from political decision makers, represent the main institutional feature of regulatory governance (Gilardi 2008). IRAs constitute a relatively new technology of regulation in western Europe, at least for certain domains, but they are increasingly widespread across countries and sectors. For instance, independent regulators have been set up for regulating very diverse issues, such as general competition, banking and finance, telecommunications, civil aviation, railway services, food safety, the pharmaceutical industry, electricity, environmental protection, and personal data privacy. Two attributes of IRAs deserve a special mention. On the one hand, they are formally separated from democratic institutions and elected politicians, thus raising normative and empirical concerns about their accountability and legitimacy. On the other hand, some hard questions about their role as political actors are still unaddressed, though, together with regulatory competencies, IRAs often accumulate executive, (quasi-)legislative, and adjudicatory functions, as well as about their performance.
Resumo:
The aim of the study was to determine objective radiological signs of danger to life in survivors of manual strangulation and to establish a radiological scoring system for the differentiation between life-threatening and non-life-threatening strangulation by dividing the cross section of the neck into three zones (superficial, middle and deep zone). Forensic pathologists classified 56 survivors of strangulation into life-threatening and non-life-threatening cases by history and clinical examination alone, and two blinded radiologists evaluated the MRIs of the neck. In 15 cases, strangulation was life-threatening (27%), compared with 41 cases in which strangulation was non-life-threatening (73%). The best radiological signs on MRI to differentiate between the two groups were intramuscular haemorrhage/oedema, swelling of platysma and intracutaneous bleeding (all p = 0.02) followed by subcutaneous bleeding (p = 0.034) and haemorrhagic lymph nodes (p = 0.04), all indicating life-threatening strangulation. The radiological scoring system showed a sensitivity and specificity of approximately 70% for life-threatening strangulation, when at least two neck zones were affected. MRI is not only helpful in assessing the severity of strangulation, but is also an excellent documentation tool that is even admissible in court.
Resumo:
Purpose: Recently morphometric measurements of the ascending aorta have been done with ECG-gated MDCT to help the development of future endovascular therapies (TCT) [1]. However, the variability of these measurements remains unknown. It will be interesting to know the impact of CAD (computer aided diagnosis) with automated segmentation of the vessel and automatic measurements of diameter on the management of ascending aorta aneurysms. Methods and Materials: Thirty patients referred for ECG-gated CT thoracic angiography (64-row CT scanner) were evaluated. Measurements of the maximum and minimum ascending aorta diameters were obtained automatically with a commercially available CAD and semi-manually by two observers separately. The CAD algorithms segment the iv-enhanced lumen of the ascending aorta into perpendicular planes along the centreline. The CAD then determines the largest and the smallest diameters. Both observers repeated the automatic measurements and the semimanual measurements during a different session at least one month after the first measurements. The Bland and Altman method was used to study the inter/intraobserver variability. A Wilcoxon signed-rank test was also used to analyse differences between observers. Results: Interobserver variability for semi-manual measurements between the first and second observers was between 1.2 to 1.0 mm for maximal and minimal diameter, respectively. Intraobserver variability of each observer ranged from 0.8 to 1.2 mm, the lowest variability being produced by the more experienced observer. CAD variability could be as low as 0.3 mm, showing that it can perform better than human observers. However, when used in nonoptimal conditions (streak artefacts from contrast in the superior vena cava or weak lumen enhancement), CAD has a variability that can be as high as 0.9 mm, reaching variability of semi-manual measurements. Furthermore, there were significant differences between both observers for maximal and minimal diameter measurements (p<0.001). There was also a significant difference between the first observer and CAD for maximal diameter measurements with the former underestimating the diameter compared to the latter (p<0.001). As for minimal diameters, they were higher when measured by the second observer than when measured by CAD (p<0.001). Neither the difference of mean minimal diameter between the first observer and CAD nor the difference of mean maximal diameter between the second observer and CAD was significant (p=0.20 and 0.06, respectively). Conclusion: CAD algorithms can lessen the variability of diameter measurements in the follow-up of ascending aorta aneurysms. Nevertheless, in non-optimal conditions, it may be necessary to correct manually the measurements. Improvements of the algorithms will help to avoid such a situation.
Resumo:
The policy analysis tool provides a framework to review government and other relevant agency policy related to the proposal.
Resumo:
This document describes Health Impact Assessment (HIA) and the steps involved in HIA. It gives advice based on the experience of HIA practitioners and provides tools to help carry out these steps and adapt HIA to local circumstances. This guidance manual explains what Health Impact Assessment (HIA) is and the stages involved in conducting it. It has been revised and updated based on the experience of HIA practitioners and includes new tools which have been developed to assist each step of the HIA process. It aims to provide a user friendly and practical framework to guide policy-makers and practitioners in undertaking HIA. All HIA tools contained in this guidance and further information on HIA may be found at http://www.publichealth.ie/hia
Resumo:
Community education needs to be supported by strong public policy if it is to be fully effective at tackling food poverty and obesity, a project evaluation by the Institute of Public Health in Ireland (IPH) has found. In its evaluation of Decent Food for All (DFfA) - a major project to improve community diet and health - IPH found that where people live and shop had a greater impact on their diet than their own individual awareness and attitudes. Access Tackling Food Poverty: lessons from the Decent Food for All intervention at www.publichealth.ie DFfA was funded by safefood (the Food Safety Promotion Board) and the Food Standards Agency Northern Ireland. The project lasted four years and included hundreds of community education activities designed to improve diet in poorer parts of Armagh and South Tyrone. safefood commissioned IPH to undertake the evaluation of DFfA. Dr. Kevin Balanda, IPH Associate Director, said 'The aim of the project was to reduce food poverty (this is defined as not being able to consume adequate healthy food) and improve health in the target communities. DFfA delivered over 370 core activities to 3,100 residents including local education talks on diet, cookery workshops, fresh fruit in schools, healthy food tastings and information stands. One in eight residents in the target areas participated in at least one of these activities.' The evaluation found that over 1 in 5 adults in the target areas reported they had cut their weekly food spending in the last six months to pay other household bills such as rent, electricity and gas. During the four years of the DFfA activities, this percentage had not changed significantly. There were mixed changes in the nature of food in local stores. While the overall availability and price of food increased, both モhealthierヤ food and モunhealthierヤ food were included in that increase. It was only in the larger モmultiple/discount freezerヤ type of shops that the overall price of food had decreased.