931 resultados para Electoral volatility


Relevância:

10.00% 10.00%

Publicador:

Resumo:

To assess the impact of electoral systems on voting turnout, cross-national studies can be usefully complemented by studies of turnout in local elections in countries using more than one electoral system at that level. In this article, we look at data from a 1998 survey of Swiss municipalities to revisit the findings of our earlier study. This previous study, based on a 1988 survey, concluded, in particular, that there exists a positive relationship between proportional representation elections, party politicization, and voter turnout. The moment is opportune since, in the interval, turnout has markedly declined in Swiss municipalities, as elsewhere. By testing whether municipalities with proportional representation voting were more or less successful in stemming the decline, we learn more about the relationship among these three phenomena. We use the results for those Swiss municipalities which participated in both surveys as our primary source.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Preventive treatment may avoid future cases of tuberculosis among asylum seekers. The effectiveness of preventive treatment depends in large part on treatment completion. METHODS: In a prospective cohort study, asylum seekers of two of the Swiss Canton Vaud migration centres were screened with the Interferon Gamma Release Assay (IGRA). Those with a positive IGRA were referred for medical examination. Individuals with active or past tuberculosis were excluded. Preventive treatment was offered to all participants with positive IGRA but without active tuberculosis. The adherence was assessed during monthly follow-up. RESULTS: From a population of 393 adult migrants, 98 (24.9%) had a positive IGRA. Eleven did not attend the initial medical assessment. Of the 87 examined, eight presented with pulmonary disease (five of them received a full course of antituberculous therapy), two had a history of prior tuberculosis treatment and two had contraindications to treatment. Preventive treatment was offered to 75 individuals (4 months rifampicin in 74 and 9 months isoniazid in one), of whom 60 (80%) completed the treatment. CONCLUSIONS: The vulnerability and the volatility of this population make screening and observance of treatment difficult. It seems possible to obtain a high rate of completion using a short course of treatment in a closely monitored population living in stable housing conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Executive Summary The unifying theme of this thesis is the pursuit of a satisfactory ways to quantify the riskureward trade-off in financial economics. First in the context of a general asset pricing model, then across models and finally across country borders. The guiding principle in that pursuit was to seek innovative solutions by combining ideas from different fields in economics and broad scientific research. For example, in the first part of this thesis we sought a fruitful application of strong existence results in utility theory to topics in asset pricing. In the second part we implement an idea from the field of fuzzy set theory to the optimal portfolio selection problem, while the third part of this thesis is to the best of our knowledge, the first empirical application of some general results in asset pricing in incomplete markets to the important topic of measurement of financial integration. While the first two parts of this thesis effectively combine well-known ways to quantify the risk-reward trade-offs the third one can be viewed as an empirical verification of the usefulness of the so-called "good deal bounds" theory in designing risk-sensitive pricing bounds. Chapter 1 develops a discrete-time asset pricing model, based on a novel ordinally equivalent representation of recursive utility. To the best of our knowledge, we are the first to use a member of a novel class of recursive utility generators to construct a representative agent model to address some long-lasting issues in asset pricing. Applying strong representation results allows us to show that the model features countercyclical risk premia, for both consumption and financial risk, together with low and procyclical risk free rate. As the recursive utility used nests as a special case the well-known time-state separable utility, all results nest the corresponding ones from the standard model and thus shed light on its well-known shortcomings. The empirical investigation to support these theoretical results, however, showed that as long as one resorts to econometric methods based on approximating conditional moments with unconditional ones, it is not possible to distinguish the model we propose from the standard one. Chapter 2 is a join work with Sergei Sontchik. There we provide theoretical and empirical motivation for aggregation of performance measures. The main idea is that as it makes sense to apply several performance measures ex-post, it also makes sense to base optimal portfolio selection on ex-ante maximization of as many possible performance measures as desired. We thus offer a concrete algorithm for optimal portfolio selection via ex-ante optimization over different horizons of several risk-return trade-offs simultaneously. An empirical application of that algorithm, using seven popular performance measures, suggests that realized returns feature better distributional characteristics relative to those of realized returns from portfolio strategies optimal with respect to single performance measures. When comparing the distributions of realized returns we used two partial risk-reward orderings first and second order stochastic dominance. We first used the Kolmogorov Smirnov test to determine if the two distributions are indeed different, which combined with a visual inspection allowed us to demonstrate that the way we propose to aggregate performance measures leads to portfolio realized returns that first order stochastically dominate the ones that result from optimization only with respect to, for example, Treynor ratio and Jensen's alpha. We checked for second order stochastic dominance via point wise comparison of the so-called absolute Lorenz curve, or the sequence of expected shortfalls for a range of quantiles. As soon as the plot of the absolute Lorenz curve for the aggregated performance measures was above the one corresponding to each individual measure, we were tempted to conclude that the algorithm we propose leads to portfolio returns distribution that second order stochastically dominates virtually all performance measures considered. Chapter 3 proposes a measure of financial integration, based on recent advances in asset pricing in incomplete markets. Given a base market (a set of traded assets) and an index of another market, we propose to measure financial integration through time by the size of the spread between the pricing bounds of the market index, relative to the base market. The bigger the spread around country index A, viewed from market B, the less integrated markets A and B are. We investigate the presence of structural breaks in the size of the spread for EMU member country indices before and after the introduction of the Euro. We find evidence that both the level and the volatility of our financial integration measure increased after the introduction of the Euro. That counterintuitive result suggests the presence of an inherent weakness in the attempt to measure financial integration independently of economic fundamentals. Nevertheless, the results about the bounds on the risk free rate appear plausible from the view point of existing economic theory about the impact of integration on interest rates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract: Recruitment of MP's and determination of electoral success

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Preface The starting point for this work and eventually the subject of the whole thesis was the question: how to estimate parameters of the affine stochastic volatility jump-diffusion models. These models are very important for contingent claim pricing. Their major advantage, availability T of analytical solutions for characteristic functions, made them the models of choice for many theoretical constructions and practical applications. At the same time, estimation of parameters of stochastic volatility jump-diffusion models is not a straightforward task. The problem is coming from the variance process, which is non-observable. There are several estimation methodologies that deal with estimation problems of latent variables. One appeared to be particularly interesting. It proposes the estimator that in contrast to the other methods requires neither discretization nor simulation of the process: the Continuous Empirical Characteristic function estimator (EGF) based on the unconditional characteristic function. However, the procedure was derived only for the stochastic volatility models without jumps. Thus, it has become the subject of my research. This thesis consists of three parts. Each one is written as independent and self contained article. At the same time, questions that are answered by the second and third parts of this Work arise naturally from the issues investigated and results obtained in the first one. The first chapter is the theoretical foundation of the thesis. It proposes an estimation procedure for the stochastic volatility models with jumps both in the asset price and variance processes. The estimation procedure is based on the joint unconditional characteristic function for the stochastic process. The major analytical result of this part as well as of the whole thesis is the closed form expression for the joint unconditional characteristic function for the stochastic volatility jump-diffusion models. The empirical part of the chapter suggests that besides a stochastic volatility, jumps both in the mean and the volatility equation are relevant for modelling returns of the S&P500 index, which has been chosen as a general representative of the stock asset class. Hence, the next question is: what jump process to use to model returns of the S&P500. The decision about the jump process in the framework of the affine jump- diffusion models boils down to defining the intensity of the compound Poisson process, a constant or some function of state variables, and to choosing the distribution of the jump size. While the jump in the variance process is usually assumed to be exponential, there are at least three distributions of the jump size which are currently used for the asset log-prices: normal, exponential and double exponential. The second part of this thesis shows that normal jumps in the asset log-returns should be used if we are to model S&P500 index by a stochastic volatility jump-diffusion model. This is a surprising result. Exponential distribution has fatter tails and for this reason either exponential or double exponential jump size was expected to provide the best it of the stochastic volatility jump-diffusion models to the data. The idea of testing the efficiency of the Continuous ECF estimator on the simulated data has already appeared when the first estimation results of the first chapter were obtained. In the absence of a benchmark or any ground for comparison it is unreasonable to be sure that our parameter estimates and the true parameters of the models coincide. The conclusion of the second chapter provides one more reason to do that kind of test. Thus, the third part of this thesis concentrates on the estimation of parameters of stochastic volatility jump- diffusion models on the basis of the asset price time-series simulated from various "true" parameter sets. The goal is to show that the Continuous ECF estimator based on the joint unconditional characteristic function is capable of finding the true parameters. And, the third chapter proves that our estimator indeed has the ability to do so. Once it is clear that the Continuous ECF estimator based on the unconditional characteristic function is working, the next question does not wait to appear. The question is whether the computation effort can be reduced without affecting the efficiency of the estimator, or whether the efficiency of the estimator can be improved without dramatically increasing the computational burden. The efficiency of the Continuous ECF estimator depends on the number of dimensions of the joint unconditional characteristic function which is used for its construction. Theoretically, the more dimensions there are, the more efficient is the estimation procedure. In practice, however, this relationship is not so straightforward due to the increasing computational difficulties. The second chapter, for example, in addition to the choice of the jump process, discusses the possibility of using the marginal, i.e. one-dimensional, unconditional characteristic function in the estimation instead of the joint, bi-dimensional, unconditional characteristic function. As result, the preference for one or the other depends on the model to be estimated. Thus, the computational effort can be reduced in some cases without affecting the efficiency of the estimator. The improvement of the estimator s efficiency by increasing its dimensionality faces more difficulties. The third chapter of this thesis, in addition to what was discussed above, compares the performance of the estimators with bi- and three-dimensional unconditional characteristic functions on the simulated data. It shows that the theoretical efficiency of the Continuous ECF estimator based on the three-dimensional unconditional characteristic function is not attainable in practice, at least for the moment, due to the limitations on the computer power and optimization toolboxes available to the general public. Thus, the Continuous ECF estimator based on the joint, bi-dimensional, unconditional characteristic function has all the reasons to exist and to be used for the estimation of parameters of the stochastic volatility jump-diffusion models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of this paper is to identify the political conditions that are most likely to be conducive to the development of social investment policies. It starts from the view put forward by theorists of welfare retrenchment that in the current context of permanent austerity, policy is likely to be dominated by retrenchment and implemented in a way that allows governments to minimise the risk of electoral punishment (blame avoidance). It is argued that this view is inconsistent with developments observed in several European countries, were some welfare state expansion has taken place mostly in the fields of childcare and active labour market policy. An alternative model is put forward, that emphasises the notion of "affordable credit claiming". It is argued that even under strong budgetary pressures, governments maintain a preference for policies that allow them to claim credit for their actions. Since the traditional redistributive policies tend to be off the menu for cost reasons, governments have tended to favour investments in childcare and active labour market policy as credit claiming tools. Policies developed in this way while they have a social investment flavour, tend to be rather limited in the extent to which they genuinely improve prospects of disadvantaged people by investing in their human capital. A more ambitious strategy of social investment sees unlikely to develop on the basis of affordable credit claiming. The paper starts by presenting the theoretical argument, which is then illustrated with examples taken from European countries both in the pre-crisis and in the post-crisis years.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The quadrennial need study was developed to assist in identifying county highway financial needs (construction, rehabilitation, maintenance, and administration) and in the distribution of the road use tax fund (RUTF) among the counties in the state. During the period since the need study was first conducted using HWYNEEDS software, between 1982 and 1998, there have been large fluctuations in the level of funds distributed to individual counties. A recent study performed by Jim Cable (HR-363, 1993), found that one of the major factors affecting the volatility in the level of fluctuations is the quality of the pavement condition data collected and the accuracy of these data. In 1998, the Center for Transportation Research and Education researchers (Maze and Smadi) completed a project to study the feasibility of using automated pavement condition data collected for the Iowa Pavement Management Program (IPMP) for the paved county roads to be used in the HWYNEEDS software (TR-418). The automated condition data are objective and also more current since they are collected in a two year cycle compared to the 10-year cycle used by HWYNEEDS right now. The study proved the use of the automated condition data in HWYNEEDS would be feasible and beneficial in educing fluctuations when applied to a pilot study area. In another recommendation from TR-418, the researchers recommended a full analysis and investigation of HWYNEEDS methodology and parameters (for more information on the project, please review the TR-418 project report). The study reported in this document builds on the previous study on using the automated condition data in HWYNEEDS and covers the analysis and investigation of the HWYNEEDS computer program methodology and parameters. The underlying hypothesis for this study is thatalong with the IPMP automated condition data, some changes need to be made to HWYNEEDS parameters to accommodate the use of the new data, which will stabilize the process of allocating resources and reduce fluctuations from one quadrennial need study to another. Another objective of this research is to investigate the gravel roads needs and study the feasibility of developing a more objective approach to determining needs on the counties gravel road network. This study identifies new procedures by which the HWYNEEDS computer program is used to conduct the quadrennial needs study on paved roads. Also, a new procedure will be developed to determine gravel roads needs outside of the HWYNEED program. Recommendations are identified for the new procedures and also in terms of making changes to the current quadrennial need study. Future research areas are also identified.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Con este trabajo revisamos los Modelos de niveles de las tasas de intereses en Chile. Además de los Modelos de Nivel tradicionales por Chan, Karoly, Longstaff y Lijadoras (1992) en EE. UU, y Parisi (1998) en Chile, por el método de Probabilidad Maximun permitimos que la volatilidad condicional también incluya los procesos inesperados de la información (el modelo GARCH ) y también que la volatilidad sea la función del nivel de la tasa de intereses (modelo TVP-NIVELE) como en Brenner, Harjes y la Crona (1996). Para esto usamos producciones de mercado de bonos de reconocimiento, en cambio las producciones mensuales medias de subasta PDBC, y la ampliación del tamaño y la frecuencia de la muestra a 4 producciones semanales con términos(condiciones) diferentes a la madurez: 1 año, 5 años, 10 años y 15 años. Los resultados principales del estudio pueden ser resumidos en esto: la volatilidad de los cambios inesperados de las tarifas depende positivamente del nivel de las tarifas, sobre todo en el modelo de TVP-NIVEL. Obtenemos pruebas de reversión tacañas, tal que los incrementos en las tasas de intereses no eran independientes, contrariamente a lo obtenido por Brenner. en EE. UU. Los modelos de NIVELES no son capaces de ajustar apropiadamente la volatilidad en comparación con un modelo GARCH (1,1), y finalmente, el modelo de TVP-NIVEL no vence los resultados del modelo GARCH (1,1)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work the valuation methodology of compound option written on a downand-out call option, developed by Ericsson and Reneby (2003), has been applied to deduce a credit risk model. It is supposed that the firm has a debt structure with two maturity dates and that the credit event takes place when the assets firm value falls under a determined level called barrier. An empirical application of the model for 105 firms of Spanish continuous market is carried out. For each one of them its value in the date of analysis, the volatility and the critical value are obtained and from these, the default probability to short and long-term and the implicit probability in the two previous probabilities are deduced. The results are compared with the ones obtained from the Geskemodel (1977).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

"IT'S THE ECONOMY STUPID", BUT CHARISMA MATTERS TOO: A DUAL PROCESS MODEL OF PRESIDENTIAL ELECTION OUTCOMES. ABSTRACT Because charisma is assumed to be an important determinant of effective leadership, the extent to which a presidential nominee is more charismatic than his opponent should be an important determinant of voter choices. We computed a composite measure of the rhetorical richness of acceptances speeches given by U.S. presidential candidates at their national party convention. We added this marker of charisma to Ray C. Fair's presidential vote-share equation (1978; 2009). We theorized that voters decide using psychological attribution (i.e., due to macroeconomics and incumbency) as well as inferential processes (i.e., due to leader charismatic behavior) when voting. Controlling for the macro-level variables and incumbency in the Fair model, our results indicated that difference between nominees' charisma is a significant determinant of electoral success, particularly in close elections. This extended model significantly improves the precision of the Fair model and correctly predicts 23 out of the last 24 U.S. presidential elections. Paper 2: IT CEO LEADERSHIP, CORPORATE SOCIAL AND FINANCIAL PERFORMANCE. ABSTRACT We investigated whether CEO leadership predicted corporate financial performance (CFP) and corporate social performance (CSP). Using longitudinal data on 258 CEOs from 117 firms across 19 countries and 10 industry sectors, we found that determinants of CEO leadership (i.e., implicit motives) significantly predicted both CFP and CSP. As expected, the most consistent positive predictor was Responsibility Disposition when interacting with n (need for) Power. n Achievement and n Affiliation were generally negatively related or unrelated to outcomes. CSP was positively related to accounting measures of CFP. Our findings suggest that executive leader characteristics have important consequences for corporate level outcomes. Paper 3. PUNISHING THE POWERFUL: ATTRIBUTIONS OF BLAME AND LEADERSHIP ABSTRACT We propose that individuals are more lenient in attributing blame to leaders than to nonleaders. We advance a motivational explanation building on the perspective of punishment and on system justification theory. We conducted two scenario experiments which supported our proposition. In study 1, wrongdoer leader status was negatively related to blame and the perceived seriousness of the wrongdoing. In study 2, controlling for the Big-Five personality factor and individual differences in moral evaluation (i.e., moral foundations), wrongdoer leader status was negatively related with desired severity of punishment, and fair punishments were perceived as more just for non-leaders than for leaders.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Based on the case of reforms aimed at integrating the provision of income protection and employment services for jobless people in Europe, this thesis seeks to understand the reasons which may prompt governments to engage in large-scale organisational reforms. Over the last 20 years, several European countries have indeed radically redesigned the organisational structure of their welfare state by merging or bundling existing front-line offices in charge of benefit payment and employment services together into 'one-stop' agencies. Whereas in academic and political debates, these reforms are generally presented as a necessary and rational response to the problems and inconsistencies induced by fragmentation in a context of the reorientation of welfare states towards labour market activation, this thesis shows that the agenda setting of these reforms is in fact the result of multidimensional political dynamics. More specifically, the main argument of this thesis is that these reforms are best understood not so such from the problems induced by organisational compartmentalism, whose political recognition is often controversial, but from the various goals that governments may simultaneously achieve by means of their adoption. This argument is tested by comparing agenda-setting processes of large-scale reforms of coordination in the United Kingdom (Jobcentre Plus), Germany (Hartz IV reform) and Denmark (2005 Jobcentre reform), and contrasting them with the Swiss case where the government has so far rejected any coordination initiative involving organisational redesign. This comparison brings to light the importance, for the rise of organisational reforms, of the possibility to couple them with the following three goals: first, goals related to the strengthening of activation policies; second, institutional goals seeking to redefine the balance of responsibilities between the central state and non-state actors, and finally electoral goals for governments eager to maintain political credibility. The decisive role of electoral goals in the three countries suggests that these reforms are less bound by partisan politics than by the particular pressures facing governments arrived in office after long periods in opposition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Political participation is often very low in Switzerland especially among students and young citizens. In the run-up to the Swiss parliamentary election in October 2007 several online tools and campaigns were developed with the aim to increase not only the level of information about the political programs of parties and candidates, but also the electoral participation of younger citizens. From a practical point of view this paper will describe the development, marketing efforts and the distribution as well as the use of two of these tools : the so-called "Parteienkompass" (party compass) and the "myVote"-tool - an online voting assistance tool based on an issue-matching system comparing policy preferences between voters and candidates on an individual level. We also havea look at similar tools stemming from Voting Advice Applications (VAA) in other countries in Western Europe. The paper closes with the results of an evaluation and an outlook to further developments and on-going projects in the near future in Switzerland.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[cat] Com afecten l’obertura comercial i financera a la volatilitat macroeconòmica? La literatura existent, tant empírica com teòrica, no ha assolit encara un consens. Aquest article usa un model microfonamentat de dos països simètrics amb entrada endògena d’empreses per estudiar-ho. L’anàlisis es du a terme per tres règims econòmics diferents amb diferents nivells d’integració internacional: una economia tancada, una autarquia financera i una integració plena. Es consideren diversos nivells d’obertura comercial, en forma de biaix domèstic de la demanda i l’economia pot patir pertorbacions en la productivitat del treball i en innovació. El model conclou que la incertesa macroeconòmica, representada principalment per la volatilitat del consum, la producció i la relació real d’intercanvi internacional, depèn del grau d’obertura i del tipus de pertorbació.

Relevância:

10.00% 10.00%

Publicador: