938 resultados para Open Business Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the impact of a balanced budget fiscal policy expansion in a regional context within a numerical dynamic general equilibrium model. We take Scotland as an example where, recently, there has been extensive debate on greater fiscal autonomy. In response to a balanced budget fiscal expansion the model suggests that: an increase in current government purchase in goods and services has negative multiplier effects only if the elasticity of substitution between private and public consumption is high enough to move downward the marginal utility of private consumers; public capital expenditure crowds in consumption and investment even with a high level of congestion; but crowding out effects might arise in the short-run if agents are myopic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates underlying changes in the UK economy over the past thirtyfive years using a small open economy DSGE model. Using Bayesian analysis, we find UK monetary policy, nominal price rigidity and exogenous shocks, are all subject to regime shifting. A model incorporating these changes is used to estimate the realised monetary policy and derive the optimal monetary policy for the UK. This allows us to assess the effectiveness of the realised policy in terms of stabilising economic fluctuations, and, in turn, provide an indication of whether there is room for monetary authorities to further improve their policies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we investigate the ability of a number of different ordered probit models to predict ratings based on firm-specific data on business and financial risks. We investigate models based on momentum, drift and ageing and compare them against alternatives that take into account the initial rating of the firm and its previous actual rating. Using data on US bond issuing firms rated by Fitch over the years 2000 to 2007 we compare the performance of these models in predicting the rating in-sample and out-of-sample using root mean squared errors, Diebold-Mariano tests of forecast performance and contingency tables. We conclude that initial and previous states have a substantial influence on rating prediction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Employing an endogenous growth model with human capital, this paper explores how productivity shocks in the goods and human capital producing sectors contribute to explaining aggregate fluctuations in output, consumption, investment and hours. Given the importance of accounting for both the dynamics and the trends in the data not captured by the theoretical growth model, we introduce a vector error correction model (VECM) of the measurement errors and estimate the model’s posterior density function using Bayesian methods. To contextualize our findings with those in the literature, we also assess whether the endogenous growth model or the standard real business cycle model better explains the observed variation in these aggregates. In addressing these issues we contribute to both the methods of analysis and the ongoing debate regarding the effects of innovations to productivity on macroeconomic activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is inspired by articles in the last decade or so that have argued for more attention to theory, and to empirical analysis, within the well-known, and long-lasting, contingency framework for explaining the organisational form of the firm. Its contribution is to extend contingency analysis in three ways: (a) by empirically testing it, using explicit econometric modelling (rather than case study evidence) involving estimation by ordered probit analysis; (b) by extending its scope from large firms to SMEs; (c) by extending its applications from Western economic contexts, to an emerging economy context, using field work evidence from China. It calibrates organizational form in a new way, as an ordinal dependent variable, and also utilises new measures of familiar contingency factors from the literature (i.e. Environment, Strategy, Size and Technology) as the independent variables. An ordered probit model of contingency was constructed, and estimated by maximum likelihood, using a cross section of 83 private Chinese firms. The probit was found to be a good fit to the data, and displayed significant coefficients with plausible interpretations for key variables under all the four categories of contingency analysis, namely Environment, Strategy, Size and Technology. Thus we have generalised the contingency model, in terms of specification, interpretation and applications area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study a business cycle model in which a benevolent fiscal authority must determine the optimal provision of government services, while lacking credibility, lump-sum taxes, and the ability to bond finance deficits. Households and the fiscal authority have risk sensitive preferences. We find that outcomes are affected importantly by the household's risk sensitivity, but not by the fiscal authority's. Further, while household risk-sensitivity induces a strong precautionary saving motive, which raises capital and lowers the return on assets, its effects on fluctuations and the business cycle are generally small, although more pronounced for negative shocks. Holding the stochastic steady state constant, increases in household risk-sensitivity lower the risk-free rate and raise the return on equity, increasing the equity premium. Finally, although risk-sensitivity has little effect on the provision of government services, it does cause the fiscal authority to lower the income tax rate. An additional contribution of this paper is to present a method for computing Markov-perfect equilibria in models where private agents and the government are risk-sensitive decisionmakers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyses optimal income taxes over the business cycle under a balanced-budget restriction, for low, middle and high income households. A model incorporating capital-skill complementarity in production and differential access to capital and labour markets is developed to capture the cyclical characteristics of the US economy, as well as the empirical observations on wage (skill premium) and wealth inequality. We .nd that the tax rate for high income agents is optimally the least volatile and the tax rate for low income agents the least countercyclical. In contrast, the path of optimal taxes for the middle income group is found to be very volatile and counter-cyclical. We further find that the optimal response to output-enhancing capital equipment technology and spending cuts is to increase the progressivity of income taxes. Finally, in response to positive TFP shocks, taxation becomes more progressive after about two years.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper studies the wasteful e ffect of bureaucracy on the economy by addressing the link between rent-seeking behavior of government bureaucrats and the public sector wage bill, which is taken to represent the rent component. In particular, public o fficials are modeled as individuals competing for a larger share of those public funds. The rent-seeking extraction technology in the government administration is modeled as in Murphy et al. (1991) and incorporated in an otherwise standard Real-Business-Cycle (RBC) framework with public sector. The model is calibrated to German data for the period 1970-2007. The main fi ndings are: (i) Due to the existence of a signi ficant public sector wage premium and the high public sector employment, a substantial amount of working time is spent rent-seeking, which in turn leads to signifi cant losses in terms of output; (ii) The measures for the rent-seeking cost obtained from the model for the major EU countries are highly-correlated to indices of bureaucratic ineffi ciency; (iii) Under the optimal scal policy regime,steady-state rent-seeking is smaller relative to the exogenous policy case, as the government chooses a higher public wage premium, but sets a much lower public employment, thus achieving a decrease in rent-seeking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An important disconnect in the news driven view of the business cycle formalized by Beaudry and Portier (2004), is the lack of agreement between different—VAR and DSGE—methodologies over the empirical plausibility of this view. We argue that this disconnect can be largely resolved once we augment a standard DSGE model with a financial channel that provides amplification to news shocks. Both methodologies suggest news shocks to the future growth prospects of the economy to be significant drivers of U.S. business cycles in the post-Greenspan era (1990-2011), explaining as much as 50% of the forecast error variance in hours worked in cyclical frequencies

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper studies unemployed workers’ decisions to change occupations, and their impact on fluctuations in aggregate unemployment and its underlying duration distribution. We develop an analytically and computationally tractable stochastic equilibrium model with heterogenous labor markets. In this model three different types of unemployment arise: search, rest and reallocation unemployment. We document new evidence on unemployed workers’ gross occupational mobility and use it to calibrate the model. We show that rest unemployment is the main driver of unemployment fluctuations over the business cycle and causes cyclical unemployment to be highly volatile. The resulting unemployment duration distribution generated by the model responds realistically to the business cycle, creating substantial longer-term unemployment in downturns. Finally, rest unemployment also makes our model simultaneously consistent with procyclical occupational mobility of the unemployed, countercyclical job separations into unemployment and a negatively-sloped Beveridge curve.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Time-inconsistency is an essential feature of many policy problems (Kydland and Prescott, 1977). This paper presents and compares three methods for computing Markov-perfect optimal policies in stochastic nonlinear business cycle models. The methods considered include value function iteration, generalized Euler-equations, and parameterized shadow prices. In the context of a business cycle model in which a scal authority chooses government spending and income taxation optimally, while lacking the ability to commit, we show that the solutions obtained using value function iteration and generalized Euler equations are somewhat more accurate than that obtained using parameterized shadow prices. Among these three methods, we show that value function iteration can be applied easily, even to environments that include a risk-sensitive scal authority and/or inequality constraints on government spending. We show that the risk-sensitive scal authority lowers government spending and income-taxation, reducing the disincentive households face to accumulate wealth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper evaluates the effects of policy interventions on sectoral labour markets and the aggregate economy in a business cycle model with search and matching frictions. We extend the canonical model by including capital-skill complementarity in production, labour markets with skilled and unskilled workers and on-the-job-learning (OJL) within and across skill types. We first find that, the model does a good job at matching the cyclical properties of sectoral employment and the wage-skill premium. We next find that vacancy subsidies for skilled and unskilled jobs lead to output multipliers which are greater than unity with OJL and less than unity without OJL. In contrast, the positive output effects from cutting skilled and unskilled income taxes are close to zero. Finally, we find that the sectoral and aggregate effects of vacancy subsidies do not depend on whether they are financed via public debt or distorting taxes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AbstractBreast cancer is one of the most common cancers affecting one in eight women during their lives. Survival rates have increased steadily thanks to early diagnosis with mammography screening and more efficient treatment strategies. Post-operative radiation therapy is a standard of care in the management of breast cancer and has been shown to reduce efficiently both local recurrence rate and breast cancer mortality. Radiation therapy is however associated with some late effects for long-term survivors. Radiation-induced secondary cancer is a relatively rare but severe late effect of radiation therapy. Currently, radiotherapy plans are essentially optimized to maximize tumor control and minimize late deterministic effects (tissue reactions) that are mainly associated with high doses (» 1 Gy). With improved cure rates and new radiation therapy technologies, it is also important to evaluate and minimize secondary cancer risks for different treatment techniques. This is a particularly challenging task due to the large uncertainties in the dose-response relationship.In contrast with late deterministic effects, secondary cancers may be associated with much lower doses and therefore out-of-field doses (also called peripheral doses) that are typically inferior to 1 Gy need to be determined accurately. Out-of-field doses result from patient scatter and head scatter from the treatment unit. These doses are particularly challenging to compute and we characterized it by Monte Carlo (MC) calculation. A detailed MC model of the Siemens Primus linear accelerator has been thoroughly validated with measurements. We investigated the accuracy of such a model for retrospective dosimetry in epidemiological studies on secondary cancers. Considering that patients in such large studies could be treated on a variety of machines, we assessed the uncertainty in reconstructed peripheral dose due to the variability of peripheral dose among various linac geometries. For large open fields (> 10x10 cm2), the uncertainty would be less than 50%, but for small fields and wedged fields the uncertainty in reconstructed dose could rise up to a factor of 10. It was concluded that such a model could be used for conventional treatments using large open fields only.The MC model of the Siemens Primus linac was then used to compare out-of-field doses for different treatment techniques in a female whole-body CT-based phantom. Current techniques such as conformai wedged-based radiotherapy and hybrid IMRT were investigated and compared to older two-dimensional radiotherapy techniques. MC doses were also compared to those of a commercial Treatment Planning System (TPS). While the TPS is routinely used to determine the dose to the contralateral breast and the ipsilateral lung which are mostly out of the treatment fields, we have shown that these doses may be highly inaccurate depending on the treatment technique investigated. MC shows that hybrid IMRT is dosimetrically similar to three-dimensional wedge-based radiotherapy within the field, but offers substantially reduced doses to out-of-field healthy organs.Finally, many different approaches to risk estimations extracted from the literature were applied to the calculated MC dose distribution. Absolute risks varied substantially as did the ratio of risk between two treatment techniques, reflecting the large uncertainties involved with current risk models. Despite all these uncertainties, the hybrid IMRT investigated resulted in systematically lower cancer risks than any of the other treatment techniques. More epidemiological studies with accurate dosimetry are required in the future to construct robust risk models. In the meantime, any treatment strategy that reduces out-of-field doses to healthy organs should be investigated. Electron radiotherapy might offer interesting possibilities with this regard.RésuméLe cancer du sein affecte une femme sur huit au cours de sa vie. Grâce au dépistage précoce et à des thérapies de plus en plus efficaces, le taux de guérison a augmenté au cours du temps. La radiothérapie postopératoire joue un rôle important dans le traitement du cancer du sein en réduisant le taux de récidive et la mortalité. Malheureusement, la radiothérapie peut aussi induire des toxicités tardives chez les patients guéris. En particulier, les cancers secondaires radio-induits sont une complication rare mais sévère de la radiothérapie. En routine clinique, les plans de radiothérapie sont essentiellement optimisées pour un contrôle local le plus élevé possible tout en minimisant les réactions tissulaires tardives qui sont essentiellement associées avec des hautes doses (» 1 Gy). Toutefois, avec l'introduction de différentes nouvelles techniques et avec l'augmentation des taux de survie, il devient impératif d'évaluer et de minimiser les risques de cancer secondaire pour différentes techniques de traitement. Une telle évaluation du risque est une tâche ardue étant donné les nombreuses incertitudes liées à la relation dose-risque.Contrairement aux effets tissulaires, les cancers secondaires peuvent aussi être induits par des basses doses dans des organes qui se trouvent hors des champs d'irradiation. Ces organes reçoivent des doses périphériques typiquement inférieures à 1 Gy qui résultent du diffusé du patient et du diffusé de l'accélérateur. Ces doses sont difficiles à calculer précisément, mais les algorithmes Monte Carlo (MC) permettent de les estimer avec une bonne précision. Un modèle MC détaillé de l'accélérateur Primus de Siemens a été élaboré et validé avec des mesures. La précision de ce modèle a également été déterminée pour la reconstruction de dose en épidémiologie. Si on considère que les patients inclus dans de larges cohortes sont traités sur une variété de machines, l'incertitude dans la reconstruction de dose périphérique a été étudiée en fonction de la variabilité de la dose périphérique pour différents types d'accélérateurs. Pour de grands champs (> 10x10 cm ), l'incertitude est inférieure à 50%, mais pour de petits champs et des champs filtrés, l'incertitude de la dose peut monter jusqu'à un facteur 10. En conclusion, un tel modèle ne peut être utilisé que pour les traitements conventionnels utilisant des grands champs.Le modèle MC de l'accélérateur Primus a été utilisé ensuite pour déterminer la dose périphérique pour différentes techniques dans un fantôme corps entier basé sur des coupes CT d'une patiente. Les techniques actuelles utilisant des champs filtrés ou encore l'IMRT hybride ont été étudiées et comparées par rapport aux techniques plus anciennes. Les doses calculées par MC ont été comparées à celles obtenues d'un logiciel de planification commercial (TPS). Alors que le TPS est utilisé en routine pour déterminer la dose au sein contralatéral et au poumon ipsilatéral qui sont principalement hors des faisceaux, nous avons montré que ces doses peuvent être plus ou moins précises selon la technTque étudiée. Les calculs MC montrent que la technique IMRT est dosimétriquement équivalente à celle basée sur des champs filtrés à l'intérieur des champs de traitement, mais offre une réduction importante de la dose aux organes périphériques.Finalement différents modèles de risque ont été étudiés sur la base des distributions de dose calculées par MC. Les risques absolus et le rapport des risques entre deux techniques de traitement varient grandement, ce qui reflète les grandes incertitudes liées aux différents modèles de risque. Malgré ces incertitudes, on a pu montrer que la technique IMRT offrait une réduction du risque systématique par rapport aux autres techniques. En attendant des données épidémiologiques supplémentaires sur la relation dose-risque, toute technique offrant une réduction des doses périphériques aux organes sains mérite d'être étudiée. La radiothérapie avec des électrons offre à ce titre des possibilités intéressantes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper contrasts the incentives for cronyism in business, the public sector and politics within an agency problem model with moral hazard. The analysis is focused on the institutional differences between private, public and political organizations. In business, when facing a residual claimant contract, a chief manager ends up with a relatively moderate …rst-best level of cronyism within a …firm. The institutional framework of the public sector does not allow explicit contracting, which leads to a more severe cronyism problem within public organizations. Finally, it is shown that the nature of political appointments (such that the subordinate's reappointment is conditioned on the chief's re-election) together with implicit contracting makes political cronyism the most extreme case. JEL classifi…cation: D72, D73, D86. Keywords: Cronyism; Meritocracy; Manager; Bureaucrat; Politician.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper uses a structural, large dimensional factor model to evaluate the role of 'news' shocks (shocks with a delayed effect on productivity) in generating the business cycle. We find that (i) existing small-scale VECM models are affected by 'non-fundamentalness' and therefore fail to recover the correct shock and impulse response functions; (ii) news shocks have a limited role in explaining the business cycle; (iii) their effects are in line with what predicted by standard neoclassical theory; (iv) the bulk of business cycle fluctuations are explained by shocks unrelated to technology.