974 resultados para business methods
Resumo:
We show that a flex-price two-sector open economy DSGE model can explain the poor degree of international risk sharing and exchange rate disconnect. We use a suite of model evaluation measures and examine the role of (i) traded and non-traded sectors; (ii) financial market incompleteness; (iii) preference shocks; (iv) deviations from UIP condition for the exchange rates; and (v) creditor status in net foreign assets. We find that there is a good case for both traded and non-traded productivity shocks as well as UIP deviations in explaining the puzzles.
Resumo:
This paper shows that introducing weak property rights in the standard real business cycle (RBC) model can help to explain economic fluctuations. This is motivated by the empirical observation that changes in institutions in emerging markets are related to the evolution of the main macroeconomic variables. In particular, in Mexico, the movements in productivity in the data are associated with changes in institutions, so that we can explain productivity shocks to a large extent as shocks to the quality of institutions. We find that the model with shocks to the degree of protection of property rights only - without technology shocks - can match the second moments in the data for Mexico well. In particular, the fit is better than that of the standard neoclassical model with full protection of property rights regarding the auto-correlations and cross-correlations in the data, especially those related to labor. Viewing productivity shocks as shocks to institutions is also consistent with the stylized fact of falling productivity and non-decreasing labor hours in Mexico over 1980-1994, which is a feature that the neoclassical model cannot match.
Resumo:
Employing the financial accelerator (FA) model of Bernanke, Gertler and Gilchrist (1999) enhanced to include a shock to the FA mechanism, we construct and study shocks to the efficiency of the financial sector in post-war US business cycles. We find that financial shocks are very tightly linked with the onset of recessions, more so than TFP or monetary shocks. The financial shock invariably remains contractionary for sometime after recessions have ended. The shock accounts for a large part of the variance of GDP and is strongly negatively correlated with the external finance premium. Second-moments comparisons across variants of the model with and without a (stochastic) FA mechanism suggests the stochastic FA model helps us understand the data.
Resumo:
This paper contributes to the on-going empirical debate regarding the role of the RBC model and in particular of technology shocks in explaining aggregate fluctuations. To this end we estimate the model’s posterior density using Markov-Chain Monte-Carlo (MCMC) methods. Within this framework we extend Ireland’s (2001, 2004) hybrid estimation approach to allow for a vector autoregressive moving average (VARMA) process to describe the movements and co-movements of the model’s errors not explained by the basic RBC model. The results of marginal likelihood ratio tests reveal that the more general model of the errors significantly improves the model’s fit relative to the VAR and AR alternatives. Moreover, despite setting the RBC model a more difficult task under the VARMA specification, our analysis, based on forecast error and spectral decompositions, suggests that the RBC model is still capable of explaining a significant fraction of the observed variation in macroeconomic aggregates in the post-war U.S. economy.
Resumo:
Aujourd'hui, les problèmes des maladies infectieuses concernent l'émergence d'infections difficiles à traiter, telles que les infections associées aux implants et les infections fongiques invasives chez les patients immunodéprimés. L'objectif de cette thèse était de développer des stratégies pour l'éradication des biofilms bactériens (partie 1), ainsi que d'étudier des méthodes innovantes pour la détection microbienne, pour l'établissement de nouveaux tests de sensibilité (partie 2). Le traitement des infections associées aux implants est difficile car les biofilms bactériens peuvent résister à des niveaux élevés d'antibiotiques. A ce jour, il n'y a pas de traitement optimal défini contre des infections causées par des bactéries de prévalence moindre telles que Enterococcus faecalis ou Propionibacterium acnés. Dans un premier temps, nous avons démontré une excellente activité in vitro de la gentamicine sur une souche de E. faecalis en phase stationnaire de croissance Nous avons ensuite confirmé l'activité de la gentamicine sur un biofilm précoce en modèle expérimental animal à corps étranger avec un taux de guérison de 50%. De plus, les courbes de bactéricidie ainsi que les résultats de calorimétrie ont prouvé que l'ajout de gentamicine améliorait l'activité in vitro de la daptomycine, ainsi que celle de la vancomycine. In vivo, le schéma thérapeutique le plus efficace était l'association daptomycine/gentamicine avec un taux de guérison de 55%. En établissant une nouvelle méthode pour l'évaluation de l'activité des antimicrobiens vis-à-vis de micro-organismes en biofilm, nous avons démontré que le meilleur antibiotique actif sur les biofilms à P. acnés était la rifampicine, suivi par la penicilline G, la daptomycine et la ceftriaxone. Les études conduites en modèle expérimental animal ont confirmé l'activité de la rifampicine seule avec un taux de guérison 36%. Le meilleur schéma thérapeutique était au final l'association rifampicine/daptomycine avec un taux de guérison 63%. Les associations de rifampicine avec la vancomycine ou la levofloxacine présentaient des taux de guérisons respectivement de 46% et 25%. Nous avons ensuite étudié l'émergence in vitro de la résistance à la rifampicine chez P. acnés. Nous avons observé un taux de mutations de 10"9. La caractérisation moléculaire de la résistance chez les mutant-résistants a mis en évidence l'implication de 5 mutations ponctuelles dans les domaines I et II du gène rpoB. Ce type de mutations a déjà été décrit au préalable chez d'autres espèces bactériennes, corroborant ainsi la validité de nos résultats. La deuxième partie de cette thèse décrit une nouvelle méthode d'évaluation de l'efficacité des antifongiques basée sur des mesures de microcalorimétrie isotherme. En utilisant un microcalorimètre, la chaleur produite par la croissance microbienne peut être-mesurée en temps réel, très précisément. Nous avons évalué l'activité de l'amphotéricine B, des triazolés et des échinocandines sur différentes souches de Aspergillus spp. par microcalorimétrie. La présence d'amphotéricine Β ou de triazole retardait la production de chaleur de manière concentration-dépendante. En revanche, pour les échinochandines, seule une diminution le pic de « flux de chaleur » a été observé. La concordance entre la concentration minimale inhibitrice de chaleur (CMIC) et la CMI ou CEM (définie par CLSI M38A), avec une marge de 2 dilutions, était de 90% pour l'amphotéricine B, 100% pour le voriconazole, 90% pour le pozoconazole et 70% pour la caspofongine. La méthode a été utilisée pour définir la sensibilité aux antifongiques pour d'autres types de champignons filamenteux. Par détermination microcalorimétrique, l'amphotéricine B s'est avéré être l'agent le plus actif contre les Mucorales et les Fusarium spp.. et le voriconazole le plus actif contre les Scedosporium spp. Finalement, nous avons évalué l'activité d'associations d'antifongiques vis-à-vis de Aspergillus spp. Une meilleure activité antifongique était retrouvée avec l'amphotéricine B ou le voriconazole lorsque ces derniers étaient associés aux échinocandines vis-à-vis de A. fumigatus. L'association échinocandine/amphotéricine B a démontré une activité antifongique synergique vis-à-vis de A. terreus, contrairement à l'association échinocandine/voriconazole qui ne démontrait aucune amélioration significative de l'activité antifongique. - The diagnosis and treatment of infectious diseases are today increasingly challenged by the emergence of difficult-to-manage situations, such as infections associated with medical devices and invasive fungal infections, especially in immunocompromised patients. The aim of this thesis was to address these challenges by developing new strategies for eradication of biofilms of difficult-to-treat microorganisms (treatment, part 1) and investigating innovative methods for microbial detection and antimicrobial susceptibility testing (diagnosis, part 2). The first part of the thesis investigates antimicrobial treatment strategies for infections caused by two less investigated microorganisms, Enterococcus faecalis and Propionibacterium acnes, which are important pathogens causing implant-associated infections. The treatment of implant-associated infections is difficult in general due to reduced susceptibility of bacteria when present in biofilms. We demonstrated an excellent in vitro activity of gentamicin against E. faecalis in stationary growth- phase and were able to confirm the activity against "young" biofilms (3 hours) in an experimental foreign-body infection model (cure rate 50%). The addition of gentamicin improved the activity of daptomycin and vancomycin in vitro, as determined by time-kill curves and microcalorimetry. In vivo, the most efficient combination regimen was daptomycin plus gentamicin (cure rate 55%). Despite a short duration of infection, the cure rates were low, highlighting that enterococcal biofilms remain difficult to treat despite administration of newer antibiotics, such as daptomycin. By establishing a novel in vitro assay for evaluation of anti-biofilm activity (microcalorimetry), we demonstrated that rifampin was the most active antimicrobial against P. acnes biofilms, followed by penicillin G, daptomycin and ceftriaxone. In animal studies we confirmed the anti-biofilm activity of rifampin (cure rate 36% when administered alone), as well as in combination with daptomycin (cure rate 63%), whereas in combination with vancomycin or levofloxacin it showed lower cure rates (46% and 25%, respectively). We further investigated the emergence of rifampin resistance in P. acnes in vitro. Rifampin resistance progressively emerged during exposure to rifampin, if the bacterial concentration was high (108 cfu/ml) with a mutation rate of 10"9. In resistant isolates, five point mutations of the rpoB gene were found in cluster I and II, as previously described for staphylococci and other bacterial species. The second part of the thesis describes a novel real-time method for evaluation of antifungals against molds, based on measurements of the growth-related heat production by isothermal microcalorimetry. Current methods for evaluation of antifungal agents against molds, have several limitations, especially when combinations of antifungals are investigated. We evaluated the activity of amphotericin B, triazoles (voriconazole, posaconazole) and echinocandins (caspofungin and anidulafungin) against Aspergillus spp. by microcalorimetry. The presence of amphotericin Β or a triazole delayed the heat production in a concentration-dependent manner and the minimal heat inhibition concentration (MHIC) was determined as the lowest concentration inhibiting 50% of the heat produced at 48 h. Due to the different mechanism of action echinocandins, the MHIC for this antifungal class was determined as the lowest concentration lowering the heat-flow peak with 50%. Agreement within two 2-fold dilutions between MHIC and MIC or MEC (determined by CLSI M38A) was 90% for amphotericin B, 100% for voriconazole, 90% for posaconazole and 70% for caspofungin. We further evaluated our assay for antifungal susceptibility testing of non-Aspergillus molds. As determined by microcalorimetry, amphotericin Β was the most active agent against Mucorales and Fusarium spp., whereas voriconazole was the most active agent against Scedosporium spp. Finally, we evaluated the activity of antifungal combinations against Aspergillus spp. Against A. jumigatus, an improved activity of amphotericin Β and voriconazole was observed when combined with an echinocandin. Against A. terreus, an echinocandin showed a synergistic activity with amphotericin B, whereas in combination with voriconazole, no considerable improved activity was observed.
Resumo:
The role of land cover change as a significant component of global change has become increasingly recognized in recent decades. Large databases measuring land cover change, and the data which can potentially be used to explain the observed changes, are also becoming more commonly available. When developing statistical models to investigate observed changes, it is important to be aware that the chosen sampling strategy and modelling techniques can influence results. We present a comparison of three sampling strategies and two forms of grouped logistic regression models (multinomial and ordinal) in the investigation of patterns of successional change after agricultural land abandonment in Switzerland. Results indicated that both ordinal and nominal transitional change occurs in the landscape and that the use of different sampling regimes and modelling techniques as investigative tools yield different results. Synthesis and applications. Our multimodel inference identified successfully a set of consistently selected indicators of land cover change, which can be used to predict further change, including annual average temperature, the number of already overgrown neighbouring areas of land and distance to historically destructive avalanche sites. This allows for more reliable decision making and planning with respect to landscape management. Although both model approaches gave similar results, ordinal regression yielded more parsimonious models that identified the important predictors of land cover change more efficiently. Thus, this approach is favourable where land cover change pattern can be interpreted as an ordinal process. Otherwise, multinomial logistic regression is a viable alternative.
Resumo:
The project aims to achieve two objectives. First, we are analysing the labour market implications of the assumption that firms cannot pay similarly qualified employees differently according to when they joined the firm. For example, if the general situation for workers improves, a firm that seeks to hire new workers may feel it has to pay more to new hires. However, if the firm must pay the same wage to new hires and incumbents due to equal treatment, it would either have to raise the wage of the incumbents, or offer new workers a lower wage than the firm would do otherwise. This is very different from the standard assumption in economic analysis that firms are free to treat newly hired workers independently of existing hires. Second, we will use detailed data on individual wages to try to gauge whether (and to what extent) equity is a feature of actual labour markets. To investigate this, we are using two matched employer-employee panel datasets, one from Portugal and the other from Brazil. These unique datasets provide objective records on millions of workers and their firms over a long period of time, so that we can identify which firms employ which workers at each time. The datasets also include a large number of firm and worker variables.
Resumo:
In this paper, we consider a producer who faces uninsurable business risks due to incomplete spanning of asset markets over stochastic goods market outcomes, and examine how the presence of the uninsurable business risks affects the producer's optimal pricing and production behaviours. Three key (inter-related) results we find are: (1) optimal prices in goods markets comprise ‘markup’ to the extent of market power and ‘premium’ by shadow price of the risks; (2) price inertia as we observe in data can be explained by a joint work of risk neutralization motive and marginal cost equalization condition; (3) the relative responsiveness of risk neutralization motive and marginal cost equalization at optimum is central to the cyclical variation of markups, providing a consistent explanation for procyclical and countercyclical movements. By these results, the proposed theory of producer leaves important implications both micro and macro, and both empirical and theoretical.
Resumo:
Achievement careers are regarded as a distinctive element of the post-war period in occidental societies. Such a career was at once a modal trajectory of the modern parts of middleclass men and a social emblem for progress and success. However, if the achievement career came to be a biographical pattern with great normative power, its precise sequential course remained vague. Theories of the 1960s and 1970s described it as an orderly advancement within large firms. By the 1990s, scholars postulated an erosion of the organizational structures that once contributed to the institutionalization of careers, accompanied by a weakening of the normative weight of the achievement career by management discourse. We question the thesis of the corrosion of achievement career by analysing the trajectories of 442 engineers and business economists in Switzerland in regard to their orderliness, loyalty, and temporal rhythm. An inspection of types of careers and cohorts reveals that even if we face a decline of loyalty over time, hierarchical orderliness is not touched by those changes. Foremost, technical-industrial careers fit the loyal and regular pattern. Hence, this trajectory-type represents only a minority and is by far the slowest and least successful in terms of hierarchical ascension.
Resumo:
Employing an endogenous growth model with human capital, this paper explores how productivity shocks in the goods and human capital producing sectors contribute to explaining aggregate fluctuations in output, consumption, investment and hours. Given the importance of accounting for both the dynamics and the trends in the data not captured by the theoretical growth model, we introduce a vector error correction model (VECM) of the measurement errors and estimate the model’s posterior density function using Bayesian methods. To contextualize our findings with those in the literature, we also assess whether the endogenous growth model or the standard real business cycle model better explains the observed variation in these aggregates. In addressing these issues we contribute to both the methods of analysis and the ongoing debate regarding the effects of innovations to productivity on macroeconomic activity.
Resumo:
This paper is inspired by articles in the last decade or so that have argued for more attention to theory, and to empirical analysis, within the well-known, and long-lasting, contingency framework for explaining the organisational form of the firm. Its contribution is to extend contingency analysis in three ways: (a) by empirically testing it, using explicit econometric modelling (rather than case study evidence) involving estimation by ordered probit analysis; (b) by extending its scope from large firms to SMEs; (c) by extending its applications from Western economic contexts, to an emerging economy context, using field work evidence from China. It calibrates organizational form in a new way, as an ordinal dependent variable, and also utilises new measures of familiar contingency factors from the literature (i.e. Environment, Strategy, Size and Technology) as the independent variables. An ordered probit model of contingency was constructed, and estimated by maximum likelihood, using a cross section of 83 private Chinese firms. The probit was found to be a good fit to the data, and displayed significant coefficients with plausible interpretations for key variables under all the four categories of contingency analysis, namely Environment, Strategy, Size and Technology. Thus we have generalised the contingency model, in terms of specification, interpretation and applications area.
Resumo:
VAR methods have been used to model the inter-relationships between inflows and outfl ows into unemployment and vacancies using tools such as impulse response analysis. In order to investigate whether such impulse responses change over the course of the business cycle or or over time, this paper uses TVP-VARs for US and Canadian data. For the US, we find interesting differences between the most recent recession and earlier recessions and expansions. In particular, we find the immediate effect of a negative shock on both in ow and out flow hazards to be larger in 2008 than in earlier times. Furthermore, the effect of this shock takes longer to decay. For Canada, we fi nd less evidence of time-variation in impulse responses.
Resumo:
We study a business cycle model in which a benevolent fiscal authority must determine the optimal provision of government services, while lacking credibility, lump-sum taxes, and the ability to bond finance deficits. Households and the fiscal authority have risk sensitive preferences. We find that outcomes are affected importantly by the household's risk sensitivity, but not by the fiscal authority's. Further, while household risk-sensitivity induces a strong precautionary saving motive, which raises capital and lowers the return on assets, its effects on fluctuations and the business cycle are generally small, although more pronounced for negative shocks. Holding the stochastic steady state constant, increases in household risk-sensitivity lower the risk-free rate and raise the return on equity, increasing the equity premium. Finally, although risk-sensitivity has little effect on the provision of government services, it does cause the fiscal authority to lower the income tax rate. An additional contribution of this paper is to present a method for computing Markov-perfect equilibria in models where private agents and the government are risk-sensitive decisionmakers.
Resumo:
This paper analyses optimal income taxes over the business cycle under a balanced-budget restriction, for low, middle and high income households. A model incorporating capital-skill complementarity in production and differential access to capital and labour markets is developed to capture the cyclical characteristics of the US economy, as well as the empirical observations on wage (skill premium) and wealth inequality. We .nd that the tax rate for high income agents is optimally the least volatile and the tax rate for low income agents the least countercyclical. In contrast, the path of optimal taxes for the middle income group is found to be very volatile and counter-cyclical. We further find that the optimal response to output-enhancing capital equipment technology and spending cuts is to increase the progressivity of income taxes. Finally, in response to positive TFP shocks, taxation becomes more progressive after about two years.
Resumo:
Motivated by the highly-unionized public sectors, the high public shares in total employment, and the public sector wage premia observed in Europe, this paper examines the importance of public sector unions for macroeconomic theory. The model generates cyclical behavior in hours and wages that is consistent with data behavior in an economy with highly-unionized public sector, namely Germany during the period 1970-2007. The union model is a signifi cant improvement over a model with exogenous public employment. In addition, endogenously-determined public wage and hours add to the distortionary e ffect of contractionary tax reforms by generating greater tax rate changes, thus producing signi ficantly higher welfare losses.