48 resultados para Mergers and acquisitions, analysts, consensus forecast error
em Aston University Research Archive
Resumo:
This empirical study employs a different methodology to examine the change in wealth associated with mergers and acquisitions (M&As) for US firms. Specifically, we employ the standard CAPM, the Fama-French three-factor model and the Carhart four-factor models within the OLS and GJR-GARCH estimation methods to test the behaviour of the cumulative abnormal returns (CARs). Whilst the standard CAPM captures the variability of stock returns with the overall market, the Fama-French factors capture the risk factors that are important to investors. Additionally, augmenting the Fama-French three-factor model with the Carhart momentum factor to generate the four-factor captures additional pricing elements that may affect stock returns. Traditionally, estimates of abnormal returns (ARs) in M&As situations rely on the standard OLS estimation method. However, the standard OLS will provide inefficient estimates of the ARs if the data contain ARCH and asymmetric effects. To minimise this problem of estimation efficiency we re-estimated the ARs using GJR-GARCH estimation method. We find that there is variation in the results both as regards the choice models and estimation methods. Besides these variations in the estimated models and the choice of estimation methods, we also tested whether the ARs are affected by the degree of liquidity of the stocks and the size of the firm. We document significant positive post-announcement cumulative ARs (CARs) for target firm shareholders under both the OLS and GJR-GARCH methods across all three methodologies. However, post-event CARs for acquiring firm shareholders were insignificant for both sets of estimation methods under the three methodologies. The GJR-GARCH method seems to generate larger CARs than those of the OLS method. Using both market capitalization and trading volume as a measure of liquidity and the size of the firm, we observed strong return continuations in the medium firms relative to small and large firms for target shareholders. We consistently observed market efficiency in small and large firm. This implies that target firms for small and large firms overreact to new information resulting in a more efficient market. For acquirer firms, our measure of liquidity captures strong return continuations for small firms under the OLS estimates for both CAPM and Fama-French three-factor models, whilst under the GJR-GARCH estimates only for Carhart model. Post-announcement bootstrapping simulated CARs confirmed our earlier results.
Resumo:
Mergers and acquisitions (M&) are increasingly becoming a strategy of choice for companies attempting to achieve and sustain competitive advantage. However, not all M&As are a success. In this paper, we examine the three main reasons highlighted in the literature as major causes of M&A failure (clashing corporate cultures, absence of clear communication, and employee involvement) in three Indian pharmaceutical companies, and we analyze the role played by the HR function in addressing them. Also, we discuss the importance of gaining the commitment and focus of the workforce during the acquisition process through employee involvement.
Resumo:
This empirical study investigates the performance of cross border M&A. The first stage is to identify the determinants of making cross border M&A complete. One focus here is to extend the existing empirical evidence in the field of cross border M&A and exploit the likelihood of M&A from a different perspective. Given the determinants of cross border M&A completions, the second stage is to investigate the effects of cross border M&A on post-acquisition firm performance for both targets and acquirers. The thesis exploits a hitherto unused data base, which consists of those firms that are rumoured to be undertaking M&A, and then follow the deal to completion or abandonment. This approach highlights a number of limitations to the previous literature, which relies on statistical methodology to identify potential but non-existent mergers. This thesis changes some conventional understanding for M&A activity. Cross border M&A activity is underpinned by various motives such as synergy, management discipline, and acquisition of complementary resources. Traditionally, it is believed that these motives will boost the international M&A activity and improve firm performance after takeovers. However, this thesis shows that such factors based on these motives as acquirer’s profitability and liquidity and target’s intangible resource actually deter the completion of cross border M&A in the period of 2002-2011. The overall finding suggests that the cross border M&A is the efficiency-seeking activity rather than the resource-seeking activity. Furthermore, compared with firms in takeover rumours, the completion of M&A lowers firm performance. More specifically, the difficulties in transfer of competitive advantages and integration of strategic assets lead to low firm performance in terms of productivity. Besides, firms cannot realise the synergistic effect and managerial disciplinary effect once a cross border M&A is completed, which suggests a low post-acquisition profitability level.
Resumo:
This article compares the cases of ozone layer protection and climate change. In both cases, scientific expertise has played a comparatively important role in the policy process. The author argues that against conventional assumptions, scientific consensus is not necessary to achieve ambitious political goals. However, the architects of the Intergovernmental Panel on Climate Change operated under such assumptions. The author argues that this is problematic both from a theoretical viewpoint and from empirical evidence. Contrary to conventional assumptions, ambitious political regulations in the ozone case were agreed under scientific uncertainty, whereas the negotiations on climate change were much more modest albeit based on a large scientific consensus. On the basis of a media analysis, the author shows that the creation of a climate of expectation plus pressure from leader countries is crucial for success. © 2006 Sage Publication.
Resumo:
There are conflicting predictions in the literature about the relationship between FDI and entrepreneurship. This paper explores how foreign direct investment (FDI) inflows, measured by lagged cross-border mergers and acquisitions (M&A), affect entrepreneurial entry in the host economy. We have constructed a micro-panel of more than two thousand individuals in each of seventy countries, 2000–2009, linked to FDI by matching sectors. We find the relationship between FDI inflows and domestic entrepreneurship to be negative across all economies. This negative effect is much more pronounced in developed than developing economies and is also identified within industries, notably in manufacturing. Policies to encourage FDI via M&A need to consider how to counteract the prevailing adverse effect on domestic entrepreneurship.
Resumo:
Decades of costly failures in translating drug candidates from preclinical disease models to human therapeutic use warrant reconsideration of the priority placed on animal models in biomedical research. Following an international workshop attended by experts from academia, government institutions, research funding bodies, and the corporate and nongovernmental organisation (NGO) sectors, in this consensus report, we analyse, as case studies, five disease areas with major unmet needs for new treatments. In view of the scientifically driven transition towards a human pathway-based paradigm in toxicology, a similar paradigm shift appears to be justified in biomedical research. There is a pressing need for an approach that strategically implements advanced, human biology-based models and tools to understand disease pathways at multiple biological scales. We present recommendations to help achieve this.
Resumo:
This study suggests a novel application of Inverse Data Envelopment Analysis (InvDEA) in strategic decision making about mergers and acquisitions in banking. The conventional DEA assesses the efficiency of banks based on the information gathered about the quantities of inputs used to realize the observed level of outputs produced. The decision maker of a banking unit willing to merge/acquire another banking unit needs to decide about the inputs and/or outputs level if an efficiency target for the new banking unit is set. In this paper, a new InvDEA-based approach is developed to suggest the required level of the inputs and outputs for the merged bank to reach a predetermined efficiency target. This study illustrates the novelty of the proposed approach through the case of a bank considering merging with or acquiring one of its competitors to synergize and realize higher level of efficiency. A real data set of 42 banking units in Gulf Corporation Council countries is used to show the practicality of the proposed approach.
Resumo:
Three experiments are reported which examine the effects of consensus information on majority and minority influence. In all experiments two levels of consensus difference were examined; large (82% versus 18%) and small (52% versus 48%). Experiment 1 showed that a majority source had more influence than a minority source, irrespective of consensus level. Experiment 2 examined the cause of this effect by presenting only the source label (‘majority’ versus ‘minority’), only the consensus information (percentages) or both. The superior influence of the majority was again found when either (a) both source label and consensus information were given (replicating Experiment 1) and (b) only consensus information was given, but not when (c) only the source label was given. The results showed majority influence was due to the consensus information indicating more than 50% of the population supported that position. Experiment 3 also manipulated message quality (strong versus weak arguments) to identify whether systematic processing had occurred. Message quality only had an impact with the minority of 18%. These studies show that consensus information has different effects for majority and minority influence. For majority influence, having over 50% support is sufficient to cause compliance while for a minority there are advantages to being numerically small, in terms of leading to detailed processing of its message.
Resumo:
We analyze theoretically the interplay between optical return-to-zero signal degradation due to timing jitter and additive amplified-spontaneous-emission noise. The impact of these two factors on the performance of a square-law direct detection receiver is also investigated. We derive an analytical expression for the bit-error probability and quantitatively determine the conditions when the contributions of the effects of timing jitter and additive noise to the bit error rate can be treated separately. The analysis of patterning effects is also presented. © 2007 IEEE.
Resumo:
Introduction: There is a growing public perception that serious medical error is commonplace and largely tolerated by the medical profession. The Government and medical establishment's response to this perceived epidemic of error has included tighter controls over practising doctors and individual stick-and-carrot reforms of medical practice. Discussion: This paper critically reviews the literature on medical error, professional socialization and medical student education, and suggests that common themes such as uncertainty, necessary fallibility, exclusivity of professional judgement and extensive use of medical networks find their genesis, in part, in aspects of medical education and socialization into medicine. The nature and comparative failure of recent reforms of medical practice and the tension between the individualistic nature of the reforms and the collegiate nature of the medical profession are discussed. Conclusion: A more theoretically informed and longitudinal approach to decreasing medical error might be to address the genesis of medical thinking about error through reforms to the aspects of medical education and professional socialization that help to create and perpetuate the existence of avoidable error, and reinforce medical collusion concerning error. Further changes in the curriculum to emphasize team working, communication skills, evidence-based practice and strategies for managing uncertainty are therefore potentially key components in helping tomorrow's doctors to discuss, cope with and commit fewer medical errors.
Resumo:
We present three jargonaphasic patients who made phonological errors in naming, repetition and reading. We analyse target/response overlap using statistical models to answer three questions: 1) Is there a single phonological source for errors or two sources, one for target-related errors and a separate source for abstruse errors? 2) Can correct responses be predicted by the same distribution used to predict errors or do they show a completion boost (CB)? 3) Is non-lexical and lexical information summed during reading and repetition? The answers were clear. 1) Abstruse errors did not require a separate distribution created by failure to access word forms. Abstruse and target-related errors were the endpoints of a single overlap distribution. 2) Correct responses required a special factor, e.g., a CB or lexical/phonological feedback, to preserve their integrity. 3) Reading and repetition required separate lexical and non-lexical contributions that were combined at output.
Resumo:
The thrust of this report concerns spline theory and some of the background to spline theory and follows the development in (Wahba, 1991). We also review methods for determining hyper-parameters, such as the smoothing parameter, by Generalised Cross Validation. Splines have an advantage over Gaussian Process based procedures in that we can readily impose atmospherically sensible smoothness constraints and maintain computational efficiency. Vector splines enable us to penalise gradients of vorticity and divergence in wind fields. Two similar techniques are summarised and improvements based on robust error functions and restricted numbers of basis functions given. A final, brief discussion of the application of vector splines to the problem of scatterometer data assimilation highlights the problems of ambiguous solutions.
Resumo:
We present a theoretical method for a direct evaluation of the average error exponent in Gallager error-correcting codes using methods of statistical physics. Results for the binary symmetric channel(BSC)are presented for codes of both finite and infinite connectivity.