939 resultados para Financial analysis
Resumo:
Australian banks are currently generating huge profits but are they sustainable? NECMI AVKIRAN suggests that banks will need to scrutinise the performance of their networks to ensure future profits.
Resumo:
Whilst financial markets are not strangers to academic and professional scrutiny, they still remain epistemologically contested. For individuals trying to profit by trading shares, this uncertainty is manifested in the varying trading styles which they are able to utilize. This paper examines one trading style commonly used by non-professional share traders-technical analysis. Using research data obtained from individuals who identify themselves as technical analysts, this paper seeks to explain the ways in which individuals understand and use the technique in an attempt to make trading profits. In particular, four distinct subcategories or ideal types of technical analysis can be identified, each providing an alternative perceptual form for participating in financial markets. Each of these types relies upon a particular method for seeing the market, these visualization techniques highlighting the existence of forms of professional vision (as originally identified by Goodwin (1994)) in the way the trading styles are comprehended and acted upon.
Resumo:
This study attempts to assess the role of perceived risk in air passenger behaviour. A survey of 889 respondents is used to investigate a multidimensional concept of perceived risk and to analyse the differences between socio-demographic characteristics regarding passengers' risk assessment. The results indicate that financial risk and temporal risk are the most important in the context of commercial air travel. All perceived risk dimensions differ according gender, age, cultural background, income, previous experience, and reason for travelling. (C) 2006 Elsevier Ltd. All rights reserved.
Resumo:
A new methodology is proposed for the analysis of generation capacity investment in a deregulated market environment. This methodology proposes to make the investment appraisal using a probabilistic framework. The probabilistic production simulation (PPC) algorithm is used to compute the expected energy generated, taking into account system load variations and plant forced outage rates, while the Monte Carlo approach has been applied to model the electricity price variability seen in a realistic network. The model is able to capture the price and hence the profitability uncertainties for generator companies. Seasonal variation in the electricity prices and the system demand are independently modeled. The method is validated on IEEE RTS system, augmented with realistic market and plant data, by using it to compare the financial viability of several generator investments applying either conventional or directly connected generator (powerformer) technologies. The significance of the results is assessed using several financial risk measures.
Resumo:
In the analysis and prediction of many real-world time series, the assumption of stationarity is not valid. A special form of non-stationarity, where the underlying generator switches between (approximately) stationary regimes, seems particularly appropriate for financial markets. We introduce a new model which combines a dynamic switching (controlled by a hidden Markov model) and a non-linear dynamical system. We show how to train this hybrid model in a maximum likelihood approach and evaluate its performance on both synthetic and financial data.
Resumo:
This paper reports on an assessment of an ongoing 6-Sigma program conducted within a UK based (US owned) automotive company. It gives an overview of the management of the 6-sigma programme and the 23 in-house methodology used. The analysis given in the paper pays particular focus to the financial impacts that individual projects have had. Three projects are chosen from the hundreds that have been completed and are discussed in detail, including which specific techniques have been used and how financially successful the projects were. Commentary is also given on the effectiveness of the overall program along with a critique of how the implementation of 6-Sigma could be more effectively managed in the future. This discussion particularly focuses upon issues such as: project selection and scoping, financial evaluation and data availability, organisational awareness, commitment and involvement, middle management support, functional variation, and maintaining momentum during the rollout of a lengthy program.
Resumo:
One of the key policy objectives of government at national and regional level, is to overcome the constraints preventing local industry achieving greater competitiveness in the international market-place. This paper examines the impact of grant assistance to Northern Ireland small firms delivered over the period 1994 ^ 97 by the former Local Enterprise Development Unit through its Growth Business Support Programme (GBSP). Previous work by the authors showed that there was some tentative evidence to suggest a link between employment growth and grant aid provided to very small firms (fewer than 10 employees) assisted under the GBSP. The central objective of the empirical work reported in this paper is to extend the previous analysis by understanding the extent to which the value of financial assistance influences growth (employment, turnover, and productivity measures) and if differential impacts arise depending on the nature and timing (lag structures) of the grant assistance.
Resumo:
The techniques and insights from two distinct areas of financial economic modelling are combined to provide evidence of the influence of firm size on the volatility of stock portfolio returns. Portfolio returns are characterized by positive serial correlation induced by the varying levels of non-synchronous trading among the component stocks. This serial correlation is greatest for portfolios of small firms. The conditional volatility of stock returns has been shown to be well represented by the GARCH family of statistical processes. Using a GARCH model of the variance of capitalization-based portfolio returns, conditioned on the autocorrelation structure in the conditional mean, striking differences related to firm size are uncovered.
Resumo:
The use of Diagnosis Related Groups (DRG) as a mechanism for hospital financing is a currently debated topic in Portugal. The DRG system was scheduled to be initiated by the Health Ministry of Portugal on January 1, 1990 as an instrument for the allocation of public hospital budgets funded by the National Health Service (NHS), and as a method of payment for other third party payers (e.g., Public Employees (ADSE), private insurers, etc.). Based on experience from other countries such as the United States, it was expected that implementation of this system would result in more efficient hospital resource utilisation and a more equitable distribution of hospital budgets. However, in order to minimise the potentially adverse financial impact on hospitals, the Portuguese Health Ministry decided to gradually phase in the use of the DRG system for budget allocation by using blended hospitalspecific and national DRG casemix rates. Since implementation in 1990, the percentage of each hospitals budget based on hospital specific costs was to decrease, while the percentage based on DRG casemix was to increase. This was scheduled to continue until 1995 when the plan called for allocating yearly budgets on a 50% national and 50% hospitalspecific cost basis. While all other nonNHS third party payers are currently paying based on DRGs, the adoption of DRG casemix as a National Health Service budget setting tool has been slower than anticipated. There is now some argument in both the political and academic communities as to the appropriateness of DRGs as a budget setting criterion as well as to their impact on hospital efficiency in Portugal. This paper uses a twostage procedure to assess the impact of actual DRG payment on the productivity (through its components, i.e., technological change and technical efficiency change) of diagnostic technology in Portuguese hospitals during the years 1992–1994, using both parametric and nonparametric frontier models. We find evidence that the DRG payment system does appear to have had a positive impact on productivity and technical efficiency of some commonly employed diagnostic technologies in Portugal during this time span.
Resumo:
Objective: The number of pharmaceutical items issued on prescription is continually rising and contributing to spiralling healthcare costs. Although there is some data highlighting the quantity, in terms of weight of medicines returned specifically to community pharmacies, little is known about the specific details of such returns or other destinations for wasted medications. This pilot study has been designed to investigate the types and amounts of medicines returned to both general practices (GPs) and associated local community pharmacies determining the reasons why these medicines have been returned. Method: The study was conducted in eight community pharmacies and five GP surgeries within East Birmingham over a 4-week period. Main outcome Measure: Reason for return and details of returned medication. Results: A total of 114 returns were made during the study: 24 (21.1) to GP surgeries and 90 (78.9) to community pharmacies. The total returns comprised 340 items, of which 42 (12.4) were returned to GPs and 298 (87.6) to pharmacies, with the mean number of items per return being 1.8 and 3.3, respectively. Half of the returns in the study were attributed to the doctor changing or stopping the medicine; 23.7 of returns were recorded as excess supplies or clearout often associated with patients' death and 3.5 of returns were related to adverse drug reactions. Cardiovascular drugs were most commonly returned, amounting to 28.5 of the total drugs returned during the study. Conclusions: The results from this pilot study indicate that unused medicines impose a significant financial burden on the National Health Service as well as a social burden on the United Kingdom population. Further studies are examining the precise nature of returned medicines and possible solutions to these issues. © Springer 2005.
Resumo:
This chapter provides the theoretical foundation and background on data envelopment analysis (DEA) method. We first introduce the basic DEA models. The balance of this chapter focuses on evidences showing DEA has been extensively applied for measuring efficiency and productivity of services including financial services (banking, insurance, securities, and fund management), professional services, health services, education services, environmental and public services, energy services, logistics, tourism, information technology, telecommunications, transport, distribution, audio-visual, media, entertainment, cultural and other business services. Finally, we provide information on the use of Performance Improvement Management Software (PIM-DEA). A free limited version of this software and downloading procedure is also included in this chapter.
Resumo:
A systematic analysis is presented of the economic consequences of the abnormally high concentration of Zambia's exports on a commodity whose price is exceptionally unstable. Zambian macro-economic variables in the post-independence years are extensively documented, showing acute instability and decline, particularly after the energy price revolution and the collapse of copper prices. The relevance of stabilization policies designed to correct short-term disequilibrium is questioned. It is, therefore, a pathological case study of externally induced economic instability, complementing other studies in this area which use cross-country analysis of a few selected variables. After a survey of theory and issues pertaining to development, finance and stabilization, the emergence of domestic and foreign financial constraints on the Zambian economy is described. The world copper industry is surveyed and an examination of commodity and world trade prices concludes that copper showed the highest degree of price instability. Specific aspects of Zambia's economy identified for detailed analysis include: its unprofitable mining industry, external payments disequilibrium, a constrained government budget, potentially inflationary monetary growth, and external indebtedness. International comparisons are used extensively, but major copper exporters are subjected to closer scrutiny. An appraisal of policy options concludes the study.
Resumo:
This thesis presents the results from an investigation into the merits of analysing Magnetoencephalographic (MEG) data in the context of dynamical systems theory. MEG is the study of both the methods for the measurement of minute magnetic flux variations at the scalp, resulting from neuro-electric activity in the neocortex, as well as the techniques required to process and extract useful information from these measurements. As a result of its unique mode of action - by directly measuring neuronal activity via the resulting magnetic field fluctuations - MEG possesses a number of useful qualities which could potentially make it a powerful addition to any brain researcher's arsenal. Unfortunately, MEG research has so far failed to fulfil its early promise, being hindered in its progress by a variety of factors. Conventionally, the analysis of MEG has been dominated by the search for activity in certain spectral bands - the so-called alpha, delta, beta, etc that are commonly referred to in both academic and lay publications. Other efforts have centred upon generating optimal fits of "equivalent current dipoles" that best explain the observed field distribution. Many of these approaches carry the implicit assumption that the dynamics which result in the observed time series are linear. This is despite a variety of reasons which suggest that nonlinearity might be present in MEG recordings. By using methods that allow for nonlinear dynamics, the research described in this thesis avoids these restrictive linearity assumptions. A crucial concept underpinning this project is the belief that MEG recordings are mere observations of the evolution of the true underlying state, which is unobservable and is assumed to reflect some abstract brain cognitive state. Further, we maintain that it is unreasonable to expect these processes to be adequately described in the traditional way: as a linear sum of a large number of frequency generators. One of the main objectives of this thesis will be to prove that much more effective and powerful analysis of MEG can be achieved if one were to assume the presence of both linear and nonlinear characteristics from the outset. Our position is that the combined action of a relatively small number of these generators, coupled with external and dynamic noise sources, is more than sufficient to account for the complexity observed in the MEG recordings. Another problem that has plagued MEG researchers is the extremely low signal to noise ratios that are obtained. As the magnetic flux variations resulting from actual cortical processes can be extremely minute, the measuring devices used in MEG are, necessarily, extremely sensitive. The unfortunate side-effect of this is that even commonplace phenomena such as the earth's geomagnetic field can easily swamp signals of interest. This problem is commonly addressed by averaging over a large number of recordings. However, this has a number of notable drawbacks. In particular, it is difficult to synchronise high frequency activity which might be of interest, and often these signals will be cancelled out by the averaging process. Other problems that have been encountered are high costs and low portability of state-of-the- art multichannel machines. The result of this is that the use of MEG has, hitherto, been restricted to large institutions which are able to afford the high costs associated with the procurement and maintenance of these machines. In this project, we seek to address these issues by working almost exclusively with single channel, unaveraged MEG data. We demonstrate the applicability of a variety of methods originating from the fields of signal processing, dynamical systems, information theory and neural networks, to the analysis of MEG data. It is noteworthy that while modern signal processing tools such as independent component analysis, topographic maps and latent variable modelling have enjoyed extensive success in a variety of research areas from financial time series modelling to the analysis of sun spot activity, their use in MEG analysis has thus far been extremely limited. It is hoped that this work will help to remedy this oversight.