733 resultados para Financial analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: The number of pharmaceutical items issued on prescription is continually rising and contributing to spiralling healthcare costs. Although there is some data highlighting the quantity, in terms of weight of medicines returned specifically to community pharmacies, little is known about the specific details of such returns or other destinations for wasted medications. This pilot study has been designed to investigate the types and amounts of medicines returned to both general practices (GPs) and associated local community pharmacies determining the reasons why these medicines have been returned. Method: The study was conducted in eight community pharmacies and five GP surgeries within East Birmingham over a 4-week period. Main outcome Measure: Reason for return and details of returned medication. Results: A total of 114 returns were made during the study: 24 (21.1) to GP surgeries and 90 (78.9) to community pharmacies. The total returns comprised 340 items, of which 42 (12.4) were returned to GPs and 298 (87.6) to pharmacies, with the mean number of items per return being 1.8 and 3.3, respectively. Half of the returns in the study were attributed to the doctor changing or stopping the medicine; 23.7 of returns were recorded as excess supplies or clearout often associated with patients' death and 3.5 of returns were related to adverse drug reactions. Cardiovascular drugs were most commonly returned, amounting to 28.5 of the total drugs returned during the study. Conclusions: The results from this pilot study indicate that unused medicines impose a significant financial burden on the National Health Service as well as a social burden on the United Kingdom population. Further studies are examining the precise nature of returned medicines and possible solutions to these issues. © Springer 2005.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter provides the theoretical foundation and background on data envelopment analysis (DEA) method. We first introduce the basic DEA models. The balance of this chapter focuses on evidences showing DEA has been extensively applied for measuring efficiency and productivity of services including financial services (banking, insurance, securities, and fund management), professional services, health services, education services, environmental and public services, energy services, logistics, tourism, information technology, telecommunications, transport, distribution, audio-visual, media, entertainment, cultural and other business services. Finally, we provide information on the use of Performance Improvement Management Software (PIM-DEA). A free limited version of this software and downloading procedure is also included in this chapter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A systematic analysis is presented of the economic consequences of the abnormally high concentration of Zambia's exports on a commodity whose price is exceptionally unstable. Zambian macro-economic variables in the post-independence years are extensively documented, showing acute instability and decline, particularly after the energy price revolution and the collapse of copper prices. The relevance of stabilization policies designed to correct short-term disequilibrium is questioned. It is, therefore, a pathological case study of externally induced economic instability, complementing other studies in this area which use cross-country analysis of a few selected variables. After a survey of theory and issues pertaining to development, finance and stabilization, the emergence of domestic and foreign financial constraints on the Zambian economy is described. The world copper industry is surveyed and an examination of commodity and world trade prices concludes that copper showed the highest degree of price instability. Specific aspects of Zambia's economy identified for detailed analysis include: its unprofitable mining industry, external payments disequilibrium, a constrained government budget, potentially inflationary monetary growth, and external indebtedness. International comparisons are used extensively, but major copper exporters are subjected to closer scrutiny. An appraisal of policy options concludes the study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents the results from an investigation into the merits of analysing Magnetoencephalographic (MEG) data in the context of dynamical systems theory. MEG is the study of both the methods for the measurement of minute magnetic flux variations at the scalp, resulting from neuro-electric activity in the neocortex, as well as the techniques required to process and extract useful information from these measurements. As a result of its unique mode of action - by directly measuring neuronal activity via the resulting magnetic field fluctuations - MEG possesses a number of useful qualities which could potentially make it a powerful addition to any brain researcher's arsenal. Unfortunately, MEG research has so far failed to fulfil its early promise, being hindered in its progress by a variety of factors. Conventionally, the analysis of MEG has been dominated by the search for activity in certain spectral bands - the so-called alpha, delta, beta, etc that are commonly referred to in both academic and lay publications. Other efforts have centred upon generating optimal fits of "equivalent current dipoles" that best explain the observed field distribution. Many of these approaches carry the implicit assumption that the dynamics which result in the observed time series are linear. This is despite a variety of reasons which suggest that nonlinearity might be present in MEG recordings. By using methods that allow for nonlinear dynamics, the research described in this thesis avoids these restrictive linearity assumptions. A crucial concept underpinning this project is the belief that MEG recordings are mere observations of the evolution of the true underlying state, which is unobservable and is assumed to reflect some abstract brain cognitive state. Further, we maintain that it is unreasonable to expect these processes to be adequately described in the traditional way: as a linear sum of a large number of frequency generators. One of the main objectives of this thesis will be to prove that much more effective and powerful analysis of MEG can be achieved if one were to assume the presence of both linear and nonlinear characteristics from the outset. Our position is that the combined action of a relatively small number of these generators, coupled with external and dynamic noise sources, is more than sufficient to account for the complexity observed in the MEG recordings. Another problem that has plagued MEG researchers is the extremely low signal to noise ratios that are obtained. As the magnetic flux variations resulting from actual cortical processes can be extremely minute, the measuring devices used in MEG are, necessarily, extremely sensitive. The unfortunate side-effect of this is that even commonplace phenomena such as the earth's geomagnetic field can easily swamp signals of interest. This problem is commonly addressed by averaging over a large number of recordings. However, this has a number of notable drawbacks. In particular, it is difficult to synchronise high frequency activity which might be of interest, and often these signals will be cancelled out by the averaging process. Other problems that have been encountered are high costs and low portability of state-of-the- art multichannel machines. The result of this is that the use of MEG has, hitherto, been restricted to large institutions which are able to afford the high costs associated with the procurement and maintenance of these machines. In this project, we seek to address these issues by working almost exclusively with single channel, unaveraged MEG data. We demonstrate the applicability of a variety of methods originating from the fields of signal processing, dynamical systems, information theory and neural networks, to the analysis of MEG data. It is noteworthy that while modern signal processing tools such as independent component analysis, topographic maps and latent variable modelling have enjoyed extensive success in a variety of research areas from financial time series modelling to the analysis of sun spot activity, their use in MEG analysis has thus far been extremely limited. It is hoped that this work will help to remedy this oversight.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Product design decisions can have a significant impact on the financial and operation performance of manufacturing companies. Therefore good analysis of the financial impact of design decisions is required if the profitability of the business is to be maximised. The product design process can be viewed as a chain of decisions which links decisions about the concept to decisions about the detail. The idea of decision chains can be extended to include the design and operation of the 'downstream' business processes which manufacture and support the product. These chains of decisions are not independent but are interrelated in a complex manner. To deal with the interdependencies requires a modelling approach which represents all the chains of decisions, to a level of detail not normally considered in the analysis of product design. The operational, control and financial elements of a manufacturing business constitute a dynamic system. These elements interact with each other and with external elements (i.e. customers and suppliers). Analysing the chain of decisions for such an environment requires the application of simulation techniques, not just to any one area of interest, but to the whole business i.e. an enterprise simulation. To investigate the capability and viability of enterprise simulation an experimental 'Whole Business Simulation' system has been developed. This system combines specialist simulation elements and standard operational applications software packages, to create a model that incorporates all the key elements of a manufacturing business, including its customers and suppliers. By means of a series of experiments, the performance of this system was compared with a range of existing analysis tools (i.e. DFX, capacity calculation, shop floor simulator, and business planner driven by a shop floor simulator).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis investigates the value of quantitative analyses for historical studies of science through an examination of research trends in insect pest control, or economic entomology. Reviews are made of quantitative studies of science, and historical studies of pest control. The methodological strengths and weaknesses of bibliometric techniques are examined in a special chapter; techniques examined include productivity studies such as paper counts, and relational techniques such as co-citation and co-word analysis. Insect pest control is described. This includes a discussion of the socio-economic basis of the concept of `pest'; a series of classifications of pest control techniques are provided and analysed with respect to their utility for scientometric studies. The chemical and biological approaches to control are discussed as scientific and technological paradigms. Three case studies of research trends in economic entomology are provided. First a scientometric analysis of samples of chemical control and biological control papers; providing quantitative data on institutional, financial, national, and journal structures associated with pest control research fields. Second, a content analysis of a core journal, the Journal of Economic Entomology, over a period of 1910-1985; this identifies the main research innovations and trends, in particular the changing balance between chemical and biological control. Third, an analysis of historical research trends in insecticide research; this shows the rise, maturity and decline of research of many groups of compounds. These are supplemented by a collection of seven papers on scientometric studies of pest control and quantitative techniques for analysing science.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis investigates corporate financial disclosure practices on Web sites and their impact. This is done, first by examining the views of various Saudi user groups (institutional investors, financial analysts and private investors) on disclosure of financial reporting on the Internet and assessing differences, if any, in perceptions of the groups. Over 303 individuals from three groups responded to a questionnaire. Views were elicited regarding: users attitude to the Internet infrastructure in Saudi Arabia, users information sources about companies in Saudi Arabia, respondents perception about the advantages and disadvantages in Internet financial reporting (IFR), respondents attitude to the quality of IFR provided by Saudi public companies and the impact of IFR on users information needs. Overall, it was found professional groups (Institutional investors, financial analysts) hold similar views in relation to many issues, while the opinions of private investors differ considerably. Second, the thesis examines the use of the Internet for the disclosure of financial and investor-related information by Saudi public companies (113 companies) and look to identify reasons for the differences in the online disclosure practices of companies by testing the association between eight firm-specific factors and the level of online disclosure. The financial disclosure index (167 items) is used to measure public company disclosure in Saudi Arabia. The descriptive part of the study reveals that 95 (84%) of the Saudi public companies in the sample had a website and 51 (45%) had a financial information section of some description. Furthermore, none of the sample companies provided 100% of the 167 index items applicable to the company. Results of multivariate analysis show that firm size and stock market listing are significant explanatory variables for the amount of information disclosed on corporate Web sites. The thesis finds a significant and negative relationship between the proportion of institutional ownership of a companys shares and the level of IFR.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the wake of the global financial crisis, several macroeconomic contributions have highlighted the risks of excessive credit expansion. In particular, too much finance can have a negative impact on growth. We examine the microeconomic foundations of this argument, positing a non-monotonic relationship between leverage and firm-level productivity growth in the spirit of the trade-off theory of capital structure. A threshold regression model estimated on a sample of Central and Eastern European countries confirms that TFP growth increases with leverage until the latter reaches a critical threshold beyond which leverage lowers TFP growth. This estimate can provide guidance to firms and policy makers on identifying "excessive" leverage. We find similar non-monotonic relationships between leverage and proxies for firm value. Our results are a first step in bridging the gap between the literature on optimal capital structure and the wider macro literature on the finance-growth nexus. © 2012 Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Energy price is related to more than half of the total life cycle cost of asphalt pavements. Furthermore, the fluctuation related to price of energy has been much higher than the general inflation and interest rate. This makes the energy price inflation an important variable that should be addressed when performing life cycle cost (LCC) studies re- garding asphalt pavements. The present value of future costs is highly sensitive to the selected discount rate. Therefore, the choice of the discount rate is the most critical element in LCC analysis during the life time of a project. The objective of the paper is to present a discount rate for asphalt pavement projects as a function of interest rate, general inflation and energy price inflation. The discount rate is defined based on the portion of the energy related costs during the life time of the pavement. Consequently, it can reflect the financial risks related to the energy price in asphalt pavement projects. It is suggested that a discount rate sensitivity analysis for asphalt pavements in Sweden should range between –20 and 30%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines the problems in the definition of the General Non-Parametric Corporate Performance (GNCP) and introduces a multiplicative linear programming as an alternative model for corporate performance. We verified and tested a statistically significant difference between the two models based on the application of 27 UK industries using six performance ratios. Our new model is found to be a more robust performance model than the previous standard Data Envelopment Analysis (DEA) model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: In today's competitive scenario, effective supply chain management is increasingly dependent on third-party logistics (3PL) companies' capabilities and performance. The dissemination of information technology (IT) has contributed to change the supply chain role of 3PL companies and IT is considered an important element influencing the performance of modern logistics companies. Therefore, the purpose of this paper is to explore the relationship between IT and 3PLs' performance, assuming that logistics capabilities play a mediating role in this relationship. Design/methodology/approach: Empirical evidence based on a questionnaire survey conducted on a sample of logistics service companies operating in the Italian market was used to test a conceptual resource-based view (RBV) framework linking IT adoption, logistics capabilities and firm performance. Factor analysis and ordinary least square (OLS) regression analysis have been used to test hypotheses. The focus of the paper is multidisciplinary in nature; management of information systems, strategy, logistics and supply chain management approaches have been combined in the analysis. Findings: The results indicate strong relationships among data gathering technologies, transactional capabilities and firm performance, in terms of both efficiency and effectiveness. Moreover, a positive correlation between enterprise information technologies and 3PL financial performance has been found. Originality/value: The paper successfully uses the concept of logistics capabilities as mediating factor between IT adoption and firm performance. Objective measures have been proposed for IT adoption and logistics capabilities. Direct and indirect relationships among variables have been successfully tested. © Emerald Group Publishing Limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since its introduction in 1978, data envelopment analysis (DEA) has become one of the preeminent nonparametric methods for measuring efficiency and productivity of decision making units (DMUs). Charnes et al. (1978) provided the original DEA constant returns to scale (CRS) model, later extended to variable returns to scale (VRS) by Banker et al. (1984). These ‘standard’ models are known by the acronyms CCR and BCC, respectively, and are now employed routinely in areas that range from assessment of public sectors, such as hospitals and health care systems, schools, and universities, to private sectors, such as banks and financial institutions (Emrouznejad et al. 2008; Emrouznejad and De Witte 2010). The main objective of this volume is to publish original studies that are beyond the two standard CCR and BCC models with both theoretical and practical applications using advanced models in DEA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The central goal of this research is to explore the approach of the Islamic banking industry in defining and implementing religious compliance at regulatory, institutional, and individual level within the Islamic Banking and Finance (IBF) industry. It also examines the discrepancies, ambiguities and paradoxes that are exhibited in the individual and institutional behaviour in relation to the infusion and enactment of religious exigencies into compliance processes in IBF. Through the combined lenses of institutional work and a sensemaking perspective, this research portrays the practice of infusion of Islamic law in Islamic banks as being ambiguous and drifting down to the institutional and actor levels. In instances of both well-codified and non-codified regulatory frameworks for Shariah compliance, institutional rules ambiguity, rules interpretation and enactment ambiguities were found to be prevalent. The individual IBF professionals performed retrospective and prospective actions to adjust the role and rules boundaries both in the case of a Muslim and a non-Muslim country. The sensitizing concept of religious compliance is the primary theoretical contribution of this research and provides a tool to understand the nature of what constitutes Shariah compliance and the dynamics of its implementation. It helps to explain the empirical consequences of the lack of a clear definition of Shariah compliance in the regulatory frameworks and standards available for the industry. It also addresses the calls to have a clear reference on what constitute Shariah compliance in IBF as proposed in previous studies (Hayat, Butter, & Kock, 2013; Maurer, 2003, 2012; Pitluck, 2012). The methodological and theoretical perspective of this research are unique in the use of multi-level analysis and approaches that blend micro and macro perspectives of the research field, to illuminate and provide a more complete picture of religious compliance infusion and enactment in IBF.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - The paper aims to examine the role of market orientation (MO) and innovation capability in determining business performance during an economic upturn and downturn. Design/methodology/approach - The data comprise two national-level surveys conducted in Finland in 2008, representing an economic boom, and in 2010 when the global economic crisis had hit the Finnish market. Partial least square path analysis is used to test the potential mediating effect of innovation capability on the relationship between MO and business performance during economic boom and bust. Findings - The results show that innovation capability fully mediates the performance effects of a MO during an economic upturn, whereas the mediation is only partial during a downturn. Innovation capability also mediates the relationship between a customer orientation and business performance during an upturn, whereas the mediating effect culminates in a competitor orientation during a downturn. Thus, the role of innovation capability as a mediator between the individual market-orientation components varies along the business cycle. Originality/value - This paper is one of the first studies that empirically examine the impact of the economic cycle on the relationship between strategic marketing concepts, such as MO or innovation capability, and the firm's business performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper critically reviews the evolution of financial reporting in the banking sector with specific reference to the reporting of market risk and the growing use of the measure known as Value at Risk (VaR). The paper investigates the process by which VaR became 'institutionalised'. The analysis highlights a number of inherent limitations of VaR as a risk measure and questions the usefulness of published VaR disclosures, concluding that risk 'disclosure' might be more apparent than real. It also looks at some of the implications for risk reporting practice and the accounting profession more generally.