22 resultados para Financial institutions -- Australia -- Problems, exercises, etc.
Resumo:
Since its introduction in 1978, data envelopment analysis (DEA) has become one of the preeminent nonparametric methods for measuring efficiency and productivity of decision making units (DMUs). Charnes et al. (1978) provided the original DEA constant returns to scale (CRS) model, later extended to variable returns to scale (VRS) by Banker et al. (1984). These ‘standard’ models are known by the acronyms CCR and BCC, respectively, and are now employed routinely in areas that range from assessment of public sectors, such as hospitals and health care systems, schools, and universities, to private sectors, such as banks and financial institutions (Emrouznejad et al. 2008; Emrouznejad and De Witte 2010). The main objective of this volume is to publish original studies that are beyond the two standard CCR and BCC models with both theoretical and practical applications using advanced models in DEA.
Resumo:
The profusion of performance measurement models suggested by Management Accounting literature in the 1990’s is one illustration of the substantial changes in Management Accounting teaching materials since the publication of “Relevance Lost” in 1987. At the same time, in the general context of increasing competition and globalisation it is widely thought that national cultural differences are tending to disappear, meaning that management techniques used in large companies, including performance measurement and management instruments (PMS), tend to be the same, irrespective of the company nationality or location. North American management practice is traditionally described as a contractually based model, mainly focused on financial performance information and measures (FPMs), more shareholder-focused than French companies. Within France, literature historically defined performance as being broadly multidimensional, driven by the idea that there are no universal rules of management and that efficient management takes into account local culture and traditions. As opposed to their North American brethren, French companies are pressured more by the financial institutions that fund them rather than by capital markets. Therefore, they pay greater attention to the long-term because they are not subject to quarterly capital market objectives. Hence, management in France should rely more on long-term qualitative information, less financial, and more multidimensional data to assess performance than their North American counterparts. The objective of this research is to investigate whether large French and US companies’ practices have changed in the way the textbooks have changed with regards to performance measurement and management, or whether cultural differences are still driving differences in performance measurement and management between them. The research findings support the idea that large US and French companies share the same PMS features, influenced by ‘universal’ PM models.
Resumo:
Risk management and knowledge management have so far been studied almost independently. The evolution of risk management to the holistic view of Enterprise Risk Management requires the destruction of barriers between organizational silos and the exchange and application of knowledge from different risk management areas. However, knowledge management has received little or no attention in risk management. This paper examines possible relationships between knowledge management constructs related to knowledge sharing, and two risk management concepts: perceived quality of risk control and perceived value of enterprise risk management. From a literature review, relationships with eight knowledge management variables covering people, process and technology aspects were hypothesised. A survey was administered to risk management employees in financial institutions. The results showed that the perceived quality of risk control is significantly associated with four knowledge management variables: perceived quality of risk knowledge sharing, perceived quality of communication among people, web channel functionality, and risk management information system functionality. However, the relationships of the knowledge management variables to the perceived value of enterprise risk management are not significant. We conclude that better knowledge management is associated with better risk control, but that more effort needs to be made to break down organizational silos in order to support true Enterprise Risk Management.
Resumo:
Nigeria is richly endowed with oil and gas resources, but the country’s continued reliance on loans from international financial institutions raises questions about the transparency and accountability of its utilisation of the huge revenues resulting from these two resources. In order to attract international capital to bolster its revenues from sales of oil and gas, a huge proportion of which continues to be used corruptly, the World Bank has encouraged the Nigerian government to subscribe to neoliberal economic policies by enlisting accounting firms and privatising state-owned enterprises. Key justifications for this have included enhancing accountability, reducing public-sector corruption, promoting market efficiency and attracting international capital. However, this paper presents evidence of the role of accounting in the undervaluation of assets, concealment of possible malpractice, and subversion of the accountability that it should have delivered in the privatisation process. The assumption that accounting will enhance accountability, reduce public-sector corruption and promote market efficiency in privatisation, and ultimately attract investment into a crony capitalist Nigerian state, appears to be an illusion created partly through the apparent legitimacy of accounting.
Resumo:
This thesis presents the results from an investigation into the merits of analysing Magnetoencephalographic (MEG) data in the context of dynamical systems theory. MEG is the study of both the methods for the measurement of minute magnetic flux variations at the scalp, resulting from neuro-electric activity in the neocortex, as well as the techniques required to process and extract useful information from these measurements. As a result of its unique mode of action - by directly measuring neuronal activity via the resulting magnetic field fluctuations - MEG possesses a number of useful qualities which could potentially make it a powerful addition to any brain researcher's arsenal. Unfortunately, MEG research has so far failed to fulfil its early promise, being hindered in its progress by a variety of factors. Conventionally, the analysis of MEG has been dominated by the search for activity in certain spectral bands - the so-called alpha, delta, beta, etc that are commonly referred to in both academic and lay publications. Other efforts have centred upon generating optimal fits of "equivalent current dipoles" that best explain the observed field distribution. Many of these approaches carry the implicit assumption that the dynamics which result in the observed time series are linear. This is despite a variety of reasons which suggest that nonlinearity might be present in MEG recordings. By using methods that allow for nonlinear dynamics, the research described in this thesis avoids these restrictive linearity assumptions. A crucial concept underpinning this project is the belief that MEG recordings are mere observations of the evolution of the true underlying state, which is unobservable and is assumed to reflect some abstract brain cognitive state. Further, we maintain that it is unreasonable to expect these processes to be adequately described in the traditional way: as a linear sum of a large number of frequency generators. One of the main objectives of this thesis will be to prove that much more effective and powerful analysis of MEG can be achieved if one were to assume the presence of both linear and nonlinear characteristics from the outset. Our position is that the combined action of a relatively small number of these generators, coupled with external and dynamic noise sources, is more than sufficient to account for the complexity observed in the MEG recordings. Another problem that has plagued MEG researchers is the extremely low signal to noise ratios that are obtained. As the magnetic flux variations resulting from actual cortical processes can be extremely minute, the measuring devices used in MEG are, necessarily, extremely sensitive. The unfortunate side-effect of this is that even commonplace phenomena such as the earth's geomagnetic field can easily swamp signals of interest. This problem is commonly addressed by averaging over a large number of recordings. However, this has a number of notable drawbacks. In particular, it is difficult to synchronise high frequency activity which might be of interest, and often these signals will be cancelled out by the averaging process. Other problems that have been encountered are high costs and low portability of state-of-the- art multichannel machines. The result of this is that the use of MEG has, hitherto, been restricted to large institutions which are able to afford the high costs associated with the procurement and maintenance of these machines. In this project, we seek to address these issues by working almost exclusively with single channel, unaveraged MEG data. We demonstrate the applicability of a variety of methods originating from the fields of signal processing, dynamical systems, information theory and neural networks, to the analysis of MEG data. It is noteworthy that while modern signal processing tools such as independent component analysis, topographic maps and latent variable modelling have enjoyed extensive success in a variety of research areas from financial time series modelling to the analysis of sun spot activity, their use in MEG analysis has thus far been extremely limited. It is hoped that this work will help to remedy this oversight.
Resumo:
Microfinance has been developed as alternative solution for global poverty alleviation effort in the last 30 years. Microfinance institution (MFI) has unique characteristic wherein they face double bottom line objectives of outreach to the poor and financial sustainability. This study proposes a two-stage analysis to measure Islamic Microfinance institutions (IMFIs) performance by comparing them to conventional MFIs. First, we develop a Data Envelopment Analysis (DEA) framework to measure MFIs' efficiency in its double bottom line objectives, i.e. in terms of social and financial efficiency. In the second stage non-parametric tests are used to compare the performance and identify factors that contribute to the efficiency of IMFIs and MFIs.