6 resultados para Non-performing loan
Resumo:
Is there evidence that market forces effectively discipline risk management behaviour within Chinese financial institutions? This study analyses information from a comprehensive sample of Chinese banks over the 1998-2008 period. Market discipline is captured through the impact of four sets of factors namely, market concentration, interbank deposits, information disclosure, and ownership structure. We find some evidence of a market disciplining effect in that: (i) higher (lower) levels of market concentration lead banks to operate with a lower (higher) capital buffer; (ii) joint-equity banks that disclose more information to the public maintain larger capital ratios; (iii) full state ownership reduces the sensitivity of changes in a bank's capital buffer to its level of risk;(iv) banks that release more transparent financial information hold more capital against their non-performing loans. © 2010 Springer Science+Business Media, LLC.
Resumo:
This study examines the relative performance of Japanese cooperative banks between 1998 and 2009, explicitly modeling non-performing loans as an undesirable output. Three key findings emerge. First, the sector is characterized by increasing returns to scale which supports the ongoing amalgamation process within the sector. Second, although restricted in product offerings, markets and their membership base, Japanese cooperatives secured both technical progress (a positive shift in the frontier) and a decrease in technical inefficiency (distance from the frontier). Third, the analysis highlighted that regulatory pressure to reduce non-performing loans will have an adverse impact on both output and performance.
Resumo:
Over the last thirty years, there has been an increased demand for better management of public sector organisations (PSOs). This requires that they are answerable for the inputs that they are given but also for what they achieve with these inputs (Hood 1991; Hood 1995). It is suggested that this will improve the management of the organisation through better planning and control, and the achievement of greater accountability (Smith 1995). However, such a rational approach with clear goals and the means to measure achievement can cause difficulties for many PSOs. These difficulties include the distinctive nature of the public sector due to the political environment within which the public sector manager operates (Stewart and Walsh 1992) and the fact that PSOs will have many stakeholders, each of whom will have their own specific objectives based on their own perspective (Boyle 1995). This can
result in goal ambiguity which means that there is leeway in interpreting the results of the PSO. The National Asset Management Agency (NAMA) was set up to bring stability to the financial system by buying loans from the banks (which were in most cases, non-performing loans). The intention was to cleanse the banks of these loans so that they could return to their normal business of taking deposits and making loans. However, the legislation, also gave NAMA a wide range of other responsibilities including responsibility for facilitating credit in the economy and protecting the interests of taxpayers. In more recent times, NAMA has been given responsibility for building social housing. This wide-range of activities is a clear example of a PSO being given multiple goals which may conflict and is therefore likely to lead to goal ambiguity. This makes it very difficult to evaluate NAMA’s performance as they are attempting to meet numerous goals at the same time and also highlights the complexity of policy making in the public sector. The purpose of this paper is to examine how NAMA dealt with goal ambiguity. This will be done through a thematic analysis of its annual reports over the last five years. The paper’s will contribute to the ongoing debate about the evaluation of PSOs and the complex environment within which they operate which makes evaluation difficult as they are
answerable to multiple stakeholders who have different objectives and different criteria for measuring success.
Resumo:
The problems related to the management of large quantum registers could be handled in the context of distributed quantum computation: unitary non-local transformations among spatially separated local processors are realized performing local unitary transformations and exchanging classical communication. In this paper, a scheme is proposed for the implementation of universal non-local quantum gates such as a controlled NOT (CNOT) and a controlled quantum phase gate (CQPG). The system chosen for their physical implementation is a cavity-quantum-electrodynamics (CQED) system formed by two spatially separated microwave cavities and two trapped Rydberg atoms. The procedures to follow for the realization of each step necessary to perform a specific non-local operation are described.
Resumo:
A method for introducing correlations between electrons and ions that is computationally affordable is described. The central assumption is that the ionic wavefunctions are narrow, which makes possible a moment expansion for the full density matrix. To make the problem tractable we reduce the remaining many-electron problem to a single-electron problem by performing a trace over all electronic degrees of freedom except one. This introduces both one- and two-electron quantities into the equations of motion. Quantities depending on more than one electron are removed by making a Hartree-Fock approximation. Using the first-moment approximation, we perform a number of tight binding simulations of the effect of an electric current on a mobile atom. The classical contribution to the ionic kinetic energy exhibits cooling and is independent of the bias. The quantum contribution exhibits strong heating, with the heating rate proportional to the bias. However, increased scattering of electrons with increasing ionic kinetic energy is not observed. This effect requires the introduction of the second moment.
Resumo:
Statistical downscaling (SD) methods have become a popular, low-cost and accessible means of bridging the gap between the coarse spatial resolution at which climate models output climate scenarios and the finer spatial scale at which impact modellers require these scenarios, with various different SD techniques used for a wide range of applications across the world. This paper compares the Generator for Point Climate Change (GPCC) model and the Statistical DownScaling Model (SDSM)—two contrasting SD methods—in terms of their ability to generate precipitation series under non-stationary conditions across ten contrasting global climates. The mean, maximum and a selection of distribution statistics as well as the cumulative frequencies of dry and wet spells for four different temporal resolutions were compared between the models and the observed series for a validation period. Results indicate that both methods can generate daily precipitation series that generally closely mirror observed series for a wide range of non-stationary climates. However, GPCC tends to overestimate higher precipitation amounts, whilst SDSM tends to underestimate these. This infers that GPCC is more likely to overestimate the effects of precipitation on a given impact sector, whilst SDSM is likely to underestimate the effects. GPCC performs better than SDSM in reproducing wet and dry day frequency, which is a key advantage for many impact sectors. Overall, the mixed performance of the two methods illustrates the importance of users performing a thorough validation in order to determine the influence of simulated precipitation on their chosen impact sector.