861 resultados para BANKING
Resumo:
In this paper we study both the level of Value-at-Risk (VaR) disclosure and the accuracy of the disclosed VaR figures for a sample of US and international commercial banks. To measure the level of VaR disclosures, we develop a VaR Disclosure Index that captures many different facets of market risk disclosure. Using panel data over the period 1996–2005, we find an overall upward trend in the quantity of information released to the public. We also find that Historical Simulation is by far the most popular VaR method. We assess the accuracy of VaR figures by studying the number of VaR exceedances and whether actual daily VaRs contain information about the volatility of subsequent trading revenues. Unlike the level of VaR disclosure, the quality of VaR disclosure shows no sign of improvement over time. We find that VaR computed using Historical Simulation contains very little information about future volatility.
Resumo:
Signal Processing (SP) is a subject of central importance in engineering and the applied sciences. Signals are information-bearing functions, and SP deals with the analysis and processing of signals (by dedicated systems) to extract or modify information. Signal processing is necessary because signals normally contain information that is not readily usable or understandable, or which might be disturbed by unwanted sources such as noise. Although many signals are non-electrical, it is common to convert them into electrical signals for processing. Most natural signals (such as acoustic and biomedical signals) are continuous functions of time, with these signals being referred to as analog signals. Prior to the onset of digital computers, Analog Signal Processing (ASP) and analog systems were the only tool to deal with analog signals. Although ASP and analog systems are still widely used, Digital Signal Processing (DSP) and digital systems are attracting more attention, due in large part to the significant advantages of digital systems over the analog counterparts. These advantages include superiority in performance,s peed, reliability, efficiency of storage, size and cost. In addition, DSP can solve problems that cannot be solved using ASP, like the spectral analysis of multicomonent signals, adaptive filtering, and operations at very low frequencies. Following the recent developments in engineering which occurred in the 1980's and 1990's, DSP became one of the world's fastest growing industries. Since that time DSP has not only impacted on traditional areas of electrical engineering, but has had far reaching effects on other domains that deal with information such as economics, meteorology, seismology, bioengineering, oceanology, communications, astronomy, radar engineering, control engineering and various other applications. This book is based on the Lecture Notes of Associate Professor Zahir M. Hussain at RMIT University (Melbourne, 2001-2009), the research of Dr. Amin Z. Sadik (at QUT & RMIT, 2005-2008), and the Note of Professor Peter O'Shea at Queensland University of Technology. Part I of the book addresses the representation of analog and digital signals and systems in the time domain and in the frequency domain. The core topics covered are convolution, transforms (Fourier, Laplace, Z. Discrete-time Fourier, and Discrete Fourier), filters, and random signal analysis. There is also a treatment of some important applications of DSP, including signal detection in noise, radar range estimation, banking and financial applications, and audio effects production. Design and implementation of digital systems (such as integrators, differentiators, resonators and oscillators are also considered, along with the design of conventional digital filters. Part I is suitable for an elementary course in DSP. Part II (which is suitable for an advanced signal processing course), considers selected signal processing systems and techniques. Core topics covered are the Hilbert transformer, binary signal transmission, phase-locked loops, sigma-delta modulation, noise shaping, quantization, adaptive filters, and non-stationary signal analysis. Part III presents some selected advanced DSP topics. We hope that this book will contribute to the advancement of engineering education and that it will serve as a general reference book on digital signal processing.
Resumo:
The primary focus of corruption studies and anti-corruption activism has been corruption within sovereign states. However, over the last twenty years ‘globalization’, the flow of money, goods, people and ideas across borders, has threatened to overwhelm the system of sovereign states. Much activity has moved outside the control of nation states at the same time as nation states have ‘deregulated’ and in so doing have transferred power from those exercising governmental power at the nominal behest of the majority of its citizens to those with greater wealth and/or greater knowledge in markets in which knowledge is typically asymmetric. It is now recognized that many governance problems have arisen because of globalisation and can only be addressed by global solutions. It must also be recognized that governance problems at the national level contribute to governance problems and the global level and vice versa. Nevertheless, many of the lessons learned in combating corruption at the national level are relevant to a globalized world – in particular, the need for ethics and leadership in addition to legal and institutional reform; the need to integrate these measures into integrity systems; and the awareness of corruption systems. These are applied to areas of concern within sustainable globalisation raised by the conference – including peace and security, extractive industries, climate change and sustainable banking.
Resumo:
In late 2009, Sandra Haukka secured funding from the auDA Foundation to explore what older Australians who never or rarely use the Internet (referred to as ‘non-users’) know about the types of online products and services available to them, and how they might use these products and services to improve their daily life. This project aims to support current and future strategies and initiatives by: 1) exploring the extent to which non-users are aware of the types and benefits of online products and services, (such as e-shopping, e-banking, e-health, social networking, and general browsing and research) as well as their interest in them b) identifying how the Internet can improve the daily life of older Australians c) reviewing the effectiveness of support and services designed to educate and encourage older people to engage with the Internet d) recommending strategies that aim to raise non-user awareness of current and emerging online products and services, and provide non-users with the skills and knowledge needed to use those products and services that they believe can improve their daily life. The Productive Ageing Centre at National Seniors Australia, and Professor Trevor Barr from Swinburne University provided the project with in-kind support.
Resumo:
In 2001, the Malaysian Code on Corporate Governance (MCCG) became an integral part of the Bursa Malaysia Listing Rules, which requires all listed firms to disclose the extent of compliance with the MCCG. Our panel analysis of 440 firms from 1999 to 2002 finds that corporate governance reform in Malaysia has been successful, with a significant improvement in governance practices. The relationship between ownership by the Employees Provident Fund (EPF) and corporate governance has strengthened during the period subsequent to the reform, in line with the lead role taken by the EPF in establishing the Minority Shareholders Watchdog Group. The implementation of MCCG has had a substantial effect on shareholders' wealth, increasing stock prices by an average of about 4.8%. Although there is no evidence that politically connected firms perform better, political connections do have a significantly negative effect on corporate governance, which is mitigated by institutional ownership.
Resumo:
Estimates of the half-life to convergence of prices across a panel of cities are subject to bias from three potential sources: inappropriate cross-sectional aggregation of heterogeneous coefficients, presence of lagged dependent variables in a model with individual fixed effects, and time aggregation of commodity prices. This paper finds no evidence of heterogeneity bias in annual CPI data for 17 U.S. cities from 1918 to 2006, but correcting for the “Nickell bias” and time aggregation bias produces a half-life of 7.5 years, shorter than estimates from previous studies.
Resumo:
We analyze the puzzling behavior of the volatility of individual stock returns over the past few decades. The literature has provided many different explanations to the trend in volatility and this paper tests the viability of the different explanations. Virtually all current theoretical arguments that are provided for the trend in the average level of volatility over time lend themselves to explanations about the difference in volatility levels between firms in the cross-section. We therefore focus separately on the cross-sectional and time-series explanatory power of the different proxies. We fail to find a proxy that is able to explain both dimensions well. In particular, we find that Cao et al. [Cao, C., Simin, T.T., Zhao, J., 2008. Can growth options explain the trend in idiosyncratic risk? Review of Financial Studies 21, 2599–2633] market-to-book ratio tracks average volatility levels well, but has no cross-sectional explanatory power. On the other hand, the low-price proxy suggested by Brandt et al. [Brandt, M.W., Brav, A., Graham, J.R., Kumar, A., 2010. The idiosyncratic volatility puzzle: time trend or speculative episodes. Review of Financial Studies 23, 863–899] has much cross-sectional explanatory power, but has virtually no time-series explanatory power. We also find that the different proxies do not explain the trend in volatility in the period prior to 1995 (R-squared of virtually zero), but explain rather well the trend in volatility at the turn of the Millennium (1995–2005).