748 resultados para Financial information
Resumo:
Tutkimuksen tavoite Tutkimuksen tavoite oli kerätä FI:n johdolle tietoa henkilöstön suhtautumisesta organisaatiomuutokseen. Tutkimus toimii pohjana muutosprosessin kriittiselle tarkastelulle ja mahdollisille muutoksille resurssien kohdentamisessa. Tutkimusmenetelmä Organisaatiomuutosta käsittelevä kirjallisuus muodosti tutkielman teoriapohjan. Tutkimuksen aineisto kerättiin kyselylomakkeella FI:n henkilöstölle ja neljällä haastattelulla henkilöstön ja johdon kanssa. Johtopäätökset Yleisesti tuloksista on nähtävissä, että vastaajat ovat aluksi vastustaneet muutosta, vaikka muutos itsessään onkin nähty positiivisena kehityksenä. Vastarintaa ovat aiheuttaneet pääasiallisesti muutoksen johtamisen tyyli ja tiedotuksen kokeminen riittämättömänä.
Resumo:
Tutkimuksen päätavoitteena on tutkia taloudellisen näkökulman integroimista laatuajatteluun pohjautuvaan johtamisjärjestelmään esimerkkiyrityksessä. Johtamisjärjestelmän tulee tuottaa tietoa johdon strategiselle päätöksenteolle ja lisäksi täyttää laatujärjestelmän (ISO 9001:2000) asettamat vaatimukset. Tutkimuksen kohteena oleva työkalu on balanced scorecard (tasapainotettu tuloskortti). Työn tarkoituksena on ehdottaa balanced scorecard- talouden tunnuslukuja esimerkkiyritykselle. Tutkimuksen tavoitteisiin päästään empiiristä tutkimusta varten tehdyn teoreettisen viitekehyksen avulla. Empiiristä tutkimustietoa kerätään osallistuvan havainnoinnin, haastattelujen ja keskustelujen avulla. Tutkimusmenetelmänä on laadullinen case -tutkimus. Balanced scorecardin eri näkökulmille ehdotettiin tunnuslukuja empiirisen tutkimuksen pohjalta. Lisäksi talouden näkökulmaa tutkittiin tarkemmin. Tutkimuksen johtopäätöksenä esitettiin, että taloudelliset tunnusluvut mittaavat ensisijaisesti strategiaa eivätkä laatua. Lisäksi huomioitiin, että tuloskorttien tulisi olla koekäytössä ennen bonuspalkkauksen ja balanced scorecardin yhdistämistä.
Resumo:
For predicting future volatility, empirical studies find mixed results regarding two issues: (1) whether model free implied volatility has more information content than Black-Scholes model-based implied volatility; (2) whether implied volatility outperforms historical volatilities. In this thesis, we address these two issues using the Canadian financial data. First, we examine the information content and forecasting power between VIXC - a model free implied volatility, and MVX - a model-based implied volatility. The GARCH in-sample test indicates that VIXC subsumes all information that is reflected in MVX. The out-of-sample examination indicates that VIXC is superior to MVX for predicting the next 1-, 5-, 10-, and 22-trading days' realized volatility. Second, we investigate the predictive power between VIXC and alternative volatility forecasts derived from historical index prices. We find that for time horizons lesser than 10-trading days, VIXC provides more accurate forecasts. However, for longer time horizons, the historical volatilities, particularly the random walk, provide better forecasts. We conclude that VIXC cannot incorporate all information contained in historical index prices for predicting future volatility.
Resumo:
Loans are illiquid assets that can be sold in a secondary market even that buyers have no certainty about their quality. I study a model in which a lender has access to new investment opportunities when all her assets are illiquid. To raise funds, the lender may either borrow using her assets as collateral, or she can sell them in a secondary market. Given asymmetric information about assets quality, the lender cannot recover the total value of her assets. There is then a role for the government to correct the information problem using fiscal tools.
Resumo:
This paper analyzes the measure of systemic importance ∆CoV aR proposed by Adrian and Brunnermeier (2009, 2010) within the context of a similar class of risk measures used in the risk management literature. In addition, we develop a series of testing procedures, based on ∆CoV aR, to identify and rank the systemically important institutions. We stress the importance of statistical testing in interpreting the measure of systemic importance. An empirical application illustrates the testing procedures, using equity data for three European banks.
Resumo:
We propose and estimate a financial distress model that explicitly accounts for the interactions or spill-over effects between financial institutions, through the use of a spatial continuity matrix that is build from financial network data of inter bank transactions. Such setup of the financial distress model allows for the empirical validation of the importance of network externalities in determining financial distress, in addition to institution specific and macroeconomic covariates. The relevance of such specification is that it incorporates simultaneously micro-prudential factors (Basel 2) as well as macro-prudential and systemic factors (Basel 3) as determinants of financial distress. Results indicate network externalities are an important determinant of financial health of a financial institutions. The parameter that measures the effect of network externalities is both economically and statistical significant and its inclusion as a risk factor reduces the importance of the firm specific variables such as the size or degree of leverage of the financial institution. In addition we analyze the policy implications of the network factor model for capital requirements and deposit insurance pricing.
Resumo:
This dissertation studies the effects of Information and Communication Technologies (ICT) on the banking sector and the payments system. It provides insight into how technology-induced changes occur, by exploring both the nature and scope of main technology innovations and evidencing their economic implications for banks and payment systems. Some parts in the dissertation are descriptive. They summarise the main technological developments in the field of finance and link them to economic policies. These parts are complemented with sections of the study that focus on assessing the extent of technology application to banking and payment activities. Finally, it includes also some work which borrows from the economic literature on banking. The need for an interdisciplinary approach arises from the complexity of the topic and the rapid path of change to which it is subject. The first chapter provides an overview of the influence of developments in ICT on the evolution of financial services and international capital flows. We include main indicators and discuss innovation in the financial sector, exchange rates and international capital flows. The chapter concludes with impact analysis and policy options regarding the international financial architecture, some monetary policy issues and the role of international institutions. The second chapter is a technology assessment study that focuses on the relationship between technology and money. The application of technology to payments systems is transforming the way we use money and, in some instances, is blurring the definition of what constitutes money. This chapter surveys the developments in electronic forms of payment and their relationship to the banking system. It also analyses the challenges posed by electronic money for regulators and policy makers, and in particular the opportunities created by two simultaneous processes: the Economic and Monetary Union and the increasing use of electronic payment instruments. The third chapter deals with the implications of developments in ICT on relationship banking. The financial intermediation literature explains relationship banking as a type of financial intermediation characterised by proprietary information and multiple interactions with customers. This form of banking is important for the financing of small and medium-sized enterprises. We discuss the effects of ICT on the banking sector as a whole and then apply these developments to the case of relationship banking. The fourth chapter is an empirical study of the effects of technology on the banking business, using a sample of data from the Spanish banking industry. The design of the study is based on some of the events described in the previous chapters, and also draws from the economic literature on banking. The study shows that developments in information management have differential effects on wholesale and retail banking activities. Finally, the last chapter is a technology assessment study on electronic payments systems in Spain and the European Union. It contains an analysis of existing payment systems and ongoing or planned initiatives in Spain. It forms part of a broader project comprising a series of country-specific analyses covering ten European countries. The main issues raised across the countries serve as the starting point to discuss implications of the development of electronic money for regulation and policies, and in particular, for monetary-policy making.
Resumo:
There are a number of challenges associated with managing knowledge and information in construction organizations delivering major capital assets. These include the ever-increasing volumes of information, losing people because of retirement or competitors, the continuously changing nature of information, lack of methods on eliciting useful knowledge, development of new information technologies and changes in management and innovation practices. Existing tools and methodologies for valuing intangible assets in fields such as engineering, project management and financial, accounting, do not address fully the issues associated with the valuation of information and knowledge. Information is rarely recorded in a way that a document can be valued, when either produced or subsequently retrieved and re-used. In addition there is a wealth of tacit personal knowledge which, if codified into documentary information, may prove to be very valuable to operators of the finished asset or future designers. This paper addresses the problem of information overload and identifies the differences between data, information and knowledge. An exploratory study was conducted with a leading construction consultant examining three perspectives (business, project management and document management) by structured interviews and specifically how to value information in practical terms. Major challenges in information management are identified. An through-life Information Evaluation methodology (IEM) is presented to reduce information overload and to make the information more valuable in the future.
Resumo:
This paper is concerned with the use of a genetic algorithm to select financial ratios for corporate distress classification models. For this purpose, the fitness value associated to a set of ratios is made to reflect the requirements of maximizing the amount of information available for the model and minimizing the collinearity between the model inputs. A case study involving 60 failed and continuing British firms in the period 1997-2000 is used for illustration. The classification model based on ratios selected by the genetic algorithm compares favorably with a model employing ratios usually found in the financial distress literature.
Resumo:
Despite continuing developments in information technology and the growing economic significance of the emerging Eastern European, South American and Asian economies, international financial activity remains strongly concentrated in a relatively small number of international financial centres. That concentration of financial activity requires a critical mass of office occupation and creates demand for high specification, high cost space. The demand for that space is increasingly linked to the fortunes of global capital markets. That linkage has been emphasised by developments in real estate markets, notably the development of global real estate investment, innovation in property investment vehicles and the growth of debt securitisation. The resultant interlinking of occupier, asset, debt and development markets within and across global financial centres is a source of potential volatility and risk. The paper sets out a broad conceptual model of the linkages and their implications for systemic market risk and presents preliminary empirical results that provide support for the model proposed.
Resumo:
We consider the finite sample properties of model selection by information criteria in conditionally heteroscedastic models. Recent theoretical results show that certain popular criteria are consistent in that they will select the true model asymptotically with probability 1. To examine the empirical relevance of this property, Monte Carlo simulations are conducted for a set of non–nested data generating processes (DGPs) with the set of candidate models consisting of all types of model used as DGPs. In addition, not only is the best model considered but also those with similar values of the information criterion, called close competitors, thus forming a portfolio of eligible models. To supplement the simulations, the criteria are applied to a set of economic and financial series. In the simulations, the criteria are largely ineffective at identifying the correct model, either as best or a close competitor, the parsimonious GARCH(1, 1) model being preferred for most DGPs. In contrast, asymmetric models are generally selected to represent actual data. This leads to the conjecture that the properties of parameterizations of processes commonly used to model heteroscedastic data are more similar than may be imagined and that more attention needs to be paid to the behaviour of the standardized disturbances of such models, both in simulation exercises and in empirical modelling.
Resumo:
‘Bilingual’ documents, with text in both Demotic and Greek, can be of several sorts, ranging from complete translations of the same information (e.g. Ptolemaic decrees) to those where the information presented in the two languages is complementary (e.g. mummy labels). The texts discussed in this paper consist of a number of examples of financial records where a full account in one language (L1) is annotated with brief pieces of information in a second language (L2). These L2 ‘tags’ are designed to facilitate extraction of summary data at another level of the administration, functioning in a different language, and probably also to make the document accessible to those who are not literate in the L1.
Resumo:
The financial crisis of 2007–2009 and the resultant pressures exerted on policymakers to prevent future crises have precipitated coordinated regulatory responses globally. A key focus of the new wave of regulation is to ensure the removal of practices now deemed problematic with new controls for conducting transactions and maintaining holdings. There is increasing pressure on organizations to retire manual processes and adopt core systems, such as Investment Management Systems (IMS). These systems facilitate trading and ensure transactions are compliant by transcribing regulatory requirements into automated rules and applying them to trades. The motivation of this study is to explore the extent to which such systems may enable the alteration of previously embedded practices. We researched implementations of an IMS at eight global financial organizations and found that overall the IMS encourages responsible trading through surveillance, monitoring and the automation of regulatory rules and that such systems are likely to become further embedded within financial organizations. We found evidence that some older practices persisted. Our study suggests that the institutionalization of technology-induced compliant behaviour is still uncertain.
Resumo:
An efficient market incorporates news into prices immediately and fully. Tests for efficiency in financial markets have been undermined by information leakage. We test for efficiency in sports betting markets – real-world markets where news breaks remarkably cleanly. Applying a novel identification to high-frequency data, we investigate the reaction of prices to goals scored on the ‘cusp’ of half-time. This strategy allows us to separate the market's response to major news (a goal), from its reaction to the continual flow of minor game-time news. On our evidence, prices update swiftly and fully.
Resumo:
Purpose – Investors are now able to analyse more noise-free news to inform their trading decisions than ever before. Their expectation that more information means better performance is not supported by previous psychological experiments which argue that too much information actually impairs performance. The purpose of this paper is to examine whether the degree of information explicitness improves stock market performance. Design/methodology/approach – An experiment is conducted in a computer laboratory to examine a trading simulation manipulated from a real market-shock. Participants’ performance efficiency and effectiveness are measured separately. Findings – The results indicate that the explicitness of information neither improves nor impairs participants’ performance effectiveness from the perspectives of returns, share and cash positions, and trading volumes. However, participants’ performance efficiency is significantly affected by information explicitness. Originality/value – The novel approach and findings of this research add to the knowledge of the impact of information explicitness on the quality of decision making in a financial market environment.