997 resultados para Financial Econometrics
Resumo:
The importance of financial market reforms in combating corruption has been highlighted in the theoretical literature but has not been systemically tested empirically. In this study we provide a first pass at testing this relationship using both linear and nonmonotonic forms of the relationship between corruption and financial intermediation. Our study finds a negative and statistically significant impact of financial intermediation on corruption. Specifically, the results imply that a one standard deviation increase in financial intermediation is associated with a decrease in corruption of 0.20 points, or 16 percent of the standard deviation in the corruption index and this relationship is shown to be robust to a variety of specification changes, including: (i) different sets of control variables; (ii) different econometrics techniques; (iii) different sample sizes; (iv) alternative corruption indices; (v) removal of outliers; (vi) different sets of panels; and (vii) allowing for cross country interdependence, contagion effects, of corruption.
Resumo:
In this paper we propose a subsampling estimator for the distribution ofstatistics diverging at either known rates when the underlying timeseries in strictly stationary abd strong mixing. Based on our results weprovide a detailed discussion how to estimate extreme order statisticswith dependent data and present two applications to assessing financialmarket risk. Our method performs well in estimating Value at Risk andprovides a superior alternative to Hill's estimator in operationalizingSafety First portofolio selection.
Resumo:
This paper analyzes the measure of systemic importance ∆CoV aR proposed by Adrian and Brunnermeier (2009, 2010) within the context of a similar class of risk measures used in the risk management literature. In addition, we develop a series of testing procedures, based on ∆CoV aR, to identify and rank the systemically important institutions. We stress the importance of statistical testing in interpreting the measure of systemic importance. An empirical application illustrates the testing procedures, using equity data for three European banks.
Resumo:
We propose and estimate a financial distress model that explicitly accounts for the interactions or spill-over effects between financial institutions, through the use of a spatial continuity matrix that is build from financial network data of inter bank transactions. Such setup of the financial distress model allows for the empirical validation of the importance of network externalities in determining financial distress, in addition to institution specific and macroeconomic covariates. The relevance of such specification is that it incorporates simultaneously micro-prudential factors (Basel 2) as well as macro-prudential and systemic factors (Basel 3) as determinants of financial distress. Results indicate network externalities are an important determinant of financial health of a financial institutions. The parameter that measures the effect of network externalities is both economically and statistical significant and its inclusion as a risk factor reduces the importance of the firm specific variables such as the size or degree of leverage of the financial institution. In addition we analyze the policy implications of the network factor model for capital requirements and deposit insurance pricing.
Resumo:
A number of methods of evaluating the validity of interval forecasts of financial data are analysed, and illustrated using intraday FTSE100 index futures returns. Some existing interval forecast evaluation techniques, such as the Markov chain approach of Christoffersen (1998), are shown to be inappropriate in the presence of periodic heteroscedasticity. Instead, we consider a regression-based test, and a modified version of Christoffersen's Markov chain test for independence, and analyse their properties when the financial time series exhibit periodic volatility. These approaches lead to different conclusions when interval forecasts of FTSE100 index futures returns generated by various GARCH(1,1) and periodic GARCH(1,1) models are evaluated.
Resumo:
This paper presents a methodology to estimate and identify different kinds of economic interaction, whenever these interactions can be established in the form of spatial dependence. First, we apply the semi-parametric approach of Chen and Conley (2001) to the estimation of reaction functions. Then, the methodology is applied to the analysis financial providers in Thailand. Based on a sample of financial institutions, we provide an economic framework to test if the actual spatial pattern is compatible with strategic competition (local interactions) or social planning (global interactions). Our estimates suggest that the provision of commercial banks and suppliers credit access is determined by spatial competition, while the Thai Bank of Agriculture and Agricultural Cooperatives is distributed as in a social planner problem.
Resumo:
This article focuses on the deviations from normality of stock returns before and after a financial liberalisation reform, and shows the extent to which inference based on statistical measures of stock market efficiency can be affected by not controlling for breaks. Drawing from recent advances in the econometrics of structural change, it compares the distribution of the returns of five East Asian emerging markets when breaks in the mean and variance are either (i) imposed using certain official liberalisation dates or (ii) detected non-parametrically using a data-driven procedure. The results suggest that measuring deviations from normality of stock returns with no provision for potentially existing breaks incorporates substantial bias. This is likely to severely affect any inference based on the corresponding descriptive or test statistics.
Resumo:
The literature on bond markets and interest rates has focused largely on the term structure of interest rates, specifically, on the so-called expectations hypothesis. At the same time, little is known about the nature of the spread of the interest rates in the money market beyond the fact that such spreads are generally unstable. However, with the evolution of complex financial instruments, it has become imperative to identify the time series process that can help one accurately forecast such spreads into the future. This article explores the nature of the time series process underlying the spread between three-month and one-year US rates, and concludes that the movements in this spread over time is best captured by a GARCH(1,1) process. It also suggests the use of a relatively long term measure of interest rate volatility as an explanatory variable. This exercise has gained added importance in view of the revelation that GARCH based estimates of option prices consistently outperform the corresponding estimates based on the stylized Black-Scholes algorithm.
Resumo:
A pénzügyi kockázatok szerepe, modellezése, kezelése az utóbbi évtizedekben vált egyre hangsúlyosabbá az elméletben és a gyakorlatban egyaránt. A 2007-ben kezdődő pénzügyi válság egyik kiváltó oka a kockázatok nem megfelelő felmérése volt. A válság egyik tanulsága, hogy bár a matematika és a fizika hozzájárulása rendkívül mély módszertani apparátust biztosított a kockázatok számszerűsítésére, ezen eredmények pénzügyi alkalmazása csak akkor sikeres, ha pontosan értjük a modellek feltételeit és korlátait. Jelen cikk a pénzügyi derivatívák értékelésének alapelveit, valamint a származtatott ügyletekben megjelenő kockázatokat tekinti át, illetve bemutatja azokat a bizonytalansági tényezőket, amelyek megkérdőjelezik az értékelés objektivitását. / === / The modeling and management of financial risks became one of the most important topics of the last decade both in theory and fi nancial practice. The mismanagement of fi nancial risks can be mentioned among the reasons contributing to the eruption of the recent crisis. In order to use successfully the methodology of mathematics and physics in pricing of derivatives, we have to consider the assumptions and limits of the models. This paper introduces the main concepts – no arbitrage pricing and risk neutral valuation – in derivatives’ pricing, then presents and quantifies the risk of some derivative products. I am arguing that the assumptions of the Black–Scholes and Merton model are injured at several points, so the pricing can not be perfectly cleared from all the risk preferences. All those risks, deriving from the difference of the reality and the model are priced in the volatility parameter in the practice.
Resumo:
Ennek a cikknek az a célja, hogy áttekintést adjon annak a folyamatnak néhány főbb állomásáról, amit Black, Scholes és Merton opcióárazásról írt cikkei indítottak el a 70-es évek elején, és ami egyszerre forradalmasította a fejlett nyugati pénzügyi piacokat és a pénzügyi elméletet. / === / This review article compares the development of financial theory within and outside Hungary in the last three decades starting with the Black-Scholes revolution. Problems like the term structure of interest rate volatilities which is in the focus of many research internationally has not received the proper attention among the Hungarian economists. The article gives an overview of no-arbitrage pricing, the partial differential equation approach and the related numerical techniques, like the lattice methods in pricing financial derivatives. The relevant concepts of the martingal approach are overviewed. There is a special focus on the HJM framework of the interest rate development. The idea that the volatility and the correlation can be traded is a new horizon to the Hungarian capital market.
Resumo:
Savings and investments in the American money market by emerging countries, primarily China, financed the excessive consumption of the United States in the early 2000s, which indirectly led to a global financial crisis. The crisis started from the real estate mortgage market. Such balance disrupting processes began on the American financial market which contradicted all previously known equilibrium theories of every school of economics. Economics has yet to come up with models or empirical theories for this new disequilibrium. This is why the outbreak of the crisis could not be prevented or at least predicted. The question is, to what extent can existing market theories, calculation methods and the latest financial products be held responsible for the new situation. This paper studies the influence of the efficient market and modern portfolio theory, as well as Li’s copula function on the American investment market. Naturally, the issues of moral risks and greed, credit ratings and shareholder control, limited liability and market regulations are aspects, which cannot be ignored. In summary, the author outlines the potential alternative measures that could be applied to prevent a new crisis, defines the new directions of economic research and draws the conclusion for the Hungarian economic policy.
Resumo:
A szerző a pénzügyi válság kapcsán a pénzügyi matematikát ért kritikai észrevételekre próbál válaszolni. Megítélése szerint a pénzügyi matematika negatív hatása a magyarországi pénzügyi problémák esetében nem mutatható ki, ugyanis Magyarországon a pénzügyi döntések mindenfajta kvantitatív megalapozás nélkül, nagyrészt politikai alapon történtek, így a felelősséget is a politikát körülvevő gazdaságpolitikusoknak kell viselniük. A matematikai modellek legfőbb felhasználási területe nem a konkrét pénzügyi döntések megalapozásában, hanem sokkal inkább az oktatásban található. _____ The author discusses some critical observations about financial mathematics in connection with the financial crisis. In his view no negative effect of financial mathematics on Hungary's financial problems can be observed, since the financial decisions in Hungary were made without any kind of serious quantitative basis. The decisions were made mainly on political ground, so that the blame must go to economic politicians involved in that policy. The main area of application of mathematical models is not its application in specific financial decisions, but far more in the field of education.
Resumo:
A Szolvencia II néven említett új irányelv elfogadása az Európai Unióban új helyzetet teremt a biztosítók tőkeszükséglet-számításánál. A tanulmány a biztosítók működését modellezve azt elemzi, hogyan hatnak a biztosítók állományának egyes jellemzői a tőkeszükséglet értékére egy olyan elméleti modellben, amelyben a tőkeszükséglet-értékek a Szolvencia II szabályok alapján számolhatók. A modellben biztosítási illetve pénzügyi kockázati "modul" figyelembevételére kerül sor külön-külön számolással, illetve a két kockázatfajta közös modellben való együttes figyelembevételével (a Szolvencia II eredményekkel való összehasonlításhoz). Az elméleti eredmények alapján megállapítható, hogy a tőkeszükségletre vonatkozóan számolható értékek eltérhetnek e két esetben. Az eredmények alapján lehetőség van az eltérések hátterében álló tényezők tanulmányozására is. ____ The new Solvency II directive results in a new environment for calculating the solvency capital requirement of insurance companies in the European Union. By modelling insurance companies the study analyses the impact of certain characteristics of insurance population on the solvency capital based on Solvency II rules. The model includes insurance and financial risk module by calculating solvency capital for the given risk types separately and together, respectively. Based on the theoretical results the difference between these two approaches can be observed. Based on the results the analysis of factors in°uencing the differences is also possible.
Resumo:
We consider a situation in which agents have mutual claims on each other, summarized in a liability matrix. Agents' assets might be insufficient to satisfy their liabilities leading to defaults. In case of default, bankruptcy rules are used to specify the way agents are going to be rationed. A clearing payment matrix is a payment matrix consistent with the prevailing bankruptcy rules that satisfies limited liability and priority of creditors. Since clearing payment matrices and the corresponding values of equity are not uniquely determined, we provide bounds on the possible levels equity can take. Unlike the existing literature, which studies centralized clearing procedures, we introduce a large class of decentralized clearing processes. We show the convergence of any such process in finitely many iterations to the least clearing payment matrix. When the unit of account is sufficiently small, all decentralized clearing processes lead essentially to the same value of equity as a centralized clearing procedure. As a policy implication, it is not necessary to collect and process all the sensitive data of all the agents simultaneously and run a centralized clearing procedure.