917 resultados para Inflation (Finance) - Mathematical models
Resumo:
Financial modelling in the area of option pricing involves the understanding of the correlations between asset and movements of buy/sell in order to reduce risk in investment. Such activities depend on financial analysis tools being available to the trader with which he can make rapid and systematic evaluation of buy/sell contracts. In turn, analysis tools rely on fast numerical algorithms for the solution of financial mathematical models. There are many different financial activities apart from shares buy/sell activities. The main aim of this chapter is to discuss a distributed algorithm for the numerical solution of a European option. Both linear and non-linear cases are considered. The algorithm is based on the concept of the Laplace transform and its numerical inverse. The scalability of the algorithm is examined. Numerical tests are used to demonstrate the effectiveness of the algorithm for financial analysis. Time dependent functions for volatility and interest rates are also discussed. Applications of the algorithm to non-linear Black-Scholes equation where the volatility and the interest rate are functions of the option value are included. Some qualitative results of the convergence behaviour of the algorithm is examined. This chapter also examines the various computational issues of the Laplace transformation method in terms of distributed computing. The idea of using a two-level temporal mesh in order to achieve distributed computation along the temporal axis is introduced. Finally, the chapter ends with some conclusions.
Resumo:
Seit Etablierung der ersten Börsen als Marktplatz für fungible Güter sind Marktteilnehmer und die Wissenschaft bemüht, Erklärungen für das Zustandekommen von Marktpreisen zu finden. Im Laufe der Zeit wurden diverse Modelle entwickelt. Allen voran ist das neoklassische Capital Asset Pricing Modell (CAPM) zu nennen. Die Neoklassik sieht den Akteur an den Finanzmärkten als emotionslosen und streng rationalen Entscheider, dem sog. homo oeconomicus. Psychologische Einflussfaktoren bei der Preisbildung bleiben unbeachtet. Mit der Behavioral Finance hat sich ein neuer Zweig zur Erklärung von Börsenkursen und deren Bewegungen entwickelt. Die Behavioral Finance sprengt die enge Sichtweise der Neoklassik und geht davon aus, dass psychologische Effekte die Entscheidung der Finanzakteure beeinflussen und dabei zu teilweise irrational und emotional geprägten Kursänderungen führen. Eines der Hauptprobleme der Behavioral Finance liegt allerdings in der fehlenden formellen Ermittelbarkeit und Testbarkeit der einzelnen psychologischen Effekte. Anders als beim CAPM, wo die einzelnen Parameter klar mathematisch bestimmbar sind, besteht die Behavioral Finance im Wesentlichen aus psychologischen Definitionen von kursbeeinflussenden Effekten. Die genaue Wirkrichtung und Intensität der Effekte kann, mangels geeigneter Modelle, nicht ermittelt werden. Ziel der Arbeit ist es, eine Abwandlung des CAPM zu ermitteln, die es ermöglicht, neoklassische Annahmen durch die Erkenntnisse des Behavioral Finance zu ergänzen. Mittels der technischen Analyse von Marktpreisen wird versucht die Effekte der Behavioral Finance formell darstellbar und berechenbar zu machen. Von Praktikern wird die technische Analyse dazu verwendet, aus Kursverläufen die Stimmungen und Intentionen der Marktteilnehmer abzuleiten. Eine wissenschaftliche Fundierung ist bislang unterblieben. Ausgehend von den Erkenntnissen der Behavioral Finance und der technischen Analyse wird das klassische CAPM um psychologische Faktoren ergänzt, indem ein Multi-Beta-CAPM (Behavioral-Finance-CAPM) definiert wird, in das psychologisch fundierte Parameter der technischen Analyse einfließen. In Anlehnung an den CAPM-Test von FAMA und FRENCH (1992) werden das klassische CAPM und das Behavioral-Finance-CAPM getestet und der psychologische Erklärungsgehalt der technischen Analyse untersucht. Im Untersuchungszeitraum kann dem Behavioral-Finance-CAPM ein deutlich höherer Erklärungsgehalt gegenüber dem klassischen CAPM zugesprochen werden.
Resumo:
We examine how the accuracy of real-time forecasts from models that include autoregressive terms can be improved by estimating the models on ‘lightly revised’ data instead of using data from the latest-available vintage. The benefits of estimating autoregressive models on lightly revised data are related to the nature of the data revision process and the underlying process for the true values. Empirically, we find improvements in root mean square forecasting error of 2–4% when forecasting output growth and inflation with univariate models, and of 8% with multivariate models. We show that multiple-vintage models, which explicitly model data revisions, require large estimation samples to deliver competitive forecasts. Copyright © 2012 John Wiley & Sons, Ltd.
Resumo:
In this review article, we explore several recent advances in the quantitative modeling of financial markets. We begin with the Efficient Markets Hypothesis and describe how this controversial idea has stimulated a number of new directions of research, some focusing on more elaborate mathematical models that are capable of rationalizing the empirical facts, others taking a completely different tack in rejecting rationality altogether. One of the most promising directions is to view financial markets from a biological perspective and, specifically, within an evolutionary framework in which markets, instruments, institutions, and investors interact and evolve dynamically according to the “law” of economic selection. Under this view, financial agents compete and adapt, but they do not necessarily do so in an optimal fashion. Evolutionary and ecological models of financial markets is truly a new frontier whose exploration has just begun.
Resumo:
v. 1. Multicomponent methods.--v. 2. Mathematical models.
Resumo:
Title page also includes summary of paper.
Resumo:
To account for the preponderance of zero counts and simultaneous correlation of observations, a class of zero-inflated Poisson mixed regression models is applicable for accommodating the within-cluster dependence. In this paper, a score test for zero-inflation is developed for assessing correlated count data with excess zeros. The sampling distribution and the power of the test statistic are evaluated by simulation studies. The results show that the test statistic performs satisfactorily under a wide range of conditions. The test procedure is further illustrated using a data set on recurrent urinary tract infections. Copyright (c) 2005 John Wiley & Sons, Ltd.
Resumo:
Previous work by Professor John Frazer on Evolutionary Architecture provides a basis for the development of a system evolving architectural envelopes in a generic and abstract manner. Recent research by the authors has focused on the implementation of a virtual environment for the automatic generation and exploration of complex forms and architectural envelopes based on solid modelling techniques and the integration of evolutionary algorithms, enhanced computational and mathematical models. Abstract data types are introduced for genotypes in a genetic algorithm order to develop complex models using generative and evolutionary computing techniques. Multi-objective optimisation techniques are employed for defining the fitness function in the evaluation process.
Resumo:
Modern Engineering Asset Management (EAM) requires the accurate assessment of current and the prediction of future asset health condition. Suitable mathematical models that are capable of predicting Time-to-Failure (TTF) and the probability of failure in future time are essential. In traditional reliability models, the lifetime of assets is estimated using failure time data. However, in most real-life situations and industry applications, the lifetime of assets is influenced by different risk factors, which are called covariates. The fundamental notion in reliability theory is the failure time of a system and its covariates. These covariates change stochastically and may influence and/or indicate the failure time. Research shows that many statistical models have been developed to estimate the hazard of assets or individuals with covariates. An extensive amount of literature on hazard models with covariates (also termed covariate models), including theory and practical applications, has emerged. This paper is a state-of-the-art review of the existing literature on these covariate models in both the reliability and biomedical fields. One of the major purposes of this expository paper is to synthesise these models from both industrial reliability and biomedical fields and then contextually group them into non-parametric and semi-parametric models. Comments on their merits and limitations are also presented. Another main purpose of this paper is to comprehensively review and summarise the current research on the development of the covariate models so as to facilitate the application of more covariate modelling techniques into prognostics and asset health management.
Resumo:
Three particular geometrical shapes of parallelepiped, cylindrical and spheres were selected from potatoes (aspect ratio = 1:1, 2:1, 3:1), cut beans (length:diameter = 1:1, 2:1, 3:1) and peas respectively. The density variation of food particulates was studied in a batch fluidised bed dryer connected to a heat pump dehumidifier system. Apparent density and bulk density were evaluated with non-dimensional moisture at three different drying temperatures of 30, 40 and 50 o C. Relative humidity of hot air was kept at 15% in all drying temperatures. Several empirical relationships were developed for the determination of changes in densities with the moisture content. Simple mathematical models were obtained to relate apparent density and bulk density with moisture content.