339 resultados para Entropia -- Teoria matemàtica
Resumo:
Vegeu el resum a l'inici del document de l'arxiu adjunt
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
Power law distributions, a well-known model in the theory of real random variables, characterize a wide variety of natural and man made phenomena. The intensity of earthquakes, the word frequencies, the solar ares and the sizes of power outages are distributed according to a power law distribution. Recently, given the usage of power laws in the scientific community, several articles have been published criticizing the statistical methods used to estimate the power law behaviour and establishing new techniques to their estimation with proven reliability. The main object of the present study is to go in deep understanding of this kind of distribution and its analysis, and introduce the half-lives of the radioactive isotopes as a new candidate in the nature following a power law distribution, as well as a \canonical laboratory" to test statistical methods appropriate for long-tailed distributions.
Resumo:
HEMOLIA (a project under European community’s 7th framework programme) is a new generation Anti-Money Laundering (AML) intelligent multi-agent alert and investigation system which in addition to the traditional financial data makes extensive use of modern society’s huge telecom data source, thereby opening up a new dimension of capabilities to all Money Laundering fighters (FIUs, LEAs) and Financial Institutes (Banks, Insurance Companies, etc.). This Master-Thesis project is done at AIA, one of the partners for the HEMOLIA project in Barcelona. The objective of this thesis is to find the clusters in a network drawn by using the financial data. An extensive literature survey has been carried out and several standard algorithms related to networks have been studied and implemented. The clustering problem is a NP-hard problem and several algorithms like K-Means and Hierarchical clustering are being implemented for studying several problems relating to sociology, evolution, anthropology etc. However, these algorithms have certain drawbacks which make them very difficult to implement. The thesis suggests (a) a possible improvement to the K-Means algorithm, (b) a novel approach to the clustering problem using the Genetic Algorithms and (c) a new algorithm for finding the cluster of a node using the Genetic Algorithm.
Resumo:
There are many factors that influence the day-ahead market bidding strategies of a generation company (GenCo) in the current energy market framework. Environmental policy issues have become more and more important for fossil-fuelled power plants and they have to be considered in their management, giving rise to emission limitations. This work allows to investigate the influence of both the allowances and emission reduction plan, and the incorporation of the derivatives medium-term commitments in the optimal generation bidding strategy to the day-ahead electricity market. Two different technologies have been considered: the coal thermal units, high-emission technology, and the combined cycle gas turbine units, low-emission technology. The Iberian Electricity Market and the Spanish National Emissions and Allocation Plans are the framework to deal with the environmental issues in the day-ahead market bidding strategies. To address emission limitations, some of the standard risk management methodologies developed for financial markets, such as Value-at-Risk (VaR) and Conditional Value-at-Risk (CVaR), have been extended. This study offers to electricity generation utilities a mathematical model to determinate the individual optimal generation bid to the wholesale electricity market, for each one of their generation units that maximizes the long-run profits of the utility abiding by the Iberian Electricity Market rules, the environmental restrictions set by the EU Emission Trading Scheme, as well as the restrictions set by the Spanish National Emissions Reduction Plan. The economic implications for a GenCo of including the environmental restrictions of these National Plans are analyzed and the most remarkable results will be presented.
Resumo:
Subtítols paral·lels del document: Teoria TK i lleis del moviment 2012 = Théorie TK et lois du mouvement 2012 = Theory TK and laws of movement 2012 = Teoría TK y leyes del movimiento 2012
Resumo:
We study the equidistribution of Fekete points in a compact complex manifold. These are extremal point configurations defined through sections of powers of a positive line bundle. Their equidistribution is a known result. The novelty of our approach is that we relate them to the problem of sampling and interpolation on line bundles, which allows us to estimate the equidistribution of the Fekete points quantitatively. In particular we estimate the Kantorovich-Wasserstein distance of the Fekete points to its limiting measure. The sampling and interpolation arrays on line bundles are a subject of independent interest, and we provide necessary density conditions through the classical approach of Landau, that in this context measures the local dimension of the space of sections of the line bundle. We obtain a complete geometric characterization of sampling and interpolation arrays in the case of compact manifolds of dimension one, and we prove that there are no arrays of both sampling and interpolation in the more general setting of semipositive line bundles.
Resumo:
In this paper a one-phase supercooled Stefan problem, with a nonlinear relation between the phase change temperature and front velocity, is analysed. The model with the standard linear approximation, valid for small supercooling, is first examined asymptotically. The nonlinear case is more difficult to analyse and only two simple asymptotic results are found. Then, we apply an accurate heat balance integral method to make further progress. Finally, we compare the results found against numerical solutions. The results show that for large supercooling the linear model may be highly inaccurate and even qualitatively incorrect. Similarly as the Stefan number β → 1&sup&+&/sup& the classic Neumann solution which exists down to β =1 is far from the linear and nonlinear supercooled solutions and can significantly overpredict the solidification rate.
Resumo:
Pippenger [Pi77] showed the existence of (6m,4m,3m,6)-concentrator for each positive integer m using a probabilistic method. We generalize his approach and prove existence of (6m,4m,3m,5.05)-concentrator (which is no longer regular, but has fewer edges). We apply this result to improve the constant of approximation of almost additive set functions by additive set functions from 44.5 (established by Kalton and Roberts in [KaRo83] to 39. We show a more direct connection of the latter problem to the Whitney type estimate for approximation of continuous functions on a cube in &b&R&/b&&sup&d&/sup& by linear functions, and improve the estimate of this Whitney constant from 802 (proved by Brudnyi and Kalton in [BrKa00] to 73.
Resumo:
En aquest article comparem el rendiment que presenten dos sistemes de reconeixement de punts característics en imatges: en el primer utilitzem la tècnica Random Ferns bàsica i en el segon (que anomenem Ferns amb Informació Mútua o FIM) apliquem una tècnica d'obtenció de Ferns utilitzant un criteri simplificat de la informació mútua.
Resumo:
El treball vol donar un tractament computacional a la recerca d'un determinat tipus de dígrafs anomenats "dígrafs radials de Moore". En determinats casos, els algoritmes desenvolupats donaran com a resultat una numeració completa.
Resumo:
Implementación de una librería en Java capaz de calcular grafos conexos, incluidos todos los grafos conexos no isomorfos a un orden dado y sus respectivas tablas de secuencias de excentricidades (para órdenes pequeños). Aparte se ha realizado un estudio del sistema Nauty y se han utilizado sus ficheros auxiliares.