68 resultados para probability of informed trading
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
In networks with small buffers, such as optical packet switching based networks, the convolution approach is presented as one of the most accurate method used for the connection admission control. Admission control and resource management have been addressed in other works oriented to bursty traffic and ATM. This paper focuses on heterogeneous traffic in OPS based networks. Using heterogeneous traffic and bufferless networks the enhanced convolution approach is a good solution. However, both methods (CA and ECA) present a high computational cost for high number of connections. Two new mechanisms (UMCA and ISCA) based on Monte Carlo method are proposed to overcome this drawback. Simulation results show that our proposals achieve lower computational cost compared to enhanced convolution approach with an small stochastic error in the probability estimation
Resumo:
We propose a new econometric estimation method for analyzing the probabilityof leaving unemployment using uncompleted spells from repeated cross-sectiondata, which can be especially useful when panel data are not available. Theproposed method-of-moments-based estimator has two important features:(1) it estimates the exit probability at the individual level and(2) it does not rely on the stationarity assumption of the inflowcomposition. We illustrate and gauge the performance of the proposedestimator using the Spanish Labor Force Survey data, and analyze the changesin distribution of unemployment between the 1980s and 1990s during a periodof labor market reform. We find that the relative probability of leavingunemployment of the short-term unemployed versus the long-term unemployedbecomes significantly higher in the 1990s.
Resumo:
The economic literature on crime and punishment focuses on the trade-off between probability and severity of punishment, and suggests that detection probability and fines are substitutes. In this paper it is shown that, in presence of substantial underdeterrence caused by costly detection and punishment, these instruments may become complements. When offenders are poor, the deterrent value of monetary sanctions is low. Thus, the government does not invest a lot in detection. If offenders are rich, however, the deterrent value of monetary sanctions is high, so it is more profitable to prosecute them.
Resumo:
We study the interaction between insurance and capital markets within singlebut general framework.We show that capital markets greatly enhance the risksharing capacity of insurance markets and the scope of risks that areinsurable because efficiency does not depend on the number of agents atrisk, nor on risks being independent, nor on the preferences and endowmentsof agents at risk being the same. We show that agents share risks by buyingfull coverage for their individual risks and provide insurance capitalthrough stock markets.We show that aggregate risk enters private insuranceas positive loading on insurance prices and despite that agents will buyfull coverage. The loading is determined by the risk premium of investorsin the stock market and hence does not depend on the agent s willingnessto pay. Agents provide insurance capital by trading an equally weightedportfolio of insurance company shares and riskless asset. We are able toconstruct agents optimal trading strategies explicitly and for verygeneral preferences.
Resumo:
In this paper, differences in return autocorrelation across weekdays havebeen investigated. Our research provides strong evidence of the importanceon non-trading periods, not only weekends and holidays but also overnightclosings, to explain return autocorrelation anomalies. While stock returnsare highly autocorrelated, specially on Mondays, when daily returns arecomputed on a open-to-close basis, they do not exhibit any significantlevel of autocorrelation. Our results are compatible with theinformation processing hypotheses as an explanation of the weekendeffect.
Resumo:
We develop a model of insider trading where agents have private information either about liquidation value or about supply and behave strategically to maximize their profits. The supply informed trader plays a dual role in market making and in information revelation. This trader not only reveals a part of the information he owns, but he also induces the other traders to reveal more of their private information. The presence of different types of information decreases market liquidity and induces non-monotonicity of the market indicators with respect to the variance of liquidation value. Replacing the noise introduced by liquidity traders with a random supply also allows us to study the effect the shocks on different components of supply have on prices and quantities.
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
Our task in this paper is to analyze the organization of trading in the era of quantitative finance. To do so, we conduct an ethnography of arbitrage, the trading strategy that best exemplifies finance in the wake of the quantitative revolution. In contrast to value and momentum investing, we argue, arbitrage involves an art of association-the construction of equivalence (comparability) of properties across different assets. In place of essential or relational characteristics, the peculiar valuation that takes place in arbitrage is based on an operation that makes something the measure of something else-associating securities to each other. The process of recognizing opportunities and the practices of making novel associations are shaped by the specific socio-spatial and socio-technical configurations of the trading room. Calculation is distributed across persons and instruments as the trading room organizes interaction among diverse principles of valuation.
Resumo:
This paper analyzes the linkages between the credibility of a target zone regime, the volatility of the exchange rate, and the width of the band where the exchange rate is allowed to fluctuate. These three concepts should be related since the band width induces a trade-off between credibility and volatility. Narrower bands should give less scope for the exchange rate to fluctuate but may make agents perceive a larger probability of realignment which by itself should increase the volatility of the exchange rate. We build a model where this trade-off is made explicit. The model is used to understand the reduction in volatility experienced by most EMS countries after their target zones were widened on August 1993. As a natural extension, the model also rationalizes the existence of non-official, implicit target zones (or fear of floating), suggested by some authors.
Resumo:
The aim of this paper is to analyze the causes leading to social exclusion dynamics. In particular, we wish to understand why any individual experiencing social exclusion today is much more likely to experience it again. In fact, there are two distinct processes that may generate a persistence of social exclusion: heterogeneity (individuals are heterogeneous with respect to some observed and/or unobserved adverse characteristics that are relevant for the chance of experiencing social exclusion and persistence over time) and true state of dependence (experiencing social exclusion in a specific time period, in itself, increases the probability of undergoing social exclusion in subsequent periods). Distinguishing between the two processes is crucial since the policy implications are very different.
Resumo:
As computer chips implementation technologies evolve to obtain more performance, those computer chips are using smaller components, with bigger density of transistors and working with lower power voltages. All these factors turn the computer chips less robust and increase the probability of a transient fault. Transient faults may occur once and never more happen the same way in a computer system lifetime. There are distinct consequences when a transient fault occurs: the operating system might abort the execution if the change produced by the fault is detected by bad behavior of the application, but the biggest risk is that the fault produces an undetected data corruption that modifies the application final result without warnings (for example a bit flip in some crucial data). With the objective of researching transient faults in computer system’s processor registers and memory we have developed an extension of HP’s and AMD joint full system simulation environment, named COTSon. This extension allows the injection of faults that change a single bit in processor registers and memory of the simulated computer. The developed fault injection system makes it possible to: evaluate the effects of single bit flip transient faults in an application, analyze an application robustness against single bit flip transient faults and validate fault detection mechanism and strategies.
Resumo:
The Pyrenean chamois (Rupicapra pyrenaica pyrenaica) is a mountain-dwelling ungulate with an extensive presence in open areas. Optimal group size results from the trade off between advantages (a reduction in the risk of predation) and disadvantages (competition between members of the herd) of group living. In addition, advantages and disadvantages of group living may vary depending on the position of each individual within the herd. Our objective was to study the effect of central vs. peripheral position in the herd on feeding and vigilance behavior in male and female Pyrenean chamois and to ascertain if a group size effect existed. We used focal animal sampling and recorded social interactions when a focal animal was involved. With males, vigilance rate was higher in the central part of the group than at the periphery, probably due to a higher density of animals in the central part of the herd and a higher probability of being disturbed by conspecifics. With females, vigilance rate did not differ according to position in the herd. Females spent more time feeding than males, and males showed a higher frequency of the vigilance behavior than females. We did not observe a clear relationship between group size and vigilance behavior. The differences in vigilance behavior might be due to social interactions.
Resumo:
Compositional data analysis motivated the introduction of a complete Euclidean structure in the simplex of D parts. This was based on the early work of J. Aitchison (1986) and completed recently when Aitchinson distance in the simplex was associated with an inner product and orthonormal bases were identified (Aitchison and others, 2002; Egozcue and others, 2003). A partition of the support of a random variable generates a composition by assigning the probability of each interval to a part of the composition. One can imagine that the partition can be refined and the probability density would represent a kind of continuous composition of probabilities in a simplex of infinitely many parts. This intuitive idea would lead to a Hilbert-space of probability densitiesby generalizing the Aitchison geometry for compositions in the simplex into the set probability densities
Resumo:
In this paper a novel methodology aimed at minimizing the probability of network failure and the failure impact (in terms of QoS degradation) while optimizing the resource consumption is introduced. A detailed study of MPLS recovery techniques and their GMPLS extensions are also presented. In this scenario, some features for reducing the failure impact and offering minimum failure probabilities at the same time are also analyzed. Novel two-step routing algorithms using this methodology are proposed. Results show that these methods offer high protection levels with optimal resource consumption