937 resultados para Algoritmic pairs trading, statistical arbitrage, Kalman filter, mean reversion.
Resumo:
A number of recent works have introduced statistical methods for detecting genetic loci that affect phenotypic variability, which we refer to as variability-controlling quantitative trait loci (vQTL). These are genetic variants whose allelic state predicts how much phenotype values will vary about their expected means. Such loci are of great potential interest in both human and non-human genetic studies, one reason being that a detected vQTL could represent a previously undetected interaction with other genes or environmental factors. The simultaneous publication of these new methods in different journals has in many cases precluded opportunity for comparison. We survey some of these methods, the respective trade-offs they imply, and the connections between them. The methods fall into three main groups: classical non-parametric, fully parametric, and semi-parametric two-stage approximations. Choosing between alternatives involves balancing the need for robustness, flexibility, and speed. For each method, we identify important assumptions and limitations, including those of practical importance, such as their scope for including covariates and random effects. We show in simulations that both parametric methods and their semi-parametric approximations can give elevated false positive rates when they ignore mean-variance relationships intrinsic to the data generation process. We conclude that choice of method depends on the trait distribution, the need to include non-genetic covariates, and the population size and structure, coupled with a critical evaluation of how these fit with the assumptions of the statistical model.
Resumo:
This paper investigates the impact of price limits on the Brazil- ian future markets using high frequency data. The aim is to identify whether there is a cool-off or a magnet effect. For that purpose, we examine a tick-by-tick data set that includes all contracts on the So Paulo stock index futures traded on the Brazilian Mercantile and Futures Exchange from January 1997 to December 1999. Our main finding is that price limits drive back prices as they approach the lower limit. There is a strong cool-off effect of the lower limit on the conditional mean, whereas the upper limit seems to entail a weak magnet effect on the conditional variance. We then build a trading strategy that accounts for the cool-off effect so as to demonstrate that the latter has not only statistical, but also economic signifi- cance. The resulting Sharpe ratio indeed is way superior to the buy-and-hold benchmarks we consider.
Resumo:
Parametric term structure models have been successfully applied to innumerous problems in fixed income markets, including pricing, hedging, managing risk, as well as studying monetary policy implications. On their turn, dynamic term structure models, equipped with stronger economic structure, have been mainly adopted to price derivatives and explain empirical stylized facts. In this paper, we combine flavors of those two classes of models to test if no-arbitrage affects forecasting. We construct cross section (allowing arbitrages) and arbitrage-free versions of a parametric polynomial model to analyze how well they predict out-of-sample interest rates. Based on U.S. Treasury yield data, we find that no-arbitrage restrictions significantly improve forecasts. Arbitrage-free versions achieve overall smaller biases and Root Mean Square Errors for most maturities and forecasting horizons. Furthermore, a decomposition of forecasts into forward-rates and holding return premia indicates that the superior performance of no-arbitrage versions is due to a better identification of bond risk premium.
Resumo:
This paper investigates the impact of price limits on the Brazilian futures markets using high frequency data. The aim is to identify whether there is a cool-off or a magnet effect. For that purpose, we examine a tick-by-tick data set that includes all contracts on the Sao Paulo stock index futures traded on the Brazilian Mercantile and Futures Exchange from January 1997 to December 1999. The results indicate that the conditional mean features a floor cool-off effect, whereas the conditional variance significantly increases as the price approaches the upper limit. We then build a trading strategy that accounts for the cool-off effect in the conditional mean so as to demonstrate that the latter has not only statistical, but also economic significance. The in-sample Sharpe ratio indeed is way superior to the buy-and-hold benchmarks we consider, whereas out-of-sample results evince similar performances.
Resumo:
Esta dissertao estuda o movimento do mercado acionrio brasileiro com o objetivo de testar a trajetria de preos de pares de aes, aplicada estratgia de pair trading. Os ativos estudados compreendem as aes que compem o Ibovespa e a seleo dos pares feita de forma unicamente estatstica atravs da caracterstica de cointegrao entre ativos, sem anlise fundamentalista na escolha. A teoria aqui aplicada trata do movimento similar de preos de pares de aes que evoluem de forma a retornar para o equilbrio. Esta evoluo medida pela diferena instantnea dos preos comparada mdia histrica. A estratgia apresenta resultados positivos quando a reverso mdia se efetiva, num intervalo de tempo pr-determinado. Os dados utilizados englobam os anos de 2006 a 2010, com preos intra-dirios para as aes do Ibovespa. As ferramentas utilizadas para seleo dos pares e simulao de operao no mercado foram MATLAB (seleo) e Streambase (operao). A seleo foi feita atravs do Teste de Dickey-Fuller aumentado aplicado no MATLAB para verificar a existncia da raiz unitria dos resduos da combinao linear entre os preos das aes que compem cada par. A operao foi feita atravs de back-testing com os dados intra-dirios mencionados. Dentro do intervalo testado, a estratgia mostrou-se rentvel para os anos de 2006, 2007 e 2010 (com retornos acima da Selic). Os parmetros calibrados para o primeiro ms de 2006 puderam ser aplicados com sucesso para o restante do intervalo (retorno de Selic + 5,8% no ano de 2006), para 2007, onde o retorno foi bastante prximo da Selic e para 2010, com retorno de Selic + 10,8%. Nos anos de maior volatilidade (2008 e 2009), os testes com os mesmos parmetros de 2006 apresentaram perdas, mostrando que a estratgia fortemente impactada pela volatilidade dos retornos dos preos das aes. Este comportamento sugere que, numa operao real, os parmetros devem ser calibrados periodicamente, com o objetivo de adapt-los aos cenrios mais volteis.
Resumo:
Pair trading is an old and well-known technique among traders. In this paper, we discuss an important element not commonly debated in Brazil: the cointegration between pairs, which would guarantee the spread stability. We run the Dickey-Fuller test to check cointegration, and then compare the results with non-cointegrated pairs. We found that the Sharpe ratio of cointegrated pairs is greater than the non-cointegrated. We also use the Ornstein-Uhlenbeck equation in order to calculate the half-life of the pairs. Again, this improves their performance. Last, we use the leverage suggested by Kelly Formula, once again improving the results.
Resumo:
Esta dissertao tem como objetivo entender os hbitos de consumo das mulheres da nova classe trabalhadora para conhecer as aspiraes, motivaes e desejos que influenciam suas decises de compra, e para identificar qual o significado, para elas, dos produtos caractersticos do novo luxo. A problemtica deste trabalho envolve o entendimento do comportamento de consumo da nova classe trabalhadora (Souza, 2012), e tem por objetivo compreender os novos hbitos de consumo dessa classe, no que tange ao consumo de bens constitutivos do chamado novo luxo (Silverstein & Fiske, 2008). Os resultados desta pesquisa traro entendimento s ressignificaes de produtos de novo luxo para a nova classe trabalhadora, conhecimento sobre as preferncias e prioridades dessa classe, e compreenso sobre o valor simblico do consumo desse tipo de produto. No primeiro captulo, foi abordado o comportamento do consumidor, mostrando a importncia do estudo do comportamento de consumo para as estratgias mercadolgicas, alm de explorar a influncia da cultura na tomada de deciso dos consumidores; o segundo captulo abordou os conceitos de habitus, capital simblico e cultural, em que so exploradas as questes relacionadas a valores, atitudes e hbitos, e a importncia destes na expresso do indivduo na sociedade e na formao de sua identidade; no terceiro captulo, discutiu-se o conceito de classe social, trabalhando com as principais divergncias encontradas nas premissas utilizadas por cada autor para identificar suas caractersticas distintivas, mencionando os principais argumentos relacionados aos conceitos de nova classe mdia (Neri, 2011) e de nova classe trabalhadora (Souza, 2012); por fim, o quarto captulo tratou do fenmeno do trading-up (Silverstein & Fiske, 2008), que demonstra que o consumidor tem optado por produtos considerados de novo luxo, mesmo que paguem valores superiores para obt-los. O produto de novo luxo definido pelos autores como um produto premium, que apresenta melhorias e caractersticas superiores em relao a produtos similares, porm com preos mais acessveis se comparados aos de luxo tradicional. A metodologia escolhida para este trabalho foi a pesquisa qualitativa de carter exploratrio-descritivo, considerando uma amostragem no probabilstica, usando a seleo por julgamento. Os resultados da pesquisa demonstraram que, de fato, o fenmeno do trading-up est presente no dia-a-dia das mulheres da nova classe trabalhadora, ao priorizarem determinados itens que julgam importantes para o seu conforto, bem-estar, e melhoria na qualidade de vida.
Resumo:
Extreme rainfall events have triggered a signicant number of ash oods in Madeira Island along its past and recent history. Madeira is a volcanic island where the spatial rainfall distribution is strongly aected by its rugged topography. In this thesis, annual maximum of daily rainfall data from 25 rain gauge stations located in Madeira Island were modelled by the generalised extreme value distribution. Also, the hypothesis of a Gumbel distribution was tested by two methods and the existence of a linear trend in both distributions parameters was analysed. Estimates for the 50 and 100year return levels were also obtained. Still in an univariate context, the assumption that a distribution function belongs to the domain of attraction of an extreme value distribution for monthly maximum rainfall data was tested for the rainy season. The available data was then analysed in order to nd the most suitable domain of attraction for the sampled distribution. In a dierent approach, a search for thresholds was also performed for daily rainfall values through a graphical analysis. In a multivariate context, a study was made on the dependence between extreme rainfall values from the considered stations based on Kendalls measure. This study suggests the inuence of factors such as altitude, slope orientation, distance between stations and their proximity of the sea on the spatial distribution of extreme rainfall. Groups of three pairwise associated stations were also obtained and an adjustment was made to a family of extreme value copulas involving the MarshallOlkin family, whose parameters can be written as a function of Kendalls association measures of the obtained pairs.
Resumo:
OBJETIVO: identificar fatores de risco para a macrossomia fetal na populao de gestantes portadoras de diabete ou hiperglicemia diria. MTODOS: estudo retrospectivo, tipo caso-controle, incluindo 803 pares de mes e recm-nascidos desta populao especfica, distribudos em dois grupos: macrossmicos (casos, n=242) e no macrossmicos (controles, n=561). Foram comparadas variveis relativas idade, paridade, peso e ndice de massa corporal (IMC), ganho de peso (GP), antecedentes de diabete, hipertenso arterial e tabagismo, tipo e classificao do diabete e indicadores do controle glicmico no terceiro trimestre. As mdias foram avaliadas pelo teste F e as variveis categorizadas foram submetidas anlise univariada, utilizando-se o teste do chi². Os resultados significativos foram includos no modelo de regresso mltipla, para identificao do risco independente de macrossomia, considerando-se OR, IC 95% e valor de p. Para todas as anlises foi estabelecido o limite de significncia estatstica de 5% (p<0,05). RESULTADOS: observou-se associao significativa entre macrossomia e GP maior que 16 kg, IMC >25 kg/m, antecedentes pessoais, obsttricos e, especificamente, o de macrossomia, classificao nos grupos de Rudge (IB e IIA + IIB), mdia glicmica (MG) >120 mg/dL e mdia de glicemia ps-prandial >130 mg/dL no terceiro trimestre. Na anlise de regresso mltipla, o GP >16 kg (OR=1,79; IC 95%: 1,23-1,60), o IMC >25 kg/m² (OR=1,83; IC 95%: 1,27-2,64), o antecedente pessoal de diabete (OR=1,56; IC 95%: 1,05-2,31) e de macrossomia (OR=2,37; IC 95%: 1,60-3,50) e a MG >120 mg/dL no terceiro trimestre (OR=1,78; IC 95%: 1,13-2,80) confirmaram risco independente para macrossomia nestas gestaes de risco. CONCLUSO: o GP superior a 16 kg, o IMC maior ou igual a 25 kg/m, a MG superior a 120 mg/dL no terceiro trimestre e a presena de antecedentes pessoais de diabete ou de macrossomia foram identificados como fatores de risco para macrossomia fetal em gestantes portadoras de diabete ou de hiperglicemia diria.
Resumo:
Fundao de Amparo Pesquisa do Estado de So Paulo (FAPESP)
Resumo:
Statistical analysis of data is crucial in cephalometric investigations. There are certainly excellent examples of good statistical practice in the field, but some articles published worldwide have carried out inappropriate analyses. Objective: The purpose of this study was to show that when the double records of each patient are traced on the same occasion, a control chart for differences between readings needs to be drawn, and limits of agreement and coefficients of repeatability must be calculated. Material and methods: Data from a well-known paper in Orthodontics were used for showing common statistical practices in cephalometric investigations and for proposing a new technique of analysis. Results: A scatter plot of the two radiograph readings and the two model readings with the respective regression lines are shown. Also, a control chart for the mean of the differences between radiograph readings was obtained and a coefficient of repeatability was calculated. Conclusions: A standard error assuming that mean differences are zero, which is referred to in Orthodontics and Facial Orthopedics as the Dahlberg error, can be calculated only for estimating precision if accuracy is already proven. When double readings are collected, limits of agreement and coefficients of repeatability must be calculated. A graph with differences of readings should be presented and outliers discussed.
Resumo:
The possibility of kaon condensation in high-density symmetric nuclear matter is investigated including both s- and p-wave kaon-baryon interactions within the relativistic mean-field (RMF) theory. Above a certain density, we have a collective (D) over bar (S) state carrying the same quantum numbers as the antikaon. The appearance of the (K) over bar (S) state is caused by the time component of the axial-vector interaction between kaons and baryons. It is shown that the system becomes unstable with respect to condensation of K-(K) over bar (S) pairs. We consider how the effective baryon masses affect the kaon self-energy coming from the time component of the axial-vector interaction. Also, the role of the spatial component of the axial-vector interaction on the possible existence of the collective kaonic states is discussed in connection with A-mixing effects in the ground state of high-density matter: Implications of K (K) over bar (S) condensation for high-energy heavy-ion collisions are briefly mentioned. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Traditionally, an (X) over bar chart is used to control the process mean and an R chart is used to control the process variance. However, these charts are not sensitive to small changes in the process parameters. The adaptive ($) over bar and R charts might be considered if the aim is to detect small disturbances. Due to the statistical character of the joint (X) over bar and R charts with fixed or adaptive parameters, they are not reliable in identifing the nature of the disturbance, whether it is one that shifts the process mean, increases the process variance, or leads to a combination of both effects. In practice, the speed with which the control charts detect process changes may be more important than their ability in identifying the nature of the change. Under these circumstances, it seems to be advantageous to consider a single chart, based on only one statistic, to simultaneously monitor the process mean and variance. In this paper, we propose the adaptive non-central chi-square statistic chart. This new chart is more effective than the adaptive (X) over bar and R charts in detecting disturbances that shift the process mean, increase the process variance, or lead to a combination of both effects. Copyright (c) 2006 John Wiley & Sons, Ltd.
Resumo:
A body of research has developed within the context of nonlinear signal and image processing that deals with the automatic, statistical design of digital window-based filters. Based on pairs of ideal and observed signals, a filter is designed in an effort to minimize the error between the ideal and filtered signals. The goodness of an optimal filter depends on the relation between the ideal and observed signals, but the goodness of a designed filter also depends on the amount of sample data from which it is designed. In order to lessen the design cost, a filter is often chosen from a given class of filters, thereby constraining the optimization and increasing the error of the optimal filter. To a great extent, the problem of filter design concerns striking the correct balance between the degree of constraint and the design cost. From a different perspective and in a different context, the problem of constraint versus sample size has been a major focus of study within the theory of pattern recognition. This paper discusses the design problem for nonlinear signal processing, shows how the issue naturally transitions into pattern recognition, and then provides a review of salient related pattern-recognition theory. In particular, it discusses classification rules, constrained classification, the Vapnik-Chervonenkis theory, and implications of that theory for morphological classifiers and neural networks. The paper closes by discussing some design approaches developed for nonlinear signal processing, and how the nature of these naturally lead to a decomposition of the error of a designed filter into a sum of the following components: the Bayes error of the unconstrained optimal filter, the cost of constraint, the cost of reducing complexity by compressing the original signal distribution, the design cost, and the contribution of prior knowledge to a decrease in the error. The main purpose of the paper is to present fundamental principles of pattern recognition theory within the framework of active research in nonlinear signal processing.
Resumo:
Genomic sequence comparison across species has enabled the elucidation of important coding and regulatory sequences encoded within DNA. Of particular interest are the noncoding regulatory sequences, which influence gene transcriptional and posttranscriptional processes. A phylogenetic footprinting strategy was employed to identify noncoding conservation patterns of 39 human and bovine orthologous genes. Seventy-three conserved noncoding sequences were identified that shared greater than 70% identity over at least 100 bp. Thirteen of these conserved sequences were also identified in the mouse genome. Evolutionary conservation of noncoding sequences across diverse species may have functional significance, and these conserved sequences may be good candidates for regulatory elements.