50 resultados para Following distance.
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
One of the criticisms leveled at the model of dispersed city found all over the world is its unarticulated, random, and undifferentiated nature. To check this idea in the Barcelona Metropolitan Region, we estimated the impact of the urban spatial structure (CBD, subcenters and transportation infrastructures) over the population density and commuting distance. The results are unfavorable to the hypothesis of the increasing destructuring of cities given that the explanatory capacity of both functions improves over time, both when other control variables are not included and when they are included.
Resumo:
Background and purpose: Individual rupture risk assessment of intracranial aneurysms is a major issue in the clinical management of asymptomatic aneurysms. Aneurysm rupture occurs when wall tension exceeds the strength limit of the wall tissue. At present, aneurysmal wall mechanics are poorly understood and thus, risk assessment involving mechanical properties is inexistent. Aneurysm computational hemodynamics studies make the assumption of rigid walls, an arguable simplification. We therefore aim to assess mechanical properties of ruptured and unruptured intracranial aneurysms in order to provide the foundation for future patient-specific aneurysmal risk assessment. This work also challenges some of the currently held hypotheses in computational flow hemodynamics research. Methods: A specific conservation protocol was applied to aneurysmal tissues following clipping and resection in order to preserve their mechanical properties. Sixteen intracranial aneurysms (11 female, 5 male) underwent mechanical uniaxial stress tests under physiological conditions, temperature, and saline isotonic solution. These represented 11 unruptured and 5 ruptured aneurysms. Stress/strain curves were then obtained for each sample, and a fitting algorithm was applied following a 3-parameter (C(10), C(01), C(11)) Mooney-Rivlin hyperelastic model. Each aneurysm was classified according to its biomechanical properties and (un)rupture status.Results: Tissue testing demonstrated three main tissue classes: Soft, Rigid, and Intermediate. All unruptured aneurysms presented a more Rigid tissue than ruptured or pre-ruptured aneurysms within each gender subgroup. Wall thickness was not correlated to aneurysmal status (ruptured/unruptured). An Intermediate subgroup of unruptured aneurysms with softer tissue characteristic was identified and correlated with multiple documented risk factors of rupture. Conclusion: There is a significant modification in biomechanical properties between ruptured aneurysm, presenting a soft tissue and unruptured aneurysms, presenting a rigid material. This finding strongly supports the idea that a biomechanical risk factor based assessment should be utilized in the to improve the therapeutic decision making.
Resumo:
This study investigates the development of fluency in 30 advanced L2 learners of English over a period of 15 months. In order to measure fluency, several temporal variables and hesitation phenomena are analyzed and compared. Oral competence is assessed by means of an oral interview carried out by the learners. Data collection takes place at three different times: before (T1) and after (T2) a six-month period of FI (80 hours) in the home university, and after a three-month SA term (T3). The data is analyzed quantitatively. Developmental gains in fluency are measured for the whole period, adopting a view of complementarity between the two learning contexts. From these results, a group of high fluency speakers is identified. Correlations between fluency gains and individual and contextual variables are executed and a more qualitative analysis is performed for high fluency speakers' performance and behavior. Results show an overall development of students' oral fluency during a period of 15 months favored by the combination of a period of FI at home followed by a 3-months SA.
Resumo:
A new direction of research in Competitive Location theory incorporatestheories of Consumer Choice Behavior in its models. Following thisdirection, this paper studies the importance of consumer behavior withrespect to distance or transportation costs in the optimality oflocations obtained by traditional Competitive Location models. To dothis, it considers different ways of defining a key parameter in thebasic Maximum Capture model (MAXCAP). This parameter will reflectvarious ways of taking into account distance based on several ConsumerChoice Behavior theories. The optimal locations and the deviation indemand captured when the optimal locations of the other models are usedinstead of the true ones, are computed for each model. A metaheuristicbased on GRASP and Tabu search procedure is presented to solve all themodels. Computational experience and an application to 55-node networkare also presented.
Resumo:
A new parametric minimum distance time-domain estimator for ARFIMA processes is introduced in this paper. The proposed estimator minimizes the sum of squared correlations of residuals obtained after filtering a series through ARFIMA parameters. The estimator iseasy to compute and is consistent and asymptotically normally distributed for fractionallyintegrated (FI) processes with an integration order d strictly greater than -0.75. Therefore, it can be applied to both stationary and non-stationary processes. Deterministic components are also allowed in the DGP. Furthermore, as a by-product, the estimation procedure provides an immediate check on the adequacy of the specified model. This is so because the criterion function, when evaluated at the estimated values, coincides with the Box-Pierce goodness of fit statistic. Empirical applications and Monte-Carlo simulations supporting the analytical results and showing the good performance of the estimator in finite samples are also provided.
Resumo:
The set covering problem is an NP-hard combinatorial optimization problemthat arises in applications ranging from crew scheduling in airlines todriver scheduling in public mass transport. In this paper we analyze searchspace characteristics of a widely used set of benchmark instances throughan analysis of the fitness-distance correlation. This analysis shows thatthere exist several classes of set covering instances that have a largelydifferent behavior. For instances with high fitness distance correlation,we propose new ways of generating core problems and analyze the performanceof algorithms exploiting these core problems.
Resumo:
La regressió basada en distàncies és un mètode de predicció que consisteix en dos passos: a partir de les distàncies entre observacions obtenim les variables latents, les quals passen a ser els regressors en un model lineal de mínims quadrats ordinaris. Les distàncies les calculem a partir dels predictors originals fent us d'una funció de dissimilaritats adequada. Donat que, en general, els regressors estan relacionats de manera no lineal amb la resposta, la seva selecció amb el test F usual no és possible. En aquest treball proposem una solució a aquest problema de selecció de predictors definint tests estadístics generalitzats i adaptant un mètode de bootstrap no paramètric per a l'estimació dels p-valors. Incluim un exemple numèric amb dades de l'assegurança d'automòbils.
Resumo:
La regressió basada en distàncies és un mètode de predicció que consisteix en dos passos: a partir de les distàncies entre observacions obtenim les variables latents, les quals passen a ser els regressors en un model lineal de mínims quadrats ordinaris. Les distàncies les calculem a partir dels predictors originals fent us d'una funció de dissimilaritats adequada. Donat que, en general, els regressors estan relacionats de manera no lineal amb la resposta, la seva selecció amb el test F usual no és possible. En aquest treball proposem una solució a aquest problema de selecció de predictors definint tests estadístics generalitzats i adaptant un mètode de bootstrap no paramètric per a l'estimació dels p-valors. Incluim un exemple numèric amb dades de l'assegurança d'automòbils.
Resumo:
Background: There is little information about the effect of infliximab on the clinical course of liver disease in Crohn's disease patients with concomitant hepatitis B virus (HBV) infection. Theoretically, immunosuppression induced by infliximab will facilitate viral replication which could be followed by a flare or exacerbation of disease when therapy is discontinued. There are no specific recommendations on surveillance and treatment of HBV before infliximab infusion. Two cases of severe hepatic failure related to infliximab infusions have been described in patients with rheumatic diseases. Patients and methods: Hepatitis markers (C and B) and liver function tests were prospectively determined to 80 Crohn's disease patients requiring infliximab infusion in three hospitals in Spain. Results: Three Crohn¿s disease patients with chronic HBV infection were identified. Two of the three patients with chronic HBV infection suffered severe reactivation of chronic hepatitis B after withdrawal of infliximab therapy and one died. A third patient, who was treated with lamivudine at the time of infliximab therapy, had no clinical or biochemical worsening of liver disease during or after therapy. From the remaining 80 patients, six received the hepatitis B vaccine. Three patients had antibodies to both hepatitis B surface antigen (anti-HBs) and hepatitis B core protein (anti-HBc) with normal aminotransferase levels, and one patient had positive anti-hepatitis C virus (HCV) antibodies, negative HCV RNA, and normal aminotransferase levels. Except for the patients with chronic HBV infection, no significant changes in hepatic function were detected. Conclusions: Patients with Crohn's disease who are candidates for infliximab therapy should be tested for hepatitis B serological markers before treatment and considered for prophylaxis of reactivation using antiviral therapy if positive.
Resumo:
The observation of coherent tunnelling in Cu2+ - and Ag2+ -doped MgO and CaO:Cu2+ was a crucial discovery in the realm of the Jahn-Teller (JT) effect. The main reasons favoring this dynamic behavior are now clarified through ab initio calculations on Cu2+ - and Ag2+ -doped cubic oxides. Small JT distortions and an unexpected low anharmonicity of the eg JT mode are behind energy barriers smaller than 25 cm-1 derived through CASPT2 calculations for Cu2+ - and Ag2+ -doped MgO and CaO:Cu2+ . The low anharmonicity is shown to come from a strong vibrational coupling of MO610- units (M=Cu,Ag) to the host lattice. The average distance between the d9 impurity and ligands is found to vary significantly on passing from MgO to SrO following to a good extent the lattice parameter.
Resumo:
The prediction of rockfall travel distance below a rock cliff is an indispensable activity in rockfall susceptibility, hazard and risk assessment. Although the size of the detached rock mass may differ considerably at each specific rock cliff, small rockfall (<100 m3) is the most frequent process. Empirical models may provide us with suitable information for predicting the travel distance of small rockfalls over an extensive area at a medium scale (1:100 000¿1:25 000). "Solà d'Andorra la Vella" is a rocky slope located close to the town of Andorra la Vella, where the government has been documenting rockfalls since 1999. This documentation consists in mapping the release point and the individual fallen blocks immediately after the event. The documentation of historical rockfalls by morphological analysis, eye-witness accounts and historical images serve to increase available information. In total, data from twenty small rockfalls have been gathered which reveal an amount of a hundred individual fallen rock blocks. The data acquired has been used to check the reliability of the main empirical models widely adopted (reach and shadow angle models) and to analyse the influence of parameters which affecting the travel distance (rockfall size, height of fall along the rock cliff and volume of the individual fallen rock block). For predicting travel distances in maps with medium scales, a method has been proposed based on the "reach probability" concept. The accuracy of results has been tested from the line entailing the farthest fallen boulders which represents the maximum travel distance of past rockfalls. The paper concludes with a discussion of the application of both empirical models to other study areas.
Resumo:
Aim: The aim of this study was to assess quality of life (QoL) and degree of satisfaction among outpatients subjected to surgical extraction of all four third molars under conscious sedation. A second objective was to describe the evolution of self-reported pain measured in a visual analogue scale (VAS) in the 7 days after extraction. Study design: Fifty patients received a questionnaire assessing social isolation, working isolation, eating and speaking ability, diet modifications, sleep impairment, changes in physical appearance, discomfort at suture removal and overall satisfaction at days 4 and 7 after surgery. Pain was recorded by patients on a 100-mm pain visual analogue scale (VAS) every day after extraction until day 7. Results: Thirty-nine patients fulfilled correctly the questionnaire. Postoperative pain values suffered small fluctuations until day 5 (range: 23 to 33 mm in a 100-mm VAS), when dicreased significantly. A positive association was observed between difficult ranked surgeries and higher postoperative pain levels. The average number of days for which the patient stopped working was 4.9. Conclusion: The removal of all third molars in a single appointment causes an important deterioration of the patient"s QoL during the first postoperative week, especially due to local pain and eating discomfort.
Resumo:
Finding an adequate paraphrase representation formalism is a challenging issue in Natural Language Processing. In this paper, we analyse the performance of Tree Edit Distance as a paraphrase representation baseline. Our experiments using Edit Distance Textual Entailment Suite show that, as Tree Edit Distance consists of a purely syntactic approach, paraphrase alternations not based on structural reorganizations do not find an adequate representation. They also show that there is much scope for better modelling of the way trees are aligned.