67 resultados para Point of view


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As stated in Aitchison (1986), a proper study of relative variation in a compositional data set should be based on logratios, and dealing with logratios excludes dealing with zeros. Nevertheless, it is clear that zero observations might be present in real data sets, either because the corresponding part is completelyabsent –essential zeros– or because it is below detection limit –rounded zeros. Because the second kind of zeros is usually understood as “a trace too small to measure”, it seems reasonable to replace them by a suitable small value, and this has been the traditional approach. As stated, e.g. by Tauber (1999) and byMartín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000), the principal problem in compositional data analysis is related to rounded zeros. One should be careful to use a replacement strategy that does not seriously distort the general structure of the data. In particular, the covariance structure of the involvedparts –and thus the metric properties– should be preserved, as otherwise further analysis on subpopulations could be misleading. Following this point of view, a non-parametric imputation method isintroduced in Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000). This method is analyzed in depth by Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2003) where it is shown that thetheoretical drawbacks of the additive zero replacement method proposed in Aitchison (1986) can be overcome using a new multiplicative approach on the non-zero parts of a composition. The new approachhas reasonable properties from a compositional point of view. In particular, it is “natural” in the sense thatit recovers the “true” composition if replacement values are identical to the missing values, and it is coherent with the basic operations on the simplex. This coherence implies that the covariance structure of subcompositions with no zeros is preserved. As a generalization of the multiplicative replacement, in thesame paper a substitution method for missing values on compositional data sets is introduced

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The identification of compositional changes in fumarolic gases of active and quiescent volcanoes is one of the mostimportant targets in monitoring programs. From a general point of view, many systematic (often cyclic) and randomprocesses control the chemistry of gas discharges, making difficult to produce a convincing mathematical-statisticalmodelling.Changes in the chemical composition of volcanic gases sampled at Vulcano Island (Aeolian Arc, Sicily, Italy) fromeight different fumaroles located in the northern sector of the summit crater (La Fossa) have been analysed byconsidering their dependence from time in the period 2000-2007. Each intermediate chemical composition has beenconsidered as potentially derived from the contribution of the two temporal extremes represented by the 2000 and 2007samples, respectively, by using inverse modelling methodologies for compositional data. Data pertaining to fumarolesF5 and F27, located on the rim and in the inner part of La Fossa crater, respectively, have been used to achieve theproposed aim. The statistical approach has allowed us to highlight the presence of random and not random fluctuations,features useful to understand how the volcanic system works, opening new perspectives in sampling strategies and inthe evaluation of the natural risk related to a quiescent volcano

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work a detailed investigation of the exohedral reactivity of the most important and abundant endohedral metallofullerene (EMF) is provided, that is, Sc3N@Ih-C80 and its D5h counterpart Sc3N@D5h-C80, and the (bio)chemically relevant lutetium- and gadolinium-based M3N@Ih/D5h-C80 EMFs (M=Sc, Lu, Gd). In particular, we analyze the thermodynamics and kinetics of the Diels–Alder cycloaddition of s-cis-1,3-butadiene on all the different bonds of the Ih-C80 and D5h-C80 cages and their endohedral derivatives. First, we discuss the thermodynamic and kinetic aspects of the cycloaddition reaction on the hollow fullerenes and the two isomers of Sc3N@C80. Afterwards, the effect of the nature of the metal nitride is analyzed in detail. In general, our BP86/TZP//BP86/DZP calculations indicate that [5,6] bonds are more reactive than [6,6] bonds for the two isomers. The [5,6] bond D5h-b, which is the most similar to the unique [5,6] bond type in the icosahedral cage, Ih-a, is the most reactive bond in M3N@D5h-C80 regardless of M. Sc3N@C80 and Lu3N@C80 give similar results; the regioselectivity is, however, significantly reduced for the larger and more electropositive M=Gd, as previously found in similar metallofullerenes. Calculations also show that the D5h isomer is more reactive from the kinetic point of view than the Ih one in all cases which is in good agreement with experiments

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Why does not gravity make drops slip down the inclined surfaces, e.g., plant leaves? The current explanation is based on the existence of surface inhomogeneities, which cause a sustaining force that pins the contact line. Following this theory, the drop remains in equilibrium until a critical value of the sustaining force is reached. We propose an alternative analysis, from the point of view of energy balance, for the particular case in which the drop leaves a liquid film behind. The critical angle of the inclined surface at which the drop slips down is predicted. This result does not depend explicitly on surface inhomogeneities, but only on the drop size and surface tensions. There is good agreement with experiments for contact angles below 90° where the formation of the film is expected, whereas for greater contact angles great discrepancies arise

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents an application of the multilevel analysis techniques tothe study of the abstention in the 2000 Spanish general election. Theinterest of the study is both, substantive and methodological. From thesubstantive point of view the article intends to explain the causes ofabstention and analyze the impact of associationism on it. From themethodological point of view it is intended to analyze the interaction betweenindividual and context with a modelisation that takes into account thehierarchical structure of data. The multilevel study of this paper validatesthe one level results obtained in previous analysis of the abstention andshows that only a fraction of the differences in abstention are explained bythe individual characteristics of the electors. Another important fraction ofthese differences is due to the political and social characteristics of thecontext. Relating to associationism, the data suggest that individualparticipation in associations decrease the probability of abstention. However,better indicators are needed in order to catch more properly the effect ofassociationism in electoral behaviour.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The generalization of simple correspondence analysis, for two categorical variables, to multiple correspondence analysis where they may be three or more variables, is not straighforward, both from a mathematical and computational point of view. In this paper we detail the exact computational steps involved in performing a multiple correspondence analysis, including the special aspects of adjusting the principal inertias to correct the percentages of inertia, supplementary points and subset analysis. Furthermore, we give the algorithm for joint correspondence analysis where the cross-tabulations of all unique pairs of variables are analysed jointly. The code in the R language for every step of the computations is given, as well as the results of each computation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The influence of the basis set size and the correlation energy in the static electrical properties of the CO molecule is assessed. In particular, we have studied both the nuclear relaxation and the vibrational contributions to the static molecular electrical properties, the vibrational Stark effect (VSE) and the vibrational intensity effect (VIE). From a mathematical point of view, when a static and uniform electric field is applied to a molecule, the energy of this system can be expressed in terms of a double power series with respect to the bond length and to the field strength. From the power series expansion of the potential energy, field-dependent expressions for the equilibrium geometry, for the potential energy and for the force constant are obtained. The nuclear relaxation and vibrational contributions to the molecular electrical properties are analyzed in terms of the derivatives of the electronic molecular properties. In general, the results presented show that accurate inclusion of the correlation energy and large basis sets are needed to calculate the molecular electrical properties and their derivatives with respect to either nuclear displacements or/and field strength. With respect to experimental data, the calculated power series coefficients are overestimated by the SCF, CISD, and QCISD methods. On the contrary, perturbation methods (MP2 and MP4) tend to underestimate them. In average and using the 6-311 + G(3df) basis set and for the CO molecule, the nuclear relaxation and the vibrational contributions to the molecular electrical properties amount to 11.7%, 3.3%, and 69.7% of the purely electronic μ, α, and β values, respectively

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The interest in solar ultraviolet (UV) radiation from the scientific community and the general population has risen significantly in recent years because of the link between increased UV levels at the Earth's surface and depletion of ozone in the stratosphere. As a consequence of recent research, UV radiation climatologies have been developed, and effects of some atmospheric constituents (such as ozone or aerosols) have been studied broadly. Correspondingly, there are well-established relationships between, for example, total ozone column and UV radiation levels at the Earth's surface. Effects of clouds, however, are not so well described, given the intrinsic difficulties in properly describing cloud characteristics. Nevertheless, the effect of clouds cannot be neglected, and the variability that clouds induce on UV radiation is particularly significant when short timescales are involved. In this review we show, summarize, and compare several works that deal with the effect of clouds on UV radiation. Specifically, works reviewed here approach the issue from the empirical point of view: Some relationship between measured UV radiation in cloudy conditions and cloud-related information is given in each work. Basically, there are two groups of methods: techniques that are based on observations of cloudiness (either from human observers or by using devices such as sky cameras) and techniques that use measurements of broadband solar radiation as a surrogate for cloud observations. Some techniques combine both types of information. Comparison of results from different works is addressed through using the cloud modification factor (CMF) defined as the ratio between measured UV radiation in a cloudy sky and calculated radiation for a cloudless sky. Typical CMF values for overcast skies range from 0.3 to 0.7, depending both on cloud type and characteristics. Despite this large dispersion of values corresponding to the same cloud cover, it is clear that the cloud effect on UV radiation is 15–45% lower than the cloud effect on total solar radiation. The cloud effect is usually a reducing effect, but a significant number of works report an enhancement effect (that is increased UV radiation levels at the surface) due to the presence of clouds. The review concludes with some recommendations for future studies aimed to further analyze the cloud effects on UV radiation

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this note we give a numerical characterization of hypersurface singularities in terms of the normalized Hilbert-Samuel coefficients, and we interpret this result from the point of view of rigid polynomials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The 10 June 2000 event was the largest flash flood event that occurred in the Northeast of Spain in the late 20th century, both as regards its meteorological features and its considerable social impact. This paper focuses on analysis of the structures that produced the heavy rainfalls, especially from the point of view of meteorological radar. Due to the fact that this case is a good example of a Mediterranean flash flood event, a final objective of this paper is to undertake a description of the evolution of the rainfall structure that would be sufficiently clear to be understood at an interdisciplinary forum. Then, it could be useful not only to improve conceptual meteorological models, but also for application in downscaling models. The main precipitation structure was a Mesoscale Convective System (MCS) that crossed the region and that developed as a consequence of the merging of two previous squall lines. The paper analyses the main meteorological features that led to the development and triggering of the heavy rainfalls, with special emphasis on the features of this MCS, its life cycle and its dynamic features. To this end, 2-D and 3-D algorithms were applied to the imagery recorded over the complete life cycle of the structures, which lasted approximately 18 h. Mesoscale and synoptic information were also considered. Results show that it was an NS-MCS, quasi-stationary during its stage of maturity as a consequence of the formation of a convective train, the different displacement directions of the 2-D structures and the 3-D structures, including the propagation of new cells, and the slow movement of the convergence line associated with the Mediterranean mesoscale low.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most optimistic views, based on Optimum Currency Areas (OCA) literature, have concluded that the probability of asymmetric shocks to occur at anational level will tend to diminish in the Economic and Monetary Union (EMU)as a result of the intensification of the integration process during the most recent years. Therefore, since Economic Geography Theories predict a higherspecialisation of regions, it is expected that asymmetric shocks will increase.Previous studies have examined to what extent asymmetric shocks have been relevant in the past using, mainly, static measures of asymmetries such as the correlation coefficients between series of shocks previously calculated from astructural VAR model (Bayoumi and Eichengreen, 1992).In this paper, we study the evolution of manufacturing specific asymmetries in Europe from a dynamic point of view (applying the modelproposed by Haldane and Hall, 1991) in order to obtain new evidence about potential risks of EMU.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most optimistic views, based on Optimum Currency Areas (OCA) literature, have concluded that the probability of asymmetric shocks to occur at anational level will tend to diminish in the Economic and Monetary Union (EMU)as a result of the intensification of the integration process during the most recent years. Therefore, since Economic Geography Theories predict a higherspecialisation of regions, it is expected that asymmetric shocks will increase.Previous studies have examined to what extent asymmetric shocks have been relevant in the past using, mainly, static measures of asymmetries such as the correlation coefficients between series of shocks previously calculated from astructural VAR model (Bayoumi and Eichengreen, 1992).In this paper, we study the evolution of manufacturing specific asymmetries in Europe from a dynamic point of view (applying the modelproposed by Haldane and Hall, 1991) in order to obtain new evidence about potential risks of EMU.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The turn-on process of a multimode VCSEL is investigated from a statistical point of view. Special attention is paid to quantities such as time jitter and bit error rate. The single-mode performance of VCSEL¿s during current modulation is compared to that of edge-emitting lasers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this article is to show the Western centuries-old misogynist tradition from its origins in Greece by analysing a text by the allegorical interpreter of the Bible Philo of Alexandria, his De opifico mundi, which on many occasions is read by him from a Platonic point of view. The accurate analysis of the chapters devoted to the creation of the woman by God proves to what extent it is not possible to understand this text if one does not take into account a Greek philosophical tradition which was already centuries-old.