63 resultados para SUN: FUNDAMENTAL PARAMETERS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An alternative approach to the fundamental general physics concepts has been proposed. We demonstrate that the electrostatic potential energy of a discrete or a continuous system of charges should be stored by the charges and not the field. It is found that there is a possibility that any electric field has no energy density, as well as magnetic field. It is found that there is no direct relation between the electric or magnetic energy and photons. An alternative derivation of the blackbody radiation formula is proposed. It is also found that the zero-point of energy of electromagnetic radiation may not exist.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For the standard kernel density estimate, it is known that one can tune the bandwidth such that the expected L1 error is within a constant factor of the optimal L1 error (obtained when one is allowed to choose the bandwidth with knowledge of the density). In this paper, we pose the same problem for variable bandwidth kernel estimates where the bandwidths are allowed to depend upon the location. We show in particular that for positive kernels on the real line, for any data-based bandwidth, there exists a densityfor which the ratio of expected L1 error over optimal L1 error tends to infinity. Thus, the problem of tuning the variable bandwidth in an optimal manner is ``too hard''. Moreover, from the class of counterexamples exhibited in the paper, it appears thatplacing conditions on the densities (monotonicity, convexity, smoothness) does not help.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most methods for small-area estimation are based on composite estimators derived from design- or model-based methods. A composite estimator is a linear combination of a direct and an indirect estimator with weights that usually depend on unknown parameters which need to be estimated. Although model-based small-area estimators are usually based on random-effects models, the assumption of fixed effects is at face value more appropriate.Model-based estimators are justified by the assumption of random (interchangeable) area effects; in practice, however, areas are not interchangeable. In the present paper we empirically assess the quality of several small-area estimators in the setting in which the area effects are treated as fixed. We consider two settings: one that draws samples from a theoretical population, and another that draws samples from an empirical population of a labor force register maintained by the National Institute of Social Security (NISS) of Catalonia. We distinguish two types of composite estimators: a) those that use weights that involve area specific estimates of bias and variance; and, b) those that use weights that involve a common variance and a common squared bias estimate for all the areas. We assess their precision and discuss alternatives to optimizing composite estimation in applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many dynamic revenue management models divide the sale period into a finite number of periods T and assume, invoking a fine-enough grid of time, that each period sees at most one booking request. These Poisson-type assumptions restrict the variability of the demand in the model, but researchers and practitioners were willing to overlook this for the benefit of tractability of the models. In this paper, we criticize this model from another angle. Estimating the discrete finite-period model poses problems of indeterminacy and non-robustness: Arbitrarily fixing T leads to arbitrary control values and on the other hand estimating T from data adds an additional layer of indeterminacy. To counter this, we first propose an alternate finite-population model that avoids this problem of fixing T and allows a wider range of demand distributions, while retaining the useful marginal-value properties of the finite-period model. The finite-population model still requires jointly estimating market size and the parameters of the customer purchase model without observing no-purchases. Estimation of market-size when no-purchases are unobservable has rarely been attempted in the marketing or revenue management literature. Indeed, we point out that it is akin to the classical statistical problem of estimating the parameters of a binomial distribution with unknown population size and success probability, and hence likely to be challenging. However, when the purchase probabilities are given by a functional form such as a multinomial-logit model, we propose an estimation heuristic that exploits the specification of the functional form, the variety of the offer sets in a typical RM setting, and qualitative knowledge of arrival rates. Finally we perform simulations to show that the estimator is very promising in obtaining unbiased estimates of population size and the model parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Intonation is a fundamental music concept that has a special relevance in Indian art music. It is characteristic of the rāga and intrinsic to the musical expression of the performer. Describing intonation is of importance to several information retrieval tasks like the development of rāga and artist similarity measures. In our previous work, we proposed a compact representation of intonation based on the parametrization of the pitch histogram of a performance and demonstrated the usefulness of this representation through an explorative rāga recognition task in which we classified 42 vocal performances belonging to 3 rāgas using parameters of a single svara. In this paper, we extend this representation to employ context-based svara distributions, which are obtained with a different approach to find the pitches belonging to each svara. We quantitatively compare this method to our previous one, discuss the advantages, and the necessary melodic analysis to be carried out in future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A digital game was created as a resource for cognitive learning and afterwards it was used in primary schools in order to survey its active users. The methods used to recollect data were observation, in-depth interviews and focus groups. The main target of this study is to collect points of view of different primary school teachers. Conclusions show us how the group of study members perceive the use of digital games in the classroom.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Actualment l’exigència i la competitivitat del mercat, obliguen les industries a modernitzar-se i automatitzar tots els seus processos productius. En aquests processos les dades i paràmetres de control són dades fonamentals a verificar. Amb aquest treball final de carrera, es pretén realitzar un mòdul d’entrades digitals, per tal de gestionar les dades rebudes d’un procés automatitzat. L’objectiu d’aquest TFC ha estat dissenyar un mòdul d’entrades digitals capaç de gestionar dades de qualsevol tipus de procés automatitzat i transmetre-les a un mestremitjançant un bus de comunicació Modbus. El projecte però, s’ha centrat en el cas específic d’un procés automatitzat per al tractament de la fusta. El desenvolupament d’aquest sistema, comprèn el disseny del circuit, la realització de la placa, el software de lectura de dades i la implementació del protocol Modbus. Tot el mòdul d’entrades està controlat per un microcontrolador PIC 18F4520. El disseny és un sistema multiplataforma per tal d’adaptar-se a qualsevol procés automàtic i algunes de les seves característiques més rellevants són: entrades aïllades multitensió, control de fugues, sortides a relé, i memòria externa de dades, entre altres. Com a conclusions cal dir que s’han assolit els objectius proposats amb èxit. S’ha aconseguit un disseny robust, fiable, polivalent i altament competitiu en el mercat. A nivell acadèmic, s’han ampliat els coneixements en el camp del disseny i de la programació.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

uvby H-beta photometry has been obtained for a sample of 93 selected main sequence A stars. The purpose was to determine accurate effective temperatures, surface gravities, and absolute magnitudes for an individual determination of ages and parallaxes, which have to be included in a more extensive work analyzing the kinematic properties of A V stars. Several calibrations and methods to determine the above mentioned parameters have been reviewed, allowing the design of a new algorithm for their determination. The results obtained using this procedure were tested in a previous paper using uvby H-beta data from the Hauck and Mermilliod catalogue, and comparing the rusulting temperatures, surface gravities and absolute magnitudes with empirical determinations of these parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The theory of a self-gravitating gas sphere is given. The gravitational field is generated by two components, each of which is an independent isothermal gas. Various quantities of interest, such as density profiles, core radii of both components, masses, free-free luminosity, surface brightness, central surface density, and overestimate of central mass density, are given for different values of both parameters which arise naturally (ratio of central densities, ratio of rms velocities). Fundamental changes appear when comparison is made with a theory in which the second component is a 'test component'. Procedures are given for the complete analysis of real astrophysical configurations such as clusters of galaxies or globular clusters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We work out a semiclassical theory of shot noise in ballistic n+-i-n+ semiconductor structures aiming at studying two fundamental physical correlations coming from Pauli exclusion principle and long-range Coulomb interaction. The theory provides a unifying scheme which, in addition to the current-voltage characteristics, describes the suppression of shot noise due to Pauli and Coulomb correlations in the whole range of system parameters and applied bias. The whole scenario is summarized by a phase diagram in the plane of two dimensionless variables related to the sample length and contact chemical potential. Here different regions of physical interest can be identified where only Coulomb or only Pauli correlations are active, or where both are present with different relevance. The predictions of the theory are proven to be fully corroborated by Monte Carlo simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper estimates a model of airline competition for the Spanish air transport market. I test the explanatory power of alternative oligopoly models with capacity constraints. In addition, I analyse the degree of density economies. Results show that Spanish airlines conduct follows a price-leadership scheme so that it is less competitive than the Cournot solution. I also find evidence that thin routes can be considered as natural monopolies

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two trends which presently exist in relation to the concept of Paleontology are analyzed, pointing out some of the aspects which negative influence. Various reflections are made based on examples of some of the principal points of paleontological method, such as the influence of a punctual sampling, the meaning of size-frequency distribution and subjectivity in the identification of fossils. Topics which have a marked repercussion in diverse aspects of Paleontology are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We show that solar neutrino experiments set an upper limit of 7.8% (7.3% including the recent KamLAND measurements) to the fraction of energy that the Sun produces via the CNO fusion cycle, which is an order of magnitude improvement upon the previous limit. New experiments are required to detect CNO neutrinos corresponding to the 1.5% of the solar luminosity that the standard solar model predicts is generated by the CNO cycle.