63 resultados para Quantitative micrographic parameters
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
For the standard kernel density estimate, it is known that one can tune the bandwidth such that the expected L1 error is within a constant factor of the optimal L1 error (obtained when one is allowed to choose the bandwidth with knowledge of the density). In this paper, we pose the same problem for variable bandwidth kernel estimates where the bandwidths are allowed to depend upon the location. We show in particular that for positive kernels on the real line, for any data-based bandwidth, there exists a densityfor which the ratio of expected L1 error over optimal L1 error tends to infinity. Thus, the problem of tuning the variable bandwidth in an optimal manner is ``too hard''. Moreover, from the class of counterexamples exhibited in the paper, it appears thatplacing conditions on the densities (monotonicity, convexity, smoothness) does not help.
Resumo:
Most methods for small-area estimation are based on composite estimators derived from design- or model-based methods. A composite estimator is a linear combination of a direct and an indirect estimator with weights that usually depend on unknown parameters which need to be estimated. Although model-based small-area estimators are usually based on random-effects models, the assumption of fixed effects is at face value more appropriate.Model-based estimators are justified by the assumption of random (interchangeable) area effects; in practice, however, areas are not interchangeable. In the present paper we empirically assess the quality of several small-area estimators in the setting in which the area effects are treated as fixed. We consider two settings: one that draws samples from a theoretical population, and another that draws samples from an empirical population of a labor force register maintained by the National Institute of Social Security (NISS) of Catalonia. We distinguish two types of composite estimators: a) those that use weights that involve area specific estimates of bias and variance; and, b) those that use weights that involve a common variance and a common squared bias estimate for all the areas. We assess their precision and discuss alternatives to optimizing composite estimation in applications.
Resumo:
The magnetic coupling constant of selected cuprate superconductor parent compounds has been determined by means of embedded cluster model and periodic calculations carried out at the same level of theory. The agreement between both approaches validates the cluster model. This model is subsequently employed in state-of-the-art configuration interaction calculations aimed to obtain accurate values of the magnetic coupling constant and hopping integral for a series of superconducting cuprates. Likewise, a systematic study of the performance of different ab initio explicitly correlated wave function methods and of several density functional approaches is presented. The accurate determination of the parameters of the t-J Hamiltonian has several consequences. First, it suggests that the appearance of high-Tc superconductivity in existing monolayered cuprates occurs with J/t in the 0.20¿0.35 regime. Second, J/t=0.20 is predicted to be the threshold for the existence of superconductivity and, third, a simple and accurate relationship between the critical temperatures at optimum doping and these parameters is found. However, this quantitative electronic structure versus Tc relationship is only found when both J and t are obtained at the most accurate level of theory.
Resumo:
Several methods and approaches for measuring parameters to determine fecal sources of pollution in water have been developed in recent years. No single microbial or chemical parameter has proved sufficient to determine the source of fecal pollution. Combinations of parameters involving at least one discriminating indicator and one universal fecal indicator offer the most promising solutions for qualitative and quantitative analyses. The universal (nondiscriminating) fecal indicator provides quantitative information regarding the fecal load. The discriminating indicator contributes to the identification of a specific source. The relative values of the parameters derived from both kinds of indicators could provide information regarding the contribution to the total fecal load from each origin. It is also essential that both parameters characteristically persist in the environment for similar periods. Numerical analysis, such as inductive learning methods, could be used to select the most suitable and the lowest number of parameters to develop predictive models. These combinations of parameters provide information on factors affecting the models, such as dilution, specific types of animal source, persistence of microbial tracers, and complex mixtures from different sources. The combined use of the enumeration of somatic coliphages and the enumeration of Bacteroides-phages using different host specific strains (one from humans and another from pigs), both selected using the suggested approach, provides a feasible model for quantitative and qualitative analyses of fecal source identification.
Resumo:
Several methods and approaches for measuring parameters to determine fecal sources of pollution in water have been developed in recent years. No single microbial or chemical parameter has proved sufficient to determine the source of fecal pollution. Combinations of parameters involving at least one discriminating indicator and one universal fecal indicator offer the most promising solutions for qualitative and quantitative analyses. The universal (nondiscriminating) fecal indicator provides quantitative information regarding the fecal load. The discriminating indicator contributes to the identification of a specific source. The relative values of the parameters derived from both kinds of indicators could provide information regarding the contribution to the total fecal load from each origin. It is also essential that both parameters characteristically persist in the environment for similar periods. Numerical analysis, such as inductive learning methods, could be used to select the most suitable and the lowest number of parameters to develop predictive models. These combinations of parameters provide information on factors affecting the models, such as dilution, specific types of animal source, persistence of microbial tracers, and complex mixtures from different sources. The combined use of the enumeration of somatic coliphages and the enumeration of Bacteroides-phages using different host specific strains (one from humans and another from pigs), both selected using the suggested approach, provides a feasible model for quantitative and qualitative analyses of fecal source identification.
Resumo:
The aim of our study was to assess the diagnostic usefulness of the gray level parameters to distinguish osteolytic lesions using radiological images. Materials and Methods: A retrospective study was carried out. A total of 76 skeletal radiographs of osteolytic metastases and 67 radiographs of multiple myeloma were used. The cases were classified into nonflat (MM1 and OL1) and flat bones (MM2 and OL2). These radiological images were analyzed by using a computerized method. The parameters calculated were mean, standard deviation, and coefficient of variation (MGL, SDGL, and CVGL) based on gray level histogram analysis of a region-of-interest.Diagnostic utility was quantified bymeasurement of parameters on osteolyticmetastases andmultiplemyeloma, yielding quantification of area under the receiver operating characteristic (ROC) curve (AUC). Results: Flat bone groups (MM2 and OL2) showed significant differences in mean values of MGL ( = 0.048) and SDGL ( = 0.003). Their corresponding values of AUC were 0.758 for MGL and 0.883 for SDGL in flat bones. In nonflat bones these gray level parameters do not show diagnostic ability. Conclusion: The gray level parametersMGL and SDGL show a good discriminatory diagnostic ability to distinguish between multiple myeloma and lytic metastases in flat bones.
Resumo:
Aquest estudi s’ha realitzat amb el principal objectiu de localitzar, analitzar i diagnosticar els arbres singulars subjectes a ser declarats monumentals dins el Parc Natural de l’Alt Pirineu. Concretament s’han inventariat la Vall Ferrera i la Vall de Cardós. L’objectiu secundari ha estat fer una proposta innovadora d’educació ambiental, utilitzant l’arbre com a un instrument pedagògic. S’han inventariat vint-i-tres arbres sent un d’ells ja declarat Arbre Monumental, “l’Avet del Pla de la Selva”. Primerament s’han localitzats els arbres amb l’ajuda dels tècnics del Parc, el coneixement popular i documentació. S’ha utilitzat una metodologia basada en estudis anteriors, mitjançant uns formularis de camp que recullen totes les característiques ecològiques i socioculturals de cada arbre. Posteriorment s’han analitzat les dades obtingudes i s’ha realitzat la diagnosi. S’ha proposat un mètode quantitatiu i un mètode qualitatiu (Rànquing d’Arbres Monumentals). Aquest últim valora cada arbre comparant-lo amb un llistat de tots aquells arbres monumentals de la mateixa espècie en el territori català realitzat per la Generalitat de Catalunya, segons tres paràmetres, l’alçada, el volt de canó i el diàmetre de la capçada. Finalment es proposa a cada arbre la protecció corresponent segons el seu estat de conservació i altres paràmetres. Un dels resultats obtinguts d’aquest estudi ha estat la realització d’una carpeta de material divulgatiu utilitzant cada arbre com a eix central per explicar el medi natural que l’envolta. Amb aquesta iniciativa es vol destacar l’important paper dels arbres monumentals com a connectors amb el medi natural i sociocultural i la necessitat de protegir en tots els Parcs Naturals els arbres singulars.
Resumo:
In this paper we simulate and analyse the economic impact that sectorial productivity gains have on two regional Spanish economies (Catalonia and Extremadura). In particular we study the quantitative effect that each sector’s productivity gain has on household welfare (real disposable income and equivalent variation), on the consumption price indices and factor relative prices, on real production (GDP) and on the government’s net income (net taxation revenues of social transfers to households). The analytical approach consists of a computable general equilibrium model, in which we assume perfect competition and cleared markets, including factor markets. All the parameters and exogenous variables of the model are calibrated by means of two social accounting matrices, one for each region under study. The results allow us to identify those sectors with the greatest impact on consumer welfare as the key sectors in the regional economies. Keywords: Productivity gains, key sectors, computable general equilibrium
Resumo:
In this work discuss the use of the standard model for the calculation of the solvency capital requirement (SCR) when the company aims to use the specific parameters of the model on the basis of the experience of its portfolio. In particular, this analysis focuses on the formula presented in the latest quantitative impact study (2010 CEIOPS) for non-life underwriting premium and reserve risk. One of the keys of the standard model for premium and reserves risk is the correlation matrix between lines of business. In this work we present how the correlation matrix between lines of business could be estimated from a quantitative perspective, as well as the possibility of using a credibility model for the estimation of the matrix of correlation between lines of business that merge qualitative and quantitative perspective.
Resumo:
The tourism consumer’s purchase decision process is, to a great extent, conditioned by the image the tourist has of the different destinations that make up his or her choice set. In a highly competitive international tourist market, those responsible for destinations’ promotion and development policies seek differentiation strategies so that they may position the destinations in the most suitable market segments for their product in order to improve their attractiveness to visitors and increase or consolidate the economic benefits that tourism activity generates in their territory. To this end, the main objective we set ourselves in this paper is the empirical analysis of the factors that determine the image formation of Tarragona city as a cultural heritage destination. Without a doubt, UNESCO’s declaration of Tarragona’s artistic and monumental legacies as World Heritage site in the year 2000 meant important international recognition of the quality of the cultural and patrimonial elements offered by the city to the visitors who choose it as a tourist destination. It also represents a strategic opportunity to boost the city’s promotion of tourism and its consolidation as a unique destination given its cultural and patrimonial characteristics. Our work is based on the use of structured and unstructured techniques to identify the factors that determine Tarragona’s tourist destination image and that have a decisive influence on visitors’ process of choice of destination. In addition to being able to ascertain Tarragona’s global tourist image, we consider that the heterogeneity of its visitors requires a more detailed study that enables us to segment visitor typology. We consider that the information provided by these results may prove of great interest to those responsible for local tourism policy, both when designing products and when promoting the destination.
Resumo:
Low concentrations of elements in geochemical analyses have the peculiarity of beingcompositional data and, for a given level of significance, are likely to be beyond thecapabilities of laboratories to distinguish between minute concentrations and completeabsence, thus preventing laboratories from reporting extremely low concentrations of theanalyte. Instead, what is reported is the detection limit, which is the minimumconcentration that conclusively differentiates between presence and absence of theelement. A spatially distributed exhaustive sample is employed in this study to generateunbiased sub-samples, which are further censored to observe the effect that differentdetection limits and sample sizes have on the inference of population distributionsstarting from geochemical analyses having specimens below detection limit (nondetects).The isometric logratio transformation is used to convert the compositional data in thesimplex to samples in real space, thus allowing the practitioner to properly borrow fromthe large source of statistical techniques valid only in real space. The bootstrap method isused to numerically investigate the reliability of inferring several distributionalparameters employing different forms of imputation for the censored data. The casestudy illustrates that, in general, best results are obtained when imputations are madeusing the distribution best fitting the readings above detection limit and exposes theproblems of other more widely used practices. When the sample is spatially correlated, itis necessary to combine the bootstrap with stochastic simulation
Resumo:
Després d’aplicar alguns tractaments d’elaboració i conservació als aliments, queden bacteris lesionats. Aquests bacteris perden la capacitat de créixer en els medis de cultiu selectiu convencionals, de manera que se’n subestima el recompte. Malgrat això, poden recuperar-se als aliments i suposar un risc per la salut, ja que alguns encara poden mantenir activitat metabòlica i integritat estructural. En aquest projecte, es van optimitzar protocols de preparació de mostres per citometria de flux (CF) per avaluar l’estat fisiològic de patògens alimentaris (Escherichia coli O157:H7, Salmonella Enteritidis i Listeria monocytogenes) sotmesos a estrès. Es van estudiar principalment dos paràmetres fisiològics: la integritat de membrana, mitjançant iodur de propidi i fluorocroms de la família SYTO; i l’activitat respiratòria, per la reducció intracel•lular d’una sal de tetrazole, el CTC. En primer lloc, es van avaluar variables de protocol, com la concentració de colorant, la ràtio entre colorants, la solució de tinció i el temps d’incubació, en mostres control (cèl•lules sanes i mortes). A continuació, els protocols optimitzats es van aplicar a suspensions bacterianes en medi de cultiu que prèviament havien estat sotmeses a estressos físics i fisicoquímics. Durant l’etapa final del projecte, els coneixements adquirits sobre la preparació de mostres per CF es van aplicar a l’anàlisi de mostres de matriu complexa: amanides comercials inoculades amb E. coli O157:H7. Als assajos amb indicadors d’integritat de membrana en suspensions bacterianes sotmeses a estrès, es van poder quantificar cèl•lules amb la membrana parcialment danyada (presumptes cèl•lules lesionades). El recompte de cèl•lules que mantingueren l’activitat respiratòria després de ser sotmeses a estrès va ser superior al que es va obtenir mitjançant recompte en placa convencional, cosa que va evidenciar la presència de cèl•lules actives però no cultivables. La introducció d’estratègies per reduir les interferències provocades per les partícules alimentàries i l’ús d’un anticòs amb marcatge fluorescent va permetre detectar selectivament les cèl•lules d’E. coli O157:H7 i avaluar-ne la integritat de membrana simultàniament. L’anàlisi de cèl•lules bacterianes per CF requereix de la exhaustiva optimització dels protocols, que són específics per cada soca i matriu. Malgrat això, i a diferència del mètode convencional per recompte en placa, ofereix la possibilitat d’obtenir una gran quantitat d’informació sobre el sovint complex estat fisiològic d’una mostra.
Resumo:
Malgrat la rellevància estratègica i el paper desestabilitzador de Corea del Nord a la regió econòmicament més dinàmica del món, la UE no compta amb cap estratègia clara per involucrar-se amb aquest país. Combinant tècniques d’anàlisi qualitatives i quantitatives, aquest treball pretén descobrir possibles contradiccions internes que impedeixin la definició d'una política exterior europea coherent i efectiva amb respecte a Corea del Nord, així com discrepàncies entre les percepcions d’actors interns de la UE i les d’actors externs. S'han detectat importants diferències d’expectatives i mancances en termes de coherència, tant entre les visions expressades pels actors interns com entre les opinions d’aquests actors i les dels futurs líders sudcoreans enquestats – diferències que fins i tot afecten la promoció dels drets humans