995 resultados para CURE FRACTION MODELS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we propose a parsimonious regime-switching approach to model the correlations between assets, the threshold conditional correlation (TCC) model. This method allows the dynamics of the correlations to change from one state (or regime) to another as a function of observable transition variables. Our model is similar in spirit to Silvennoinen and Teräsvirta (2009) and Pelletier (2006) but with the appealing feature that it does not suffer from the course of dimensionality. In particular, estimation of the parameters of the TCC involves a simple grid search procedure. In addition, it is easy to guarantee a positive definite correlation matrix because the TCC estimator is given by the sample correlation matrix, which is positive definite by construction. The methodology is illustrated by evaluating the behaviour of international equities, govenrment bonds and major exchange rates, first separately and then jointly. We also test and allow for different parts in the correlation matrix to be governed by different transition variables. For this, we estimate a multi-threshold TCC specification. Further, we evaluate the economic performance of the TCC model against a constant conditional correlation (CCC) estimator using a Diebold-Mariano type test. We conclude that threshold correlation modelling gives rise to a significant reduction in portfolio´s variance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study evaluated parasitological and molecular techniques for the diagnosis and assessment of cure of schistosomiasis mansoni. A population-based study was performed in 201 inhabitants from a low transmission locality named Pedra Preta, municipality of Montes Claros, state of Minas Gerais, Brazil. Four stool samples were analysed using two techniques, the Kato-Katz® (KK) technique (18 slides) and the TF-Test®, to establish the infection rate. The positivity rate of 18 KK slides of four stool samples was 28.9% (58/201) and the combined parasitological techniques (KK+TF-Test®) produced a 35.8% positivity rate (72/201). Furthermore, a polymerase chain reaction (PCR)-ELISA assay produced a positivity rate of 23.4% (47/201) using the first sample. All 72 patients with positive parasitological exams were treated with a single dose of Praziquantel® and these patients were followed-up 30, 90 and 180 days after treatment to establish the cure rate. Cure rates obtained by the analysis of 12 KK slides were 100%, 100% and 98.4% at 30, 90 and 180 days after treatment, respectively. PCR-ELISA revealed cure rates of 98.5%, 95.5% and 96.5%, respectively. The diagnostic and assessment of cure for schistosomiasis may require an increased number of KK slides or a test with higher sensitivity, such as PCR-ELISA, in situations of very low parasite load, such as after therapeutic interventions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Functional divergence between homologous proteins is expected to affect amino acid sequences in two main ways, which can be considered as proxies of biochemical divergence: a "covarion-like" pattern of correlated changes in evolutionary rates, and switches in conserved residues ("conserved but different"). Although these patterns have been used in case studies, a large-scale analysis is needed to estimate their frequency and distribution. We use a phylogenomic framework of animal genes to answer three questions: 1) What is the prevalence of such patterns? 2) Can we link such patterns at the amino acid level with selection inferred at the codon level? 3) Are patterns different between paralogs and orthologs? We find that covarion-like patterns are more frequently detected than "constant but different," but that only the latter are correlated with signal for positive selection. Finally, there is no obvious difference in patterns between orthologs and paralogs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gene expression data from microarrays are being applied to predict preclinical and clinical endpoints, but the reliability of these predictions has not been established. In the MAQC-II project, 36 independent teams analyzed six microarray data sets to generate predictive models for classifying a sample with respect to one of 13 endpoints indicative of lung or liver toxicity in rodents, or of breast cancer, multiple myeloma or neuroblastoma in humans. In total, >30,000 models were built using many combinations of analytical methods. The teams generated predictive models without knowing the biological meaning of some of the endpoints and, to mimic clinical reality, tested the models on data that had not been used for training. We found that model performance depended largely on the endpoint and team proficiency and that different approaches generated models of similar performance. The conclusions and recommendations from MAQC-II should be useful for regulatory agencies, study committees and independent investigators that evaluate methods for global gene expression analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of the EU funded integrated project "ACuteTox" is to develop a strategy in which general cytotoxicity, together with organ-specific endpoints and biokinetic features, are taken into consideration in the in vitro prediction of oral acute systemic toxicity. With regard to the nervous system, the effects of 23 reference chemicals were tested with approximately 50 endpoints, using a neuronal cell line, primary neuronal cell cultures, brain slices and aggregated brain cell cultures. Comparison of the in vitro neurotoxicity data with general cytotoxicity data generated in a non-neuronal cell line and with in vivo data such as acute human lethal blood concentration, revealed that GABA(A) receptor function, acetylcholine esterase activity, cell membrane potential, glucose uptake, total RNA expression and altered gene expression of NF-H, GFAP, MBP, HSP32 and caspase-3 were the best endpoints to use for further testing with 36 additional chemicals. The results of the second analysis showed that no single neuronal endpoint could give a perfect improvement in the in vitro-in vivo correlation, indicating that several specific endpoints need to be analysed and combined with biokinetic data to obtain the best correlation with in vivo acute toxicity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

According to the World Health Organization, 5.1% of blindnesses or visual impairments are related to corneal opacification. Cornea is a transparent tissue placed in front of the color of the eye. Its transparency is mandatory for vision. The ocular surface is a functional unit including the cornea and all the elements involved in maintaining its transparency i.e., the eyelids, the conjunctiva, the lymphoid tissue of the conjunctiva, the limbus, the lacrymal glands and the tear film. The destruction of the ocular surface is a disease caused by : traumatisms, infections, chronic inflammations, cancers, toxics, unknown causes or congenital abnormalities. The treatment of the ocular surface destruction requires a global strategy including all the elements that are involved in its physiology. The microenvironnement of the ocular surface must first be restored, i.e., the lids, the conjunctiva, the limbus and the structures that secrete the different layers of the tear film. In a second step, the transparency of the cornea can be reconstructed. A corneal graft performed in a healthy ocular surface microenvironnement will have a better survival rate. To achieve these goals, a thorough understanding of the renewal of the epitheliums and the role of the epithelial stem cells are mandatory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the role of learning by private agents and the central bank (two-sided learning) in a New Keynesian framework in which both sides of the economy have asymmetric and imperfect knowledge about the true data generating process. We assume that all agents employ the data that they observe (which may be distinct for different sets of agents) to form beliefs about unknown aspects of the true model of the economy, use their beliefs to decide on actions, and revise these beliefs through a statistical learning algorithm as new information becomes available. We study the short-run dynamics of our model and derive its policy recommendations, particularly with respect to central bank communications. We demonstrate that two-sided learning can generate substantial increases in volatility and persistence, and alter the behavior of the variables in the model in a signifficant way. Our simulations do not converge to a symmetric rational expectations equilibrium and we highlight one source that invalidates the convergence results of Marcet and Sargent (1989). Finally, we identify a novel aspect of central bank communication in models of learning: communication can be harmful if the central bank's model is substantially mis-specified

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper discusses maintenance challenges of organisations with a huge number of devices and proposes the use of probabilistic models to assist monitoring and maintenance planning. The proposal assumes connectivity of instruments to report relevant features for monitoring. Also, the existence of enough historical registers with diagnosed breakdowns is required to make probabilistic models reliable and useful for predictive maintenance strategies based on them. Regular Markov models based on estimated failure and repair rates are proposed to calculate the availability of the instruments and Dynamic Bayesian Networks are proposed to model cause-effect relationships to trigger predictive maintenance services based on the influence between observed features and previously documented diagnostics

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Polymorphisms in IL28B were shown to affect clearance of hepatitis C virus (HCV) infection in genome-wide association (GWA) studies. Only a fraction of patients with chronic HCV infection develop liver fibrosis, a process that might also be affected by genetic factors. We performed a 2-stage GWA study of liver fibrosis progression related to HCV infection. We studied well-characterized HCV-infected patients of European descent who underwent liver biopsies before treatment. We defined various liver fibrosis phenotypes on the basis of METAVIR scores, with and without taking the duration of HCV infection into account. Our GWA analyses were conducted on a filtered primary cohort of 1161 patients using 780,650 single nucleotide polymorphisms (SNPs). We genotyped 96 SNPs with P values <5 × 10(-5) from an independent replication cohort of 962 patients. We then assessed the most interesting replicated SNPs using DNA samples collected from 219 patients who participated in separate GWA studies of HCV clearance. In the combined cohort of 2342 HCV-infected patients, the SNPs rs16851720 (in the total sample) and rs4374383 (in patients who received blood transfusions) were associated with fibrosis progression (P(combined) = 8.9 × 10(-9) and 1.1 × 10(-9), respectively). The SNP rs16851720 is located within RNF7, which encodes an antioxidant that protects against apoptosis. The SNP rs4374383, together with another replicated SNP, rs9380516 (P(combined) = 5.4 × 10(-7)), were linked to the functionally related genes MERTK and TULP1, which encode factors involved in phagocytosis of apoptotic cells by macrophages. Our GWA study identified several susceptibility loci for HCV-induced liver fibrosis; these were linked to genes that regulate apoptosis. Apoptotic control might therefore be involved in liver fibrosis.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our work is concerned with user modelling in open environments. Our proposal then is the line of contributions to the advances on user modelling in open environments thanks so the Agent Technology, in what has been called Smart User Model. Our research contains a holistic study of User Modelling in several research areas related to users. We have developed a conceptualization of User Modelling by means of examples from a broad range of research areas with the aim of improving our understanding of user modelling and its role in the next generation of open and distributed service environments. This report is organized as follow: In chapter 1 we introduce our motivation and objectives. Then in chapters 2, 3, 4 and 5 we provide the state-of-the-art on user modelling. In chapter 2, we give the main definitions of elements described in the report. In chapter 3, we present an historical perspective on user models. In chapter 4 we provide a review of user models from the perspective of different research areas, with special emphasis on the give-and-take relationship between Agent Technology and user modelling. In chapter 5, we describe the main challenges that, from our point of view, need to be tackled by researchers wanting to contribute to advances in user modelling. From the study of the state-of-the-art follows an exploratory work in chapter 6. We define a SUM and a methodology to deal with it. We also present some cases study in order to illustrate the methodology. Finally, we present the thesis proposal to continue the work, together with its corresponding work scheduling and temporalisation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Customer satisfaction and retention are key issues for organizations in today’s competitive market place. As such, much research and revenue has been invested in developing accurate ways of assessing consumer satisfaction at both the macro (national) and micro (organizational) level, facilitating comparisons in performance both within and between industries. Since the instigation of the national customer satisfaction indices (CSI), partial least squares (PLS) has been used to estimate the CSI models in preference to structural equation models (SEM) because they do not rely on strict assumptions about the data. However, this choice was based upon some misconceptions about the use of SEM’s and does not take into consideration more recent advances in SEM, including estimation methods that are robust to non-normality and missing data. In this paper, both SEM and PLS approaches were compared by evaluating perceptions of the Isle of Man Post Office Products and Customer service using a CSI format. The new robust SEM procedures were found to be advantageous over PLS. Product quality was found to be the only driver of customer satisfaction, while image and satisfaction were the only predictors of loyalty, thus arguing for the specificity of postal services

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Brake wear particulate matter (PM) may provoke cardiovascular effects. A system was developed to expose cells to airborne PM from brakes. Six car models were tested, each with full stop and normal deceleration. PM numbers, mass and surface, metals, and carbon compounds were measured. Full stop produced higher PM number and mass concentrations than normal deceleration (up to 10 million particles/cm3 in 0.2 m3 volume). 87% of the PM mass was in the fine (100 nm to 2.5 ìm) and 12% in the coarse (2.5 to 10 ìm) fraction, whereas 74% of the PM number was nanoscaled (ultrafine < 0.1 ìm) and 26% fine PM. Elemental concentrations were 2,364, 236, and 18 ìg/m3 of iron, copper and manganese, respectively, and 664 and 36 ìg/m3 of organic and elemental carbon. PM-release differed between cars and braking behaviour. Temperature and humidity were stable. In conclusion, the established system seems feasible for exposing cell cultures to brake wear PM. [Authors]