998 resultados para envelope models
Resumo:
In this paper we propose a parsimonious regime-switching approach to model the correlations between assets, the threshold conditional correlation (TCC) model. This method allows the dynamics of the correlations to change from one state (or regime) to another as a function of observable transition variables. Our model is similar in spirit to Silvennoinen and Teräsvirta (2009) and Pelletier (2006) but with the appealing feature that it does not suffer from the course of dimensionality. In particular, estimation of the parameters of the TCC involves a simple grid search procedure. In addition, it is easy to guarantee a positive definite correlation matrix because the TCC estimator is given by the sample correlation matrix, which is positive definite by construction. The methodology is illustrated by evaluating the behaviour of international equities, govenrment bonds and major exchange rates, first separately and then jointly. We also test and allow for different parts in the correlation matrix to be governed by different transition variables. For this, we estimate a multi-threshold TCC specification. Further, we evaluate the economic performance of the TCC model against a constant conditional correlation (CCC) estimator using a Diebold-Mariano type test. We conclude that threshold correlation modelling gives rise to a significant reduction in portfolio´s variance.
Resumo:
Functional divergence between homologous proteins is expected to affect amino acid sequences in two main ways, which can be considered as proxies of biochemical divergence: a "covarion-like" pattern of correlated changes in evolutionary rates, and switches in conserved residues ("conserved but different"). Although these patterns have been used in case studies, a large-scale analysis is needed to estimate their frequency and distribution. We use a phylogenomic framework of animal genes to answer three questions: 1) What is the prevalence of such patterns? 2) Can we link such patterns at the amino acid level with selection inferred at the codon level? 3) Are patterns different between paralogs and orthologs? We find that covarion-like patterns are more frequently detected than "constant but different," but that only the latter are correlated with signal for positive selection. Finally, there is no obvious difference in patterns between orthologs and paralogs.
Resumo:
Prevention of Trypanosoma cruzi infection in mammals likely depends on either prevention of the invading trypomastigotes from infecting host cells or the rapid recognition and killing of the newly infected cells byT. cruzi-specific T cells. We show here that multiple rounds of infection and cure (by drug therapy) fails to protect mice from reinfection, despite the generation of potent T cell responses. This disappointing result is similar to that obtained with many other vaccine protocols used in attempts to protect animals from T. cruziinfection. We have previously shown that immune recognition ofT. cruziinfection is significantly delayed both at the systemic level and at the level of the infected host cell. The systemic delay appears to be the result of a stealth infection process that fails to trigger substantial innate recognition mechanisms while the delay at the cellular level is related to the immunodominance of highly variable gene family proteins, in particular those of the trans-sialidase family. Here we discuss how these previous studies and the new findings herein impact our thoughts on the potential of prophylactic vaccination to serve a productive role in the prevention of T. cruziinfection and Chagas disease.
Resumo:
It can be assumed that the composition of Mercury’s thin gas envelope (exosphere) is related to thecomposition of the planets crustal materials. If this relationship is true, then inferences regarding the bulkchemistry of the planet might be made from a thorough exospheric study. The most vexing of allunsolved problems is the uncertainty in the source of each component. Historically, it has been believedthat H and He come primarily from the solar wind, while Na and K originate from volatilized materialspartitioned between Mercury’s crust and meteoritic impactors. The processes that eject atoms andmolecules into the exosphere of Mercury are generally considered to be thermal vaporization, photonstimulateddesorption (PSD), impact vaporization, and ion sputtering. Each of these processes has its owntemporal and spatial dependence. The exosphere is strongly influenced by Mercury’s highly ellipticalorbit and rapid orbital speed. As a consequence the surface undergoes large fluctuations in temperatureand experiences differences of insolation with longitude. We will discuss these processes but focus moreon the expected surface composition and solar wind particle sputtering which releases material like Caand other elements from the surface minerals and discuss the relevance of composition modelling
Resumo:
Hepatitis C virus (HCV) envelope protein 2 (E2) is involved in viral binding to host cells. The aim of this work was to produce recombinant E2B and E2Y HCV proteins in Escherichia coli and Pichia pastoris, respectively, and to study their interactions with low-density lipoprotein receptor (LDLr) and CD81 in human umbilical vein endothelial cells (HUVEC) and the ECV304 bladder carcinoma cell line. To investigate the effects of human LDL and differences in protein structure (glycosylated or not) on binding efficiency, the recombinant proteins were either associated or not associated with lipoproteins before being assayed. The immunoreactivity of the recombinant proteins was analysed using pooled serum samples that were either positive or negative for hepatitis C. The cells were immunophenotyped by LDLr and CD81 using flow cytometry. Binding and binding inhibition assays were performed in the presence of LDL, foetal bovine serum (FCS) and specific antibodies. The results revealed that binding was reduced in the absence of FCS, but that the addition of human LDL rescued and increased binding capacity. In HUVEC cells, the use of antibodies to block LDLr led to a significant reduction in the binding of E2B and E2Y. CD81 antibodies did not affect E2B and E2Y binding. In ECV304 cells, blocking LDLr and CD81 produced similar effects, but they were not as marked as those that were observed in HUVEC cells. In conclusion, recombinant HCV E2 is dependent on LDL for its ability to bind to LDLr in HUVEC and ECV304 cells. These findings are relevant because E2 acts to anchor HCV to host cells; therefore, high blood levels of LDL could enhance viral infectivity in chronic hepatitis C patients.
Resumo:
Gene expression data from microarrays are being applied to predict preclinical and clinical endpoints, but the reliability of these predictions has not been established. In the MAQC-II project, 36 independent teams analyzed six microarray data sets to generate predictive models for classifying a sample with respect to one of 13 endpoints indicative of lung or liver toxicity in rodents, or of breast cancer, multiple myeloma or neuroblastoma in humans. In total, >30,000 models were built using many combinations of analytical methods. The teams generated predictive models without knowing the biological meaning of some of the endpoints and, to mimic clinical reality, tested the models on data that had not been used for training. We found that model performance depended largely on the endpoint and team proficiency and that different approaches generated models of similar performance. The conclusions and recommendations from MAQC-II should be useful for regulatory agencies, study committees and independent investigators that evaluate methods for global gene expression analysis.
Resumo:
The objective of the EU funded integrated project "ACuteTox" is to develop a strategy in which general cytotoxicity, together with organ-specific endpoints and biokinetic features, are taken into consideration in the in vitro prediction of oral acute systemic toxicity. With regard to the nervous system, the effects of 23 reference chemicals were tested with approximately 50 endpoints, using a neuronal cell line, primary neuronal cell cultures, brain slices and aggregated brain cell cultures. Comparison of the in vitro neurotoxicity data with general cytotoxicity data generated in a non-neuronal cell line and with in vivo data such as acute human lethal blood concentration, revealed that GABA(A) receptor function, acetylcholine esterase activity, cell membrane potential, glucose uptake, total RNA expression and altered gene expression of NF-H, GFAP, MBP, HSP32 and caspase-3 were the best endpoints to use for further testing with 36 additional chemicals. The results of the second analysis showed that no single neuronal endpoint could give a perfect improvement in the in vitro-in vivo correlation, indicating that several specific endpoints need to be analysed and combined with biokinetic data to obtain the best correlation with in vivo acute toxicity.
Resumo:
This paper investigates the role of learning by private agents and the central bank (two-sided learning) in a New Keynesian framework in which both sides of the economy have asymmetric and imperfect knowledge about the true data generating process. We assume that all agents employ the data that they observe (which may be distinct for different sets of agents) to form beliefs about unknown aspects of the true model of the economy, use their beliefs to decide on actions, and revise these beliefs through a statistical learning algorithm as new information becomes available. We study the short-run dynamics of our model and derive its policy recommendations, particularly with respect to central bank communications. We demonstrate that two-sided learning can generate substantial increases in volatility and persistence, and alter the behavior of the variables in the model in a signifficant way. Our simulations do not converge to a symmetric rational expectations equilibrium and we highlight one source that invalidates the convergence results of Marcet and Sargent (1989). Finally, we identify a novel aspect of central bank communication in models of learning: communication can be harmful if the central bank's model is substantially mis-specified
Resumo:
The paper discusses maintenance challenges of organisations with a huge number of devices and proposes the use of probabilistic models to assist monitoring and maintenance planning. The proposal assumes connectivity of instruments to report relevant features for monitoring. Also, the existence of enough historical registers with diagnosed breakdowns is required to make probabilistic models reliable and useful for predictive maintenance strategies based on them. Regular Markov models based on estimated failure and repair rates are proposed to calculate the availability of the instruments and Dynamic Bayesian Networks are proposed to model cause-effect relationships to trigger predictive maintenance services based on the influence between observed features and previously documented diagnostics
Resumo:
Our work is concerned with user modelling in open environments. Our proposal then is the line of contributions to the advances on user modelling in open environments thanks so the Agent Technology, in what has been called Smart User Model. Our research contains a holistic study of User Modelling in several research areas related to users. We have developed a conceptualization of User Modelling by means of examples from a broad range of research areas with the aim of improving our understanding of user modelling and its role in the next generation of open and distributed service environments. This report is organized as follow: In chapter 1 we introduce our motivation and objectives. Then in chapters 2, 3, 4 and 5 we provide the state-of-the-art on user modelling. In chapter 2, we give the main definitions of elements described in the report. In chapter 3, we present an historical perspective on user models. In chapter 4 we provide a review of user models from the perspective of different research areas, with special emphasis on the give-and-take relationship between Agent Technology and user modelling. In chapter 5, we describe the main challenges that, from our point of view, need to be tackled by researchers wanting to contribute to advances in user modelling. From the study of the state-of-the-art follows an exploratory work in chapter 6. We define a SUM and a methodology to deal with it. We also present some cases study in order to illustrate the methodology. Finally, we present the thesis proposal to continue the work, together with its corresponding work scheduling and temporalisation
Resumo:
Customer satisfaction and retention are key issues for organizations in today’s competitive market place. As such, much research and revenue has been invested in developing accurate ways of assessing consumer satisfaction at both the macro (national) and micro (organizational) level, facilitating comparisons in performance both within and between industries. Since the instigation of the national customer satisfaction indices (CSI), partial least squares (PLS) has been used to estimate the CSI models in preference to structural equation models (SEM) because they do not rely on strict assumptions about the data. However, this choice was based upon some misconceptions about the use of SEM’s and does not take into consideration more recent advances in SEM, including estimation methods that are robust to non-normality and missing data. In this paper, both SEM and PLS approaches were compared by evaluating perceptions of the Isle of Man Post Office Products and Customer service using a CSI format. The new robust SEM procedures were found to be advantageous over PLS. Product quality was found to be the only driver of customer satisfaction, while image and satisfaction were the only predictors of loyalty, thus arguing for the specificity of postal services
Resumo:
Sackung is a widespread post-glacial morphological feature affecting Alpine mountains and creating characteristic geomorphological expression that can be detected from topography. Over long time evolution, internal deformation can lead to the formation of rapidly moving phenomena such as a rock-slide or rock avalanche. In this study, a detailed description of the Sierre rock-avalanche (SW Switzerland) is presented. This convex-shaped postglacial instability is one of the larger rock-avalanche in the Alps, involving more than 1.5 billion m3 with a run-out distance of about 14 km and extremely low Fahrböschung angle. This study presents comprehensive analyses of the structural and geological characteristics leading to the development of the Sierre rock-avalanche. In particular, by combining field observations, digital elevation model analyses and numerical modelling, the strong influence of both ductile and brittle tectonic structures on the failure mechanism and on the failure surface geometry is highlighted. The detection of pre-failure deformation indicates that the development of the rock avalanche corresponds to the last evolutionary stage of a pre-existing deep seated gravitational slope instability. These analyses accompanied by the dating and the characterization of rock avalanche deposits, allow the proposal of a destabilization model that clarifies the different phases leading to the development of the Sierre rock avalanche.
Resumo:
Calculating explicit closed form solutions of Cournot models where firms have private information about their costs is, in general, very cumbersome. Most authors consider therefore linear demands and constant marginal costs. However, within this framework, the nonnegativity constraint on prices (and quantities) has been ignored or not properly dealt with and the correct calculation of all Bayesian Nash equilibria is more complicated than expected. Moreover, multiple symmetric and interior Bayesianf equilibria may exist for an open set of parameters. The reason for this is that linear demand is not really linear, since there is a kink at zero price: the general ''linear'' inverse demand function is P (Q) = max{a - bQ, 0} rather than P (Q) = a - bQ.