987 resultados para GAUSSIAN NUCLEUS MODELS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract Peroxisome Proliferator-Activated Receptors (PPARs) form a family of three nuclear receptors regulating important cellular and metabolic functions. PPARs control gene expression by directly binding to target promoters as heterodimers with the Retinoid X Receptor (RXR), and their transcriptional activity is enhanced upon activation by natural or pharmacological ligands. The binding of PPAR/RXR heterodimers on target promoters allows the anchoring of a series of coactivators and corepressors involved in promoter remodeling and the recruitment of the transcription machinery. The transcriptional output finally depends on a complex interplay between (i) the respective expression levels of PPARs, RXRs and of other nuclear receptors competing for DNA binding and RXR recruitment, (ii) the availability and the nature of PPAR and RXR ligands, (iii) the expression levels and the nature of the different coactivators and corepressors and (iv) the sequence and the epigenetic status of the promoter. Understanding how all these factors and signals integrate and fine-tune transcription remains a challenge but is necessary to understand the specificity of the physiological functions regulated by PPARs. The work presented herein focuses on the molecular mechanisms of PPAR action and aims at understanding how the interactions and mobility of the receptor modulate transcription in the physiological context of a living cell: Such observations in vivo rely on the use of engineered fluorescent protein chimeras and require the development and the application of complementary imaging techniques such as Fluorescence Recovery After Photobleaching (FRAP), Fluorescence Resonance Energy Transfer (FRET) and Fluorescence Correlation Spectroscopy (FCS). Using such techniques, PPARs are shown to reside solely in the nucleus where they are constitutively associated with RXR but transcriptional activation by ligand binding -does not promote the formation of sub-nuclear structures as observed with other nuclear receptors. In addition, the engagement of unliganded PPARs in large complexes of cofactors in living cells provides a molecular basis for their ligand-independent activity. Ligand binding reduces receptor diffusion by promoting the recruitment of coactivators which further enlarge the size of PPAR complexes to acquire full transcriptional competence. Using these molecular approaches, we deciphered the molecular mechanisms through which phthalates, a class of pollutants from the plastic industry, interfere with PPARγ signaling. Mono-ethyl-hexyl-phthalate (MEHP) binding induces the recruitment of a specific subset of cofactors and translates into the expression of a specific subset of target genes, the transcriptional output being strongly conditioned by the differentiation status of the cell. This selective PPARγ modulation induces limited adipogenic effects in cellular models while exposure to phthalates in animal models leads to protective effects on glucose tolerance and diet-induced obesity. These results demonstrate that phthalates influence lipid and carbohydrate metabolism through complex mechanisms which most likely involve PPARγ but also probably PPARα and PPARß, Altogether, the molecular and physiological demonstration of the interference of pollutants with PPAR action outlines an important role of chemical exposure in metabolic regulations. Résumé Les PPARs (Peroxisome Proliferator-Activated Receptors) forment une famille de récepteurs nucléaires qui régulent des fonctions cellulaires et métaboliques importantes. Les PPARs contrôlent l'expression des gènes en se liant directement à leurs promoteurs sous forme d'hétérodimères avec les récepteurs RXR (Retinoid X Receptor), et leur activité transcriptionnelle est stimulée par la liaison de ligands naturels ou pharmacologiques. L'association des hétérodimères PPAR/RXR avec les promoteurs des gènes cibles permet le recrutement de coactivateurs et de corépresseurs qui vont permettre le remodelage de la chromatine et le recrutement de la machinerie transcriptionnelle. Les actions transcriptionnelles du récepteur dépendent toutefois d'interactions complexes qui sont régulées par (i) le niveau d'expression des PPARs, des RXRs et d'autres récepteurs nucléaires entrant en compétition pour la liaison à l'ADN et l'association avec RXR, (ii) la disponibilité et la nature de ligands de PPAR et de RXR, (iii) les niveaux d'expression et la nature des différents coactivateurs et corépresseurs et (iv) la séquence et le marquage épigénétique des promoteurs. La compréhension des mécanismes qui permettent d'intégrer ces aspects pour assurer une régulation fine de l'activité transcriptionnelle est un défi qu'il est nécessaire de relever pour comprendre la spécificité des fonctions physiologiques régulées par les PPARs. Ce travail concerne l'étude des mécanismes d'action moléculaire des PPARs et vise à mieux comprendre comment les interactions du récepteur avec d'autres protéines ainsi que la mobilité de ce dernier régulent son activité transcriptionnelle dans le contexte physiologique des cellules vivantes. De telles observations reposent sur l'emploi de protéines fusionnées à des protéines fluorescentes ainsi que sur le développement et l'utilisation de techniques d'imagerie complémentaires telles que le FRAP (Fluorescence Recovery After Photobleaching), le FRET (Fluorescence Resonance Energy Transfer) ou la FCS (Fluorescence Corrélation Spectroscopy). En appliquant ces méthodes, nous avons pu montrer que les PPARs résident toujours dans le noyau où ils sont associés de manière constitutive à RXR, mais que l'ajout de ligand n'induit pas la formation de structures sub-nucléaires comme cela a pu être décrit pour d'autres récepteurs nucléaires. De plus, les PPARs sont engagés dans de larges complexes protéiques de cofacteurs en absence de ligand, ce qui procure une explication moléculaire à leur activité ligand-indépendante. La liaison du ligand réduit la vitesse de diffusion du récepteur en induisant le recrutement de coactivateurs qui augmente encore plus la taille des complexes afin d'acquérir un potentiel d'activation maximal. En utilisant ces approches moléculaires, nous avons pu caractériser les mécanismes permettant aux phtalates, une classe de polluants provenant de l'industrie plastique, d'interférer avec PPARγ. La liaison du mono-ethyl-hexyl-phtalate (NERF) à PPARγ induit un recrutement sélectif de cofacteurs, se traduisant par l'induction spécifique d'un sous-ensemble de gènes qui varie en fonction du niveau de différentiation cellulaire. La modulation sélective de PPARγ par le MEHP provoque une adipogenèse modérée dans des modèles cellulaires alors que l'exposition de modèles animaux aux phtalates induit des effets bénéfiques sur la tolérance au glucose et sur le développement de l'obésité. Toutefois, les phtalates ont une action complexe sur le métabolisme glucido-lipidique en faisant intervenir PPARγ mais aussi probablement PPARα et PPARß. Cette démonstration moléculaire et physiologique de l'interférence des polluants avec les récepteurs nucléaires PPAR souligne un rôle important de l'exposition à de tels composés dans les régulations métaboliques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aquest és un estudi retrospectiu que compara la mobilitat i el conflicto escàpulo-humeral entre 2 models diferents de pròtesi invertida d’espatlla. Aquestes pròtesis s’han implantat en pacients amb ruptures del manegot dels rotadors irreparables. Aquesta cirugía no està exenta de complicacions, i una de les més habituals és el conflicto escàpulo-humeral o notch.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cloud computing and its three facets (Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS)) are terms that denote new developments in the software industry. In particular, PaaS solutions, also referred to as cloud platforms, are changing the way software is being produced, distributed, consumed, and priced. Software vendors have started considering cloud platforms as a strategic option but are battling to redefine their offerings to embrace PaaS. In contrast to SaaS and IaaS, PaaS allows for value co-creation with partners to develop complementary components and applications. It thus requires multisided business models that bring together two or more distinct customer segments. Understanding how to design PaaS business models to establish a flourishing ecosystem is crucial for software vendors. This doctoral thesis aims to address this issue in three interrelated research parts. First, based on case study research, the thesis provides a deeper understanding of current PaaS business models and their evolution. Second, it analyses and simulates consumers' preferences regarding PaaS business models, using a conjoint approach to find out what determines the choice of cloud platforms. Finally, building on the previous research outcomes, the third part introduces a design theory for the emerging class of PaaS business models, which is grounded on an extensive action design research study with a large European software vendor. Understanding PaaS business models from a market as well as a consumer perspective will, together with the design theory, inform and guide decision makers in their business model innovation plans. It also closes gaps in the research related to PaaS business model design and more generally related to platform business models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gene-on-gene regulations are key components of every living organism. Dynamical abstract models of genetic regulatory networks help explain the genome's evolvability and robustness. These properties can be attributed to the structural topology of the graph formed by genes, as vertices, and regulatory interactions, as edges. Moreover, the actual gene interaction of each gene is believed to play a key role in the stability of the structure. With advances in biology, some effort was deployed to develop update functions in Boolean models that include recent knowledge. We combine real-life gene interaction networks with novel update functions in a Boolean model. We use two sub-networks of biological organisms, the yeast cell-cycle and the mouse embryonic stem cell, as topological support for our system. On these structures, we substitute the original random update functions by a novel threshold-based dynamic function in which the promoting and repressing effect of each interaction is considered. We use a third real-life regulatory network, along with its inferred Boolean update functions to validate the proposed update function. Results of this validation hint to increased biological plausibility of the threshold-based function. To investigate the dynamical behavior of this new model, we visualized the phase transition between order and chaos into the critical regime using Derrida plots. We complement the qualitative nature of Derrida plots with an alternative measure, the criticality distance, that also allows to discriminate between regimes in a quantitative way. Simulation on both real-life genetic regulatory networks show that there exists a set of parameters that allows the systems to operate in the critical region. This new model includes experimentally derived biological information and recent discoveries, which makes it potentially useful to guide experimental research. The update function confers additional realism to the model, while reducing the complexity and solution space, thus making it easier to investigate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Vegeu el resum a l'inici del document de l'arxiu adjunt

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we propose a parsimonious regime-switching approach to model the correlations between assets, the threshold conditional correlation (TCC) model. This method allows the dynamics of the correlations to change from one state (or regime) to another as a function of observable transition variables. Our model is similar in spirit to Silvennoinen and Teräsvirta (2009) and Pelletier (2006) but with the appealing feature that it does not suffer from the course of dimensionality. In particular, estimation of the parameters of the TCC involves a simple grid search procedure. In addition, it is easy to guarantee a positive definite correlation matrix because the TCC estimator is given by the sample correlation matrix, which is positive definite by construction. The methodology is illustrated by evaluating the behaviour of international equities, govenrment bonds and major exchange rates, first separately and then jointly. We also test and allow for different parts in the correlation matrix to be governed by different transition variables. For this, we estimate a multi-threshold TCC specification. Further, we evaluate the economic performance of the TCC model against a constant conditional correlation (CCC) estimator using a Diebold-Mariano type test. We conclude that threshold correlation modelling gives rise to a significant reduction in portfolio´s variance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Functional divergence between homologous proteins is expected to affect amino acid sequences in two main ways, which can be considered as proxies of biochemical divergence: a "covarion-like" pattern of correlated changes in evolutionary rates, and switches in conserved residues ("conserved but different"). Although these patterns have been used in case studies, a large-scale analysis is needed to estimate their frequency and distribution. We use a phylogenomic framework of animal genes to answer three questions: 1) What is the prevalence of such patterns? 2) Can we link such patterns at the amino acid level with selection inferred at the codon level? 3) Are patterns different between paralogs and orthologs? We find that covarion-like patterns are more frequently detected than "constant but different," but that only the latter are correlated with signal for positive selection. Finally, there is no obvious difference in patterns between orthologs and paralogs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Prevention of Trypanosoma cruzi infection in mammals likely depends on either prevention of the invading trypomastigotes from infecting host cells or the rapid recognition and killing of the newly infected cells byT. cruzi-specific T cells. We show here that multiple rounds of infection and cure (by drug therapy) fails to protect mice from reinfection, despite the generation of potent T cell responses. This disappointing result is similar to that obtained with many other vaccine protocols used in attempts to protect animals from T. cruziinfection. We have previously shown that immune recognition ofT. cruziinfection is significantly delayed both at the systemic level and at the level of the infected host cell. The systemic delay appears to be the result of a stealth infection process that fails to trigger substantial innate recognition mechanisms while the delay at the cellular level is related to the immunodominance of highly variable gene family proteins, in particular those of the trans-sialidase family. Here we discuss how these previous studies and the new findings herein impact our thoughts on the potential of prophylactic vaccination to serve a productive role in the prevention of T. cruziinfection and Chagas disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the histomorphological grading of prostate carcinoma, pathologists have regularly assigned comparable scores for the architectural Gleason and the now-obsolete nuclear World Health Organization (WHO) grading systems. Although both systems demonstrate good correspondence between grade and survival, they are based on fundamentally different biological criteria. We tested the hypothesis that this apparent concurrence between the two grading systems originates from an interpretation bias in the minds of diagnostic pathologists, rather than reflecting a biological reality. Three pathologists graded 178 prostatectomy specimens, assigning Gleason and WHO scores on glass slides and on digital images of nuclei isolated out of their architectural context. The results were analysed with respect to interdependencies among the grading systems, to tumour recurrence (PSA relapse > 0.1 ng/ml at 48 months) and robust nuclear morphometry, as assessed by computer-assisted image analysis. WHO and Gleason grades were strongly correlated (r = 0.82) and demonstrated identical prognostic power. However, WHO grades correlated poorly with nuclear morphology (r = 0.19). Grading of nuclei isolated out of their architectural context significantly improved accuracy for nuclear morphology (r = 0.55), but the prognostic power was virtually lost. In conclusion, the architectural organization of a tumour, which the pathologist cannot avoid noticing during initial slide viewing at low magnification, unwittingly influences the subsequent nuclear grade assignment. In our study, the prognostic power of the WHO grading system was dependent on visual assessment of tumour growth pattern. We demonstrate for the first time the influence a cognitive bias can have in the generation of an error in diagnostic pathology and highlight a considerable problem in histopathological tumour grading.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gene expression data from microarrays are being applied to predict preclinical and clinical endpoints, but the reliability of these predictions has not been established. In the MAQC-II project, 36 independent teams analyzed six microarray data sets to generate predictive models for classifying a sample with respect to one of 13 endpoints indicative of lung or liver toxicity in rodents, or of breast cancer, multiple myeloma or neuroblastoma in humans. In total, >30,000 models were built using many combinations of analytical methods. The teams generated predictive models without knowing the biological meaning of some of the endpoints and, to mimic clinical reality, tested the models on data that had not been used for training. We found that model performance depended largely on the endpoint and team proficiency and that different approaches generated models of similar performance. The conclusions and recommendations from MAQC-II should be useful for regulatory agencies, study committees and independent investigators that evaluate methods for global gene expression analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of the EU funded integrated project "ACuteTox" is to develop a strategy in which general cytotoxicity, together with organ-specific endpoints and biokinetic features, are taken into consideration in the in vitro prediction of oral acute systemic toxicity. With regard to the nervous system, the effects of 23 reference chemicals were tested with approximately 50 endpoints, using a neuronal cell line, primary neuronal cell cultures, brain slices and aggregated brain cell cultures. Comparison of the in vitro neurotoxicity data with general cytotoxicity data generated in a non-neuronal cell line and with in vivo data such as acute human lethal blood concentration, revealed that GABA(A) receptor function, acetylcholine esterase activity, cell membrane potential, glucose uptake, total RNA expression and altered gene expression of NF-H, GFAP, MBP, HSP32 and caspase-3 were the best endpoints to use for further testing with 36 additional chemicals. The results of the second analysis showed that no single neuronal endpoint could give a perfect improvement in the in vitro-in vivo correlation, indicating that several specific endpoints need to be analysed and combined with biokinetic data to obtain the best correlation with in vivo acute toxicity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the role of learning by private agents and the central bank (two-sided learning) in a New Keynesian framework in which both sides of the economy have asymmetric and imperfect knowledge about the true data generating process. We assume that all agents employ the data that they observe (which may be distinct for different sets of agents) to form beliefs about unknown aspects of the true model of the economy, use their beliefs to decide on actions, and revise these beliefs through a statistical learning algorithm as new information becomes available. We study the short-run dynamics of our model and derive its policy recommendations, particularly with respect to central bank communications. We demonstrate that two-sided learning can generate substantial increases in volatility and persistence, and alter the behavior of the variables in the model in a signifficant way. Our simulations do not converge to a symmetric rational expectations equilibrium and we highlight one source that invalidates the convergence results of Marcet and Sargent (1989). Finally, we identify a novel aspect of central bank communication in models of learning: communication can be harmful if the central bank's model is substantially mis-specified

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper discusses maintenance challenges of organisations with a huge number of devices and proposes the use of probabilistic models to assist monitoring and maintenance planning. The proposal assumes connectivity of instruments to report relevant features for monitoring. Also, the existence of enough historical registers with diagnosed breakdowns is required to make probabilistic models reliable and useful for predictive maintenance strategies based on them. Regular Markov models based on estimated failure and repair rates are proposed to calculate the availability of the instruments and Dynamic Bayesian Networks are proposed to model cause-effect relationships to trigger predictive maintenance services based on the influence between observed features and previously documented diagnostics