942 resultados para Attainable Sets


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Vegeu el resum a l'inici del document del fitxer adjunt.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Astrocytes play active roles in brain physiology by dynamic interactions with neurons. Connexin 30, one of the two main astroglial gap-junction subunits, is thought to be involved in behavioral and basic cognitive processes. However, the underlying cellular and molecular mechanisms are unknown. We show here in mice that connexin 30 controls hippocampal excitatory synaptic transmission through modulation of astroglial glutamate transport, which directly alters synaptic glutamate levels. Unexpectedly, we found that connexin 30 regulated cell adhesion and migration and that connexin 30 modulation of glutamate transport, occurring independently of its channel function, was mediated by morphological changes controlling insertion of astroglial processes into synaptic clefts. By setting excitatory synaptic strength, connexin 30 plays an important role in long-term synaptic plasticity and in hippocampus-based contextual memory. Taken together, these results establish connexin 30 as a critical regulator of synaptic strength by controlling the synaptic location of astroglial processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The evolutionary lineage leading to the teleost fish underwent a whole genome duplication termed FSGD or 3R in addition to two prior genome duplications that took place earlier during vertebrate evolution (termed 1R and 2R). Resulting from the FSGD, additional copies of genes are present in fish, compared to tetrapods whose lineage did not experience the 3R genome duplication. Interestingly, we find that ParaHox genes do not differ in number in extant teleost fishes despite their additional genome duplication from the genomic situation in mammals, but they are distributed over twice as many paralogous regions in fish genomes. RESULTS: We determined the DNA sequence of the entire ParaHox C1 paralogon in the East African cichlid fish Astatotilapia burtoni, and compared it to orthologous regions in other vertebrate genomes as well as to the paralogous vertebrate ParaHox D paralogons. Evolutionary relationships among genes from these four chromosomal regions were studied with several phylogenetic algorithms. We provide evidence that the genes of the ParaHox C paralogous cluster are duplicated in teleosts, just as it had been shown previously for the D paralogon genes. Overall, however, synteny and cluster integrity seems to be less conserved in ParaHox gene clusters than in Hox gene clusters. Comparative analyses of non-coding sequences uncovered conserved, possibly co-regulatory elements, which are likely to contain promoter motives of the genes belonging to the ParaHox paralogons. CONCLUSION: There seems to be strong stabilizing selection for gene order as well as gene orientation in the ParaHox C paralogon, since with a few exceptions, only the lengths of the introns and intergenic regions differ between the distantly related species examined. The high degree of evolutionary conservation of this gene cluster's architecture in particular - but possibly clusters of genes more generally - might be linked to the presence of promoter, enhancer or inhibitor motifs that serve to regulate more than just one gene. Therefore, deletions, inversions or relocations of individual genes could destroy the regulation of the clustered genes in this region. The existence of such a regulation network might explain the evolutionary conservation of gene order and orientation over the course of hundreds of millions of years of vertebrate evolution. Another possible explanation for the highly conserved gene order might be the existence of a regulator not located immediately next to its corresponding gene but further away since a relocation or inversion would possibly interrupt this interaction. Different ParaHox clusters were found to have experienced differential gene loss in teleosts. Yet the complete set of these homeobox genes was maintained, albeit distributed over almost twice the number of chromosomes. Selection due to dosage effects and/or stoichiometric disturbance might act more strongly to maintain a modal number of homeobox genes (and possibly transcription factors more generally) per genome, yet permit the accumulation of other (non regulatory) genes associated with these homeobox gene clusters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given the very large amount of data obtained everyday through population surveys, much of the new research again could use this information instead of collecting new samples. Unfortunately, relevant data are often disseminated into different files obtained through different sampling designs. Data fusion is a set of methods used to combine information from different sources into a single dataset. In this article, we are interested in a specific problem: the fusion of two data files, one of which being quite small. We propose a model-based procedure combining a logistic regression with an Expectation-Maximization algorithm. Results show that despite the lack of data, this procedure can perform better than standard matching procedures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Topological indices have been applied to build QSAR models for a set of 20 antimalarial cyclic peroxy cetals. In order to evaluate the reliability of the proposed linear models leave-n-out and Internal Test Sets (ITS) approaches have been considered. The proposed procedure resulted in a robust and consensued prediction equation and here it is shown why it is superior to the employed standard cross-validation algorithms involving multilinear regression models

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents several algorithms for joint estimation of the target number and state in a time-varying scenario. Building on the results presented in [1], which considers estimation of the target number only, we assume that not only the target number, but also their state evolution must be estimated. In this context, we extend to this new scenario the Rao-Blackwellization procedure of [1] to compute Bayes recursions, thus defining reduced-complexity solutions for the multi-target set estimator. A performance assessmentis finally given both in terms of Circular Position Error Probability - aimed at evaluating the accuracy of the estimated track - and in terms of Cardinality Error Probability, aimed at evaluating the reliability of the target number estimates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the application of normal theory methods to the estimation and testing of a general type of multivariate regressionmodels with errors--in--variables, in the case where various data setsare merged into a single analysis and the observable variables deviatepossibly from normality. The various samples to be merged can differ on the set of observable variables available. We show that there is a convenient way to parameterize the model so that, despite the possiblenon--normality of the data, normal--theory methods yield correct inferencesfor the parameters of interest and for the goodness--of--fit test. Thetheory described encompasses both the functional and structural modelcases, and can be implemented using standard software for structuralequations models, such as LISREL, EQS, LISCOMP, among others. An illustration with Monte Carlo data is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the homogeneous case of one-dimensional objects, we show that any preference relation that is positive and homothetic can be represented by a quantitative utility function and unique bias. This bias may favor or disfavor the preference for an object. In the first case, preferences are complete but not transitive and an object may be preferred even when its utility is lower. In the second case, preferences are asymmetric and transitive but not negatively transitive and it may not be sufficient for an object to have a greater utility for be preferred. In this manner, the bias reflects the extent to which preferences depart from the maximization of a utility function.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article examines the extent and limits of nonstate forms of authority in international relations. It analyzes how the information and communication technology (ICT) infrastructure for the tradability of services in a global knowledge-based economy relies on informal regulatory practices for the adjustment of ICT-related skills. By focusing on the challenge that highly volatile and short-lived cycles of demands for this type of knowledge pose for ensuring the right qualification of the labor force, the article explores how companies and associations provide training and certification programs as part of a growing market for educational services setting their own standards. The existing literature on non-conventional forms of authority in the global political economy has emphasized that the consent of actors, subject to informal rules and some form of state support, remains crucial for the effectiveness of those new forms of power. However, analyses based on a limited sample of actors tend toward a narrow understanding of the issues concerned and fail to fully explore the differentiated space in which non state authority is emerging. This article develops a three-dimensional analytical framework that brings together the scope of the issues involved, the range of nonstate actors concerned, and the spatial scope of their authority. The empirical findings highlight the limits of these new forms of nonstate authority and shed light on the role of the state and international governmental organizations in this new context.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[cat] En l'article es dona una condició necessària per a que els conjunts de negociació definits per Shimomura (1997) i el nucli d'un joc cooperatiu amb utilitat transferible coincideixin. A tal efecte, s'introdueix el concepte de vectors de màxim pagament. La condició necessària consiteix a verificar que aquests vectors pertanyen al nucli del joc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Naturally acquired immune responses against human cancers often include CD8(+) T cells specific for the cancer testis antigen NY-ESO-1. Here, we studied T cell receptor (TCR) primary structure and function of 605 HLA-A*0201/NY-ESO-1(157-165)-specific CD8 T cell clones derived from five melanoma patients. We show that an important proportion of tumor-reactive T cells preferentially use TCR AV3S1/BV8S2 chains, with remarkably conserved CDR3 amino acid motifs and lengths in both chains. All remaining T cell clones belong to two additional sets expressing BV1 or BV13 TCRs, associated with alpha-chains with highly diverse VJ usage, CDR3 amino acid sequence, and length. Yet, all T cell clonotypes recognize tumor antigen with similar functional avidity. Two residues, Met-160 and Trp-161, located in the middle region of the NY-ESO-1(157-165) peptide, are critical for recognition by most of the T cell clonotypes. Collectively, our data show that a large number of alphabeta TCRs, belonging to three distinct sets (AVx/BV1, AV3/BV8, AVx/BV13) bind pMHC with equal antigen sensitivity and recognize the same peptide motif. Finally, this in-depth study of recognition of a self-antigen suggests that in part similar biophysical mechanisms shape TCR repertoires toward foreign and self-antigens.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Preface The starting point for this work and eventually the subject of the whole thesis was the question: how to estimate parameters of the affine stochastic volatility jump-diffusion models. These models are very important for contingent claim pricing. Their major advantage, availability T of analytical solutions for characteristic functions, made them the models of choice for many theoretical constructions and practical applications. At the same time, estimation of parameters of stochastic volatility jump-diffusion models is not a straightforward task. The problem is coming from the variance process, which is non-observable. There are several estimation methodologies that deal with estimation problems of latent variables. One appeared to be particularly interesting. It proposes the estimator that in contrast to the other methods requires neither discretization nor simulation of the process: the Continuous Empirical Characteristic function estimator (EGF) based on the unconditional characteristic function. However, the procedure was derived only for the stochastic volatility models without jumps. Thus, it has become the subject of my research. This thesis consists of three parts. Each one is written as independent and self contained article. At the same time, questions that are answered by the second and third parts of this Work arise naturally from the issues investigated and results obtained in the first one. The first chapter is the theoretical foundation of the thesis. It proposes an estimation procedure for the stochastic volatility models with jumps both in the asset price and variance processes. The estimation procedure is based on the joint unconditional characteristic function for the stochastic process. The major analytical result of this part as well as of the whole thesis is the closed form expression for the joint unconditional characteristic function for the stochastic volatility jump-diffusion models. The empirical part of the chapter suggests that besides a stochastic volatility, jumps both in the mean and the volatility equation are relevant for modelling returns of the S&P500 index, which has been chosen as a general representative of the stock asset class. Hence, the next question is: what jump process to use to model returns of the S&P500. The decision about the jump process in the framework of the affine jump- diffusion models boils down to defining the intensity of the compound Poisson process, a constant or some function of state variables, and to choosing the distribution of the jump size. While the jump in the variance process is usually assumed to be exponential, there are at least three distributions of the jump size which are currently used for the asset log-prices: normal, exponential and double exponential. The second part of this thesis shows that normal jumps in the asset log-returns should be used if we are to model S&P500 index by a stochastic volatility jump-diffusion model. This is a surprising result. Exponential distribution has fatter tails and for this reason either exponential or double exponential jump size was expected to provide the best it of the stochastic volatility jump-diffusion models to the data. The idea of testing the efficiency of the Continuous ECF estimator on the simulated data has already appeared when the first estimation results of the first chapter were obtained. In the absence of a benchmark or any ground for comparison it is unreasonable to be sure that our parameter estimates and the true parameters of the models coincide. The conclusion of the second chapter provides one more reason to do that kind of test. Thus, the third part of this thesis concentrates on the estimation of parameters of stochastic volatility jump- diffusion models on the basis of the asset price time-series simulated from various "true" parameter sets. The goal is to show that the Continuous ECF estimator based on the joint unconditional characteristic function is capable of finding the true parameters. And, the third chapter proves that our estimator indeed has the ability to do so. Once it is clear that the Continuous ECF estimator based on the unconditional characteristic function is working, the next question does not wait to appear. The question is whether the computation effort can be reduced without affecting the efficiency of the estimator, or whether the efficiency of the estimator can be improved without dramatically increasing the computational burden. The efficiency of the Continuous ECF estimator depends on the number of dimensions of the joint unconditional characteristic function which is used for its construction. Theoretically, the more dimensions there are, the more efficient is the estimation procedure. In practice, however, this relationship is not so straightforward due to the increasing computational difficulties. The second chapter, for example, in addition to the choice of the jump process, discusses the possibility of using the marginal, i.e. one-dimensional, unconditional characteristic function in the estimation instead of the joint, bi-dimensional, unconditional characteristic function. As result, the preference for one or the other depends on the model to be estimated. Thus, the computational effort can be reduced in some cases without affecting the efficiency of the estimator. The improvement of the estimator s efficiency by increasing its dimensionality faces more difficulties. The third chapter of this thesis, in addition to what was discussed above, compares the performance of the estimators with bi- and three-dimensional unconditional characteristic functions on the simulated data. It shows that the theoretical efficiency of the Continuous ECF estimator based on the three-dimensional unconditional characteristic function is not attainable in practice, at least for the moment, due to the limitations on the computer power and optimization toolboxes available to the general public. Thus, the Continuous ECF estimator based on the joint, bi-dimensional, unconditional characteristic function has all the reasons to exist and to be used for the estimation of parameters of the stochastic volatility jump-diffusion models.