77 resultados para Missing values structures
Resumo:
In this paper, we address this problem through the design of a semiactive controller based on the mixed H2/H∞ control theory. The vibrations caused by the seismic motions are mitigated by a semiactive damper installed in the bottom of the structure. It is meant by semiactive damper, a device that absorbs but cannot inject energy into the system. Sufficient conditions for the design of a desired control are given in terms of linear matrix inequalities (LMIs). A controller that guarantees asymptotic stability and a mixed H2/H∞ performance is then developed. An algorithm is proposed to handle the semiactive nature of the actuator. The performance of the controller is experimentally evaluated in a real-time hybrid testing facility that consists of a physical specimen (a small-scale magnetorheological damper) and a numerical model (a large-scale three-story building)
Resumo:
This short paper addresses the problem of designing a QFT (quantitative feedback theory) based controllers for the vibration reduction in a 6-story building structure equipped with shear-mode magnetorheological dampers. A new methodology is proposed for characterizing the nonlinear hysteretic behavior of the MR damper through the uncertainty template in the Nichols chart. The design procedure for QFT control design is briefly presented
Resumo:
The objective the present research is try to find some control design strategies, which must be effective and closed to the real operation conditions. As a novel contribution to structural control strategies, the theories of Interval Modal Arithmetic, Backstepping Control and QFT (Qualitative Feedback Theory) will be studied. The steps to follow are to develop first new controllers based on the above theories and then to implement the proposed control strategies to different kind of structures. The report is organized as follows. The Chapter 2 presents the state-of-the-art on structural control systems. The chapter 3 presents the most important open problems found in field of structural control. The exploratory work made by the author, research proposal and working plan are given in the Chapter 4
Resumo:
Projecte de recerca elaborat a partir d’una estada al Max Planck Institute for Human Cognitive and Brain Sciences, Alemanya, entre 2010 i 2012. El principal objectiu d’aquest projecte era estudiar en detall les estructures subcorticals, en concret, el rol dels ganglis basals en control cognitiu durant processament lingüístic i no-lingüístic. Per tal d’assolir una diferenciació minuciosa en els diferents nuclis dels ganglis basals s’utilitzà ressonància magnètica d’ultra-alt camp i alta resolució (7T-MRI). El còrtex prefrontal lateral i els ganglis basals treballant conjuntament per a mitjançar memòria de treball i la regulació “top-down” de la cognició. Aquest circuit regula l’equilibri entre respostes automàtiques i d’alt-ordre cognitiu. Es crearen tres condicions experimentals principals: frases/seqüències noambigües, no-gramatical i ambigües. Les frases/seqüències no-ambigües haurien de provocar una resposta automàtica, mentre les frases/seqüències ambigües i no-gramaticals produïren un conflicte amb la resposta automàtica, i per tant, requeririen una resposta de d’alt-ordre cognitiu. Dins del domini de la resposta de control, la ambigüitat i no-gramaticalitat representen dues dimensions diferents de la resolució de conflicte, mentre per una frase/seqüència temporalment ambigua existeix una interpretació correcte, aquest no és el cas per a les frases/seqüències no-gramaticals. A més, el disseny experimental incloïa una manipulació lingüística i nolingüística, la qual posà a prova la hipòtesi que els efectes són de domini-general; així com una manipulació semàntica i sintàctica que avaluà les diferències entre el processament d’ambigüitat/error “intrínseca” vs. “estructural”. Els resultats del primer experiment (sintax-lingüístic) mostraren un gradient rostroventralcaudodorsal de control cognitiu dins del nucli caudat, això és, les regions més rostrals sostenint els nivells més alts de processament cognitiu
Resumo:
Background: Recent advances on high-throughput technologies have produced a vast amount of protein sequences, while the number of high-resolution structures has seen a limited increase. This has impelled the production of many strategies to built protein structures from its sequence, generating a considerable amount of alternative models. The selection of the closest model to the native conformation has thus become crucial for structure prediction. Several methods have been developed to score protein models by energies, knowledge-based potentials and combination of both.Results: Here, we present and demonstrate a theory to split the knowledge-based potentials in scoring terms biologically meaningful and to combine them in new scores to predict near-native structures. Our strategy allows circumventing the problem of defining the reference state. In this approach we give the proof for a simple and linear application that can be further improved by optimizing the combination of Zscores. Using the simplest composite score () we obtained predictions similar to state-of-the-art methods. Besides, our approach has the advantage of identifying the most relevant terms involved in the stability of the protein structure. Finally, we also use the composite Zscores to assess the conformation of models and to detect local errors.Conclusion: We have introduced a method to split knowledge-based potentials and to solve the problem of defining a reference state. The new scores have detected near-native structures as accurately as state-of-art methods and have been successful to identify wrongly modeled regions of many near-native conformations.
Resumo:
Background: Alternatively spliced exons play an important role in the diversification of gene function in most metazoans and are highly regulated by conserved motifs in exons and introns. Two contradicting properties have been associated to evolutionary conserved alternative exons: higher sequence conservation and higher rate of non-synonymous substitutions, relative to constitutive exons. In order to clarify this issue, we have performed an analysis of the evolution of alternative and constitutive exons, using a large set of protein coding exons conserved between human and mouse and taking into account the conservation of the transcript exonic structure. Further, we have also defined a measure of the variation of the arrangement of exonic splicing enhancers (ESE-conservation score) to study the evolution of splicing regulatory sequences. We have used this measure to correlate the changes in the arrangement of ESEs with the divergence of exon and intron sequences. Results: We find evidence for a relation between the lack of conservation of the exonic structure and the weakening of the sequence evolutionary constraints in alternative and constitutive exons. Exons in transcripts with non-conserved exonic structures have higher synonymous (dS) and non-synonymous (dN) substitution rates than exons in conserved structures. Moreover, alternative exons in transcripts with non-conserved exonic structure are the least constrained in sequence evolution, and at high EST-inclusion levels they are found to be very similar to constitutive exons, whereas alternative exons in transcripts with conserved exonic structure have a dS significantly lower than average at all EST-inclusion levels. We also find higher conservation in the arrangement of ESEs in constitutive exons compared to alternative ones. Additionally, the sequence conservation at flanking introns remains constant for constitutive exons at all ESE-conservation values, but increases for alternative exons at high ESE-conservation values. Conclusion: We conclude that most of the differences in dN observed between alternative and constitutive exons can be explained by the conservation of the transcript exonic structure. Low dS values are more characteristic of alternative exons with conserved exonic structure, but not of those with non-conserved exonic structure. Additionally, constitutive exons are characterized by a higher conservation in the arrangement of ESEs, and alternative exons with an ESE-conservation similar to that of constitutive exons are characterized by a conservation of the flanking intron sequences higher than average, indicating the presence of more intronic regulatory signals.
Resumo:
Background: The understanding of whole genome sequences in higher eukaryotes depends to a large degree on the reliable definition of transcription units including exon/intron structures, translated open reading frames (ORFs) and flanking untranslated regions. The best currently available chicken transcript catalog is the Ensembl build based on the mappings of a relatively small number of full length cDNAs and ESTs to the genome as well as genome sequence derived in silico gene predictions.Results: We use Long Serial Analysis of Gene Expression (LongSAGE) in bursal lymphocytes and the DT40 cell line to verify the quality and completeness of the annotated transcripts. 53.6% of the more than 38,000 unique SAGE tags (unitags) match to full length bursal cDNAs, the Ensembl transcript build or the genome sequence. The majority of all matching unitags show single matches to the genome, but no matches to the genome derived Ensembl transcript build. Nevertheless, most of these tags map close to the 3' boundaries of annotated Ensembl transcripts.Conclusions: These results suggests that rather few genes are missing in the current Ensembl chicken transcript build, but that the 3' ends of many transcripts may not have been accurately predicted. The tags with no match in the transcript sequences can now be used to improve gene predictions, pinpoint the genomic location of entirely missed transcripts and optimize the accuracy of gene finder software.
Resumo:
Aquest treball és una revisió d'alguns sistemes de Traducció Automàtica que segueixen l'estratègia de Transfer i fan servir estructures de trets com a eina de representació. El treball s'integra dins el projecte MLAP-9315, projecte que investiga la reutilització de les especificacions lingüístiques del projecte EUROTRA per estàndards industrials.
Resumo:
We analyze a standard environment of adverse selection in credit markets. In our envi- ronment, entrepreneurs who are privately informed about the quality of their projects need to borrow from banks. As is generally the case in economies with adverse selection, the competitive equilibrium of our economy is shown to be ine¢ cient. Under adverse selection, the choices made by one type of agents limit what can be o¤ered to other types in an incentive-compatible manner. This gives rise to an externality, which cannot be internalized in a competitive equilibrium. We show that, in this type of environment, the ine¢ ciency associated to adverse selection is the consequence of one implicit assumption: entrepreneurs can only borrow from banks. If an additional market is added (say, a .security market.), in which entrepreneurs can obtain funds beyond those o¤ered by banks, we show that the e¢ cient allocation is an equilibrium of the economy. In such an equilibrium, all entrepreneurs borrow at a pooling rate in the security market. When they apply to bank loans, though, only entrepreneurs with good projects pledge these additional funds as collateral. This equilibrium thus simultaneously entails cross- subsidization and separation between di¤erent types of entrepreneurs.
Resumo:
We analyze a standard environment of adverse selection in credit markets. In our environment, entrepreneurs who are privately informed about the quality of their projects need to borrow in order to invest. Conventional wisdom says that, in this class of economies, the competitive equilibrium is typically inefficient. We show that this conventional wisdom rests on one implicit assumption: entrepreneurs can only access monitored lending. If a new set of markets is added to provide entrepreneurs with additional funds, efficiency can be attained in equilibrium. An important characteristic of these additional markets is that lending in them must be unmonitored, in the sense that it does not condition total borrowing or investment by entrepreneurs. This makes it possible to attain efficiency by pooling all entrepreneurs in the new markets while separating them in the markets for monitored loans.
Resumo:
In this paper, we develop a general equilibrium model of crime and show thatlaw enforcement has different roles depending on the equilibrium characterization and the value of social norms. When an economy has a unique stable equilibrium where a fraction of the population is productive and the remaining predates, the government can choose an optimal law enforcement policy to maximize a welfare function evaluated at the steady state. If such steady state is not unique, law enforcement is still relevant but in a completely different way because the steady state that prevails depends on the initial proportions of productive and predator individuals in the economy. The relative importance of these proportions can be changed through law enforcement policy.
Resumo:
Using data for all the fixtures for the seasons from 1972-73 to 2002-03, we estimate a dynamic model of demand for football pools in Spain paying attention to whether their main economic explanatory variable is the effective price of a ticket or the jackpot. Additionally, we evaluate the importance of the composition of the list of games in terms of whether First Division matches are included or not. Results show that the jackpot model is preferred to the effective price model, having important implications in terms of how the structure of the game should be changed in order to increase demand.
Resumo:
Structural equation models are widely used in economic, socialand behavioral studies to analyze linear interrelationships amongvariables, some of which may be unobservable or subject to measurementerror. Alternative estimation methods that exploit different distributionalassumptions are now available. The present paper deals with issues ofasymptotic statistical inferences, such as the evaluation of standarderrors of estimates and chi--square goodness--of--fit statistics,in the general context of mean and covariance structures. The emphasisis on drawing correct statistical inferences regardless of thedistribution of the data and the method of estimation employed. A(distribution--free) consistent estimate of $\Gamma$, the matrix ofasymptotic variances of the vector of sample second--order moments,will be used to compute robust standard errors and a robust chi--squaregoodness--of--fit squares. Simple modifications of the usual estimateof $\Gamma$ will also permit correct inferences in the case of multi--stage complex samples. We will also discuss the conditions under which,regardless of the distribution of the data, one can rely on the usual(non--robust) inferential statistics. Finally, a multivariate regressionmodel with errors--in--variables will be used to illustrate, by meansof simulated data, various theoretical aspects of the paper.
Resumo:
In moment structure analysis with nonnormal data, asymptotic valid inferences require the computation of a consistent (under general distributional assumptions) estimate of the matrix $\Gamma$ of asymptotic variances of sample second--order moments. Such a consistent estimate involves the fourth--order sample moments of the data. In practice, the use of fourth--order moments leads to computational burden and lack of robustness against small samples. In this paper we show that, under certain assumptions, correct asymptotic inferences can be attained when $\Gamma$ is replaced by a matrix $\Omega$ that involves only the second--order moments of the data. The present paper extends to the context of multi--sample analysis of second--order moment structures, results derived in the context of (simple--sample) covariance structure analysis (Satorra and Bentler, 1990). The results apply to a variety of estimation methods and general type of statistics. An example involving a test of equality of means under covariance restrictions illustrates theoretical aspects of the paper.
Resumo:
We extend to score, Wald and difference test statistics the scaled and adjusted corrections to goodness-of-fit test statistics developed in Satorra and Bentler (1988a,b). The theory is framed in the general context of multisample analysis of moment structures, under general conditions on the distribution of observable variables. Computational issues, as well as the relation of the scaled and corrected statistics to the asymptotic robust ones, is discussed. A Monte Carlo study illustrates thecomparative performance in finite samples of corrected score test statistics.