951 resultados para Energetic Heterogeneity
Resumo:
We focus on the comparison of three statistical models used to estimate the treatment effect in metaanalysis when individually pooled data are available. The models are two conventional models, namely a multi-level and a model based upon an approximate likelihood, and a newly developed model, the profile likelihood model which might be viewed as an extension of the Mantel-Haenszel approach. To exemplify these methods, we use results from a meta-analysis of 22 trials to prevent respiratory tract infections. We show that by using the multi-level approach, in the case of baseline heterogeneity, the number of clusters or components is considerably over-estimated. The approximate and profile likelihood method showed nearly the same pattern for the treatment effect distribution. To provide more evidence two simulation studies are accomplished. The profile likelihood can be considered as a clear alternative to the approximate likelihood model. In the case of strong baseline heterogeneity, the profile likelihood method shows superior behaviour when compared with the multi-level model. Copyright (C) 2006 John Wiley & Sons, Ltd.
Resumo:
We describe a general likelihood-based 'mixture model' for inferring phylogenetic trees from gene-sequence or other character-state data. The model accommodates cases in which different sites in the alignment evolve in qualitatively distinct ways, but does not require prior knowledge of these patterns or partitioning of the data. We call this qualitative variability in the pattern of evolution across sites "pattern-heterogeneity" to distinguish it from both a homogenous process of evolution and from one characterized principally by differences in rates of evolution. We present studies to show that the model correctly retrieves the signals of pattern-heterogeneity from simulated gene-sequence data, and we apply the method to protein-coding genes and to a ribosomal 12S data set. The mixture model outperforms conventional partitioning in both these data sets. We implement the mixture model such that it can simultaneously detect rate- and pattern-heterogeneity. The model simplifies to a homogeneous model or a rate- variability model as special cases, and therefore always performs at least as well as these two approaches, and often considerably improves upon them. We make the model available within a Bayesian Markov-chain Monte Carlo framework for phylogenetic inference, as an easy-to-use computer program.
Resumo:
Background: MHC Class I molecules present antigenic peptides to cytotoxic T cells, which forms an integral part of the adaptive immune response. Peptides are bound within a groove formed by the MHC heavy chain. Previous approaches to MHC Class I-peptide binding prediction have largely concentrated on the peptide anchor residues located at the P2 and C-terminus positions. Results: A large dataset comprising MHC-peptide structural complexes was created by remodelling pre-determined x-ray crystallographic structures. Static energetic analysis, following energy minimisation, was performed on the dataset in order to characterise interactions between bound peptides and the MHC Class I molecule, partitioning the interactions within the groove into van der Waals, electrostatic and total non-bonded energy contributions. Conclusion: The QSAR techniques of Genetic Function Approximation (GFA) and Genetic Partial Least Squares (G/PLS) algorithms were used to identify key interactions between the two molecules by comparing the calculated energy values with experimentally-determined BL50 data. Although the peptide termini binding interactions help ensure the stability of the MHC Class I-peptide complex, the central region of the peptide is also important in defining the specificity of the interaction. As thermodynamic studies indicate that peptide association and dissociation may be driven entropically, it may be necessary to incorporate entropic contributions into future calculations.
Resumo:
A critical analysis of single crystal X-ray diffraction studies on a series of terminally protected tripeptides containing a centrally positioned Aib (alpha-aminoisobutyric acid) residue has been reported. For the tripeptide series containing Boc-Ala-Aib as corner residues, all the reported peptides formed distorted type II beta-turn structures. Moreover, a series of Phe substituted analogues ( tripeptides with Boc-Phe-Aib) have also shown different beta-turn conformations. However, the Leu-modified analogues (tripeptides with Boc-Leu-Aib) disrupt the concept of beta-turn formation and adopt various conformations in the solid state. X-ray crystallography sheds some light on the conformational heterogeneity at atomic resolution. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Williams syndrome (WS) is a rare genetic disorder resulting from a deletion on chromosome 7. A number of studies have shown that individuals with WS have a superior linguistic profile compared to their non-verbal abilities, however the evidence has been inconclusive, as many studies have disputed such a profile. The vast majority of studies on WS have assumed a single, homogeneous WS linguistic profile in order to support various theoretical viewpoints. The present study investigated the linguistic profiles of 5 individuals with WS on a number of standardized verbal measures and in conversational settings. The results indicated substantially variable performance in all aspects of the verbal domain, which supports the view that WS, linguistically, is a rather heterogeneous condition and this should be taken into consideration when referring to it in theoretical accounts of language acquisition and debates on modularity.
Resumo:
The performance benefit when using Grid systems comes from different strategies, among which partitioning the applications into parallel tasks is the most important. However, in most cases the enhancement coming from partitioning is smoothed by the effect of the synchronization overhead, mainly due to the high variability of completion times of the different tasks, which, in turn, is due to the large heterogeneity of Grid nodes. For this reason, it is important to have models which capture the performance of such systems. In this paper we describe a queueing-network-based performance model able to accurately analyze Grid architectures, and we use the model to study a real parallel application executed in a Grid. The proposed model improves the classical modelling techniques and highlights the impact of resource heterogeneity and network latency on the application performance.
Resumo:
The performance benefit when using grid systems comes from different strategies, among which partitioning the applications into parallel tasks is the most important. However, in most cases the enhancement coming from partitioning is smoothed by the effects of synchronization overheads, mainly due to the high variability in the execution times of the different tasks, which, in turn, is accentuated by the large heterogeneity of grid nodes. In this paper we design hierarchical, queuing network performance models able to accurately analyze grid architectures and applications. Thanks to the model results, we introduce a new allocation policy based on a combination between task partitioning and task replication. The models are used to study two real applications and to evaluate the performance benefits obtained with allocation policies based on task replication.
Resumo:
This study presents a systematic and quantitative analysis of the effect of inhomogeneous surface albedo on shortwave cloud absorption estimates. We used 3D radiative transfer modeling over a checkerboard surface albedo to calculate cloud absorption. We have found that accounting for surface heterogeneity enhances cloud absorption. However, the enhancement is not sufficient to explain the reported difference between measured and modeled cloud absorption.
Resumo:
We develop a database of 110 gradual solar energetic particle (SEP) events, over the period 1967–2006, providing estimates of event onset, duration, fluence, and peak flux for protons of energy E > 60 MeV. The database is established mainly from the energetic proton flux data distributed in the OMNI 2 data set; however, we also utilize the McMurdo neutron monitor and the energetic proton flux from GOES missions. To aid the development of the gradual SEP database, we establish a method with which the homogeneity of the energetic proton flux record is improved. A comparison between other SEP databases and the database developed here is presented which discusses the different algorithms used to define an event. Furthermore, we investigate the variation of gradual SEP occurrence and fluence with solar cycle phase, sunspot number (SSN), and interplanetary magnetic field intensity (Bmag) over solar cycles 20–23. We find that the occurrence and fluence of SEP events vary with the solar cycle phase. Correspondingly, we find a positive correlation between SEP occurrence and solar activity as determined by SSN and Bmag, while the mean fluence in individual events decreases with the same measures of solar activity. Therefore, although the number of events decreases when solar activity is low, the events that do occur at such times have higher fluence. Thus, large events such as the “Carrington flare” may be more likely at lower levels of solar activity. These results are discussed in the context of other similar investigations.
Resumo:
The recent solar minimum was the longest and deepest of the space age, with the lowest average sunspot numbers for nearly a century. The Sun appears to be exiting a grand solar maximum (GSM) of activity which has persisted throughout the space age, and is headed into a significantly quieter period. Indeed, initial observations of solar cycle 24 (SC24) continue to show a relatively low heliospheric magnetic field strength and sunspot number (R), despite the average latitude of sunspots and the inclination of the heliospheric current sheet showing the rise to solar maximum is well underway. We extrapolate the available SC24 observations forward in time by assuming R will continue to follow a similar form to previous cycles, despite the end of the GSM, and predict a very weak cycle 24, with R peaking at ∼65–75 around the middle/end of 2012. Similarly, we estimate the heliospheric magnetic field strength will peak around 6nT. We estimate that average galactic cosmic ray fluxes above 1GV rigidity will be ∼10% higher in SC24 than SC23 and that the probability of a large SEP event during this cycle is 0.8, compared to 0.5 for SC23. Comparison of the SC24 R estimates with previous ends of GSMs inferred from 9300 years of cosmogenic isotope data places the current evolution of the Sun and heliosphere in the lowest 5% of cases, suggesting Maunder Minimum conditions are likely within the next 40 years.
Resumo:
Energetic constraints on precipitation are useful for understanding the response of the hydrological cycle to ongoing climate change, its response to possible geoengineering schemes, and the limits on precipitation in very warm climates of the past. Much recent progress has been made in quantifying the different forcings and feedbacks on precipitation and in understanding how the transient responses of precipitation and temperature might differ qualitatively. Here, we introduce the basic ideas and review recent progress. We also examine the extent to which energetic constraints on precipitation may be viewed as radiative constraints and the extent to which they are confirmed by available observations. Challenges remain, including the need to better demonstrate the link between energetics and precipitation in observations and to better understand energetic constraints on precipitation at sub-global length scales.