997 resultados para Statistical structures
Resumo:
La tomodensitométrie (CT) est une technique d'imagerie dont l'intérêt n'a cessé de croître depuis son apparition dans le début des années 70. Dans le domaine médical, son utilisation est incontournable à tel point que ce système d'imagerie pourrait être amené à devenir victime de son succès si son impact au niveau de l'exposition de la population ne fait pas l'objet d'une attention particulière. Bien évidemment, l'augmentation du nombre d'examens CT a permis d'améliorer la prise en charge des patients ou a rendu certaines procédures moins invasives. Toutefois, pour assurer que le compromis risque - bénéfice soit toujours en faveur du patient, il est nécessaire d'éviter de délivrer des doses non utiles au diagnostic.¦Si cette action est importante chez l'adulte elle doit être une priorité lorsque les examens se font chez l'enfant, en particulier lorsque l'on suit des pathologies qui nécessitent plusieurs examens CT au cours de la vie du patient. En effet, les enfants et jeunes adultes sont plus radiosensibles. De plus, leur espérance de vie étant supérieure à celle de l'adulte, ils présentent un risque accru de développer un cancer radio-induit dont la phase de latence peut être supérieure à vingt ans. Partant du principe que chaque examen radiologique est justifié, il devient dès lors nécessaire d'optimiser les protocoles d'acquisitions pour s'assurer que le patient ne soit pas irradié inutilement. L'avancée technologique au niveau du CT est très rapide et depuis 2009, de nouvelles techniques de reconstructions d'images, dites itératives, ont été introduites afin de réduire la dose et améliorer la qualité d'image.¦Le présent travail a pour objectif de déterminer le potentiel des reconstructions itératives statistiques pour réduire au minimum les doses délivrées lors d'examens CT chez l'enfant et le jeune adulte tout en conservant une qualité d'image permettant le diagnostic, ceci afin de proposer des protocoles optimisés.¦L'optimisation d'un protocole d'examen CT nécessite de pouvoir évaluer la dose délivrée et la qualité d'image utile au diagnostic. Alors que la dose est estimée au moyen d'indices CT (CTDIV0| et DLP), ce travail a la particularité d'utiliser deux approches radicalement différentes pour évaluer la qualité d'image. La première approche dite « physique », se base sur le calcul de métriques physiques (SD, MTF, NPS, etc.) mesurées dans des conditions bien définies, le plus souvent sur fantômes. Bien que cette démarche soit limitée car elle n'intègre pas la perception des radiologues, elle permet de caractériser de manière rapide et simple certaines propriétés d'une image. La seconde approche, dite « clinique », est basée sur l'évaluation de structures anatomiques (critères diagnostiques) présentes sur les images de patients. Des radiologues, impliqués dans l'étape d'évaluation, doivent qualifier la qualité des structures d'un point de vue diagnostique en utilisant une échelle de notation simple. Cette approche, lourde à mettre en place, a l'avantage d'être proche du travail du radiologue et peut être considérée comme méthode de référence.¦Parmi les principaux résultats de ce travail, il a été montré que les algorithmes itératifs statistiques étudiés en clinique (ASIR?, VEO?) ont un important potentiel pour réduire la dose au CT (jusqu'à-90%). Cependant, par leur fonctionnement, ils modifient l'apparence de l'image en entraînant un changement de texture qui pourrait affecter la qualité du diagnostic. En comparant les résultats fournis par les approches « clinique » et « physique », il a été montré que ce changement de texture se traduit par une modification du spectre fréquentiel du bruit dont l'analyse permet d'anticiper ou d'éviter une perte diagnostique. Ce travail montre également que l'intégration de ces nouvelles techniques de reconstruction en clinique ne peut se faire de manière simple sur la base de protocoles utilisant des reconstructions classiques. Les conclusions de ce travail ainsi que les outils développés pourront également guider de futures études dans le domaine de la qualité d'image, comme par exemple, l'analyse de textures ou la modélisation d'observateurs pour le CT.¦-¦Computed tomography (CT) is an imaging technique in which interest has been growing since it first began to be used in the early 1970s. In the clinical environment, this imaging system has emerged as the gold standard modality because of its high sensitivity in producing accurate diagnostic images. However, even if a direct benefit to patient healthcare is attributed to CT, the dramatic increase of the number of CT examinations performed has raised concerns about the potential negative effects of ionizing radiation on the population. To insure a benefit - risk that works in favor of a patient, it is important to balance image quality and dose in order to avoid unnecessary patient exposure.¦If this balance is important for adults, it should be an absolute priority for children undergoing CT examinations, especially for patients suffering from diseases requiring several follow-up examinations over the patient's lifetime. Indeed, children and young adults are more sensitive to ionizing radiation and have an extended life span in comparison to adults. For this population, the risk of developing cancer, whose latency period exceeds 20 years, is significantly higher than for adults. Assuming that each patient examination is justified, it then becomes a priority to optimize CT acquisition protocols in order to minimize the delivered dose to the patient. Over the past few years, CT advances have been developing at a rapid pace. Since 2009, new iterative image reconstruction techniques, called statistical iterative reconstructions, have been introduced in order to decrease patient exposure and improve image quality.¦The goal of the present work was to determine the potential of statistical iterative reconstructions to reduce dose as much as possible without compromising image quality and maintain diagnosis of children and young adult examinations.¦The optimization step requires the evaluation of the delivered dose and image quality useful to perform diagnosis. While the dose is estimated using CT indices (CTDIV0| and DLP), the particularity of this research was to use two radically different approaches to evaluate image quality. The first approach, called the "physical approach", computed physical metrics (SD, MTF, NPS, etc.) measured on phantoms in well-known conditions. Although this technique has some limitations because it does not take radiologist perspective into account, it enables the physical characterization of image properties in a simple and timely way. The second approach, called the "clinical approach", was based on the evaluation of anatomical structures (diagnostic criteria) present on patient images. Radiologists, involved in the assessment step, were asked to score image quality of structures for diagnostic purposes using a simple rating scale. This approach is relatively complicated to implement and also time-consuming. Nevertheless, it has the advantage of being very close to the practice of radiologists and is considered as a reference method.¦Primarily, this work revealed that the statistical iterative reconstructions studied in clinic (ASIR? and VECO have a strong potential to reduce CT dose (up to -90%). However, by their mechanisms, they lead to a modification of the image appearance with a change in image texture which may then effect the quality of the diagnosis. By comparing the results of the "clinical" and "physical" approach, it was showed that a change in texture is related to a modification of the noise spectrum bandwidth. The NPS analysis makes possible to anticipate or avoid a decrease in image quality. This project demonstrated that integrating these new statistical iterative reconstruction techniques can be complex and cannot be made on the basis of protocols using conventional reconstructions. The conclusions of this work and the image quality tools developed will be able to guide future studies in the field of image quality as texture analysis or model observers dedicated to CT.
Resumo:
Structural equation models are widely used in economic, socialand behavioral studies to analyze linear interrelationships amongvariables, some of which may be unobservable or subject to measurementerror. Alternative estimation methods that exploit different distributionalassumptions are now available. The present paper deals with issues ofasymptotic statistical inferences, such as the evaluation of standarderrors of estimates and chi--square goodness--of--fit statistics,in the general context of mean and covariance structures. The emphasisis on drawing correct statistical inferences regardless of thedistribution of the data and the method of estimation employed. A(distribution--free) consistent estimate of $\Gamma$, the matrix ofasymptotic variances of the vector of sample second--order moments,will be used to compute robust standard errors and a robust chi--squaregoodness--of--fit squares. Simple modifications of the usual estimateof $\Gamma$ will also permit correct inferences in the case of multi--stage complex samples. We will also discuss the conditions under which,regardless of the distribution of the data, one can rely on the usual(non--robust) inferential statistics. Finally, a multivariate regressionmodel with errors--in--variables will be used to illustrate, by meansof simulated data, various theoretical aspects of the paper.
Resumo:
We present a study of the influence of atomic order on the relative stability of the bcc and the 18R martensitic structures in a Cu2.96Al0.92Be0.12 crystal. Calorimetric measurements have shown that disorder increases the stability of the 18R phase, contrary to what happens in Cu-Zn-Al alloys for which it is the bcc phase that is stabilized by disordering the system. This different behavior has been explained in terms of a model recently reported. We have also proved that the entropy change at the martensitic transition is independent of the state of atomic order of the crystal, as predicted theoretically. Our results suggest that differences in the vibrational spectrum of the crystal due to different states of atomic order must be equal in the bcc and in the close-packed phases.
Resumo:
We discuss the dynamics of the transient pattern formation process corresponding to the splay Fréedericksz transition. The emergence and subsequent evolution of the spatial periodicity is here described in terms of the temporal dependence of the wave numbers corresponding to the maxima of the structure factor. Situations of perpendicular as well as oblique field-induced stripes relative to the initial orientation of the director are both examined with explicit indications of the time scales needed for their appearance and posterior development.
Resumo:
OBJECTIVE: The purpose of this article is to assess the effect of the adaptive statistical iterative reconstruction (ASIR) technique on image quality in hip MDCT arthrography and to evaluate its potential for reducing radiation dose. SUBJECTS AND METHODS: Thirty-seven patients examined with hip MDCT arthrography were prospectively randomized into three different protocols: one with a regular dose (volume CT dose index [CTDIvol], 38.4 mGy) and two with a reduced dose (CTDIvol, 24.6 or 15.4 mGy). Images were reconstructed using filtered back projection (FBP) and four increasing percentages of ASIR (30%, 50%, 70%, and 90%). Image noise and contrast-to-noise ratio (CNR) were measured. Two musculoskeletal radiologists independently evaluated several anatomic structures and image quality parameters using a 4-point scale. They also jointly assessed acetabular labrum tears and articular cartilage lesions. RESULTS: With decreasing radiation dose level, image noise statistically significantly increased (p=0.0009) and CNR statistically significantly decreased (p=0.001). We also found a statistically significant reduction in noise (p=0.0001) and increase in CNR (p≤0.003) with increasing percentage of ASIR; in addition, we noted statistically significant increases in image quality scores for the labrum and cartilage, subchondral bone, overall diagnostic quality (up to 50% ASIR), and subjective noise (p≤0.04), and statistically significant reductions for the trabecular bone and muscles (p≤0.03). Regardless of the radiation dose level, there were no statistically significant differences in the detection and characterization of labral tears (n=24; p=1) and cartilage lesions (n=40; p≥0.89) depending on the ASIR percentage. CONCLUSION: The use of up to 50% ASIR in hip MDCT arthrography helps to reduce radiation dose by approximately 35-60%, while maintaining diagnostic image quality comparable to that of a regular-dose protocol using FBP.
Resumo:
Purpose: Although several approaches have been already used to reduce radiation dose, CT doses are still among the high doses in radio-diagnostic. Recently, General Electric introduced a new imaging reconstruction technique, adaptive statistical iterative reconstruction (ASIR), allows to taking into account the statistical fluctuation of noise. The benefits of ASIR method were assessed through classic metrics and the evaluations of cardiac structures by radiologists. Methods and materials: A 64-row CT (MDCT) was employed. Catphan600 phantom acquisitions and 10 routine-dose CT examinations performed at 80 kVp were reconstructed with FBP and with 50% of ASIR. Six radiologists then assessed the visibility of main cardiac structures using the visual grading analysis (VGA) method. Results: On phantoms, for a constant value of SD (25 HU), CTDIvol is divided by 2 (8 mGy to 4 mGy) when 50% of ASIR is used. At constant CTDIvol, MTF medium frequencies were also significantly improved. First results indicated that clinical images reconstructed with ASIR had a better overall image quality compared with conventional reconstruction. This means that at constant image quality the radiation dose can be strongly reduced. Conclusion: The first results of this study shown that the ASIR method improves the image quality on phantoms by decreasing noise and improving resolution with respect to the classical one. Moreover, the benefit obtained is higher at lower doses. In clinical environment, a dose reduction can still be expected on 80 kVp low dose pediatric protocols using 50% of iterative reconstruction. Best ASIR percentage as a function of cardiac structures and detailed protocols will be presented for cardiac examinations.
Resumo:
To permit the tracking of turbulent flow structures in an Eulerian frame from single-point measurements, we make use of a generalization of conventional two-dimensional quadrant analysis to three-dimensional octants. We characterize flow structures using the sequences of these octants and show how significance may be attached to particular sequences using statistical mull models. We analyze an example experiment and show how a particular dominant flow structure can be identified from the conditional probability of octant sequences. The frequency of this structure corresponds to the dominant peak in the velocity spectra and exerts a high proportion of the total shear stress. We link this structure explicitly to the propensity for sediment entrainment and show that greater insight into sediment entrainment can be obtained by disaggregating those octants that occur within the identified macroturbulence structure from those that do not. Hence, this work goes beyond critiques of Reynolds stress approaches to bed load entrainment that highlight the importance of outward interactions, to identifying and prioritizing the quadrants/octants that define particular flow structures. Key Points <list list-type=''bulleted'' id=''jgrf20196-list-0001''> <list-item id=''jgrf20196-li-0001''>A new method for analysing single point velocity data is presented <list-item id=''jgrf20196-li-0002''>Flow structures are identified by a sequence of flow states (termed octants) <list-item id=''jgrf20196-li-0003''>The identified structure exerts high stresses and causes bed-load entrainment
Resumo:
Many social phenomena involve a set of dyadic relations among agents whose actions may be dependent. Although individualistic approaches have frequently been applied to analyze social processes, these are not generally concerned with dyadic relations nor do they deal with dependency. This paper describes a mathematical procedure for analyzing dyadic interactions in a social system. The proposed method mainly consists of decomposing asymmetric data into their symmetrical and skew-symmetrical parts. A quantification of skew-symmetry for a social system can be obtained by dividing the norm of the skew-symmetrical matrix by the norm of the asymmetric matrix. This calculation makes available to researchers a quantity related to the amount of dyadic reciprocity. Regarding agents, the procedure enables researchers to identify those whose behavior is asymmetric with respect to all agents. It is also possible to derive symmetric measurements among agents and to use multivariate statistical techniques.
Resumo:
Flood simulation studies use spatial-temporal rainfall data input into distributed hydrological models. A correct description of rainfall in space and in time contributes to improvements on hydrological modelling and design. This work is focused on the analysis of 2-D convective structures (rain cells), whose contribution is especially significant in most flood events. The objective of this paper is to provide statistical descriptors and distribution functions for convective structure characteristics of precipitation systems producing floods in Catalonia (NE Spain). To achieve this purpose heavy rainfall events recorded between 1996 and 2000 have been analysed. By means of weather radar, and applying 2-D radar algorithms a distinction between convective and stratiform precipitation is made. These data are introduced and analyzed with a GIS. In a first step different groups of connected pixels with convective precipitation are identified. Only convective structures with an area greater than 32 km2 are selected. Then, geometric characteristics (area, perimeter, orientation and dimensions of the ellipse), and rainfall statistics (maximum, mean, minimum, range, standard deviation, and sum) of these structures are obtained and stored in a database. Finally, descriptive statistics for selected characteristics are calculated and statistical distributions are fitted to the observed frequency distributions. Statistical analyses reveal that the Generalized Pareto distribution for the area and the Generalized Extreme Value distribution for the perimeter, dimensions, orientation and mean areal precipitation are the statistical distributions that best fit the observed ones of these parameters. The statistical descriptors and the probability distribution functions obtained are of direct use as an input in spatial rainfall generators.
Redox dysregulation in schizophrenia : effect on myelination of cortical structures and connectivity
Resumo:
Cette thèse traite du rôle qu'un facteur de risque génétique développé chez les patients souffrant de schizophrénie, à savoir un déficit de la synthèse du glutathion, peut jouer dans les anomalies de la connectivité cérébrale trouvées chez ces patients. L'essentiel du travail a été consacré à évaluer la structure de la substance blanche dans l'ensemble du cerveau chez un modèle animal par une méthode similaire à celle utilisée en recherche clinique avec l'imagerie par résonance magnétique (IRM). Cette approche de translation inverse chez la souris knock-out de glutamate-cystéine ligase modulateur sous-unité (Gclm KO), avait l'objectif d'étudier l'effet des défenses redox déficientes sur le développement des connexions cérébrales, tout en excluant celui des facteurs non liés au génotype. Après avoir établi le protocole de recherche, l'influence d'une manipulation environnementale a également été étudiée. Pour effectuer une analyse statistique fiable des données d'IRM obtenues, nous .avons d'abord créé un atlas du cerveau de la souris afin de l'utiliser comme modèle pour une segmentation précise des différentes régions du cerveau sur les images IRM obtenues in vivo. Les données provenant de chaque région d'intérêt ont ensuite été étudiées séparément. La qualité de cette méthode a été évaluée dans une expérience de simulation pour déduire la puissance statistique réalisable dans chaque région en fonction du nombre d'animaux utilisés. Ces outils d'analyse nous ont permis d'évaluer l'intégrité de la substance blanche dans le cerveau des souris durant le développement grâce à une expérience longitudinale, en utilisant l'imagerie du tenseur de diffusion (DTI). Nous avons ainsi observé des anomalies dans les paramètres dérivés du tenseur (diffusivité et anisotropie) dans la Commissure Antérieure et le Fimbria/Fornix des souris Gclm KO, par rapport aux animaux contrôles. Ces résultats suggèrent une substance blanche endommagée dans ces régions. Dans une expérience électrophysiologique, Pascal Steullet a montré que ces anomalies ont des conséquences fonctionnelles caractérisées par une réduction de la vitesse de conduction dans les fibres nerveuses. Ces données renforcent les conclusions des analyses d'imagerie. Le mécanisme par lequel une dérégulation redox affecte la structure de la substance blanche reste encore à définir, car une analyse immunohistochimique des protéines constituantes de la couche de myéline des fibres concernées n'a pas donné de résultats concluants. Nous avons également constaté un élargissement des ventricules dans les jeunes souris Gclm KO, mais pas chez les adultes et des anomalies neurochimiques déjà connues chez ces animaux (Duarte et al. 2011), à savoir une réduction du Glutathion et une augmentation de l'acide N-acétylaspartique, de l'Alanine et du ratio Glutamine/Glutamate. Nous avons ensuite testé l'effet d'un stress environnemental supplémentaire, l'élevage en isolement social, sur le phénotype. Ce stress n'a eu aucun effet sur la structure de la substance blanche évaluée par DTI, mais a réduit la concentration de myo-Inositol et augmenté le ratio de Glutamine/Glutamate dans le cortex frontal. Nous avons aussi reproduit dans ce groupe indépendant d'animaux les effets du génotype sur le profil neurochimique, sur la taille des ventricules et aussi sur les paramètres dérivés du tenseur de diffusion dans le Fimbria/Fornix, mais pas dans la Commissure Antérieure. Nos résultats montrent qu'une dérégulation redox d'origine génétique perturbe la structure et la fonction de la substance blanche dans des régions spécifiques, causant ainsi l'élargissement des ventricules. Ces phénotypes rassemblent certaines caractéristiques neuro-anatomiques de la schizophrénie, mais les mécanismes qui en sont responsables demeurent encore inconnus. L'isolement social n'a pas d'effet sur la structure de la substance blanche évaluée par DTI, alors qu'il est prouvé qu'il affecte la maturation des oligodendrocytes. La neurochimie corticale et en particulier le rapport Glutamine/Glutamate a été affecté par le dérèglement redox ainsi que par l'isolement social. En conséquence, ce ratio représente un indice prometteur dans la recherche sur l'interaction du stress environnemental avec le déséquilibre redox dans le domaine de la schizophrénie. -- The present doctoral thesis is concerned with the role that a genetic risk factor for the development of schizophrenia, namely a deficit in Glutathione synthesis, may play in the anomalies of brain connectivity found in patients. Most of the effort was devoted to perform a whole-brain assessment of white matter structure in the Glutamate-Cysteine ligase modulatory knockout mouse model (Gclm KO) using Magnetic Resonance Imaging (MRI) techniques similar to those used in state-of-the-art clinical research. Such reverse translational approach taking brain imaging from the bedside to the bench aimed to investigate the role that deficient redox defenses may play in the development of brain connections while excluding all influencing factors beside the genotype. After establishing the protocol, the influence of further environmental manipulations was also studied. Analysis of MRI images acquired in vivo was one of the main challenges of the project. Our strategy consisted in creating an atlas of the mouse brain to use as segmentation guide and then analyze the data from each region of interest separately. The quality of the method was assessed in a simulation experiment by calculating the statistical power achievable in each brain region at different sample sizes. This analysis tool enabled us to assess white matter integrity in the mouse brain along development in a longitudinal experiment using Diffusion Tensor Imaging (DTI). We discovered anomalies in diffusivity parameters derived from the tensor in the Anterior Commissure and Fimbria/Fornix of Gclm KO mice when compared to wild-type animals, which suggest that the structure of these tracts is compromised in the KO mice. In an elegant electrophysiological experiment, Pascal Steullet has provided evidence that these anomalies have functional consequences in form of reduced conduction velocity in the concerned tracts, thus supporting the DTI findings. The mechanism by which redox dysregulation affects WM structure remains unknown, for the immunohistochemical analysis of myelin constituent proteins in the concerned tracts produced inconclusive results. Our experiments also detected an enlargement of the lateral ventricles in young but not adult Gclm KO mice and confirmed neurochemical anomalies already known to affect this animals (Duarte et al. 2011), namely a reduction in Glutathione and an increase in Glutamine/Glutamate ratio, N-acetylaspartate and Alanine. Using the same methods, we tested the effect of an additional environmental stress on the observed phenotype: rearing in social isolation had no effect on white matter structure as assessed by DTI, but it reduced the concentration of myo-Inositol and increased the Glutamine/Glutamate ratio in the frontal cortex. We could also replicate in this separate group of animals the effects of genotype on the frontal neurochemical profile, ventricular size and diffusivity parameters in the Fimbria/Fornix but not in the Anterior Commissure. Our data show that a redox dysregulation of genetic origin may disrupt white matter structure and function in specific tracts and cause a ventricular enlargement, phenotypes that resemble some neuroanatomical features of schizophrenia. The mechanism responsible remains however unknown. We have also demonstrated that environmental stress in form of social isolation does not affect white matter structure as assessed by DTI even though it is known to affect oligodendrocyte maturation. Cortical neurochemistry, and specifically the Glutamine to Glutamate balance was affected both by redox dysregulation and social isolation, and is thus a good target for further research on the interaction of redox imbalance and environmental stress in schizophrenia.
Resumo:
Phase encoded nano structures such as Quick Response (QR) codes made of metallic nanoparticles are suggested to be used in security and authentication applications. We present a polarimetric optical method able to authenticate random phase encoded QR codes. The system is illuminated using polarized light and the QR code is encoded using a phase-only random mask. Using classification algorithms it is possible to validate the QR code from the examination of the polarimetric signature of the speckle pattern. We used Kolmogorov-Smirnov statistical test and Support Vector Machine algorithms to authenticate the phase encoded QR codes using polarimetric signatures.
Resumo:
The interatomic potential of the ion-atom scattering system I^N+-I at small intermediate internuclear distances is calculated for different charge states N from atomic Dirac-Focker-Slater (DFS) electron densities within a statistical model. The behaviour of the potential structures, due to ionized electronic shells, is studied by calculations of classical elastic differential scattering cross-sections.
Resumo:
This thesis presents a statistical framework for object recognition. The framework is motivated by the pictorial structure models introduced by Fischler and Elschlager nearly 30 years ago. The basic idea is to model an object by a collection of parts arranged in a deformable configuration. The appearance of each part is modeled separately, and the deformable configuration is represented by spring-like connections between pairs of parts. These models allow for qualitative descriptions of visual appearance, and are suitable for generic recognition problems. The problem of detecting an object in an image and the problem of learning an object model using training examples are naturally formulated under a statistical approach. We present efficient algorithms to solve these problems in our framework. We demonstrate our techniques by training models to represent faces and human bodies. The models are then used to locate the corresponding objects in novel images.
Resumo:
The preceding two editions of CoDaWork included talks on the possible consideration of densities as infinite compositions: Egozcue and D´ıaz-Barrero (2003) extended the Euclidean structure of the simplex to a Hilbert space structure of the set of densities within a bounded interval, and van den Boogaart (2005) generalized this to the set of densities bounded by an arbitrary reference density. From the many variations of the Hilbert structures available, we work with three cases. For bounded variables, a basis derived from Legendre polynomials is used. For variables with a lower bound, we standardize them with respect to an exponential distribution and express their densities as coordinates in a basis derived from Laguerre polynomials. Finally, for unbounded variables, a normal distribution is used as reference, and coordinates are obtained with respect to a Hermite-polynomials-based basis. To get the coordinates, several approaches can be considered. A numerical accuracy problem occurs if one estimates the coordinates directly by using discretized scalar products. Thus we propose to use a weighted linear regression approach, where all k- order polynomials are used as predictand variables and weights are proportional to the reference density. Finally, for the case of 2-order Hermite polinomials (normal reference) and 1-order Laguerre polinomials (exponential), one can also derive the coordinates from their relationships to the classical mean and variance. Apart of these theoretical issues, this contribution focuses on the application of this theory to two main problems in sedimentary geology: the comparison of several grain size distributions, and the comparison among different rocks of the empirical distribution of a property measured on a batch of individual grains from the same rock or sediment, like their composition
Resumo:
Background: MHC Class I molecules present antigenic peptides to cytotoxic T cells, which forms an integral part of the adaptive immune response. Peptides are bound within a groove formed by the MHC heavy chain. Previous approaches to MHC Class I-peptide binding prediction have largely concentrated on the peptide anchor residues located at the P2 and C-terminus positions. Results: A large dataset comprising MHC-peptide structural complexes was created by remodelling pre-determined x-ray crystallographic structures. Static energetic analysis, following energy minimisation, was performed on the dataset in order to characterise interactions between bound peptides and the MHC Class I molecule, partitioning the interactions within the groove into van der Waals, electrostatic and total non-bonded energy contributions. Conclusion: The QSAR techniques of Genetic Function Approximation (GFA) and Genetic Partial Least Squares (G/PLS) algorithms were used to identify key interactions between the two molecules by comparing the calculated energy values with experimentally-determined BL50 data. Although the peptide termini binding interactions help ensure the stability of the MHC Class I-peptide complex, the central region of the peptide is also important in defining the specificity of the interaction. As thermodynamic studies indicate that peptide association and dissociation may be driven entropically, it may be necessary to incorporate entropic contributions into future calculations.