123 resultados para vector quantization based Gaussian modeling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The availability of high resolution Digital Elevation Models (DEM) at a regional scale enables the analysis of topography with high levels of detail. Hence, a DEM-based geomorphometric approach becomes more accurate for detecting potential rockfall sources. Potential rockfall source areas are identified according to the slope angle distribution deduced from high resolution DEM crossed with other information extracted from geological and topographic maps in GIS format. The slope angle distribution can be decomposed in several Gaussian distributions that can be considered as characteristic of morphological units: rock cliffs, steep slopes, footslopes and plains. A terrain is considered as potential rockfall sources when their slope angles lie over an angle threshold, which is defined where the Gaussian distribution of the morphological unit "Rock cliffs" become dominant over the one of "Steep slopes". In addition to this analysis, the cliff outcrops indicated by the topographic maps were added. They contain however "flat areas", so that only the slope angles values above the mode of the Gaussian distribution of the morphological unit "Steep slopes" were considered. An application of this method is presented over the entire Canton of Vaud (3200 km2), Switzerland. The results were compared with rockfall sources observed on the field and orthophotos analysis in order to validate the method. Finally, the influence of the cell size of the DEM is inspected by applying the methodology over six different DEM resolutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mountains and mountain societies provide a wide range of goods and services to humanity, but they are particularly sensitive to the effects of global environmental change. Thus, the definition of appropriate management regimes that maintain the multiple functions of mountain regions in a time of greatly changing climatic, economic, and societal drivers constitutes a significant challenge. Management decisions must be based on a sound understanding of the future dynamics of these systems. The present article reviews the elements required for an integrated effort to project the impacts of global change on mountain regions, and recommends tools that can be used at 3 scientific levels (essential, improved, and optimum). The proposed strategy is evaluated with respect to UNESCO's network of Mountain Biosphere Reserves (MBRs), with the intention of implementing it in other mountain regions as well. First, methods for generating scenarios of key drivers of global change are reviewed, including land use/land cover and climate change. This is followed by a brief review of the models available for projecting the impacts of these scenarios on (1) cryospheric systems, (2) ecosystem structure and diversity, and (3) ecosystem functions such as carbon and water relations. Finally, the cross-cutting role of remote sensing techniques is evaluated with respect to both monitoring and modeling efforts. We conclude that a broad range of techniques is available for both scenario generation and impact assessments, many of which can be implemented without much capacity building across many or even most MBRs. However, to foster implementation of the proposed strategy, further efforts are required to establish partnerships between scientists and resource managers in mountain areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: In the radiopharmaceutical therapy approach to the fight against cancer, in particular when it comes to translating laboratory results to the clinical setting, modeling has served as an invaluable tool for guidance and for understanding the processes operating at the cellular level and how these relate to macroscopic observables. Tumor control probability (TCP) is the dosimetric end point quantity of choice which relates to experimental and clinical data: it requires knowledge of individual cellular absorbed doses since it depends on the assessment of the treatment's ability to kill each and every cell. Macroscopic tumors, seen in both clinical and experimental studies, contain too many cells to be modeled individually in Monte Carlo simulation; yet, in particular for low ratios of decays to cells, a cell-based model that does not smooth away statistical considerations associated with low activity is a necessity. The authors present here an adaptation of the simple sphere-based model from which cellular level dosimetry for macroscopic tumors and their end point quantities, such as TCP, may be extrapolated more reliably. METHODS: Ten homogenous spheres representing tumors of different sizes were constructed in GEANT4. The radionuclide 131I was randomly allowed to decay for each model size and for seven different ratios of number of decays to number of cells, N(r): 1000, 500, 200, 100, 50, 20, and 10 decays per cell. The deposited energy was collected in radial bins and divided by the bin mass to obtain the average bin absorbed dose. To simulate a cellular model, the number of cells present in each bin was calculated and an absorbed dose attributed to each cell equal to the bin average absorbed dose with a randomly determined adjustment based on a Gaussian probability distribution with a width equal to the statistical uncertainty consistent with the ratio of decays to cells, i.e., equal to Nr-1/2. From dose volume histograms the surviving fraction of cells, equivalent uniform dose (EUD), and TCP for the different scenarios were calculated. Comparably sized spherical models containing individual spherical cells (15 microm diameter) in hexagonal lattices were constructed, and Monte Carlo simulations were executed for all the same previous scenarios. The dosimetric quantities were calculated and compared to the adjusted simple sphere model results. The model was then applied to the Bortezomib-induced enzyme-targeted radiotherapy (BETR) strategy of targeting Epstein-Barr virus (EBV)-expressing cancers. RESULTS: The TCP values were comparable to within 2% between the adjusted simple sphere and full cellular models. Additionally, models were generated for a nonuniform distribution of activity, and results were compared between the adjusted spherical and cellular models with similar comparability. The TCP values from the experimental macroscopic tumor results were consistent with the experimental observations for BETR-treated 1 g EBV-expressing lymphoma tumors in mice. CONCLUSIONS: The adjusted spherical model presented here provides more accurate TCP values than simple spheres, on par with full cellular Monte Carlo simulations while maintaining the simplicity of the simple sphere model. This model provides a basis for complementing and understanding laboratory and clinical results pertaining to radiopharmaceutical therapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Monitoring of posture allocations and activities enables accurate estimation of energy expenditure and may aid in obesity prevention and treatment. At present, accurate devices rely on multiple sensors distributed on the body and thus may be too obtrusive for everyday use. This paper presents a novel wearable sensor, which is capable of very accurate recognition of common postures and activities. The patterns of heel acceleration and plantar pressure uniquely characterize postures and typical activities while requiring minimal preprocessing and no feature extraction. The shoe sensor was tested in nine adults performing sitting and standing postures and while walking, running, stair ascent/descent and cycling. Support vector machines (SVMs) were used for classification. A fourfold validation of a six-class subject-independent group model showed 95.2% average accuracy of posture/activity classification on full sensor set and over 98% on optimized sensor set. Using a combination of acceleration/pressure also enabled a pronounced reduction of the sampling frequency (25 to 1 Hz) without significant loss of accuracy (98% versus 93%). Subjects had shoe sizes (US) M9.5-11 and W7-9 and body mass index from 18.1 to 39.4 kg/m2 and thus suggesting that the device can be used by individuals with varying anthropometric characteristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a review of methodology for semi-supervised modeling with kernel methods, when the manifold assumption is guaranteed to be satisfied. It concerns environmental data modeling on natural manifolds, such as complex topographies of the mountainous regions, where environmental processes are highly influenced by the relief. These relations, possibly regionalized and nonlinear, can be modeled from data with machine learning using the digital elevation models in semi-supervised kernel methods. The range of the tools and methodological issues discussed in the study includes feature selection and semisupervised Support Vector algorithms. The real case study devoted to data-driven modeling of meteorological fields illustrates the discussed approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The vast territories that have been radioactively contaminated during the 1986 Chernobyl accident provide a substantial data set of radioactive monitoring data, which can be used for the verification and testing of the different spatial estimation (prediction) methods involved in risk assessment studies. Using the Chernobyl data set for such a purpose is motivated by its heterogeneous spatial structure (the data are characterized by large-scale correlations, short-scale variability, spotty features, etc.). The present work is concerned with the application of the Bayesian Maximum Entropy (BME) method to estimate the extent and the magnitude of the radioactive soil contamination by 137Cs due to the Chernobyl fallout. The powerful BME method allows rigorous incorporation of a wide variety of knowledge bases into the spatial estimation procedure leading to informative contamination maps. Exact measurements (?hard? data) are combined with secondary information on local uncertainties (treated as ?soft? data) to generate science-based uncertainty assessment of soil contamination estimates at unsampled locations. BME describes uncertainty in terms of the posterior probability distributions generated across space, whereas no assumption about the underlying distribution is made and non-linear estimators are automatically incorporated. Traditional estimation variances based on the assumption of an underlying Gaussian distribution (analogous, e.g., to the kriging variance) can be derived as a special case of the BME uncertainty analysis. The BME estimates obtained using hard and soft data are compared with the BME estimates obtained using only hard data. The comparison involves both the accuracy of the estimation maps using the exact data and the assessment of the associated uncertainty using repeated measurements. Furthermore, a comparison of the spatial estimation accuracy obtained by the two methods was carried out using a validation data set of hard data. Finally, a separate uncertainty analysis was conducted that evaluated the ability of the posterior probabilities to reproduce the distribution of the raw repeated measurements available in certain populated sites. The analysis provides an illustration of the improvement in mapping accuracy obtained by adding soft data to the existing hard data and, in general, demonstrates that the BME method performs well both in terms of estimation accuracy as well as in terms estimation error assessment, which are both useful features for the Chernobyl fallout study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Huntington's disease (HD) is an autosomal dominant neurodegenerative disorder caused by an expansion of CAG repeats in the huntingtin (Htt) gene. Despite intensive efforts devoted to investigating the mechanisms of its pathogenesis, effective treatments for this devastating disease remain unavailable. The lack of suitable models recapitulating the entire spectrum of the degenerative process has severely hindered the identification and validation of therapeutic strategies. The discovery that the degeneration in HD is caused by a mutation in a single gene has offered new opportunities to develop experimental models of HD, ranging from in vitro models to transgenic primates. However, recent advances in viral-vector technology provide promising alternatives based on the direct transfer of genes to selected sub-regions of the brain. Rodent studies have shown that overexpression of mutant human Htt in the striatum using adeno-associated virus or lentivirus vectors induces progressive neurodegeneration, which resembles that seen in HD. This article highlights progress made in modeling HD using viral vector gene transfer. We describe data obtained with of this highly flexible approach for the targeted overexpression of a disease-causing gene. The ability to deliver mutant Htt to specific tissues has opened pathological processes to experimental analysis and allowed targeted therapeutic development in rodent and primate pre-clinical models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The potential of type-2 fuzzy sets for managing high levels of uncertainty in the subjective knowledge of experts or of numerical information has focused on control and pattern classification systems in recent years. One of the main challenges in designing a type-2 fuzzy logic system is how to estimate the parameters of type-2 fuzzy membership function (T2MF) and the Footprint of Uncertainty (FOU) from imperfect and noisy datasets. This paper presents an automatic approach for learning and tuning Gaussian interval type-2 membership functions (IT2MFs) with application to multi-dimensional pattern classification problems. T2MFs and their FOUs are tuned according to the uncertainties in the training dataset by a combination of genetic algorithm (GA) and crossvalidation techniques. In our GA-based approach, the structure of the chromosome has fewer genes than other GA methods and chromosome initialization is more precise. The proposed approach addresses the application of the interval type-2 fuzzy logic system (IT2FLS) for the problem of nodule classification in a lung Computer Aided Detection (CAD) system. The designed IT2FLS is compared with its type-1 fuzzy logic system (T1FLS) counterpart. The results demonstrate that the IT2FLS outperforms the T1FLS by more than 30% in terms of classification accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, kernel-based Machine Learning methods have gained great popularity in many data analysis and data mining fields: pattern recognition, biocomputing, speech and vision, engineering, remote sensing etc. The paper describes the use of kernel methods to approach the processing of large datasets from environmental monitoring networks. Several typical problems of the environmental sciences and their solutions provided by kernel-based methods are considered: classification of categorical data (soil type classification), mapping of environmental and pollution continuous information (pollution of soil by radionuclides), mapping with auxiliary information (climatic data from Aral Sea region). The promising developments, such as automatic emergency hot spot detection and monitoring network optimization are discussed as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The interpretation of the Wechsler Intelligence Scale for Children-Fourth Edition (WISC-IV) is based on a 4-factor model, which is only partially compatible with the mainstream Cattell-Horn-Carroll (CHC) model of intelligence measurement. The structure of cognitive batteries is frequently analyzed via exploratory factor analysis and/or confirmatory factor analysis. With classical confirmatory factor analysis, almost all crossloadings between latent variables and measures are fixed to zero in order to allow the model to be identified. However, inappropriate zero cross-loadings can contribute to poor model fit, distorted factors, and biased factor correlations; most important, they do not necessarily faithfully reflect theory. To deal with these methodological and theoretical limitations, we used a new statistical approach, Bayesian structural equation modeling (BSEM), among a sample of 249 French-speaking Swiss children (8-12 years). With BSEM, zero-fixed cross-loadings between latent variables and measures are replaced by approximate zeros, based on informative, small-variance priors. Results indicated that a direct hierarchical CHC-based model with 5 factors plus a general intelligence factor better represented the structure of the WISC-IV than did the 4-factor structure and the higher order models. Because a direct hierarchical CHC model was more adequate, it was concluded that the general factor should be considered as a breadth rather than a superordinate factor. Because it was possible for us to estimate the influence of each of the latent variables on the 15 subtest scores, BSEM allowed improvement of the understanding of the structure of intelligence tests and the clinical interpretation of the subtest scores.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The activation of the specific immune response against tumor cells is based on the recognition by the CD8+ Cytotoxic Τ Lymphocytes (CTL), of antigenic peptides (p) presented at the surface of the cell by the class I major histocompatibility complex (MHC). The ability of the so-called T-Cell Receptors (TCR) to discriminate between self and non-self peptides constitutes the most important specific control mechanism against infected cells. The TCR/pMHC interaction has been the subject of much attention in cancer therapy since the design of the adoptive transfer approach, in which Τ lymphocytes presenting an interesting response against tumor cells are extracted from the patient, expanded in vitro, and reinfused after immunodepletion, possibly leading to cancer regression. In the last decade, major progress has been achieved by the introduction of engineered lypmhocytes. In the meantime, the understanding of the molecular aspects of the TCRpMHC interaction has become essential to guide in vitro and in vivo studies. In 1996, the determination of the first structure of a TCRpMHC complex by X-ray crystallography revealed the molecular basis of the interaction. Since then, molecular modeling techniques have taken advantage of crystal structures to study the conformational space of the complex, and understand the specificity of the recognition of the pMHC by the TCR. In the meantime, experimental techniques used to determine the sequences of TCR that bind to a pMHC complex have been used intensively, leading to the collection of large repertoires of TCR sequences that are specific for a given pMHC. There is a growing need for computational approaches capable of predicting the molecular interactions that occur upon TCR/pMHC binding without relying on the time consuming resolution of a crystal structure. This work presents new approaches to analyze the molecular principles that govern the recognition of the pMHC by the TCR and the subsequent activation of the T-cell. We first introduce TCRep 3D, a new method to model and study the structural properties of TCR repertoires, based on homology and ab initio modeling. We discuss the methodology in details, and demonstrate that it outperforms state of the art modeling methods in predicting relevant TCR conformations. Two successful applications of TCRep 3D that supported experimental studies on TCR repertoires are presented. Second, we present a rigid body study of TCRpMHC complexes that gives a fair insight on the TCR approach towards pMHC. We show that the binding mode of the TCR is correctly described by long-distance interactions. Finally, the last section is dedicated to a detailed analysis of an experimental hydrogen exchange study, which suggests that some regions of the constant domain of the TCR are subject to conformational changes upon binding to the pMHC. We propose a hypothesis of the structural signaling of TCR molecules leading to the activation of the T-cell. It is based on the analysis of correlated motions in the TCRpMHC structure. - L'activation de la réponse immunitaire spécifique dirigée contre les cellules tumorales est basée sur la reconnaissance par les Lymphocytes Τ Cytotoxiques (CTL), d'un peptide antigénique (p) présenté à la suface de la cellule par le complexe majeur d'histocompatibilité de classe I (MHC). La capacité des récepteurs des lymphocytes (TCR) à distinguer les peptides endogènes des peptides étrangers constitue le mécanisme de contrôle le plus important dirigé contre les cellules infectées. L'interaction entre le TCR et le pMHC est le sujet de beaucoup d'attention dans la thérapie du cancer, depuis la conception de la méthode de transfer adoptif: les lymphocytes capables d'une réponse importante contre les cellules tumorales sont extraits du patient, amplifiés in vitro, et réintroduits après immunosuppression. Il peut en résulter une régression du cancer. Ces dix dernières années, d'importants progrès ont été réalisés grâce à l'introduction de lymphocytes modifiés par génie génétique. En parallèle, la compréhension du TCRpMHC au niveau moléculaire est donc devenue essentielle pour soutenir les études in vitro et in vivo. En 1996, l'obtention de la première structure du complexe TCRpMHC à l'aide de la cristallographie par rayons X a révélé les bases moléculaires de l'interaction. Depuis lors, les techniques de modélisation moléculaire ont exploité les structures expérimentales pour comprendre la spécificité de la reconnaissance du pMHC par le TCR. Dans le même temps, de nouvelles techniques expérimentales permettant de déterminer la séquence de TCR spécifiques envers un pMHC donné, ont été largement exploitées. Ainsi, d'importants répertoires de TCR sont devenus disponibles, et il est plus que jamais nécessaire de développer des approches informatiques capables de prédire les interactions moléculaires qui ont lieu lors de la liaison du TCR au pMHC, et ce sans dépendre systématiquement de la résolution d'une structure cristalline. Ce mémoire présente une nouvelle approche pour analyser les principes moléculaires régissant la reconnaissance du pMHC par le TCR, et l'activation du lymphocyte qui en résulte. Dans un premier temps, nous présentons TCRep 3D, une nouvelle méthode basée sur les modélisations par homologie et ab initio, pour l'étude de propriétés structurales des répertoires de TCR. Le procédé est discuté en détails et comparé à des approches standard. Nous démontrons ainsi que TCRep 3D est le plus performant pour prédire des conformations pertinentes du TCR. Deux applications à des études expérimentales des répertoires TCR sont ensuite présentées. Dans la seconde partie de ce travail nous présentons une étude de complexes TCRpMHC qui donne un aperçu intéressant du mécanisme d'approche du pMHC par le TCR. Finalement, la dernière section se concentre sur l'analyse détaillée d'une étude expérimentale basée sur les échanges deuterium/hydrogène, dont les résultats révèlent que certaines régions clés du domaine constant du TCR sont sujettes à un changement conformationnel lors de la liaison au pMHC. Nous proposons une hypothèse pour la signalisation structurelle des TCR, menant à l'activation du lymphocyte. Celle-ci est basée sur l'analyse des mouvements corrélés observés dans la structure du TCRpMHC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The detection of Parkinson's disease (PD) in its preclinical stages prior to outright neurodegeneration is essential to the development of neuroprotective therapies and could reduce the number of misdiagnosed patients. However, early diagnosis is currently hampered by lack of reliable biomarkers. (1) H magnetic resonance spectroscopy (MRS) offers a noninvasive measure of brain metabolite levels that allows the identification of such potential biomarkers. This study aimed at using MRS on an ultrahigh field 14.1 T magnet to explore the striatal metabolic changes occurring in two different rat models of the disease. Rats lesioned by the injection of 6-hydroxydopamine (6-OHDA) in the medial-forebrain bundle were used to model a complete nigrostriatal lesion while a genetic model based on the nigral injection of an adeno-associated viral (AAV) vector coding for the human α-synuclein was used to model a progressive neurodegeneration and dopaminergic neuron dysfunction, thereby replicating conditions closer to early pathological stages of PD. MRS measurements in the striatum of the 6-OHDA rats revealed significant decreases in glutamate and N-acetyl-aspartate levels and a significant increase in GABA level in the ipsilateral hemisphere compared with the contralateral one, while the αSyn overexpressing rats showed a significant increase in the GABA striatal level only. Therefore, we conclude that MRS measurements of striatal GABA levels could allow for the detection of early nigrostriatal defects prior to outright neurodegeneration and, as such, offers great potential as a sensitive biomarker of presymptomatic PD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Protein-protein interactions encode the wiring diagram of cellular signaling pathways and their deregulations underlie a variety of diseases, such as cancer. Inhibiting protein-protein interactions with peptide derivatives is a promising way to develop new biological and therapeutic tools. Here, we develop a general framework to computationally handle hundreds of non-natural amino acid sidechains and predict the effect of inserting them into peptides or proteins. We first generate all structural files (pdb and mol2), as well as parameters and topologies for standard molecular mechanics software (CHARMM and Gromacs). Accurate predictions of rotamer probabilities are provided using a novel combined knowledge and physics based strategy. Non-natural sidechains are useful to increase peptide ligand binding affinity. Our results obtained on non-natural mutants of a BCL9 peptide targeting beta-catenin show very good correlation between predicted and experimental binding free-energies, indicating that such predictions can be used to design new inhibitors. Data generated in this work, as well as PyMOL and UCSF Chimera plug-ins for user-friendly visualization of non-natural sidechains, are all available at http://www.swisssidechain.ch. Our results enable researchers to rapidly and efficiently work with hundreds of non-natural sidechains.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The complex network dynamics that arise from the interaction of the brain's structural and functional architectures give rise to mental function. Theoretical models demonstrate that the structure-function relation is maximal when the global network dynamics operate at a critical point of state transition. In the present work, we used a dynamic mean-field neural model to fit empirical structural connectivity (SC) and functional connectivity (FC) data acquired in humans and macaques and developed a new iterative-fitting algorithm to optimize the SC matrix based on the FC matrix. A dramatic improvement of the fitting of the matrices was obtained with the addition of a small number of anatomical links, particularly cross-hemispheric connections, and reweighting of existing connections. We suggest that the notion of a critical working point, where the structure-function interplay is maximal, may provide a new way to link behavior and cognition, and a new perspective to understand recovery of function in clinical conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neuroimaging studies typically compare experimental conditions using average brain responses, thereby overlooking the stimulus-related information conveyed by distributed spatio-temporal patterns of single-trial responses. Here, we take advantage of this rich information at a single-trial level to decode stimulus-related signals in two event-related potential (ERP) studies. Our method models the statistical distribution of the voltage topographies with a Gaussian Mixture Model (GMM), which reduces the dataset to a number of representative voltage topographies. The degree of presence of these topographies across trials at specific latencies is then used to classify experimental conditions. We tested the algorithm using a cross-validation procedure in two independent EEG datasets. In the first ERP study, we classified left- versus right-hemifield checkerboard stimuli for upper and lower visual hemifields. In a second ERP study, when functional differences cannot be assumed, we classified initial versus repeated presentations of visual objects. With minimal a priori information, the GMM model provides neurophysiologically interpretable features - vis à vis voltage topographies - as well as dynamic information about brain function. This method can in principle be applied to any ERP dataset testing the functional relevance of specific time periods for stimulus processing, the predictability of subject's behavior and cognitive states, and the discrimination between healthy and clinical populations.