954 resultados para Scale Invariant Features Transform (SIFT)
Resumo:
High-throughput technologies are now used to generate more than one type of data from the same biological samples. To properly integrate such data, we propose using co-modules, which describe coherent patterns across paired data sets, and conceive several modular methods for their identification. We first test these methods using in silico data, demonstrating that the integrative scheme of our Ping-Pong Algorithm uncovers drug-gene associations more accurately when considering noisy or complex data. Second, we provide an extensive comparative study using the gene-expression and drug-response data from the NCI-60 cell lines. Using information from the DrugBank and the Connectivity Map databases we show that the Ping-Pong Algorithm predicts drug-gene associations significantly better than other methods. Co-modules provide insights into possible mechanisms of action for a wide range of drugs and suggest new targets for therapy
Resumo:
Primary brain tumours are heterogeneous in histology, genetics, and outcome. Although WHO's classification of tumours of the CNS has greatly helped to standardise diagnostic criteria worldwide, it does not consider the substantial progress that has been made in the molecular classification of many brain tumours. Recent practice-changing clinical trials have defined a role for routine assessment of MGMT promoter methylation in glioblastomas in elderly people, and 1p and 19q codeletions in anaplastic oligodendroglial tumours. Moreover, large-scale molecular profiling approaches have identified new mutations in gliomas, affecting IDH1, IDH2, H3F3, ATRX, and CIC, which has allowed subclassification of gliomas into distinct molecular subgroups with characteristic features of age, localisation, and outcome. However, these molecular approaches cannot yet predict patients' benefit from therapeutic interventions. Similarly, transcriptome-based classification of medulloblastoma has delineated four variants that might now be candidate diseases in which to explore novel targeted agents.
Resumo:
Un estudi observacional de pacients amb LES, atesos al University College de London Hospital entre 1976 i 2005, es va dur a terme per revisar les diferències entre homes i dones amb lupus pel que fa a les característiques clíniques, serologia i resultats. 439 dones i 45 homes van ser identificats. L'edat mitjana al diagnòstic va ser de 29,3 anys (12,6), sense diferències significatives entre homes i dones. El sexe femení es va associar significativament amb la presència d'úlceres orals i Ig M ACA. No hi va haver diferències significatives en la comparació de les altres variables. Durant aquest període de seguiment de trenta anys, relativament poques diferències han sorgit al comparar les freqüències de les característiques clíniques i serològiques en homes y dones amb lupus.
Resumo:
Let A be a simple, separable C*-algebra of stable rank one. We prove that the Cuntz semigroup of C (T, A) is determined by its Murray-von Neumann semigroup of projections and a certain semigroup of lower semicontinuous functions (with values in the Cuntz semigroup of A). This result has two consequences. First, specializing to the case that A is simple, finite, separable and Z-stable, this yields a description of the Cuntz semigroup of C (T, A) in terms of the Elliott invariant of A. Second, suitably interpreted, it shows that the Elliott functor and the functor defined by the Cuntz semigroup of the tensor product with the algebra of continuous functions on the circle are naturally equivalent.
Resumo:
Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.
Resumo:
BACKGROUND: The recurrent ~600 kb 16p11.2 BP4-BP5 deletion is among the most frequent known genetic aetiologies of autism spectrum disorder (ASD) and related neurodevelopmental disorders. OBJECTIVE: To define the medical, neuropsychological, and behavioural phenotypes in carriers of this deletion. METHODS: We collected clinical data on 285 deletion carriers and performed detailed evaluations on 72 carriers and 68 intrafamilial non-carrier controls. RESULTS: When compared to intrafamilial controls, full scale intelligence quotient (FSIQ) is two standard deviations lower in carriers, and there is no difference between carriers referred for neurodevelopmental disorders and carriers identified through cascade family testing. Verbal IQ (mean 74) is lower than non-verbal IQ (mean 83) and a majority of carriers require speech therapy. Over 80% of individuals exhibit psychiatric disorders including ASD, which is present in 15% of the paediatric carriers. Increase in head circumference (HC) during infancy is similar to the HC and brain growth patterns observed in idiopathic ASD. Obesity, a major comorbidity present in 50% of the carriers by the age of 7 years, does not correlate with FSIQ or any behavioural trait. Seizures are present in 24% of carriers and occur independently of other symptoms. Malformations are infrequently found, confirming only a few of the previously reported associations. CONCLUSIONS: The 16p11.2 deletion impacts in a quantitative and independent manner FSIQ, behaviour and body mass index, possibly through direct influences on neural circuitry. Although non-specific, these features are clinically significant and reproducible. Lastly, this study demonstrates the necessity of studying large patient cohorts ascertained through multiple methods to characterise the clinical consequences of rare variants involved in common diseases.
Resumo:
A morphological study of the midgut of Lutzomyia intermedia, the primary vector of cutaneous leishmaniasis, in southeast Brazil, was conducted by light, scanning and transmission electron microscopy. The midgut is formed by a layer of epithelium of columnar cells on a non-cellular basal lamina, under which there is a musculature, which consists of circular and longitudinal muscular fibers. A tracheolar network is observed surrounding and penetrating in the musculature. Females were examined 12, 24, 48, 72 h and 5 days following a blood meal and were analyzed comparatively by transmission electron microscopy with starved females. In starved females, the epithelium of both the anterior and posterior sections of the midgut present whorl shaped rough endoplasmic reticulum. The posterior section does not present well-developed cellular structures such as mitochondria. Observations performed at 12, 24, 48 and 72 h after the blood meal showed morphological changes in the cellular structures in this section, and the presence of the peritrophic matrix up to 48 h after the blood meal. Digestion is almost complete and a few residues are detected in the lumen 72 h after blood feeding. Finally, on the 5th day after the blood meal all cellular structures present the original feature resembling that seen in starved sand flies. Morphometric data confirmed the morphological observations. Mitochondria, nuclei and microvilli of midgut epithelial cells are different in starved and blood fed females. The mitochondria present a similar profile in the epithelium of both the anterior and posterior section of the midgut, with higher dimension in starved females. The cell microvilli in the posterior section of the midgut of starved females are twice the size of those that had taken a blood meal. We concluded that there are changes in the midgut cellular structures of L. intermedia during the digestion of blood, which are in agreement with those described for other hematophagous diptera.
Resumo:
In the PhD thesis “Sound Texture Modeling” we deal with statistical modelling or textural sounds like water, wind, rain, etc. For synthesis and classification. Our initial model is based on a wavelet tree signal decomposition and the modeling of the resulting sequence by means of a parametric probabilistic model, that can be situated within the family of models trainable via expectation maximization (hidden Markov tree model ). Our model is able to capture key characteristics of the source textures (water, rain, fire, applause, crowd chatter ), and faithfully reproduces some of the sound classes. In terms of a more general taxonomy of natural events proposed by Graver, we worked on models for natural event classification and segmentation. While the event labels comprise physical interactions between materials that do not have textural propierties in their enterity, those segmentation models can help in identifying textural portions of an audio recording useful for analysis and resynthesis. Following our work on concatenative synthesis of musical instruments, we have developed a pattern-based synthesis system, that allows to sonically explore a database of units by means of their representation in a perceptual feature space. Concatenative syntyhesis with “molecules” built from sparse atomic representations also allows capture low-level correlations in perceptual audio features, while facilitating the manipulation of textural sounds based on their physical and perceptual properties. We have approached the problem of sound texture modelling for synthesis from different directions, namely a low-level signal-theoretic point of view through a wavelet transform, and a more high-level point of view driven by perceptual audio features in the concatenative synthesis setting. The developed framework provides unified approach to the high-quality resynthesis of natural texture sounds. Our research is embedded within the Metaverse 1 European project (2008-2011), where our models are contributting as low level building blocks within a semi-automated soundscape generation system.
Resumo:
Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending the corresponding approaches to the scale of a field site represents a major, and as-of-yet largely unresolved, challenge. To address this problem, we have developed downscaling procedure based on a non-linear Bayesian sequential simulation approach. The main objective of this algorithm is to estimate the value of the sparsely sampled hydraulic conductivity at non-sampled locations based on its relation to the electrical conductivity logged at collocated wells and surface resistivity measurements, which are available throughout the studied site. The in situ relationship between the hydraulic and electrical conductivities is described through a non-parametric multivariatekernel density function. Then a stochastic integration of low-resolution, large-scale electrical resistivity tomography (ERT) data in combination with high-resolution, local-scale downhole measurements of the hydraulic and electrical conductivities is applied. The overall viability of this downscaling approach is tested and validated by comparing flow and transport simulation through the original and the upscaled hydraulic conductivity fields. Our results indicate that the proposed procedure allows obtaining remarkably faithful estimates of the regional-scale hydraulic conductivity structure and correspondingly reliable predictions of the transport characteristics over relatively long distances.
Resumo:
A sizable fraction of T cells expressing the NK cell marker NK1.1 (NKT cells) bear a very conserved TCR, characterized by homologous invariant (inv.) TCR V alpha 24-J alpha Q and V alpha 14-J alpha 18 rearrangements in humans and mice, respectively, and are thus defined as inv. NKT cells. Because human inv. NKT cells recognize mouse CD1d in vitro, we wondered whether a human inv. V alpha 24 TCR could be selected in vivo by mouse ligands presented by CD1d, thereby supporting the development of inv. NKT cells in mice. Therefore, we generated transgenic (Tg) mice expressing the human inv. V alpha 24-J alpha Q TCR chain in all T cells. The expression of the human inv. V alpha 24 TCR in TCR C alpha(-/-) mice indeed rescues the development of inv. NKT cells, which home preferentially to the liver and respond to the CD1d-restricted ligand alpha-galactosylceramide (alpha-GalCer). However, unlike inv. NKT cells from non-Tg mice, the majority of NKT cells in V alpha 24 Tg mice display a double-negative phenotype, as well as a significant increase in TCR V beta 7 and a corresponding decrease in TCR V beta 8.2 use. Despite the forced expression of the human CD1d-restricted TCR in C alpha(-/-) mice, staining with mCD1d-alpha-GalCer tetramers reveals that the absolute numbers of peripheral CD1d-dependent T lymphocytes increase at most by 2-fold. This increase is accounted for mainly by an increased fraction of NK1.1(-) T cells that bind CD1d-alpha-GalCer tetramers. These findings indicate that human inv. V alpha 24 TCR supports the development of CD1d-dependent lymphocytes in mice, and argue for a tight homeostatic control on the total number of inv. NKT cells. Thus, human inv. V alpha 24 TCR-expressing mice are a valuable model to study different aspects of the inv. NKT cell subset.
Resumo:
Invariant Valpha14 (Valpha14i) NKT cells are a murine CD1d-dependent regulatory T cell subset characterized by a Valpha14-Jalpha18 rearrangement and expression of mostly Vbeta8.2 and Vbeta7. Whereas the TCR Vbeta domain influences the binding avidity of the Valpha14i TCR for CD1d-alpha-galactosylceramide complexes, with Vbeta8.2 conferring higher avidity binding than Vbeta7, a possible impact of the TCR Vbeta domain on Valpha14i NKT cell selection by endogenous ligands has not been studied. In this study, we show that thymic selection of Vbeta7(+), but not Vbeta8.2(+), Valpha14i NKT cells is favored in situations where endogenous ligand concentration or TCRalpha-chain avidity are suboptimal. Furthermore, thymic Vbeta7(+) Valpha14i NKT cells were preferentially selected in vitro in response to CD1d-dependent presentation of endogenous ligands or exogenously added self ligand isoglobotrihexosylceramide. Collectively, our data demonstrate that the TCR Vbeta domain influences the selection of Valpha14i NKT cells by endogenous ligands, presumably because Vbeta7 confers higher avidity binding.