58 resultados para INDEPENDENT COMPONENT ANALYSIS


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mismatch negativity (MMN) overlaps with other auditory event-related potential (ERP) components. We examined the ERPs of 50 9- to 11-year-old children for vowels /i/, /y/ and equivalent complex tones. The goal was to separate MMN from obligatory ERP components using principal component analysis and equal probability control condition. In addition to the contrast of the deviant minus standard response, we employed the contrast of the deviant minus control response, to see whether the obligatory processing contributes to MMN in children. When looking for differences in speech deviant minus standard contrast, MMN starts around 112 ms. However, when both contrasts are examined, MMN emerges for speech at 160 ms whereas for nonspeech MMN is observed at 112 ms regardless of contrast. We argue that this discriminative response to speech stimuli at 112 ms is obligatory in nature rather than reflecting change detection processing.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

To study the stress-induced effects caused by wounding under a new perspective, a metabolomic strategy based on HPLC-MS has been devised for the model plant Arabidopsis thaliana. To detect induced metabolites and precisely localise these compounds among the numerous constitutive metabolites, HPLC-MS analyses were performed in a two-step strategy. In a first step, rapid direct TOF-MS measurements of the crude leaf extract were performed with a ballistic gradient on a short LC-column. The HPLC-MS data were investigated by multivariate analysis as total mass spectra (TMS). Principal components analysis (PCA) and hierarchical cluster analysis (HCA) on principal coordinates were combined for data treatment. PCA and HCA demonstrated a clear clustering of plant specimens selecting the highest discriminating ions given by the complete data analysis, leading to the specific detection of discrete-induced ions (m/z values). Furthermore, pool constitution with plants of homogeneous behaviour was achieved for confirmatory analysis. In this second step, long high-resolution LC profilings on an UPLC-TOF-MS system were used on pooled samples. This allowed to precisely localise the putative biological marker induced by wounding and by specific extraction of accurate m/z values detected in the screening procedure with the TMS spectra.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tire traces can be observed on several crime scenes as vehicles are often used by criminals. The tread abrasion on the road, while braking or skidding, leads to the production of small rubber particles which can be collected for comparison purposes. This research focused on the statistical comparison of Py-GC/MS profiles of tire traces and tire treads. The optimisation of the analytical method was carried out using experimental designs. The aim was to determine the best pyrolysis parameters regarding the repeatability of the results. Thus, the pyrolysis factor effect could also be calculated. The pyrolysis temperature was found to be five time more important than time. Finally, a pyrolysis at 650 °C during 15 s was selected. Ten tires of different manufacturers and models were used for this study. Several samples were collected on each tire, and several replicates were carried out to study the variability within each tire (intravariability). More than eighty compounds were integrated for each analysis and the variability study showed that more than 75% presented a relative standard deviation (RSD) below 5% for the ten tires, thus supporting a low intravariability. The variability between the ten tires (intervariability) presented higher values and the ten most variant compounds had a RSD value above 13%, supporting their high potential of discrimination between the tires tested. Principal Component Analysis (PCA) was able to fully discriminate the ten tires with the help of the first three principal components. The ten tires were finally used to perform braking tests on a racetrack with a vehicle equipped with an anti-lock braking system. The resulting tire traces were adequately collected using sheets of white gelatine. As for tires, the intravariability for the traces was found to be lower than the intervariability. Clustering methods were carried out and the Ward's method based on the squared Euclidean distance was able to correctly group all of the tire traces replicates in the same cluster than the replicates of their corresponding tire. Blind tests on traces were performed and were correctly assigned to their tire source. These results support the hypothesis that the tested tires, of different manufacturers and models, can be discriminated by a statistical comparison of their chemical profiles. The traces were found to be not differentiable from their source but differentiable from all the other tires present in the subset. The results are promising and will be extended on a larger sample set.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A multivariate morphometric study of the Greater white-toothed shrew (C. russula) throughout its Palearctic range was carried out to search for patterns of geographic variation within the species boundary. Burnaby's and multiple group principal component analysis allowed the adjustment of raw data with respect to within-sample allometric variation. Multivariate 'size-free' results show a stepped dine with the phenotypical trait reduction and shape change from the eastern to the western Maghreb. Pleistocene fossil mandibles proved to have low phenetic distances with eastern populations (Tunisia, east Algeria) and it is argued that their character set is the primitive condition. The ancestral Mid-Pleistocene shrews lived in a relatively more humid climate. Gee-climatic changes in the north African range during the Quaternary provoked phenetic variation of C. russula and, it can be argued, evolution of the modern western C.r. yebalensis. A historical process can thus be assumed as the main cause of this categorical variation, by segmentation of the species range due to gee-climatic events. Morphometric discontinuity within the C. russula Maghreb range is shown to be congruent with karyological and biochemical studies. Moroccan and Tunisian shrews differ, for example, in NFa chromosomes and electrophoretical traits. A stasipatric process should be invoked to explain categorical variation in the Maghreb range. Colonization and divergence of insular populations results in more or less differentiated geographic races. The populations of Ibiza and Pantelleria are close to the species threshold (Nei's D greater than or equal to 0.1). The process of speciation undergone by the Greater white-toothed shrew results in a complex pattern of geographic variation, including both allopatric and non-allopatric modes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The current study investigated cognitive resource allocation in discourse processing by means of pupil dilation and behavioral measures. Short question-answer dialogs were presented to listeners. Either the context question queried a new information focus in the successive answer, or else the context query was corrected in the answer sentence (correction information). The information foci contained in the answer sentences were either adequately highlighted by prosodic means or not. Participants had to judge the adequacy of the focus prosody with respect to the preceding context question. Prosodic judgment accuracy was higher in the conditions bearing adequate focus prosody than in the conditions with inadequate focus prosody. Latency to peak pupil dilation was longer when new information foci were perceived compared to correction foci. Moreover, for the peak dilation, an interaction of focus type and prosody was found. Post hoc statistical tests revealed that prosodically adequate correction focus positions were processed with smaller peak dilation in comparison to all other dialog conditions. Thus, pupil dilation and results of a principal component analysis suggest an interaction of focus type and focus prosody in discourse processing.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Raman spectroscopy combined with chemometrics has recently become a widespread technique for the analysis of pharmaceutical solid forms. The application presented in this paper is the investigation of counterfeit medicines. This increasingly serious issue involves networks that are an integral part of industrialized organized crime. Efficient analytical tools are consequently required to fight against it. Quick and reliable authentication means are needed to allow the deployment of measures from the company and the authorities. For this purpose a method in two steps has been implemented here. The first step enables the identification of pharmaceutical tablets and capsules and the detection of their counterfeits. A nonlinear classification method, the Support Vector Machines (SVM), is computed together with a correlation with the database and the detection of Active Pharmaceutical Ingredient (API) peaks in the suspect product. If a counterfeit is detected, the second step allows its chemical profiling among former counterfeits in a forensic intelligence perspective. For this second step a classification based on Principal Component Analysis (PCA) and correlation distance measurements is applied to the Raman spectra of the counterfeits.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

To assess religious coping in schizophrenia, we developed and tested a clinical grid, as no validated questionnaire exists for this population. One hundred fifteen outpatients were interviewed. Results obtained by 2 clinicians were compared. Religion was central in the lives of 45% of patients, 60% used religion extensively to cope with their illness. Religion is a multifaceted construct. Principal component analysis elicited 4 factors: subjective dimension, collective dimension, synergy with psychiatric treatment, and ease of talking about religion with psychiatrist. Different associations were found between these factors and psychopathology, substance abuse, and psychosocial adaptation. The high prevalence of spirituality and religious coping clearly indicates the necessity of addressing spirituality in patient care. Our clinical grid is suitable for this purpose. It proved its applicability to a broad diversity of religious beliefs, even pathological ones. Interjudge reliability and construct validity were high and specific training is not required.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

AimOur aim was to understand the interplay of heterogeneous climatic and spatial landscapes in shaping the distribution of nuclear microsatellite variation in burrowing parrots, Cyanoliseus patagonus. Given the marked phenotypic differences between populations of burrowing parrots we hypothesized an important role of geographical as well climatic heterogeneity in the population structure of this species. LocationSouthern South America. MethodsWe applied a landscape genetics approach to investigate the explicit patterns of genetic spatial autocorrelation based on both geography and climate using spatial principal component analysis (sPCA). This necessitated a novel statistical estimation of the species climatic landscape, considering temperature- and precipitation-based variables separately to evaluate their weight in shaping the distribution of genetic variation in our model system. ResultsGeographical and climatic heterogeneity successfully explained molecular variance in burrowing parrots. sPCA divided the species distribution into two main areas, Patagonia and the pre-Andes, which were connected by an area of geographical and climatic transition. Moreover, sPCA revealed cryptic and conservation-relevant genetic structure: the pre-Andean populations and the transition localities were each divided into two groups, each management units for conservation. Main conclusionssPCA, a method originally developed for spatial genetics, allowed us to unravel the genetic structure related to spatial and climatic landscapes and to visualize these patterns in landscape space. These novel climatic inferences underscore the importance of our modified sPCA approach in revealing how climatic variables can drive cryptic patterns of genetic structure, making the approach potentially useful in the study of any species distributed over a climatically heterogeneous landscape.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In traffic accidents involving motorcycles, paint traces can be transferred from the rider's helmet or smeared onto its surface. These traces are usually in the form of chips or smears and are frequently collected for comparison purposes. This research investigates the physical and chemical characteristics of the coatings found on motorcycles helmets. An evaluation of the similarities between helmet and automotive coating systems was also performed.Twenty-seven helmet coatings from 15 different brands and 22 models were considered. One sample per helmet was collected and observed using optical microscopy. FTIR spectroscopy was then used and seven replicate measurements per layer were carried out to study the variability of each coating system (intravariability). Principal Component Analysis (PCA) and Hierarchical Cluster Analysis (HCA) were also performed on the infrared spectra of the clearcoats and basecoats of the data set. The most common systems were composed of two or three layers, consistently involving a clearcoat and basecoat. The coating systems of helmets with composite shells systematically contained a minimum of three layers. FTIR spectroscopy results showed that acrylic urethane and alkyd urethane were the most frequent binders used for clearcoats and basecoats. A high proportion of the coatings were differentiated (more than 95%) based on microscopic examinations. The chemical and physical characteristics of the coatings allowed the differentiation of all but one pair of helmets of the same brand, model and color. Chemometrics (PCA and HCA) corroborated classification based on visual comparisons of the spectra and allowed the study of the whole data set at once (i.e., all spectra of the same layer). Thus, the intravariability of each helmet and its proximity to the others (intervariability) could be more readily assessed. It was also possible to determine the most discriminative chemical variables based on the study of the PCA loadings. Chemometrics could therefore be used as a complementary decision-making tool when many spectra and replicates have to be taken into account. Similarities between automotive and helmet coating systems were highlighted, in particular with regard to automotive coating systems on plastic substrates (microscopy and FTIR). However, the primer layer of helmet coatings was shown to differ from the automotive primer. If the paint trace contains this layer, the risk of misclassification (i.e., helmet versus vehicle) is reduced. Nevertheless, a paint examiner should pay close attention to these similarities when analyzing paint traces, especially regarding smears or paint chips presenting an incomplete layer system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Combining theories on social trust and social capital with sociopsychological approaches and applying contextual analyses to Swiss and European survey data, this thesis examines under what circumstances generalised trust, often understood as public good, may not benefit everyone, but instead amplify inequality. The empirical investigation focuses on the Swiss context, but considers different scales of analysis. Two broader questions are addressed. First, might generalised trust imply more or less narrow visions of community and solidarity in different contexts? Applying nonlinear principal component analysis to aggregate indicators, Study 1 explores inclusive and exclusive types of social capital in Europe, measured as regional configurations of generalised trust, civic participation and attitudes towards diversity. Study 2 employs multilevel models to examine how generalised trust, as an individual predisposition and an aggregate climate at the level of Swiss cantons, is linked to equality- directed collective action intention versus radical right support. Second, might high-trust climates impact negatively on disadvantaged members of society, precisely because they reflect a normative discourse of social harmony that impedes recognition of inequality? Study 3 compares how climates of generalised trust at the level of Swiss micro-regions and subjective perceptions of neighbourhood cohesion moderate the negative relationship between socio-economic disadvantage and mental health. Overall, demonstrating beneficial, as well as counterintuitive effects of social trust, this thesis proposes a critical and contextualised approach to the sources and dynamics of social cohesion in democratic societies. -- Cette thèse combine des théories sur le capital social et la confiance sociale avec des approches psychosociales et s'appuie sur des analyses contextuelles de données d'enquêtes suisses et européennes, afin d'étudier dans quelles circonstances la confiance généralisée, souvent présentée comme un bien public, pourrait ne pas bénéficier à tout le monde, mais amplifier les inégalités. Les études empiriques, centrées sur le contexte suisse, intègrent différentes échelles d'analyse et investiguent deux questions principales. Premièrement, la confiance généralisée implique-t-elle des visions plus ou moins restrictives de la communauté et de la solidarité selon le contexte? Dans l'étude 1, une analyse à composantes principales non-linéaire sur des indicateurs agrégés permet d'explorer des types de capital social inclusif et exclusif en Europe, mesurés par des configurations régionales de confiance généralisée, de participation civique, et d'attitudes envers la diversité. L'étude 2 utilise des modèles multiniveaux afin d'analyser comment la confiance généralisée, en tant que prédisposition individuelle et climat agrégé au niveau des cantons suisses, est associée à l'intention de participer à des actions collectives en faveur de l'égalité ou, au contraire, à l'intention de voter pour la droite radicale. Deuxièmement, des climats de haute confiance peuvent-ils avoir un impact négatif sur des membres désavantagés de la société, précisément parce qu'ils reflètent un discours normatif d'harmonie sociale qui empêche la reconnaissance des inégalités? L'étude 3 analyse comment des climats de confiance au niveau des micro-régions suisses et la perception subjective de faire partie d'un environnement cohésif modèrent la relation négative entre le désavantage socio-économique et la santé mentale. En démontrant des effets bénéfiques mais aussi contre-intuitifs de la confiance sociale, cette thèse propose une approche critique et contextualisée des sources et dynamiques de la cohésion sociale dans les sociétés démocratiques.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Notre consommation en eau souterraine, en particulier comme eau potable ou pour l'irrigation, a considérablement augmenté au cours des années. De nombreux problèmes font alors leur apparition, allant de la prospection de nouvelles ressources à la remédiation des aquifères pollués. Indépendamment du problème hydrogéologique considéré, le principal défi reste la caractérisation des propriétés du sous-sol. Une approche stochastique est alors nécessaire afin de représenter cette incertitude en considérant de multiples scénarios géologiques et en générant un grand nombre de réalisations géostatistiques. Nous rencontrons alors la principale limitation de ces approches qui est le coût de calcul dû à la simulation des processus d'écoulements complexes pour chacune de ces réalisations. Dans la première partie de la thèse, ce problème est investigué dans le contexte de propagation de l'incertitude, oú un ensemble de réalisations est identifié comme représentant les propriétés du sous-sol. Afin de propager cette incertitude à la quantité d'intérêt tout en limitant le coût de calcul, les méthodes actuelles font appel à des modèles d'écoulement approximés. Cela permet l'identification d'un sous-ensemble de réalisations représentant la variabilité de l'ensemble initial. Le modèle complexe d'écoulement est alors évalué uniquement pour ce sousensemble, et, sur la base de ces réponses complexes, l'inférence est faite. Notre objectif est d'améliorer la performance de cette approche en utilisant toute l'information à disposition. Pour cela, le sous-ensemble de réponses approximées et exactes est utilisé afin de construire un modèle d'erreur, qui sert ensuite à corriger le reste des réponses approximées et prédire la réponse du modèle complexe. Cette méthode permet de maximiser l'utilisation de l'information à disposition sans augmentation perceptible du temps de calcul. La propagation de l'incertitude est alors plus précise et plus robuste. La stratégie explorée dans le premier chapitre consiste à apprendre d'un sous-ensemble de réalisations la relation entre les modèles d'écoulement approximé et complexe. Dans la seconde partie de la thèse, cette méthodologie est formalisée mathématiquement en introduisant un modèle de régression entre les réponses fonctionnelles. Comme ce problème est mal posé, il est nécessaire d'en réduire la dimensionnalité. Dans cette optique, l'innovation du travail présenté provient de l'utilisation de l'analyse en composantes principales fonctionnelles (ACPF), qui non seulement effectue la réduction de dimensionnalités tout en maximisant l'information retenue, mais permet aussi de diagnostiquer la qualité du modèle d'erreur dans cet espace fonctionnel. La méthodologie proposée est appliquée à un problème de pollution par une phase liquide nonaqueuse et les résultats obtenus montrent que le modèle d'erreur permet une forte réduction du temps de calcul tout en estimant correctement l'incertitude. De plus, pour chaque réponse approximée, une prédiction de la réponse complexe est fournie par le modèle d'erreur. Le concept de modèle d'erreur fonctionnel est donc pertinent pour la propagation de l'incertitude, mais aussi pour les problèmes d'inférence bayésienne. Les méthodes de Monte Carlo par chaîne de Markov (MCMC) sont les algorithmes les plus communément utilisés afin de générer des réalisations géostatistiques en accord avec les observations. Cependant, ces méthodes souffrent d'un taux d'acceptation très bas pour les problèmes de grande dimensionnalité, résultant en un grand nombre de simulations d'écoulement gaspillées. Une approche en deux temps, le "MCMC en deux étapes", a été introduite afin d'éviter les simulations du modèle complexe inutiles par une évaluation préliminaire de la réalisation. Dans la troisième partie de la thèse, le modèle d'écoulement approximé couplé à un modèle d'erreur sert d'évaluation préliminaire pour le "MCMC en deux étapes". Nous démontrons une augmentation du taux d'acceptation par un facteur de 1.5 à 3 en comparaison avec une implémentation classique de MCMC. Une question reste sans réponse : comment choisir la taille de l'ensemble d'entrainement et comment identifier les réalisations permettant d'optimiser la construction du modèle d'erreur. Cela requiert une stratégie itérative afin que, à chaque nouvelle simulation d'écoulement, le modèle d'erreur soit amélioré en incorporant les nouvelles informations. Ceci est développé dans la quatrième partie de la thèse, oú cette méthodologie est appliquée à un problème d'intrusion saline dans un aquifère côtier. -- Our consumption of groundwater, in particular as drinking water and for irrigation, has considerably increased over the years and groundwater is becoming an increasingly scarce and endangered resource. Nofadays, we are facing many problems ranging from water prospection to sustainable management and remediation of polluted aquifers. Independently of the hydrogeological problem, the main challenge remains dealing with the incomplete knofledge of the underground properties. Stochastic approaches have been developed to represent this uncertainty by considering multiple geological scenarios and generating a large number of realizations. The main limitation of this approach is the computational cost associated with performing complex of simulations in each realization. In the first part of the thesis, we explore this issue in the context of uncertainty propagation, where an ensemble of geostatistical realizations is identified as representative of the subsurface uncertainty. To propagate this lack of knofledge to the quantity of interest (e.g., the concentration of pollutant in extracted water), it is necessary to evaluate the of response of each realization. Due to computational constraints, state-of-the-art methods make use of approximate of simulation, to identify a subset of realizations that represents the variability of the ensemble. The complex and computationally heavy of model is then run for this subset based on which inference is made. Our objective is to increase the performance of this approach by using all of the available information and not solely the subset of exact responses. Two error models are proposed to correct the approximate responses follofing a machine learning approach. For the subset identified by a classical approach (here the distance kernel method) both the approximate and the exact responses are knofn. This information is used to construct an error model and correct the ensemble of approximate responses to predict the "expected" responses of the exact model. The proposed methodology makes use of all the available information without perceptible additional computational costs and leads to an increase in accuracy and robustness of the uncertainty propagation. The strategy explored in the first chapter consists in learning from a subset of realizations the relationship between proxy and exact curves. In the second part of this thesis, the strategy is formalized in a rigorous mathematical framework by defining a regression model between functions. As this problem is ill-posed, it is necessary to reduce its dimensionality. The novelty of the work comes from the use of functional principal component analysis (FPCA), which not only performs the dimensionality reduction while maximizing the retained information, but also allofs a diagnostic of the quality of the error model in the functional space. The proposed methodology is applied to a pollution problem by a non-aqueous phase-liquid. The error model allofs a strong reduction of the computational cost while providing a good estimate of the uncertainty. The individual correction of the proxy response by the error model leads to an excellent prediction of the exact response, opening the door to many applications. The concept of functional error model is useful not only in the context of uncertainty propagation, but also, and maybe even more so, to perform Bayesian inference. Monte Carlo Markov Chain (MCMC) algorithms are the most common choice to ensure that the generated realizations are sampled in accordance with the observations. Hofever, this approach suffers from lof acceptance rate in high dimensional problems, resulting in a large number of wasted of simulations. This led to the introduction of two-stage MCMC, where the computational cost is decreased by avoiding unnecessary simulation of the exact of thanks to a preliminary evaluation of the proposal. In the third part of the thesis, a proxy is coupled to an error model to provide an approximate response for the two-stage MCMC set-up. We demonstrate an increase in acceptance rate by a factor three with respect to one-stage MCMC results. An open question remains: hof do we choose the size of the learning set and identify the realizations to optimize the construction of the error model. This requires devising an iterative strategy to construct the error model, such that, as new of simulations are performed, the error model is iteratively improved by incorporating the new information. This is discussed in the fourth part of the thesis, in which we apply this methodology to a problem of saline intrusion in a coastal aquifer.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

PURPOSE: Statistical shape and appearance models play an important role in reducing the segmentation processing time of a vertebra and in improving results for 3D model development. Here, we describe the different steps in generating a statistical shape model (SSM) of the second cervical vertebra (C2) and provide the shape model for general use by the scientific community. The main difficulties in its construction are the morphological complexity of the C2 and its variability in the population. METHODS: The input dataset is composed of manually segmented anonymized patient computerized tomography (CT) scans. The alignment of the different datasets is done with the procrustes alignment on surface models, and then, the registration is cast as a model-fitting problem using a Gaussian process. A principal component analysis (PCA)-based model is generated which includes the variability of the C2. RESULTS: The SSM was generated using 92 CT scans. The resulting SSM was evaluated for specificity, compactness and generalization ability. The SSM of the C2 is freely available to the scientific community in Slicer (an open source software for image analysis and scientific visualization) with a module created to visualize the SSM using Statismo, a framework for statistical shape modeling. CONCLUSION: The SSM of the vertebra allows the shape variability of the C2 to be represented. Moreover, the SSM will enable semi-automatic segmentation and 3D model generation of the vertebra, which would greatly benefit surgery planning.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aim To disentangle the effects of environmental and geographical processes driving phylogenetic distances among clades of maritime pine (Pinus pinaster). To assess the implications for conservation management of combining molecular information with species distribution models (SDMs; which predict species distribution based on known occurrence records and on environmental variables). Location Western Mediterranean Basin and European Atlantic coast. Methods We undertook two cluster analyses for eight genetically defined pine clades based on climatic niche and genetic similarities. We assessed niche similarity by means of a principal component analysis and Schoener's D metric. To calculate genetic similarity, we used the unweighted pair group method with arithmetic mean based on Nei's distance using 266 single nucleotide polymorphisms. We then assessed the contribution of environmental and geographical distances to phylogenetic distance by means of Mantel regression with variance partitioning. Finally, we compared the projection obtained from SDMs fitted from the species level (SDMsp) and composed from the eight clade-level models (SDMcm). Results Genetically and environmentally defined clusters were identical. Environmental and geographical distances explained 12.6% of the phylogenetic distance variation and, overall, geographical and environmental overlap among clades was low. Large differences were detected between SDMsp and SDMcm (57.75% of disagreement in the areas predicted as suitable). Main conclusions The genetic structure within the maritime pine subspecies complex is primarily a consequence of its demographic history, as seen by the high proportion of unexplained variation in phylogenetic distances. Nevertheless, our results highlight the contribution of local environmental adaptation in shaping the lower-order, phylogeographical distribution patterns and spatial genetic structure of maritime pine: (1) genetically and environmentally defined clusters are consistent, and (2) environment, rather than geography, explained a higher proportion of variation in phylogenetic distance. SDMs, key tools in conservation management, better characterize the fundamental niche of the species when they include molecular information.