138 resultados para Maximum Ratio Combining
Resumo:
Species distribution models (SDMs) are widely used to explain and predict species ranges and environmental niches. They are most commonly constructed by inferring species' occurrence-environment relationships using statistical and machine-learning methods. The variety of methods that can be used to construct SDMs (e.g. generalized linear/additive models, tree-based models, maximum entropy, etc.), and the variety of ways that such models can be implemented, permits substantial flexibility in SDM complexity. Building models with an appropriate amount of complexity for the study objectives is critical for robust inference. We characterize complexity as the shape of the inferred occurrence-environment relationships and the number of parameters used to describe them, and search for insights into whether additional complexity is informative or superfluous. By building 'under fit' models, having insufficient flexibility to describe observed occurrence-environment relationships, we risk misunderstanding the factors shaping species distributions. By building 'over fit' models, with excessive flexibility, we risk inadvertently ascribing pattern to noise or building opaque models. However, model selection can be challenging, especially when comparing models constructed under different modeling approaches. Here we argue for a more pragmatic approach: researchers should constrain the complexity of their models based on study objective, attributes of the data, and an understanding of how these interact with the underlying biological processes. We discuss guidelines for balancing under fitting with over fitting and consequently how complexity affects decisions made during model building. Although some generalities are possible, our discussion reflects differences in opinions that favor simpler versus more complex models. We conclude that combining insights from both simple and complex SDM building approaches best advances our knowledge of current and future species ranges.
Resumo:
BACKGROUND: In vitro aggregating brain cell cultures containing all types of brain cells have been shown to be useful for neurotoxicological investigations. The cultures are used for the detection of nervous system-specific effects of compounds by measuring multiple endpoints, including changes in enzyme activities. Concentration-dependent neurotoxicity is determined at several time points. METHODS: A Markov model was set up to describe the dynamics of brain cell populations exposed to potentially neurotoxic compounds. Brain cells were assumed to be either in a healthy or stressed state, with only stressed cells being susceptible to cell death. Cells may have switched between these states or died with concentration-dependent transition rates. Since cell numbers were not directly measurable, intracellular lactate dehydrogenase (LDH) activity was used as a surrogate. Assuming that changes in cell numbers are proportional to changes in intracellular LDH activity, stochastic enzyme activity models were derived. Maximum likelihood and least squares regression techniques were applied for estimation of the transition rates. Likelihood ratio tests were performed to test hypotheses about the transition rates. Simulation studies were used to investigate the performance of the transition rate estimators and to analyze the error rates of the likelihood ratio tests. The stochastic time-concentration activity model was applied to intracellular LDH activity measurements after 7 and 14 days of continuous exposure to propofol. The model describes transitions from healthy to stressed cells and from stressed cells to death. RESULTS: The model predicted that propofol would affect stressed cells more than healthy cells. Increasing propofol concentration from 10 to 100 μM reduced the mean waiting time for transition to the stressed state by 50%, from 14 to 7 days, whereas the mean duration to cellular death reduced more dramatically from 2.7 days to 6.5 hours. CONCLUSION: The proposed stochastic modeling approach can be used to discriminate between different biological hypotheses regarding the effect of a compound on the transition rates. The effects of different compounds on the transition rate estimates can be quantitatively compared. Data can be extrapolated at late measurement time points to investigate whether costs and time-consuming long-term experiments could possibly be eliminated.
Resumo:
The hydrogen isotope ratio (HIR) of body water and, therefore, of all endogenously synthesized compounds in humans, is mainly affected by the HIR of ingested drinking water. As a consequence, the entire organism and all of its synthesized substrates will reflect alterations in the isotope ratio of drinking water, which depends on the duration of exposure. To investigate the effect of this change on endogenous urinary steroids relevant to doping-control analysis the hydrogen isotope composition of potable water was suddenly enriched from -50 to 200 0/00 and maintained at this level for two weeks for two individuals. The steroids under investigation were 5β-pregnane-3α,20α-diol, 5α-androst-16-en-3α-ol, 3α-hydroxy-5α-androstan-17-one (ANDRO), 3α-hydroxy-5β-androstan-17-one (ETIO), 5α-androstane-3α,17β-diol, and 5β-androstane-3α,17β-diol (excreted as glucuronides) and ETIO, ANDRO and 3β-hydroxyandrost-5-en-17-one (excreted as sulfates). The HIR of body water was estimated by determination of the HIR of total native urine, to trace the induced changes. The hydrogen in steroids is partly derived from the total amount of body water and cholesterol-enrichment could be calculated by use of these data. Although the sum of changes in the isotopic composition of body water was 150 0/00, shifts of approximately 30 0/00 were observed for urinary steroids. Parallel enrichment in their HIR was observed for most of the steroids, and none of the differences between the HIR of individual steroids was elevated beyond recently established thresholds. This finding is important to sports drug testing because it supports the intended use of this novel and complementary methodology even in cases where athletes have drunk water of different HIR, a plausible and, presumably, inevitable scenario while traveling.
Resumo:
Recently, the spin-echo full-intensity acquired localized (SPECIAL) spectroscopy technique was proposed to unite the advantages of short TEs on the order of milliseconds (ms) with full sensitivity and applied to in vivo rat brain. In the present study, SPECIAL was adapted and optimized for use on a clinical platform at 3T and 7T by combining interleaved water suppression (WS) and outer volume saturation (OVS), optimized sequence timing, and improved shimming using FASTMAP. High-quality single voxel spectra of human brain were acquired at TEs below or equal to 6 ms on a clinical 3T and 7T system for six volunteers. Narrow linewidths (6.6 +/- 0.6 Hz at 3T and 12.1 +/- 1.0 Hz at 7T for water) and the high signal-to-noise ratio (SNR) of the artifact-free spectra enabled the quantification of a neurochemical profile consisting of 18 metabolites with Cramér-Rao lower bounds (CRLBs) below 20% at both field strengths. The enhanced sensitivity and increased spectral resolution at 7T compared to 3T allowed a two-fold reduction in scan time, an increased precision of quantification for 12 metabolites, and the additional quantification of lactate with CRLB below 20%. Improved sensitivity at 7T was also demonstrated by a 1.7-fold increase in average SNR (= peak height/root mean square [RMS]-of-noise) per unit-time.
Resumo:
BACKGROUND: We estimated the heritability of three measures of glomerular filtration rate (GFR) in hypertensive families of African descent in the Seychelles (Indian Ocean). METHODS: Families with at least two hypertensive siblings and an average of two normotensive siblings were identified through a national hypertension register. Using the ASSOC program in SAGE (Statistical Analysis in Genetic Epidemiology), the age- and gender-adjusted narrow sense heritability of GFR was estimated by maximum likelihood assuming multivariate normality after power transformation. ASSOC can calculate the additive polygenic component of the variance of a trait from pedigree data in the presence of other familial correlations. The effects of body mass index (BMI), blood pressure, natriuresis, along with sodium to potassium ratio in urine and diabetes, were also tested as covariates. RESULTS: Inulin clearance, 24-hour creatinine clearance, and GFR based on the Cockcroft-Gault formula were available for 348 persons from 66 pedigrees. The age- and gender-adjusted correlations (+/- SE) were 0.51 (+/- 0.04) between inulin clearance and creatinine clearance, 0.53 (+/- 0.04) between inulin clearance and Cockcroft-Gault formula and 0.66 (+/- 0.03) between creatinine clearance and Cockcroft-Gault formula. The age- and gender-adjusted heritabilities (+/- SE) of GFR were 0.41 (+/- 0.10) for inulin clearance, 0.52 (+/- 0.13) for creatinine clearance, and 0.82 (+/- 0.09) for Cockcroft-Gault formula. Adjustment for BMI slightly lowered the correlations and heritabilities for all measurements whereas adjustment for blood pressure had virtually no effect. CONCLUSION: The significant heritability estimates of GFR in our sample of families of African descent confirm the familial aggregation of this trait and justify further analyses aimed at discovering genetic determinants of GFR.
Resumo:
The study of wave propagation at sonic frequency in soil leads to elasticity parameter determination. These parameters are compatible to those measured simultaneously by static loading. The acquisition of in situ elasticity parameter combined with laboratory description of the elastoplastic behaviour can lead to in situ elastoplastic curves. - L'étude de la propagation des ondes acoustiques permet la détermination des paramètres d'élasticité dans les sols. Ces paramètres sont cohérents avec des mesures statiques simultanées. L'acquisition des paramètres d'élasticité in situ associée à une description du comportement élasto-plastique mesuré en laboratoire permet d'obtenir des courbes d'élastoplasticité in situ.
Resumo:
Abstract : In the subject of fingerprints, the rise of computers tools made it possible to create powerful automated search algorithms. These algorithms allow, inter alia, to compare a fingermark to a fingerprint database and therefore to establish a link between the mark and a known source. With the growth of the capacities of these systems and of data storage, as well as increasing collaboration between police services on the international level, the size of these databases increases. The current challenge for the field of fingerprint identification consists of the growth of these databases, which makes it possible to find impressions that are very similar but coming from distinct fingers. However and simultaneously, this data and these systems allow a description of the variability between different impressions from a same finger and between impressions from different fingers. This statistical description of the withinand between-finger variabilities computed on the basis of minutiae and their relative positions can then be utilized in a statistical approach to interpretation. The computation of a likelihood ratio, employing simultaneously the comparison between the mark and the print of the case, the within-variability of the suspects' finger and the between-variability of the mark with respect to a database, can then be based on representative data. Thus, these data allow an evaluation which may be more detailed than that obtained by the application of rules established long before the advent of these large databases or by the specialists experience. The goal of the present thesis is to evaluate likelihood ratios, computed based on the scores of an automated fingerprint identification system when the source of the tested and compared marks is known. These ratios must support the hypothesis which it is known to be true. Moreover, they should support this hypothesis more and more strongly with the addition of information in the form of additional minutiae. For the modeling of within- and between-variability, the necessary data were defined, and acquired for one finger of a first donor, and two fingers of a second donor. The database used for between-variability includes approximately 600000 inked prints. The minimal number of observations necessary for a robust estimation was determined for the two distributions used. Factors which influence these distributions were also analyzed: the number of minutiae included in the configuration and the configuration as such for both distributions, as well as the finger number and the general pattern for between-variability, and the orientation of the minutiae for within-variability. In the present study, the only factor for which no influence has been shown is the orientation of minutiae The results show that the likelihood ratios resulting from the use of the scores of an AFIS can be used for evaluation. Relatively low rates of likelihood ratios supporting the hypothesis known to be false have been obtained. The maximum rate of likelihood ratios supporting the hypothesis that the two impressions were left by the same finger when the impressions came from different fingers obtained is of 5.2 %, for a configuration of 6 minutiae. When a 7th then an 8th minutia are added, this rate lowers to 3.2 %, then to 0.8 %. In parallel, for these same configurations, the likelihood ratios obtained are on average of the order of 100,1000, and 10000 for 6,7 and 8 minutiae when the two impressions come from the same finger. These likelihood ratios can therefore be an important aid for decision making. Both positive evolutions linked to the addition of minutiae (a drop in the rates of likelihood ratios which can lead to an erroneous decision and an increase in the value of the likelihood ratio) were observed in a systematic way within the framework of the study. Approximations based on 3 scores for within-variability and on 10 scores for between-variability were found, and showed satisfactory results. Résumé : Dans le domaine des empreintes digitales, l'essor des outils informatisés a permis de créer de puissants algorithmes de recherche automatique. Ces algorithmes permettent, entre autres, de comparer une trace à une banque de données d'empreintes digitales de source connue. Ainsi, le lien entre la trace et l'une de ces sources peut être établi. Avec la croissance des capacités de ces systèmes, des potentiels de stockage de données, ainsi qu'avec une collaboration accrue au niveau international entre les services de police, la taille des banques de données augmente. Le défi actuel pour le domaine de l'identification par empreintes digitales consiste en la croissance de ces banques de données, qui peut permettre de trouver des impressions très similaires mais provenant de doigts distincts. Toutefois et simultanément, ces données et ces systèmes permettent une description des variabilités entre différentes appositions d'un même doigt, et entre les appositions de différents doigts, basées sur des larges quantités de données. Cette description statistique de l'intra- et de l'intervariabilité calculée à partir des minuties et de leurs positions relatives va s'insérer dans une approche d'interprétation probabiliste. Le calcul d'un rapport de vraisemblance, qui fait intervenir simultanément la comparaison entre la trace et l'empreinte du cas, ainsi que l'intravariabilité du doigt du suspect et l'intervariabilité de la trace par rapport à une banque de données, peut alors se baser sur des jeux de données représentatifs. Ainsi, ces données permettent d'aboutir à une évaluation beaucoup plus fine que celle obtenue par l'application de règles établies bien avant l'avènement de ces grandes banques ou par la seule expérience du spécialiste. L'objectif de la présente thèse est d'évaluer des rapports de vraisemblance calcul és à partir des scores d'un système automatique lorsqu'on connaît la source des traces testées et comparées. Ces rapports doivent soutenir l'hypothèse dont il est connu qu'elle est vraie. De plus, ils devraient soutenir de plus en plus fortement cette hypothèse avec l'ajout d'information sous la forme de minuties additionnelles. Pour la modélisation de l'intra- et l'intervariabilité, les données nécessaires ont été définies, et acquises pour un doigt d'un premier donneur, et deux doigts d'un second donneur. La banque de données utilisée pour l'intervariabilité inclut environ 600000 empreintes encrées. Le nombre minimal d'observations nécessaire pour une estimation robuste a été déterminé pour les deux distributions utilisées. Des facteurs qui influencent ces distributions ont, par la suite, été analysés: le nombre de minuties inclus dans la configuration et la configuration en tant que telle pour les deux distributions, ainsi que le numéro du doigt et le dessin général pour l'intervariabilité, et la orientation des minuties pour l'intravariabilité. Parmi tous ces facteurs, l'orientation des minuties est le seul dont une influence n'a pas été démontrée dans la présente étude. Les résultats montrent que les rapports de vraisemblance issus de l'utilisation des scores de l'AFIS peuvent être utilisés à des fins évaluatifs. Des taux de rapports de vraisemblance relativement bas soutiennent l'hypothèse que l'on sait fausse. Le taux maximal de rapports de vraisemblance soutenant l'hypothèse que les deux impressions aient été laissées par le même doigt alors qu'en réalité les impressions viennent de doigts différents obtenu est de 5.2%, pour une configuration de 6 minuties. Lorsqu'une 7ème puis une 8ème minutie sont ajoutées, ce taux baisse d'abord à 3.2%, puis à 0.8%. Parallèlement, pour ces mêmes configurations, les rapports de vraisemblance sont en moyenne de l'ordre de 100, 1000, et 10000 pour 6, 7 et 8 minuties lorsque les deux impressions proviennent du même doigt. Ces rapports de vraisemblance peuvent donc apporter un soutien important à la prise de décision. Les deux évolutions positives liées à l'ajout de minuties (baisse des taux qui peuvent amener à une décision erronée et augmentation de la valeur du rapport de vraisemblance) ont été observées de façon systématique dans le cadre de l'étude. Des approximations basées sur 3 scores pour l'intravariabilité et sur 10 scores pour l'intervariabilité ont été trouvées, et ont montré des résultats satisfaisants.
Resumo:
The large spatial inhomogeneity in transmit B, field (B-1(+)) observable in human MR images at hi h static magnetic fields (B-0) severely impairs image quality. To overcome this effect in brain T-1-weighted images the, MPRAGE sequence was modified to generate two different images at different inversion times MP2RAGE By combining the two images in a novel fashion, it was possible to create T-1-weigthed images where the result image was free of proton density contrast, T-2* contrast, reception bias field, and, to first order transmit field inhomogeneity. MP2RAGE sequence parameters were optimized using Bloch equations to maximize contrast-to-noise ratio per unit of time between brain tissues and minimize the effect of B-1(+) variations through space. Images of high anatomical quality and excellent brain tissue differentiation suitable for applications such as segmentation and voxel-based morphometry were obtained at 3 and 7 T. From such T-1-weighted images, acquired within 12 min, high-resolution 3D T-1 maps were routinely calculated at 7 T with sub-millimeter voxel resolution (0.65-0.85 mm isotropic). T-1 maps were validated in phantom experiments. In humans, the T, values obtained at 7 T were 1.15 +/- 0.06 s for white matter (WM) and 1.92 +/- 0.16 s for grey matter (GM), in good agreement with literature values obtained at lower spatial resolution. At 3 T, where whole-brain acquisitions with 1 mm isotropic voxels were acquired in 8 min the T-1 values obtained (0.81 +/- 0.03 S for WM and 1.35 +/- 0.05 for GM) were once again found to be in very good agreement with values in the literature. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Altitudinal tree lines are mainly constrained by temperature, but can also be influenced by factors such as human activity, particularly in the European Alps, where centuries of agricultural use have affected the tree-line. Over the last decades this trend has been reversed due to changing agricultural practices and land-abandonment. We aimed to combine a statistical land-abandonment model with a forest dynamics model, to take into account the combined effects of climate and human land-use on the Alpine tree-line in Switzerland. Land-abandonment probability was expressed by a logistic regression function of degree-day sum, distance from forest edge, soil stoniness, slope, proportion of employees in the secondary and tertiary sectors, proportion of commuters and proportion of full-time farms. This was implemented in the TreeMig spatio-temporal forest model. Distance from forest edge and degree-day sum vary through feed-back from the dynamics part of TreeMig and climate change scenarios, while the other variables remain constant for each grid cell over time. The new model, TreeMig-LAb, was tested on theoretical landscapes, where the variables in the land-abandonment model were varied one by one. This confirmed the strong influence of distance from forest and slope on the abandonment probability. Degree-day sum has a more complex role, with opposite influences on land-abandonment and forest growth. TreeMig-LAb was also applied to a case study area in the Upper Engadine (Swiss Alps), along with a model where abandonment probability was a constant. Two scenarios were used: natural succession only (100% probability) and a probability of abandonment based on past transition proportions in that area (2.1% per decade). The former showed new forest growing in all but the highest-altitude locations. The latter was more realistic as to numbers of newly forested cells, but their location was random and the resulting landscape heterogeneous. Using the logistic regression model gave results consistent with observed patterns of land-abandonment: existing forests expanded and gaps closed, leading to an increasingly homogeneous landscape.
Resumo:
We extend PML theory to account for information on the conditional moments up to order four, but without assuming a parametric model, to avoid a risk of misspecification of the conditional distribution. The key statistical tool is the quartic exponential family, which allows us to generalize the PML2 and QGPML1 methods proposed in Gourieroux et al. (1984) to PML4 and QGPML2 methods, respectively. An asymptotic theory is developed. The key numerical tool that we use is the Gauss-Freud integration scheme that solves a computational problem that has previously been raised in several fields. Simulation exercises demonstrate the feasibility and robustness of the methods [Authors]
Resumo:
In gynodioecious species, sex expression is generally determined through cytoplasmic male sterility genes interacting with nuclear restorers of the male function. With dominant restorers, there may be an excess of females in the progeny of self-fertilized compared with cross-fertilized hermaphrodites. Moreover, the effect of inbreeding on late stages of the life cycle remains poorly explored. Here, we used hermaphrodites of the gynodioecious Silene vulgaris originating from three populations located in different valleys in the Alps to investigate the effects of two generations of self- and cross-fertilization on sex ratio and gender variation. We detected an increase in females in the progeny of selfed compared with outcrossed hermaphrodites and inbreeding depression for female and male fertility. Male fertility correlated positively with sex ratio differences between outbred and inbred progeny, suggesting that dominant restorers are likely to influence male fertility qualitatively and quantitatively in S. vulgaris. We argue that the excess of females in the progeny of selfed compared with outcrossed hermaphrodites and inbreeding depression for gamete production may contribute to the maintenance of females in gynodioecious populations of S. vulgaris because purging of the genetic load is less likely to occur.
Resumo:
Since GHB (gamma-hydroxybutyric acid) is naturally produced in the human body, clinical and forensic toxicologists must be able to discriminate between endogenous levels and a concentration resulting from exposure. To suggest an alternative to the use of interpretative concentration cut-offs, the detection of exogenous GHB in urine specimens was investigated by means of gas chromatography/combustion/isotope ratio mass spectrometry (GC/C/IRMS). GHB was isolated from urinary matrix by successive purification on Oasis MCX and Bond Elute SAX solid-phase extraction (SPE) cartridges prior to high-performance liquid chromatography (HPLC) fractioning using an Atlantis dC18 column eluted with a mixture of formic acid and methanol. Subsequent intramolecular esterification of GHB leading to the formation of gamma-butyrolactone (GBL) was carried out to avoid introduction of additional carbon atoms for carbon isotopic ratio analysis. A precision of 0.3 per thousand was determined using this IRMS method for samples at GHB concentrations of 10 mg/L. The (13)C/(12)C ratios of GHB in samples of subjects exposed to the drug ranged from -32.1 to -42.1 per thousand, whereas the results obtained for samples containing GHB of endogenous origin at concentration levels less than 10 mg/L were in the range -23.5 to -27.0 per thousand. Therefore, these preliminary results show that a possible discrimination between endogenous and exogenous GHB can be made using carbon isotopic ratio analyses.
Resumo:
The increasing number of bomb attacks involving improvised explosive devices, as well as the nature of the explosives, give rise to concern among safety and law enforcement agencies. The substances used in explosive charges are often everyday products diverted from their primary licit applications. Thus, reducing or limiting their accessibility for prevention purposes is difficult. Ammonium nitrate, employed in agriculture as a fertiliser, is used worldwide in small and large homemade bombs. Black powder, dedicated to hunting and shooting sports, is used illegally as a filling in pipe bombs causing extensive damage. If the main developments of instrumental techniques in explosive analysis have been constantly pushing the limits of detection, their actual contribution to the investigation of explosives in terms of source discrimination is limited. Forensic science has seen the emergence of a new technology, isotope ratio mass spectrometry (IRMS), that shows promising results. Its very first application in forensic science dates back to 1979. Liu et al. analysed cannabis plants coming from different countries [Liu et al. 1979]. This preliminary study highlighted its potential to discriminate specimens coming from different sources. Thirty years later, the keen interest in this new technology has given rise to a flourishing number of publications in forensic science. The countless applications of IRMS to a wide range of materials and substances attest to its success and suggest that the technique is ready to be used in forensic science. However, many studies are characterised by a lack of methodology and fundamental data. They have been undertaken in a top-down approach, applying this technique in an exploratory manner on a restricted sampling. This manner of procedure often does not allow the researcher to answer a number of questions, such as: do the specimens come from the same source, what do we mean by source or what is the inherent variability of a substance? The production of positive results has prevailed at the expense of forensic fundamentals. This research focused on the evaluation of the contribution of the information provided by isotopic analysis to the investigation of explosives. More specifically, this evaluation was based on a sampling of black powders and ammonium nitrate fertilisers coming from known sources. Not only has the methodology developed in this work enabled us to highlight crucial elements inherent to the methods themselves, but also to evaluate both the longitudinal and transversal variabilities of the information. First, the study of the variability of the profile over time was undertaken. Secondly, the variability of black powders and ammonium nitrate fertilisers within the same source and between different sources was evaluated. The contribution of this information to the investigation of explosives was then evaluated and discussed. --------------------------------------------------------------------------------------------------- Le nombre croissant d'attentats à la bombe impliquant des engins explosifs artisanaux, ainsi que la nature des charges explosives, constituent une préoccupation majeure pour les autorités d'application de la loi et les organismes de sécurité. Les substances utilisées dans les charges explosives sont souvent des produits du quotidien, détournés de leurs applications licites. Par conséquent, réduire ou limiter l'accessibilité de ces produits dans un but de prévention est difficile. Le nitrate d'ammonium, employé dans l'agriculture comme engrais, est utilisé dans des petits et grands engins explosifs artisanaux. La poudre noire, initialement dédiée à la chasse et au tir sportif, est fréquemment utilisée comme charge explosive dans les pipe bombs, qui causent des dommages importants. Si les développements des techniques d'analyse des explosifs n'ont cessé de repousser les limites de détection, leur contribution réelle à l'investigation des explosifs est limitée en termes de discrimination de sources. Une nouvelle technologie qui donne des résultats prometteurs a fait son apparition en science forensique: la spectrométrie de masse à rapport isotopique (IRMS). Sa première application en science forensique remonte à 1979. Liu et al. ont analysé des plants de cannabis provenant de différents pays [Liu et al. 1979]. Cette étude préliminaire, basée sur quelques analyses, a mis en évidence le potentiel de l'IRMS à discriminer des spécimens provenant de sources différentes. Trente ans plus tard, l'intérêt marqué pour cette nouvelle technologie en science forensique se traduit par un nombre florissant de publications. Les innombrables applications de l'IRMS à une large gamme de matériaux et de substances attestent de son succès et suggèrent que la technique est prête à être utilisée en science forensique. Cependant, de nombreuses études sont caractérisées par un manque de méthodologie et de données fondamentales. Elles ont été menées sans définir les hypothèses de recherche et en appliquant cette technique de façon exploratoire sur un échantillonnage restreint. Cette manière de procéder ne permet souvent pas au chercheur de répondre à un certain nombre de questions, tels que: est-ce que deux spécimens proviennent de la même source, qu'entend-on par source ou encore quelle est l'intravariabilité d'une substance? La production de résultats positifs a prévalu au détriment des fondamentaux de science forensique. Cette recherche s'est attachée à évaluer la contribution réelle de l'information isotopique dans les investigations en matière d'explosifs. Plus particulièrement, cette évaluation s'est basée sur un échantillonnage constitué de poudres noires et d'engrais à base de nitrate d'ammonium provenant de sources connues. La méthodologie développée dans ce travail a permis non seulement de mettre en évidence des éléments cruciaux relatifs à la méthode d'analyse elle-même, mais également d'évaluer la variabilité de l'information isotopique d'un point de vue longitudinal et transversal. Dans un premier temps, l'évolution du profil en fonction du temps a été étudiée. Dans un second temps, la variabilité du profil des poudres noires et des engrais à base de nitrate d'ammonium au sein d'une même source et entre différentes sources a été évaluée. La contribution de cette information dans le cadre des investigations d'explosifs a ensuite été discutée et évaluée.
Resumo:
The spontaneous activity of the brain shows different features at different scales. On one hand, neuroimaging studies show that long-range correlations are highly structured in spatiotemporal patterns, known as resting-state networks, on the other hand, neurophysiological reports show that short-range correlations between neighboring neurons are low, despite a large amount of shared presynaptic inputs. Different dynamical mechanisms of local decorrelation have been proposed, among which is feedback inhibition. Here, we investigated the effect of locally regulating the feedback inhibition on the global dynamics of a large-scale brain model, in which the long-range connections are given by diffusion imaging data of human subjects. We used simulations and analytical methods to show that locally constraining the feedback inhibition to compensate for the excess of long-range excitatory connectivity, to preserve the asynchronous state, crucially changes the characteristics of the emergent resting and evoked activity. First, it significantly improves the model's prediction of the empirical human functional connectivity. Second, relaxing this constraint leads to an unrealistic network evoked activity, with systematic coactivation of cortical areas which are components of the default-mode network, whereas regulation of feedback inhibition prevents this. Finally, information theoretic analysis shows that regulation of the local feedback inhibition increases both the entropy and the Fisher information of the network evoked responses. Hence, it enhances the information capacity and the discrimination accuracy of the global network. In conclusion, the local excitation-inhibition ratio impacts the structure of the spontaneous activity and the information transmission at the large-scale brain level.