974 resultados para Symmetry Ratio Algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increasing number of bomb attacks involving improvised explosive devices, as well as the nature of the explosives, give rise to concern among safety and law enforcement agencies. The substances used in explosive charges are often everyday products diverted from their primary licit applications. Thus, reducing or limiting their accessibility for prevention purposes is difficult. Ammonium nitrate, employed in agriculture as a fertiliser, is used worldwide in small and large homemade bombs. Black powder, dedicated to hunting and shooting sports, is used illegally as a filling in pipe bombs causing extensive damage. If the main developments of instrumental techniques in explosive analysis have been constantly pushing the limits of detection, their actual contribution to the investigation of explosives in terms of source discrimination is limited. Forensic science has seen the emergence of a new technology, isotope ratio mass spectrometry (IRMS), that shows promising results. Its very first application in forensic science dates back to 1979. Liu et al. analysed cannabis plants coming from different countries [Liu et al. 1979]. This preliminary study highlighted its potential to discriminate specimens coming from different sources. Thirty years later, the keen interest in this new technology has given rise to a flourishing number of publications in forensic science. The countless applications of IRMS to a wide range of materials and substances attest to its success and suggest that the technique is ready to be used in forensic science. However, many studies are characterised by a lack of methodology and fundamental data. They have been undertaken in a top-down approach, applying this technique in an exploratory manner on a restricted sampling. This manner of procedure often does not allow the researcher to answer a number of questions, such as: do the specimens come from the same source, what do we mean by source or what is the inherent variability of a substance? The production of positive results has prevailed at the expense of forensic fundamentals. This research focused on the evaluation of the contribution of the information provided by isotopic analysis to the investigation of explosives. More specifically, this evaluation was based on a sampling of black powders and ammonium nitrate fertilisers coming from known sources. Not only has the methodology developed in this work enabled us to highlight crucial elements inherent to the methods themselves, but also to evaluate both the longitudinal and transversal variabilities of the information. First, the study of the variability of the profile over time was undertaken. Secondly, the variability of black powders and ammonium nitrate fertilisers within the same source and between different sources was evaluated. The contribution of this information to the investigation of explosives was then evaluated and discussed. --------------------------------------------------------------------------------------------------- Le nombre croissant d'attentats à la bombe impliquant des engins explosifs artisanaux, ainsi que la nature des charges explosives, constituent une préoccupation majeure pour les autorités d'application de la loi et les organismes de sécurité. Les substances utilisées dans les charges explosives sont souvent des produits du quotidien, détournés de leurs applications licites. Par conséquent, réduire ou limiter l'accessibilité de ces produits dans un but de prévention est difficile. Le nitrate d'ammonium, employé dans l'agriculture comme engrais, est utilisé dans des petits et grands engins explosifs artisanaux. La poudre noire, initialement dédiée à la chasse et au tir sportif, est fréquemment utilisée comme charge explosive dans les pipe bombs, qui causent des dommages importants. Si les développements des techniques d'analyse des explosifs n'ont cessé de repousser les limites de détection, leur contribution réelle à l'investigation des explosifs est limitée en termes de discrimination de sources. Une nouvelle technologie qui donne des résultats prometteurs a fait son apparition en science forensique: la spectrométrie de masse à rapport isotopique (IRMS). Sa première application en science forensique remonte à 1979. Liu et al. ont analysé des plants de cannabis provenant de différents pays [Liu et al. 1979]. Cette étude préliminaire, basée sur quelques analyses, a mis en évidence le potentiel de l'IRMS à discriminer des spécimens provenant de sources différentes. Trente ans plus tard, l'intérêt marqué pour cette nouvelle technologie en science forensique se traduit par un nombre florissant de publications. Les innombrables applications de l'IRMS à une large gamme de matériaux et de substances attestent de son succès et suggèrent que la technique est prête à être utilisée en science forensique. Cependant, de nombreuses études sont caractérisées par un manque de méthodologie et de données fondamentales. Elles ont été menées sans définir les hypothèses de recherche et en appliquant cette technique de façon exploratoire sur un échantillonnage restreint. Cette manière de procéder ne permet souvent pas au chercheur de répondre à un certain nombre de questions, tels que: est-ce que deux spécimens proviennent de la même source, qu'entend-on par source ou encore quelle est l'intravariabilité d'une substance? La production de résultats positifs a prévalu au détriment des fondamentaux de science forensique. Cette recherche s'est attachée à évaluer la contribution réelle de l'information isotopique dans les investigations en matière d'explosifs. Plus particulièrement, cette évaluation s'est basée sur un échantillonnage constitué de poudres noires et d'engrais à base de nitrate d'ammonium provenant de sources connues. La méthodologie développée dans ce travail a permis non seulement de mettre en évidence des éléments cruciaux relatifs à la méthode d'analyse elle-même, mais également d'évaluer la variabilité de l'information isotopique d'un point de vue longitudinal et transversal. Dans un premier temps, l'évolution du profil en fonction du temps a été étudiée. Dans un second temps, la variabilité du profil des poudres noires et des engrais à base de nitrate d'ammonium au sein d'une même source et entre différentes sources a été évaluée. La contribution de cette information dans le cadre des investigations d'explosifs a ensuite été discutée et évaluée.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The spontaneous activity of the brain shows different features at different scales. On one hand, neuroimaging studies show that long-range correlations are highly structured in spatiotemporal patterns, known as resting-state networks, on the other hand, neurophysiological reports show that short-range correlations between neighboring neurons are low, despite a large amount of shared presynaptic inputs. Different dynamical mechanisms of local decorrelation have been proposed, among which is feedback inhibition. Here, we investigated the effect of locally regulating the feedback inhibition on the global dynamics of a large-scale brain model, in which the long-range connections are given by diffusion imaging data of human subjects. We used simulations and analytical methods to show that locally constraining the feedback inhibition to compensate for the excess of long-range excitatory connectivity, to preserve the asynchronous state, crucially changes the characteristics of the emergent resting and evoked activity. First, it significantly improves the model's prediction of the empirical human functional connectivity. Second, relaxing this constraint leads to an unrealistic network evoked activity, with systematic coactivation of cortical areas which are components of the default-mode network, whereas regulation of feedback inhibition prevents this. Finally, information theoretic analysis shows that regulation of the local feedback inhibition increases both the entropy and the Fisher information of the network evoked responses. Hence, it enhances the information capacity and the discrimination accuracy of the global network. In conclusion, the local excitation-inhibition ratio impacts the structure of the spontaneous activity and the information transmission at the large-scale brain level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: We sought to improve upon previously published statistical modeling strategies for binary classification of dyslipidemia for general population screening purposes based on the waist-to-hip circumference ratio and body mass index anthropometric measurements. METHODS: Study subjects were participants in WHO-MONICA population-based surveys conducted in two Swiss regions. Outcome variables were based on the total serum cholesterol to high density lipoprotein cholesterol ratio. The other potential predictor variables were gender, age, current cigarette smoking, and hypertension. The models investigated were: (i) linear regression; (ii) logistic classification; (iii) regression trees; (iv) classification trees (iii and iv are collectively known as "CART"). Binary classification performance of the region-specific models was externally validated by classifying the subjects from the other region. RESULTS: Waist-to-hip circumference ratio and body mass index remained modest predictors of dyslipidemia. Correct classification rates for all models were 60-80%, with marked gender differences. Gender-specific models provided only small gains in classification. The external validations provided assurance about the stability of the models. CONCLUSIONS: There were no striking differences between either the algebraic (i, ii) vs. non-algebraic (iii, iv), or the regression (i, iii) vs. classification (ii, iv) modeling approaches. Anticipated advantages of the CART vs. simple additive linear and logistic models were less than expected in this particular application with a relatively small set of predictor variables. CART models may be more useful when considering main effects and interactions between larger sets of predictor variables.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper focuses on likelihood ratio based evaluations of fibre evidence in cases in which there is uncertainty about whether or not the reference item available for analysis - that is, an item typically taken from the suspect or seized at his home - is the item actually worn at the time of the offence. A likelihood ratio approach is proposed that, for situations in which certain categorical assumptions can be made about additionally introduced parameters, converges to formula described in existing literature. The properties of the proposed likelihood ratio approach are analysed through sensitivity analyses and discussed with respect to possible argumentative implications that arise in practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using numerical simulations we investigate shapes of random equilateral open and closed chains, one of the simplest models of freely fluctuating polymers in a solution. We are interested in the 3D density distribution of the modeled polymers where the polymers have been aligned with respect to their three principal axes of inertia. This type of approach was pioneered by Theodorou and Suter in 1985. While individual configurations of the modeled polymers are almost always nonsymmetric, the approach of Theodorou and Suter results in cumulative shapes that are highly symmetric. By taking advantage of asymmetries within the individual configurations, we modify the procedure of aligning independent configurations in a way that shows their asymmetry. This approach reveals, for example, that the 3D density distribution for linear polymers has a bean shape predicted theoretically by Kuhn. The symmetry-breaking approach reveals complementary information to the traditional, symmetrical, 3D density distributions originally introduced by Theodorou and Suter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The accumulation of mutations after long-lasting exposure to a failing combination antiretroviral therapy (cART) is problematic and severely reduces the options for further successful treatments. METHODS: We studied patients from the Swiss HIV Cohort Study who failed cART with nucleoside reverse transcriptase inhibitors (NRTIs) and either a ritonavir-boosted PI (PI/r) or a non-nucleoside reverse transcriptase inhibitor (NNRTI). The loss of genotypic activity <3, 3-6, >6 months after virological failure was analyzed with Stanford algorithm. Risk factors associated with early emergence of drug resistance mutations (<6 months after failure) were identified with multivariable logistic regression. RESULTS: Ninety-nine genotypic resistance tests from PI/r-treated and 129 from NNRTI-treated patients were analyzed. The risk of losing the activity of ≥1 NRTIs was lower among PI/r- compared to NNRTI-treated individuals <3, 3-6, and >6 months after failure: 8.8% vs. 38.2% (p = 0.009), 7.1% vs. 46.9% (p<0.001) and 18.9% vs. 60.9% (p<0.001). The percentages of patients who have lost PI/r activity were 2.9%, 3.6% and 5.4% <3, 3-6, >6 months after failure compared to 41.2%, 49.0% and 63.0% of those who have lost NNRTI activity (all p<0.001). The risk to accumulate an early NRTI mutation was strongly associated with NNRTI-containing cART (adjusted odds ratio: 13.3 (95% CI: 4.1-42.8), p<0.001). CONCLUSIONS: The loss of activity of PIs and NRTIs was low among patients treated with PI/r, even after long-lasting exposure to a failing cART. Thus, more options remain for second-line therapy. This finding is potentially of high relevance, in particular for settings with poor or lacking virological monitoring.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is well known that dichotomizing continuous data has the effect to decrease statistical power when the goal is to test for a statistical association between two variables. Modern researchers however are focusing not only on statistical significance but also on an estimation of the "effect size" (i.e., the strength of association between the variables) to judge whether a significant association is also clinically relevant. In this article, we are interested in the consequences of dichotomizing continuous data on the value of an effect size in some classical settings. It turns out that the conclusions will not be the same whether using a correlation or an odds ratio to summarize the strength of association between the variables: Whereas the value of a correlation is typically decreased by a factor pi/2 after each dichotomization, the value of an odds ratio is at the same time raised to the power 2. From a descriptive statistical point of view, it is thus not clear whether dichotomizing continuous data leads to a decrease or to an increase in the effect size, as illustrated using a data set to investigate the relationship between motor and intellectual functions in children and adolescents

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The most valuable pigment of the Roman wall paintings was the red color obtained from powdered cinnabar (Minium Cinnabaris pigment), the red mercury sulfide (HgS), which was brought from mercury (Hg) deposits in the Roman Empire. To address the question of whether sulfur isotope signatures can serve as a rapid method to establish the provenance of the red pigment in Roman frescoes, we have measured the sulfur isotope composition (delta(34) S value in parts per thousand VCDT) in samples of wall painting from the Roman city Aventicum (Avenches, Vaud, Switzerland) and compared them with values from cinnabar from European mercury deposits (Almaden in Spain, Idria in Slovenia, Monte Amiata in Italy, Moschellandsberg in Germany, and Genepy in France). Our study shows that the delta(34) S values of cinnabar from the studied Roman wall paintings fall within or near to the composition of Almaden cinnabar; thus, the provenance of the raw material may be deduced. This approach may provide information on provenance and authenticity in archaeological, restoration and forensic studies of Roman and Greek frescoes. Copyright (c) 2010 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The multiscale finite volume (MsFV) method has been developed to efficiently solve large heterogeneous problems (elliptic or parabolic); it is usually employed for pressure equations and delivers conservative flux fields to be used in transport problems. The method essentially relies on the hypothesis that the (fine-scale) problem can be reasonably described by a set of local solutions coupled by a conservative global (coarse-scale) problem. In most cases, the boundary conditions assigned for the local problems are satisfactory and the approximate conservative fluxes provided by the method are accurate. In numerically challenging cases, however, a more accurate localization is required to obtain a good approximation of the fine-scale solution. In this paper we develop a procedure to iteratively improve the boundary conditions of the local problems. The algorithm relies on the data structure of the MsFV method and employs a Krylov-subspace projection method to obtain an unconditionally stable scheme and accelerate convergence. Two variants are considered: in the first, only the MsFV operator is used; in the second, the MsFV operator is combined in a two-step method with an operator derived from the problem solved to construct the conservative flux field. The resulting iterative MsFV algorithms allow arbitrary reduction of the solution error without compromising the construction of a conservative flux field, which is guaranteed at any iteration. Since it converges to the exact solution, the method can be regarded as a linear solver. In this context, the schemes proposed here can be viewed as preconditioned versions of the Generalized Minimal Residual method (GMRES), with a very peculiar characteristic that the residual on the coarse grid is zero at any iteration (thus conservative fluxes can be obtained).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We compare correspondance análisis to the logratio approach based on compositional data. We also compare correspondance análisis and an alternative approach using Hellinger distance, for representing categorical data in a contingency table. We propose a coefficient which globally measures the similarity between these approaches. This coefficient can be decomposed into several components, one component for each principal dimension, indicating the contribution of the dimensions to the difference between the two representations. These three methods of representation can produce quite similar results. One illustrative example is given

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A low digit ratio (2D:4D) and low 2D:4D in the right compared with the left hand (right-left 2D:4D) are thought to be determined by high in utero concentrations of testosterone, and are related to "masculine" traits such as aggression and performance in sports like running and rugby. Low right-left 2D:4D is also related to sensitivity to testosterone as measured by the number of cytosine-adenine-guanine triplet repeats in exon 1 of the androgen receptor gene. Here we show that low right-left 2D:4D is associated with high maximal oxygen uptake (VO2(max)), high velocity at VO2(max), and high maximum lactate concentration in a sample of teenage boys. We suggest that low right-left 2D:4D is linked to performance in some sports because it is a proxy of high sensitivity to prenatal and maybe also circulating testosterone and high VO2(max).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a parallel architecture for estimation of the motion of an underwater robot. It is well known that image processing requires a huge amount of computation, mainly at low-level processing where the algorithms are dealing with a great number of data. In a motion estimation algorithm, correspondences between two images have to be solved at the low level. In the underwater imaging, normalised correlation can be a solution in the presence of non-uniform illumination. Due to its regular processing scheme, parallel implementation of the correspondence problem can be an adequate approach to reduce the computation time. Taking into consideration the complexity of the normalised correlation criteria, a new approach using parallel organisation of every processor from the architecture is proposed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantitatively assessing the importance or criticality of each link in a network is of practical value to operators, as that can help them to increase the network's resilience, provide more efficient services, or improve some other aspect of the service. Betweenness is a graph-theoretical measure of centrality that can be applied to communication networks to evaluate link importance. However, as we illustrate in this paper, the basic definition of betweenness centrality produces inaccurate estimations as it does not take into account some aspects relevant to networking, such as the heterogeneity in link capacity or the difference between node-pairs in their contribution to the total traffic. A new algorithm for discovering link centrality in transport networks is proposed in this paper. It requires only static or semi-static network and topology attributes, and yet produces estimations of good accuracy, as verified through extensive simulations. Its potential value is demonstrated by an example application. In the example, the simple shortest-path routing algorithm is improved in such a way that it outperforms other more advanced algorithms in terms of blocking ratio

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In computer graphics, global illumination algorithms take into account not only the light that comes directly from the sources, but also the light interreflections. This kind of algorithms produce very realistic images, but at a high computational cost, especially when dealing with complex environments. Parallel computation has been successfully applied to such algorithms in order to make it possible to compute highly-realistic images in a reasonable time. We introduce here a speculation-based parallel solution for a global illumination algorithm in the context of radiosity, in which we have taken advantage of the hierarchical nature of such an algorithm

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, different recovery methods applied at different network layers and time scales are used in order to enhance the network reliability. Each layer deploys its own fault management methods. However, current recovery methods are applied to only a specific layer. New protection schemes, based on the proposed partial disjoint path algorithm, are defined in order to avoid protection duplications in a multi-layer scenario. The new protection schemes also encompass shared segment backup computation and shared risk link group identification. A complete set of experiments proves the efficiency of the proposed methods in relation with previous ones, in terms of resources used to protect the network, the failure recovery time and the request rejection ratio