946 resultados para incremental computation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

To determine the diagnostic accuracy of physicians' prior probability estimates of serious infection in critically ill neonates and children, we conducted a prospective cohort study in 2 intensive care units. Using available clinical, laboratory, and radiographic information, 27 physicians provided 2567 probability estimates for 347 patients (follow-up rate, 92%). The median probability estimate of infection increased from 0% (i.e., no antibiotic treatment or diagnostic work-up for sepsis), to 2% on the day preceding initiation of antibiotic therapy, to 20% at initiation of antibiotic treatment (P<.001). At initiation of treatment, predictions discriminated well between episodes subsequently classified as proven infection and episodes ultimately judged unlikely to be infection (area under the curve, 0.88). Physicians also showed a good ability to predict blood culture-positive sepsis (area under the curve, 0.77). Treatment and testing thresholds were derived from the provided predictions and treatment rates. Physicians' prognoses regarding the presence of serious infection were remarkably precise. Studies investigating the value of new tests for diagnosis of sepsis should establish that they add incremental value to physicians' judgment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El procés de fusió de dues o més imatges de la mateixa escena en una d'única i més gran és conegut com a Image Mosaicing. Un cop finalitzat el procés de construcció d'un mosaic, els límits entre les imatges són habitualment visibles, degut a imprecisions en els registres fotomètric i geomètric. L'Image Blending és l'etapa del procediment de mosaicing a la que aquests artefactes són minimitzats o suprimits. Existeixen diverses metodologies a la literatura que tracten aquests problemes, però la majoria es troben orientades a la creació de panorames terrestres, imatges artístiques d'alta resolució o altres aplicacions a les quals el posicionament de la càmera o l'adquisició de les imatges no són etapes rellevants. El treball amb imatges subaquàtiques presenta desafiaments importants, degut a la presència d'scattering (reflexions de partícules en suspensió) i atenuació de la llum i a condicions físiques extremes a milers de metres de profunditat, amb control limitat dels sistemes d'adquisició i la utilització de tecnologia d'alt cost. Imatges amb il·luminació artificial similar, sense llum global com la oferta pel sol, han de ser unides sense mostrar una unió perceptible. Les imatges adquirides a gran profunditat presenten una qualitat altament depenent de la profunditat, i la seva degradació amb aquest factor és molt rellevant. El principal objectiu del treball és presentar dels principals problemes de la imatge subaquàtica, seleccionar les estratègies més adequades i tractar tota la seqüència adquisició-procesament-visualització del procés. Els resultats obtinguts demostren que la solució desenvolupada, basada en una Estratègia de Selecció de Límit Òptim, Fusió en el Domini del Gradient a les regions comunes i Emfatització Adaptativa d'Imatges amb baix nivell de detall permet obtenir uns resultats amb una alta qualitat. També s'ha proposat una estratègia, amb possibilitat d'implementació paral·lela, que permet processar mosaics de kilòmetres d'extensió amb resolució de centímetres per píxel.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El análisis de criterios clásicos de rentabilidad, como la Tasa Interna de Rendimiento o el Cociente Beneficio/Coste, revela que, contra lo que se suponía, concuerdan con el criterio Valor Actual Neto si se aplican correctamente. Lo mismo ocurre con los viejos criterios Valor Final Neto y Anualidad Equivalente y los nuevos Demora Máxima de Beneficios y Plazo de Recuperación de Costes. Se demuestra, además, que para elegir entre dos proyectos mutuamente excluyentes, la aplicación de los criterios citados al proyecto diferencia o incremental es una condición suficiente para que exista concordancia con el criterio Valor Actual Neto.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study examined the effects of intermittent hypoxic training (IHT) on skeletal muscle monocarboxylate lactate transporter (MCT) expression and anaerobic performance in trained athletes. Cyclists were assigned to two interventions, either normoxic (N; n = 8; 150 mmHg PIO2) or hypoxic (H; n = 10; ∼3000 m, 100 mmHg PIO2) over a three week training (5×1 h-1h30.week-1) period. Prior to and after training, an incremental exercise test to exhaustion (EXT) was performed in normoxia together with a 2 min time trial (TT). Biopsy samples from the vastus lateralis were analyzed for MCT1 and MCT4 using immuno-blotting techniques. The peak power output (PPO) increased (p<0.05) after training (7.2% and 6.6% for N and H, respectively), but VO2max showed no significant change. The average power output in the TT improved significantly (7.3% and 6.4% for N and H, respectively). No differences were found in MCT1 and MCT4 protein content, before and after the training in either the N or H group. These results indicate there are no additional benefits of IHT when compared to similar normoxic training. Hence, the addition of the hypoxic stimulus on anaerobic performance or MCT expression after a three-week training period is ineffective.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabajo analiza el rendimiento del algoritmo de alineamiento de secuencias conocido como Needleman-Wunsch, sobre 3 sistemas de cómputo multiprocesador diferentes. Se analiza y se codifica el algoritmo serie usando el lenguaje de programación C y se plantean una serie de optimizaciones con la finalidad de minimizar el volumen y el tiempo de cómputo. Posteriormente, se realiza un análisis de las prestaciones del programa sobre los diferentes sistemas de cómputo. En la segunda parte del trabajo, se paraleliza el algoritmo serie y se codifica ayudándonos de OpenMP. El resultado son dos variantes del programa que difieren en la relación entre la cantidad de cómputo y la de comunicación. En la primera variante, la comunicación entre procesadores es poco frecuente y se realiza tras largos periodos de ejecución (granularidad gruesa). En cambio, en la segunda variante las tareas individuales son relativamente pequeñas en término de tiempo de ejecución y la comunicación entre los procesadores es frecuente (granularidad fina). Ambas variantes se ejecutan y analizan en arquitecturas multicore que explotan el paralelismo a nivel de thread. Los resultados obtenidos muestran la importancia de entender y saber analizar el efecto del multicore y multithreading en el rendimiento.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study describes the validation of a new wearable system for assessment of 3D spatial parameters of gait. The new method is based on the detection of temporal parameters, coupled to optimized fusion and de-drifted integration of inertial signals. Composed of two wirelesses inertial modules attached on feet, the system provides stride length, stride velocity, foot clearance, and turning angle parameters at each gait cycle, based on the computation of 3D foot kinematics. Accuracy and precision of the proposed system were compared to an optical motion capture system as reference. Its repeatability across measurements (test-retest reliability) was also evaluated. Measurements were performed in 10 young (mean age 26.1±2.8 years) and 10 elderly volunteers (mean age 71.6±4.6 years) who were asked to perform U-shaped and 8-shaped walking trials, and then a 6-min walking test (6MWT). A total of 974 gait cycles were used to compare gait parameters with the reference system. Mean accuracy±precision was 1.5±6.8cm for stride length, 1.4±5.6cm/s for stride velocity, 1.9±2.0cm for foot clearance, and 1.6±6.1° for turning angle. Difference in gait performance was observed between young and elderly volunteers during the 6MWT particularly in foot clearance. The proposed method allows to analyze various aspects of gait, including turns, gait initiation and termination, or inter-cycle variability. The system is lightweight, easy to wear and use, and suitable for clinical application requiring objective evaluation of gait outside of the lab environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We evaluate the performance of different optimization techniques developed in the context of optical flowcomputation with different variational models. In particular, based on truncated Newton methods (TN) that have been an effective approach for large-scale unconstrained optimization, we develop the use of efficient multilevel schemes for computing the optical flow. More precisely, we evaluate the performance of a standard unidirectional multilevel algorithm - called multiresolution optimization (MR/OPT), to a bidrectional multilevel algorithm - called full multigrid optimization (FMG/OPT). The FMG/OPT algorithm treats the coarse grid correction as an optimization search direction and eventually scales it using a line search. Experimental results on different image sequences using four models of optical flow computation show that the FMG/OPT algorithm outperforms both the TN and MR/OPT algorithms in terms of the computational work and the quality of the optical flow estimation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Summary (in English) Computer simulations provide a practical way to address scientific questions that would be otherwise intractable. In evolutionary biology, and in population genetics in particular, the investigation of evolutionary processes frequently involves the implementation of complex models, making simulations a particularly valuable tool in the area. In this thesis work, I explored three questions involving the geographical range expansion of populations, taking advantage of spatially explicit simulations coupled with approximate Bayesian computation. First, the neutral evolutionary history of the human spread around the world was investigated, leading to a surprisingly simple model: A straightforward diffusion process of migrations from east Africa throughout a world map with homogeneous landmasses replicated to very large extent the complex patterns observed in real human populations, suggesting a more continuous (as opposed to structured) view of the distribution of modern human genetic diversity, which may play a better role as a base model for further studies. Second, the postglacial evolution of the European barn owl, with the formation of a remarkable coat-color cline, was inspected with two rounds of simulations: (i) determine the demographic background history and (ii) test the probability of a phenotypic cline, like the one observed in the natural populations, to appear without natural selection. We verified that the modern barn owl population originated from a single Iberian refugium and that they formed their color cline, not due to neutral evolution, but with the necessary participation of selection. The third and last part of this thesis refers to a simulation-only study inspired by the barn owl case above. In this chapter, we showed that selection is, indeed, effective during range expansions and that it leaves a distinguished signature, which can then be used to detect and measure natural selection in range-expanding populations. Résumé (en français) Les simulations fournissent un moyen pratique pour répondre à des questions scientifiques qui seraient inabordable autrement. En génétique des populations, l'étude des processus évolutifs implique souvent la mise en oeuvre de modèles complexes, et les simulations sont un outil particulièrement précieux dans ce domaine. Dans cette thèse, j'ai exploré trois questions en utilisant des simulations spatialement explicites dans un cadre de calculs Bayésiens approximés (approximate Bayesian computation : ABC). Tout d'abord, l'histoire de la colonisation humaine mondiale et de l'évolution de parties neutres du génome a été étudiée grâce à un modèle étonnement simple. Un processus de diffusion des migrants de l'Afrique orientale à travers un monde avec des masses terrestres homogènes a reproduit, dans une très large mesure, les signatures génétiques complexes observées dans les populations humaines réelles. Un tel modèle continu (opposé à un modèle structuré en populations) pourrait être très utile comme modèle de base dans l'étude de génétique humaine à l'avenir. Deuxièmement, l'évolution postglaciaire d'un gradient de couleur chez l'Effraie des clocher (Tyto alba) Européenne, a été examiné avec deux séries de simulations pour : (i) déterminer l'histoire démographique de base et (ii) tester la probabilité qu'un gradient phénotypique, tel qu'observé dans les populations naturelles puisse apparaître sans sélection naturelle. Nous avons montré que la population actuelle des chouettes est sortie d'un unique refuge ibérique et que le gradient de couleur ne peux pas s'être formé de manière neutre (sans l'action de la sélection naturelle). La troisième partie de cette thèse se réfère à une étude par simulations inspirée par l'étude de l'Effraie. Dans ce dernier chapitre, nous avons montré que la sélection est, en effet, aussi efficace dans les cas d'expansion d'aire de distribution et qu'elle laisse une signature unique, qui peut être utilisée pour la détecter et estimer sa force.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Anti-self/tumor T cell function can be improved by increasing TCR-peptide MHC (pMHC) affinity within physiological limits, but paradoxically further increases (K(d) < 1 μM) lead to drastic functional declines. Using human CD8(+) T cells engineered with TCRs of incremental affinity for the tumor antigen HLA-A2/NY-ESO-1, we investigated the molecular mechanisms underlying this high-affinity-associated loss of function. As compared with cells expressing TCR affinities generating optimal function (K(d) = 5 to 1 μM), those with supraphysiological affinity (K(d) = 1 μM to 15 nM) showed impaired gene expression, signaling, and surface expression of activatory/costimulatory receptors. Preferential expression of the inhibitory receptor programmed cell death-1 (PD-1) was limited to T cells with the highest TCR affinity, correlating with full functional recovery upon PD-1 ligand 1 (PD-L1) blockade. In contrast, upregulation of the Src homology 2 domain-containing phosphatase 1 (SHP-1/PTPN6) was broad, with gradually enhanced expression in CD8(+) T cells with increasing TCR affinities. Consequently, pharmacological inhibition of SHP-1 with sodium stibogluconate augmented the function of all engineered T cells, and this correlated with the TCR affinity-dependent levels of SHP-1. These data highlight an unexpected and global role of SHP-1 in regulating CD8(+) T cell activation and responsiveness and support the development of therapies inhibiting protein tyrosine phosphatases to enhance T cell-mediated immunity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction. Respiratory difficulties in athletes are common, especially in adolescents, even in the absence of exercise-induced bronchoconstriction. Immaturity of the respiratory muscles coupling at high respiratory rates could be a potential mechanism. Whether respiratory muscle training (RMT) can positively influence it is yet unknown. Goal. We investigate the effects of RMT on ventilation and performance parameters in adolescent athletes and hypothesize that RMT will enhance respiratory capacity. Methods. 12 healthy subjects (8 male, 4 female, 17±0.5 years) from a sports/study high school class, competitively involved in various sports (minimum of 10 hours per week) underwent respiratory function testing, maximal minute ventilation (MMV) measurements and a maximal treadmill incremental test with VO2max and ventilatory thresholds (VT1 and VT2) determination. They then underwent one month of RMT (4 times/week) using a eucapnic hyperventilation device, with an incremental training program. The same tests were repeated after RMT. Results. Subjects completed 14.8 sessions of RMT, with an increase in total ventilation per session of 211±29% during training. Borg scale evaluation of the RMT session was unchanged or reduced in all subjects, despite an increase in total respiratory work. No changes (p>0.05) were observed pre/post RMT in VO2max (53.4±7.5 vs 51.6±7.7 ml/kg/min), VT2 (14.4±1.4 vs 14.0±1.1 km/h) or Speed max at end of test (16.1±1.7 vs 15.8±1.7 km/h). MVV increased by 9.2% (176.7±36.9 vs 192.9±32.6 l/min, p<0.001) and FVC by 3.3% (6.70±0.75 vs 4.85±0.76 litres, p<0.05). Subjective evaluation of respiratory sensations during exercise and daily living were also improved. Conclusions. RMT improves MMV and FVC in adolescent athletes, along with important subjective respiratory benefits, although no changes are seen in treadmill maximal performance tests and VO2max measurements. RMT can be easily performed in adolescent without side effects, with a potential for improvement in training capacity and overall well-being.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Simultaneous polydrug use (SPU) may represent a greater incremental risk factor for human health than concurrent polydrug use (CPU). However, few studies have examined these patterns of use in relation to health issues, particularly with regard to the number of drugs used. Methods: In the present study, we have analyzed data from a representative sample of 5734 young Swiss males from the Cohort Study on Substance Use Risk Factors. Exposure to drugs (i.e., alcohol, tobacco, cannabis, and 15 other illicit drugs), as well as mental, social and physical factors, were studied through regression analysis. Results: We found that individuals engaging in CPU and SPU followed the known stages of drug use, involving initial experiences with licit drugs (e.g., alcohol and tobacco), followed by use of cannabis and then other illicit drugs. In this regard, two classes of illicit drugs were identified, including first uppers, hallucinogens and sniffed drugs; and then "harder" drugs (ketamine, heroin, and crystal meth), which were only consumed by polydrug users who were already taking numerous drugs. Moreover, we observed an association between the number of drugs used simultaneously and social issues (i.e., social consequences and aggressiveness). In fact, the more often the participants simultaneously used substances, the more likely they were to experience social problems. In contrast, we did not find any relationship between SPU and depression, anxiety, health consequences, or health. Conclusions: We identified some associations with SPU that were independent of CPU. Moreover, we found that the number of concurrently used drugs can be a strong factor associated with mental and physical health, although their simultaneous use may not significantly contribute to this association. Finally, the negative effects related to the use of one substance might be counteracted by the use of an additional substance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To measure the contribution of individual transactions inside the total risk of a credit portfolio is a major issue in financial institutions. VaR Contributions (VaRC) and Expected Shortfall Contributions (ESC) have become two popular ways of quantifying the risks. However, the usual Monte Carlo (MC) approach is known to be a very time consuming method for computing these risk contributions. In this paper we consider the Wavelet Approximation (WA) method for Value at Risk (VaR) computation presented in [Mas10] in order to calculate the Expected Shortfall (ES) and the risk contributions under the Vasicek one-factor model framework. We decompose the VaR and the ES as a sum of sensitivities representing the marginal impact on the total portfolio risk. Moreover, we present technical improvements in the Wavelet Approximation (WA) that considerably reduce the computational effort in the approximation while, at the same time, the accuracy increases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Report for the scientific sojourn at the University of Reading, United Kingdom, from January until May 2008. The main objectives have been firstly to infer population structure and parameters in demographic models using a total of 13 microsatellite loci for genotyping approximately 30 individuals per population in 10 Palinurus elephas populations both from Mediterranean and Atlantic waters. Secondly, developing statistical methods to identify discrepant loci, possibly under selection and implement those methods using the R software environment. It is important to consider that the calculation of the probability distribution of the demographic and mutational parameters for a full genetic data set is numerically difficult for complex demographic history (Stephens 2003). The Approximate Bayesian Computation (ABC), based on summary statistics to infer posterior distributions of variable parameters without explicit likelihood calculations, can surmount this difficulty. This would allow to gather information on different demographic prior values (i.e. effective population sizes, migration rate, microsatellite mutation rate, mutational processes) and assay the sensitivity of inferences to demographic priors by assuming different priors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Projecte de recerca elaborat a partir d’una estada a la National University of Singapore Singapur, entre juliol i octubre del 2007. Donada l'explosió de la música a l'internet i la ràpida expansió de les col•leccions de música digital, un repte clau en l'àrea de la informació musical és el desenvolupament de sistemes de processament musical eficients i confiables. L'objectiu de la investigació proposada ha estat treballar en diferents aspectes de l'extracció, modelatge i processat del contingut musical. En particular, s’ha treballat en l'extracció, l'anàlisi i la manipulació de descriptors d'àudio de baix nivell, el modelatge de processos musicals, l'estudi i desenvolupament de tècniques d'aprenentatge automàtic per a processar àudio, i la identificació i extracció d'atributs musicals d'alt nivell. S’han revisat i millorat alguns components d'anàlisis d'àudio i revisat components per a l'extracció de descriptors inter-nota i intra-nota en enregistraments monofónics d'àudio. S’ha aplicat treball previ en Tempo a la formalització de diferents tasques musicals. Finalment, s’ha investigat el processat d'alt nivell de música basandonos en el seu contingut. Com exemple d'això, s’ha investigat com músics professionals expressen i comuniquen la seva interpretació del contingut musical i emocional de peces musicals, i hem usat aquesta informació per a identificar automàticament intèrprets. S’han estudiat les desviacions en paràmetres com to, temps, amplitud i timbre a nivell inter-nota i intra-nota.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract : In the subject of fingerprints, the rise of computers tools made it possible to create powerful automated search algorithms. These algorithms allow, inter alia, to compare a fingermark to a fingerprint database and therefore to establish a link between the mark and a known source. With the growth of the capacities of these systems and of data storage, as well as increasing collaboration between police services on the international level, the size of these databases increases. The current challenge for the field of fingerprint identification consists of the growth of these databases, which makes it possible to find impressions that are very similar but coming from distinct fingers. However and simultaneously, this data and these systems allow a description of the variability between different impressions from a same finger and between impressions from different fingers. This statistical description of the withinand between-finger variabilities computed on the basis of minutiae and their relative positions can then be utilized in a statistical approach to interpretation. The computation of a likelihood ratio, employing simultaneously the comparison between the mark and the print of the case, the within-variability of the suspects' finger and the between-variability of the mark with respect to a database, can then be based on representative data. Thus, these data allow an evaluation which may be more detailed than that obtained by the application of rules established long before the advent of these large databases or by the specialists experience. The goal of the present thesis is to evaluate likelihood ratios, computed based on the scores of an automated fingerprint identification system when the source of the tested and compared marks is known. These ratios must support the hypothesis which it is known to be true. Moreover, they should support this hypothesis more and more strongly with the addition of information in the form of additional minutiae. For the modeling of within- and between-variability, the necessary data were defined, and acquired for one finger of a first donor, and two fingers of a second donor. The database used for between-variability includes approximately 600000 inked prints. The minimal number of observations necessary for a robust estimation was determined for the two distributions used. Factors which influence these distributions were also analyzed: the number of minutiae included in the configuration and the configuration as such for both distributions, as well as the finger number and the general pattern for between-variability, and the orientation of the minutiae for within-variability. In the present study, the only factor for which no influence has been shown is the orientation of minutiae The results show that the likelihood ratios resulting from the use of the scores of an AFIS can be used for evaluation. Relatively low rates of likelihood ratios supporting the hypothesis known to be false have been obtained. The maximum rate of likelihood ratios supporting the hypothesis that the two impressions were left by the same finger when the impressions came from different fingers obtained is of 5.2 %, for a configuration of 6 minutiae. When a 7th then an 8th minutia are added, this rate lowers to 3.2 %, then to 0.8 %. In parallel, for these same configurations, the likelihood ratios obtained are on average of the order of 100,1000, and 10000 for 6,7 and 8 minutiae when the two impressions come from the same finger. These likelihood ratios can therefore be an important aid for decision making. Both positive evolutions linked to the addition of minutiae (a drop in the rates of likelihood ratios which can lead to an erroneous decision and an increase in the value of the likelihood ratio) were observed in a systematic way within the framework of the study. Approximations based on 3 scores for within-variability and on 10 scores for between-variability were found, and showed satisfactory results. Résumé : Dans le domaine des empreintes digitales, l'essor des outils informatisés a permis de créer de puissants algorithmes de recherche automatique. Ces algorithmes permettent, entre autres, de comparer une trace à une banque de données d'empreintes digitales de source connue. Ainsi, le lien entre la trace et l'une de ces sources peut être établi. Avec la croissance des capacités de ces systèmes, des potentiels de stockage de données, ainsi qu'avec une collaboration accrue au niveau international entre les services de police, la taille des banques de données augmente. Le défi actuel pour le domaine de l'identification par empreintes digitales consiste en la croissance de ces banques de données, qui peut permettre de trouver des impressions très similaires mais provenant de doigts distincts. Toutefois et simultanément, ces données et ces systèmes permettent une description des variabilités entre différentes appositions d'un même doigt, et entre les appositions de différents doigts, basées sur des larges quantités de données. Cette description statistique de l'intra- et de l'intervariabilité calculée à partir des minuties et de leurs positions relatives va s'insérer dans une approche d'interprétation probabiliste. Le calcul d'un rapport de vraisemblance, qui fait intervenir simultanément la comparaison entre la trace et l'empreinte du cas, ainsi que l'intravariabilité du doigt du suspect et l'intervariabilité de la trace par rapport à une banque de données, peut alors se baser sur des jeux de données représentatifs. Ainsi, ces données permettent d'aboutir à une évaluation beaucoup plus fine que celle obtenue par l'application de règles établies bien avant l'avènement de ces grandes banques ou par la seule expérience du spécialiste. L'objectif de la présente thèse est d'évaluer des rapports de vraisemblance calcul és à partir des scores d'un système automatique lorsqu'on connaît la source des traces testées et comparées. Ces rapports doivent soutenir l'hypothèse dont il est connu qu'elle est vraie. De plus, ils devraient soutenir de plus en plus fortement cette hypothèse avec l'ajout d'information sous la forme de minuties additionnelles. Pour la modélisation de l'intra- et l'intervariabilité, les données nécessaires ont été définies, et acquises pour un doigt d'un premier donneur, et deux doigts d'un second donneur. La banque de données utilisée pour l'intervariabilité inclut environ 600000 empreintes encrées. Le nombre minimal d'observations nécessaire pour une estimation robuste a été déterminé pour les deux distributions utilisées. Des facteurs qui influencent ces distributions ont, par la suite, été analysés: le nombre de minuties inclus dans la configuration et la configuration en tant que telle pour les deux distributions, ainsi que le numéro du doigt et le dessin général pour l'intervariabilité, et la orientation des minuties pour l'intravariabilité. Parmi tous ces facteurs, l'orientation des minuties est le seul dont une influence n'a pas été démontrée dans la présente étude. Les résultats montrent que les rapports de vraisemblance issus de l'utilisation des scores de l'AFIS peuvent être utilisés à des fins évaluatifs. Des taux de rapports de vraisemblance relativement bas soutiennent l'hypothèse que l'on sait fausse. Le taux maximal de rapports de vraisemblance soutenant l'hypothèse que les deux impressions aient été laissées par le même doigt alors qu'en réalité les impressions viennent de doigts différents obtenu est de 5.2%, pour une configuration de 6 minuties. Lorsqu'une 7ème puis une 8ème minutie sont ajoutées, ce taux baisse d'abord à 3.2%, puis à 0.8%. Parallèlement, pour ces mêmes configurations, les rapports de vraisemblance sont en moyenne de l'ordre de 100, 1000, et 10000 pour 6, 7 et 8 minuties lorsque les deux impressions proviennent du même doigt. Ces rapports de vraisemblance peuvent donc apporter un soutien important à la prise de décision. Les deux évolutions positives liées à l'ajout de minuties (baisse des taux qui peuvent amener à une décision erronée et augmentation de la valeur du rapport de vraisemblance) ont été observées de façon systématique dans le cadre de l'étude. Des approximations basées sur 3 scores pour l'intravariabilité et sur 10 scores pour l'intervariabilité ont été trouvées, et ont montré des résultats satisfaisants.