69 resultados para FINGERPRINT VERIFICATION


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this research is to assess the vulnerabilities of a high resolution fingerprint sensor when confronted with fake fingerprints. The study has not been focused on the decision outcome of the biometric device, but essentially on the scores obtained following the comparison between a query (genuine or fake) and a template using an AFIS system. To do this, fake fingerprints of 12 subjects have been produced with and without their cooperation. These fake fingerprints have been used alongside with real fingers. The study led to three major observations: First, genuine fingerprints produced scores higher than fake fingers (translating a closer proximity) and this tendency is observed considering each subject separately. Second, scores are however not sufficient as a single measure to differentiate these samples (fake from genuine) given the variation due to the donors themselves. That explains why fingerprint readers without vitality detection can be fooled. Third, production methods and subjects greatly influence the scores obtained for fake fingerprints.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a statistical model for the quantification of the weight of fingerprint evidence. Contrarily to previous models (generative and score-based models), our model proposes to estimate the probability distributions of spatial relationships, directions and types of minutiae observed on fingerprints for any given fingermark. Our model is relying on an AFIS algorithm provided by 3M Cogent and on a dataset of more than 4,000,000 fingerprints to represent a sample from a relevant population of potential sources. The performance of our model was tested using several hundreds of minutiae configurations observed on a set of 565 fingermarks. In particular, the effects of various sub-populations of fingers (i.e., finger number, finger general pattern) on the expected evidential value of our test configurations were investigated. The performance of our model indicates that the spatial relationship between minutiae carries more evidential weight than their type or direction. Our results also indicate that the AFIS component of our model directly enables us to assign weight to fingerprint evidence without the need for the additional layer of complex statistical modeling involved by the estimation of the probability distributions of fingerprint features. In fact, it seems that the AFIS component is more sensitive to the sub-population effects than the other components of the model. Overall, the data generated during this research project contributes to support the idea that fingerprint evidence is a valuable forensic tool for the identification of individuals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"This paper will discuss the major developments in the area of fingerprint" "identification that followed the publication of the National Research Council (NRC, of the US National Academies of Sciences) report in 2009 entitled: Strengthening Forensic Science in the United States: A Path Forward. The report portrayed an image of a field of expertise used for decades without the necessary scientific research-based underpinning. The advances since the report and the needs in selected areas of fingerprinting will be detailed. It includes the measurement of the accuracy, reliability, repeatability and reproducibility of the conclusions offered by fingerprint experts. The paper will also pay attention to the development of statistical models allow- ing assessment of fingerprint comparisons. As a corollary of these developments, the next challenge is to reconcile a traditional practice domi- nated by deterministic conclusions with the probabilistic logic of any statistical model. There is a call for greater candour and fingerprint experts will need to communicate differently on the strengths and limitations of their findings. Their testimony will have to go beyond the blunt assertion" "of the uniqueness of fingerprints or the opinion delivered ispe dixit."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The interpretation of fingerprint evidence depends on the judgments of fingerprint examiners. This study assessed the accuracy of different judgments made by fingerprint examiners following the Analysis, Comparison, and Evaluation (ACE) process. Each examiner was given five marks for analysis, comparison, and evaluation. We compared the experts' judgments against the ground truth and used an annotation platform to evaluate how Chinese fingerprint examiners document their comparisons during the identification process. The results showed that different examiners demonstrated different accuracy of judgments and different mechanisms to reach them.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Intraoperative imaging, in particular intraoperative MRI, is a developing area in neurosurgery and its role is currently being evaluated. Its role in epilepsy surgery has not been defined yet and its use has been limited. In our experience with a compact and mobile low-field intraoperative MRI system, a few epilepsy surgeries have been performed using this technique. As the integration of imaging and functional data plays an important role in the planning of epilepsy surgery, intraoperative verification of the surgical result may be highly valuable. Therefore, teams that have access to intraoperative MRI should be encouraged to use this technique prospectively to evaluate its current relevance in epilepsy surgery.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nanoparticles, a new tool to deter crime? The detection of fingermarks at a crime scene or on evidence related with a criminal affair constitutes one of the main tasks of the investigators. Fingerprints, due to their uniqueness and invariability in time, remain a key element of an identification process (being for suspects or victims). The main difficulty resides in the fact that, most of the time, fingermarks are not visible through naked eye due to their chemical composition and the small amount of material that is left on the scene. There are said to be latent and their detection requires the application of specific techniques (optical or chemical). If numerous efficient techniques currently exist, there is a continuing quest for developing new techniques or reagents with an enhanced sensitivity towards secretions and with an increased efficiency. This article gives an outline about some currently performed researches based on the use of functionalized nanoparticles to detect latent fingermarks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Determining the time since deposition of fingermarks may prove necessary in order to assess their relevance to criminal investigations. The crucial factor is the initial composition of fingermarks because it represents the starting point of any ageing model. This study mainly aimed to characterize the initial composition of fingerprints, which show a high variability between donors (inter-variability), but also to investigate the variations among fingerprints from the same donor (intra-variability). Solutions to reduce this initial variability using squalene and cholesterol as target compounds are proposed and should be further investigated. The influence of substrates was also evaluated and the initial composition was observed to be larger on porous surface than non-porous surfaces. Preliminary aging of fingerprints over 30 days was finally studied on a porous and a non-porous substrate to evaluate the potential for dating of fingermarks. Squalene was observed to decrease in a faster rate on a non-porous substrate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUCTION: The phase III EORTC 22033-26033/NCIC CE5 intergroup trial compares 50.4 Gy radiotherapy with up-front temozolomide in previously untreated low-grade glioma. We describe the digital EORTC individual case review (ICR) performed to evaluate protocol radiotherapy (RT) compliance. METHODS: Fifty-eight institutions were asked to submit 1-2 randomly selected cases. Digital ICR datasets were uploaded to the EORTC server and accessed by three central reviewers. Twenty-seven parameters were analysed including volume delineation, treatment planning, organ at risk (OAR) dosimetry and verification. Consensus reviews were collated and summary statistics calculated. RESULTS: Fifty-seven of seventy-two requested datasets from forty-eight institutions were technically usable. 31/57 received a major deviation for at least one section. Relocation accuracy was according to protocol in 45. Just over 30% had acceptable target volumes. OAR contours were missing in an average of 25% of cases. Up to one-third of those present were incorrectly drawn while dosimetry was largely protocol compliant. Beam energy was acceptable in 97% and 48 patients had per protocol beam arrangements. CONCLUSIONS: Digital RT plan submission and review within the EORTC 22033-26033 ICR provide a solid foundation for future quality assurance procedures. Strict evaluation resulted in overall grades of minor and major deviation for 37% and 32%, respectively.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article describes the composition of fingermark residue as being a complex system with numerous compounds coming from different sources and evolving over time from the initial composition (corresponding to the composition right after deposition) to the aged composition (corresponding to the evolution of the initial composition over time). This complex system will additionally vary due to effects of numerous influence factors grouped in five different classes: the donor characteristics, the deposition conditions, the substrate nature, the environmental conditions and the applied enhancement techniques. The initial and aged compositions as well as the influence factors are thus considered in this article to provide a qualitative and quantitative review of all compounds identified in fingermark residue up to now. The analytical techniques used to obtain these data are also enumerated. This review highlights the fact that despite the numerous analytical processes that have already been proposed and tested to elucidate fingermark composition, advanced knowledge is still missing. Thus, there is a real need to conduct future research on the composition of fingermark residue, focusing particularly on quantitative measurements, aging kinetics and effects of influence factors. The results of future research are particularly important for advances in fingermark enhancement and dating technique developments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is enormous interest in designing training methods for reducing cognitive decline in healthy older adults. Because it is impaired with aging, multitasking has often been targeted and has been shown to be malleable with appropriate training. Investigating the effects of cognitive training on functional brain activation might provide critical indication regarding the mechanisms that underlie those positive effects, as well as provide models for selecting appropriate training methods. The few studies that have looked at brain correlates of cognitive training indicate a variable pattern and location of brain changes - a result that might relate to differences in training formats. The goal of this study was to measure the neural substrates as a function of whether divided attentional training programs induced the use of alternative processes or whether it relied on repeated practice. Forty-eight older adults were randomly allocated to one of three training programs. In the SINGLE REPEATED training, participants practiced an alphanumeric equation and a visual detection task, each under focused attention. In the DIVIDED FIXED training, participants practiced combining verification and detection by divided attention, with equal attention allocated to both tasks. In the DIVIDED VARIABLE training, participants completed the task by divided attention, but were taught to vary the attentional priority allocated to each task. Brain activation was measured with fMRI pre- and post-training while completing each task individually and the two tasks combined. The three training programs resulted in markedly different brain changes. Practice on individual tasks in the SINGLE REPEATED training resulted in reduced brain activation whereas DIVIDED VARIABLE training resulted in a larger recruitment of the right superior and middle frontal gyrus, a region that has been involved in multitasking. The type of training is a critical factor in determining the pattern of brain activation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Despite the continuous production of genome sequence for a number of organisms, reliable, comprehensive, and cost effective gene prediction remains problematic. This is particularly true for genomes for which there is not a large collection of known gene sequences, such as the recently published chicken genome. We used the chicken sequence to test comparative and homology-based gene-finding methods followed by experimental validation as an effective genome annotation method. RESULTS: We performed experimental evaluation by RT-PCR of three different computational gene finders, Ensembl, SGP2 and TWINSCAN, applied to the chicken genome. A Venn diagram was computed and each component of it was evaluated. The results showed that de novo comparative methods can identify up to about 700 chicken genes with no previous evidence of expression, and can correctly extend about 40% of homology-based predictions at the 5' end. CONCLUSIONS: De novo comparative gene prediction followed by experimental verification is effective at enhancing the annotation of the newly sequenced genomes provided by standard homology-based methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Summary : Forensic science - both as a source of and as a remedy for error potentially leading to judicial error - has been studied empirically in this research. A comprehensive literature review, experimental tests on the influence of observational biases in fingermark comparison, and semistructured interviews with heads of forensic science laboratories/units in Switzerland and abroad were the tools used. For the literature review, some of the areas studied are: the quality of forensic science work in general, the complex interaction between science and law, and specific propositions as to error sources not directly related to the interaction between law and science. A list of potential error sources all the way from the crime scene to the writing of the report has been established as well. For the empirical tests, the ACE-V (Analysis, Comparison, Evaluation, and Verification) process of fingermark comparison was selected as an area of special interest for the study of observational biases, due to its heavy reliance on visual observation and recent cases of misidentifications. Results of the tests performed with forensic science students tend to show that decision-making stages are the most vulnerable to stimuli inducing observational biases. For the semi-structured interviews, eleven senior forensic scientists answered questions on several subjects, for example on potential and existing error sources in their work, of the limitations of what can be done with forensic science, and of the possibilities and tools to minimise errors. Training and education to augment the quality of forensic science have been discussed together with possible solutions to minimise the risk of errors in forensic science. In addition, the time that samples of physical evidence are kept has been determined as well. Results tend to show considerable agreement on most subjects among the international participants. Their opinions on possible explanations for the occurrence of such problems and the relative weight of such errors in the three stages of crime scene, laboratory, and report writing, disagree, however, with opinions widely represented in existing literature. Through the present research it was therefore possible to obtain a better view of the interaction of forensic science and judicial error to propose practical recommendations to minimise their occurrence. Résumé : Les sciences forensiques - considérés aussi bien comme source de que comme remède à l'erreur judiciaire - ont été étudiées empiriquement dans cette recherche. Une revue complète de littérature, des tests expérimentaux sur l'influence du biais de l'observation dans l'individualisation de traces digitales et des entretiens semi-directifs avec des responsables de laboratoires et unités de sciences forensiques en Suisse et à l'étranger étaient les outils utilisés. Pour la revue de littérature, quelques éléments étudies comprennent: la qualité du travail en sciences forensiques en général, l'interaction complexe entre la science et le droit, et des propositions spécifiques quant aux sources d'erreur pas directement liées à l'interaction entre droit et science. Une liste des sources potentielles d'erreur tout le long du processus de la scène de crime à la rédaction du rapport a également été établie. Pour les tests empiriques, le processus d'ACE-V (analyse, comparaison, évaluation et vérification) de l'individualisation de traces digitales a été choisi comme un sujet d'intérêt spécial pour l'étude des effets d'observation, due à son fort recours à l'observation visuelle et dû à des cas récents d'identification erronée. Les résultats des tests avec des étudiants tendent à prouver que les étapes de prise de décision sont les plus vulnérables aux stimuli induisant des biais d'observation. Pour les entretiens semi-structurés, onze forensiciens ont répondu à des questions sur des sujets variés, par exemple sur des sources potentielles et existantes d'erreur dans leur travail, des limitations de ce qui peut être fait en sciences forensiques, et des possibilités et des outils pour réduire au minimum ses erreurs. La formation et l'éducation pour augmenter la qualité des sciences forensiques ont été discutées ainsi que les solutions possibles pour réduire au minimum le risque d'erreurs en sciences forensiques. Le temps que des échantillons sont gardés a été également déterminé. En général, les résultats tendent à montrer un grand accord sur la plupart des sujets abordés pour les divers participants internationaux. Leur avis sur des explications possibles pour l'occurrence de tels problèmes et sur le poids relatif de telles erreurs dans les trois étapes scène de crime;', laboratoire et rédaction de rapports est cependant en désaccord avec les avis largement représentés dans la littérature existante. Par cette recherche il était donc possible d'obtenir une meilleure vue de l'interaction des sciences forensiques et de l'erreur judiciaire afin de proposer des recommandations pratiques pour réduire au minimum leur occurrence. Zusammenfassung : Forensische Wissenschaften - als Ursache und als Hilfsmittel gegen Fehler, die möglicherweise zu Justizirrtümern führen könnten - sind hier empirisch erforscht worden. Die eingestzten Methoden waren eine Literaturübersicht, experimentelle Tests über den Einfluss von Beobachtungseffekten (observer bias) in der Individualisierung von Fingerabdrücken und halbstandardisierte Interviews mit Verantwortlichen von kriminalistischen Labors/Diensten in der Schweiz und im Ausland. Der Literaturüberblick umfasst unter anderem: die Qualität der kriminalistischen Arbeit im Allgemeinen, die komplizierte Interaktion zwischen Wissenschaft und Recht und spezifische Fehlerquellen, welche nicht direkt auf der Interaktion von Recht und Wissenschaft beruhen. Eine Liste möglicher Fehlerquellen vom Tatort zum Rapportschreiben ist zudem erstellt worden. Für die empirischen Tests wurde der ACE-V (Analyse, Vergleich, Auswertung und Überprüfung) Prozess in der Fingerabdruck-Individualisierung als speziell interessantes Fachgebiet für die Studie von Beobachtungseffekten gewählt. Gründe sind die Wichtigkeit von visuellen Beobachtungen und kürzliche Fälle von Fehlidentifizierungen. Resultate der Tests, die mit Studenten durchgeführt wurden, neigen dazu Entscheidungsphasen als die anfälligsten für Stimuli aufzuzeigen, die Beobachtungseffekte anregen könnten. Für die halbstandardisierten Interviews beantworteten elf Forensiker Fragen über Themen wie zum Beispiel mögliche und vorhandene Fehlerquellen in ihrer Arbeit, Grenzen der forensischen Wissenschaften und Möglichkeiten und Mittel um Fehler zu verringern. Wie Training und Ausbildung die Qualität der forensischen Wissenschaften verbessern können ist zusammen mit möglichen Lösungen zur Fehlervermeidung im selben Bereich diskutiert worden. Wie lange Beweismitten aufbewahrt werden wurde auch festgehalten. Resultate neigen dazu, für die meisten Themen eine grosse Übereinstimmung zwischen den verschiedenen internationalen Teilnehmern zu zeigen. Ihre Meinungen über mögliche Erklärungen für das Auftreten solcher Probleme und des relativen Gewichts solcher Fehler in den drei Phasen Tatort, Labor und Rapportschreiben gehen jedoch mit den Meinungen, welche in der Literatur vertreten werden auseinander. Durch diese Forschungsarbeit war es folglich möglich, ein besseres Verständnis der Interaktion von forensischen Wissenschaften und Justizirrtümer zu erhalten, um somit praktische Empfehlungen vorzuschlagen, welche diese verringern. Resumen : Esta investigación ha analizado de manera empírica el rol de las ciencias forenses como fuente y como remedio de potenciales errores judiciales. La metodología empleada consistió en una revisión integral de la literatura, en una serie de experimentos sobre la influencia de los sesgos de observación en la individualización de huellas dactilares y en una serie de entrevistas semiestructuradas con jefes de laboratorios o unidades de ciencias forenses en Suiza y en el extranjero. En la revisión de la literatura, algunas de las áreas estudiadas fueron: la calidad del trabajo en ciencias forenses en general, la interacción compleja entre la ciencia y el derecho, así como otras fuentes de error no relacionadas directamente con la interacción entre derecho y ciencia. También se ha establecido una lista exhaustiva de las fuentes potenciales de error desde la llegada a la escena del crimen a la redacción del informe. En el marco de los tests empíricos, al analizar los sesgos de observación dedicamos especial interés al proceso de ACE-V (análisis, comparación, evaluación y verificación) para la individualización de huellas dactilares puesto que este reposa sobre la observación visual y ha originado varios casos recientes de identificaciones erróneas. Los resultados de las experimentaciones realizadas con estudiantes sugieren que las etapas en las que deben tornarse decisiones son las más vulnerables a lös factores que pueden generar sesgos de observación. En el contexto de las entrevistas semi-estructuradas, once científicos forenses de diversos países contestaron preguntas sobre varios temas, incluyendo las fuentes potenciales y existehtes de error en su trabajo, las limitaciones propias a las ciencias forenses, las posibilidades de reducir al mínimo los errores y las herramientas que podrían ser utilizadas para ello. Se han sugerido diversas soluciones para alcanzar este objetivo, incluyendo el entrenamiento y la educación para aumentar la calidad de las ciencias forenses. Además, se ha establecido el periodo de conservación de las muestras judiciales. Los resultados apuntan a un elevado grado de consenso entre los entrevistados en la mayoría de los temas. Sin embargo, sus opiniones sobre las posibles causas de estos errores y su importancia relativa en las tres etapas de la investigación -la escena del crimen, el laboratorio y la redacción de informe- discrepan con las que predominan ampliamente en la literatura actual. De este modo, esta investigación nos ha permitido obtener una mejor imagen de la interacción entre ciencias forenses y errores judiciales, y comenzar a formular una serie de recomendaciones prácticas para reducirlos al minimo.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To evaluate the diagnostic value and image quality of CT with filtered back projection (FBP) compared with adaptive statistical iterative reconstructed images (ASIR) in body stuffers with ingested cocaine-filled packets.Methods and Materials: Twenty-nine body stuffers (mean age 31.9 years, 3 women) suspected for ingestion of cocaine-filled packets underwent routine-dose 64-row multidetector CT with FBP (120kV, pitch 1.375, 100-300 mA and automatic tube current modulation (auto mA), rotation time 0.7sec, collimation 2.5mm), secondarily reconstructed with 30 % and 60 % ASIR. In 13 (44.83%) out of the body stuffers cocaine-filled packets were detected, confirmed by exact analysis of the faecal content including verification of the number (range 1-25). Three radiologists independently and blindly evaluated anonymous CT examinations (29 FBP-CT and 68 ASIR-CT) for the presence and number of cocaine-filled packets indicating observers' confidence, and graded them for diagnostic quality, image noise, and sharpness. Sensitivity, specificity, area under the receiver operating curve (ROC) Az and interobserver agreement between the 3 radiologists for FBP-CT and ASIR-CT were calculated.Results: The increase of the percentage of ASIR significantly diminished the objective image noise (p<0.001). Overall sensitivity and specificity for the detection of the cocaine-filled packets were 87.72% and 76.15%, respectively. The difference of ROC area Az between the different reconstruction techniques was significant (p= 0.0101), that is 0.938 for FBP-CT, 0.916 for 30 % ASIR-CT, and 0.894 for 60 % ASIR-CT.Conclusion: Despite the evident image noise reduction obtained by ASIR, the diagnostic value for detecting cocaine-filled packets decreases, depending on the applied ASIR percentage.

Relevância:

10.00% 10.00%

Publicador: