936 resultados para Forensic analysis
Resumo:
This paper describes the development and application of a simple, cheap, and clean method for the quantification of furosemide in urine samples from athletes, to detect doping, using a combined spot test/diffuse reflectance spectroscopy procedure. The method is based on the complexation reaction of furosemide (5-(aminosulfonyl)-4-chloro-2-((furanylmethyl)amino)benzoic acid, dissolved in ethanol, with FeCl3 and the surfactant dodecyltrimethylammonium bromide (DTAB) in aqueous solution, yielding a colored compound on the surface of a filter paper. The reagent concentrations were optimized using a chemometric experimental design. The reflectometric measurements of the complex formed were carried out at 477nm. The linear range obtained was 1.65-9.00×10-3molL-1 of furosemide (R=0.997), and the detection and quantification limits were 4.9×10-4 and 1.62×10-3molL-1, respectively. The proposed method was successfully applied in the analysis of furosemide in spiked urine, demonstrating that it is a reliable alternative method for the detection of furosemide doping in sport. © 2012 Elsevier B.V..
Resumo:
Come risposta positiva alle richieste provenienti dal mondo dei giuristi, spesso troppo distante da quello scientifico, si vuole sviluppare un sistema solido dal punto di vista tecnico e chiaro dal punto di vista giurico finalizzato ad migliore ricerca della verità. L’obiettivo ci si prefigge è quello di creare uno strumento versatile e di facile utilizzo da mettere a disposizione dell’A.G. ed eventualmente della P.G. operante finalizzato a consentire il proseguo dell’attività d’indagine in tempi molto rapidi e con un notevole contenimento dei costi di giustizia rispetto ad una normale CTU. La progetto verterà su analisi informatiche forensi di supporti digitali inerenti vari tipi di procedimento per cui si dovrebbe richiedere una CTU o una perizia. La sperimentazione scientifica prevede un sistema di partecipazione diretta della P.G. e della A.G. all’analisi informatica rendendo disponibile, sottoforma di macchina virtuale, il contenuto dei supporti sequestrati in modo che possa essere visionato alla pari del supporto originale. In questo modo il CT diventa una mera guida per la PG e l’AG nell’ambito dell’indagine informatica forense che accompagna il giudice e le parti alla migliore comprensione delle informazioni richieste dal quesito. Le fasi chiave della sperimentazione sono: • la ripetibilità delle operazioni svolte • dettare delle chiare linee guida per la catena di custodia dalla presa in carico dei supporti • i metodi di conservazione e trasmissione dei dati tali da poter garantire integrità e riservatezza degli stessi • tempi e costi ridotti rispetto alle normali CTU/perizie • visualizzazione diretta dei contenuti dei supporti analizzati delle Parti e del Giudice circoscritte alle informazioni utili ai fini di giustizia
Resumo:
Die Analyse tandem-repetitiver DNA-Sequenzen hat einen festen Platz als genetisches Typisierungsverfahren in den Breichen der stammesgeschichtlichen Untersuchung, der Verwandtschaftsanalyse und vor allem in der forensischen Spurenkunde, bei der es durch den Einsatz der Multiplex-PCR-Analyse von Short Tandem Repeat-Systemen (STR) zu einem Durchbruch bei der Aufklärung und sicheren Zuordnung von biologischen Tatortspuren kam. Bei der Sequenzierung des humanen Genoms liegt ein besonderes Augenmerk auf den genetisch polymorphen Sequenzvariationen im Genom, den SNPs (single nucleotide polymorphisms). Zwei ihrer Eigenschaften – das häufige Vorkommen innerhalb des humanen Genoms und ihre vergleichbar geringe Mutationsrate – machen sie zu besonders gut geeigneten Werkzeugen sowohl für die Forensik als auch für die Populationsgenetik.rnZum Ziel des EU-Projekts „SNPforID“, aus welchem die vorliegende Arbeit entstanden ist, wurde die Etablierung neuer Methoden zur validen Typisierung von SNPs in Multiplexverfahren erklärt. Die Berücksichtigung der Sensitivität bei der Untersuchung von Spuren sowie die statistische Aussagekraft in der forensischen Analyse standen dabei im Vordergrund. Hierfür wurden 52 autosomale SNPs ausgewählt und auf ihre maximale Individualisierungsstärke hin untersucht. Die Untersuchungen der ersten 23 selektierten Marker stellen den ersten Teil der vorliegenden Arbeit dar. Sie umfassen die Etablierung des Multiplexverfahrens und der SNaPshot™-Typisierungsmethode sowie ihre statistische Auswertung. Die Ergebnisse dieser Untersuchung sind ein Teil der darauf folgenden, in enger Zusammenarbeit der Partnerlaboratorien durchgeführten Studie der 52-SNP-Multiplexmethode. rnEbenfalls im Rahmen des Projekts und als Hauptziel der Dissertation erfolgten Etablierung und Evaluierung des auf der Microarray-Technologie basierenden Verfahrens der Einzelbasenverlängerung auf Glasobjektträgern. Ausgehend von einer begrenzten DNA-Menge wurde hierbei die Möglichkeit der simultanen Hybridisierung einer möglichst hohen Anzahl von SNP-Systemen untersucht. Die Auswahl der hierbei eingesetzten SNP-Marker erfolgte auf der Basis der Vorarbeiten, die für die Etablierung des 52-SNP-Multiplexes erfolgreich durchgeführt worden waren. rnAus einer Vielzahl von Methoden zur Genotypisierung von biallelischen Markern hebt sich das Assay in seiner Parallelität und der Einfachheit des experimentellen Ansatzes durch eine erhebliche Zeit- und Kostenersparnis ab. In der vorliegenden Arbeit wurde das „array of arrays“-Prinzip eingesetzt, um zur gleichen Zeit unter einheitlichen Versuchsbedingungen zwölf DNA-Proben auf einem Glasobjektträger zu typisieren. Auf der Basis von insgesamt 1419 typisierten Allelen von 33 Markern konnte die Validierung mit einem Typisierungserfolg von 86,75% abgeschlossen werden. Dabei wurden zusätzlich eine Reihe von Randbedingungen in Bezug auf das Sonden- und Primerdesign, die Hybridisierungsbedingungen sowie physikalische Parameter der laserinduzierten Fluoreszenzmessung der Signale ausgetestet und optimiert. rn
Resumo:
La prova informatica richiede l’adozione di precauzioni come in un qualsiasi altro accertamento scientifico. Si fornisce una panoramica sugli aspetti metodologici e applicativi dell’informatica forense alla luce del recente standard ISO/IEC 27037:2012 in tema di trattamento del reperto informatico nelle fasi di identificazione, raccolta, acquisizione e conservazione del dato digitale. Tali metodologie si attengono scrupolosamente alle esigenze di integrità e autenticità richieste dalle norme in materia di informatica forense, in particolare della Legge 48/2008 di ratifica della Convenzione di Budapest sul Cybercrime. In merito al reato di pedopornografia si offre una rassegna della normativa comunitaria e nazionale, ponendo l’enfasi sugli aspetti rilevanti ai fini dell’analisi forense. Rilevato che il file sharing su reti peer-to-peer è il canale sul quale maggiormente si concentra lo scambio di materiale illecito, si fornisce una panoramica dei protocolli e dei sistemi maggiormente diffusi, ponendo enfasi sulla rete eDonkey e il software eMule che trovano ampia diffusione tra gli utenti italiani. Si accenna alle problematiche che si incontrano nelle attività di indagine e di repressione del fenomeno, di competenza delle forze di polizia, per poi concentrarsi e fornire il contributo rilevante in tema di analisi forensi di sistemi informatici sequestrati a soggetti indagati (o imputati) di reato di pedopornografia: la progettazione e l’implementazione di eMuleForensic consente di svolgere in maniera estremamente precisa e rapida le operazioni di analisi degli eventi che si verificano utilizzando il software di file sharing eMule; il software è disponibile sia in rete all’url http://www.emuleforensic.com, sia come tool all’interno della distribuzione forense DEFT. Infine si fornisce una proposta di protocollo operativo per l’analisi forense di sistemi informatici coinvolti in indagini forensi di pedopornografia.
Resumo:
El propósito de la presente investigación es determinar si, a través del estudio y análisis de los estudios de tráfico en autopistas de peaje, se pueden determinar las razones de los incumplimientos en las previsiones de estos estudios. La metodología se basa en un análisis empírico ex- post facto de los estudios de tráfico contenidos en los anteproyectos de las autopistas Radial 3 y Radial 5 y los datos realmente verificados. Tras una introducción para presentar las principales características de las autopistas de peaje, se realiza una revisión de la bibliografía sobre el cumplimiento de las previsiones de tráfico. Lo anterior permite establecer una serie de aspectos que pueden contribuir a estos incumplimientos, así como una serie de medidas encontradas para mejorar las futuras previsiones. Ya en el núcleo fundamental de la investigación, esta se centra en el análisis del cumplimiento de las previsiones de tráfico contenidas en los anteproyectos de la Radial 3 y Radial 5. Se realiza un análisis crítico de la metodología adoptada, así como de las variables e hipótesis realizadas. Tras este primer análisis, se profundiza en la fase de asignación de los estudios. Siempre con base a los tráficos reales y para el año 2006, se cuantifica el efecto en los incumplimientos, por un lado de las variables utilizadas, y por otro, del propio método ó curva de asignación. Finalmente, y con base en los hallazgos anteriores, se determinan una serie de limitaciones en el método de asignación de tráficos entre recorridos alternativos para el caso de entornos urbanos usado. El planteamiento con base a las teorías del agente racional y maximización de la utilidad esperada es criticado desde la perspectiva de la teoría de decisión bajo condiciones de riesgo planteada por Kahneman y Tversky. Para superar las limitaciones anteriores, se propone una nueva curva de asignación semi empírica que relaciona la proporción del tráfico que circula por la autopista de peaje con la velocidad media en la autovía libre alternativa. ABSTRACT The aim of this research is to confirm whether the forensic analysis of the traffic forecast studies for tolled highways may bring to light the reasons behind the lack of accuracy. The methodology used on this research is empirical and is based on the ex –post facto analysis of the real traffic numbers compared to the forecasted for the tolled highways Radial 3 and Radial 5. Firstly the main features of tolled highways are presented as an introductory chapter. Secondly a broad bibliographic search is presented, this is done from a global perspective and from the Spanish perspective too. From this review, a list of the main causes behind the systematic inaccuracy together with measures to improve future traffic forecast exercises are shown. In what we could consider as the core of the research, it focuses on the ratios of actual / forecast traffic for the tolled highways Radial 3 y Radial 5 in Madrid outskirts. From a critical perspective, the methodology and inputs used in the traffic studies are analysed. In a further step, the trip assignment stage is scrutinised to quantify the influence of the inputs and the assignment model itself in the accuracy of the traffic studies. This exercise is bases on the year 2006. Finally, the assignment model used is criticised for its application in tolled urban highways. The assumptions behind the model, rational agent and expected utility maximization theories, are questioned under the theories presented by Kahneman and Tversky (Prospect Theory). To overcome these assignment model limitations, the author presents a semi empiric new diversion curve. This curve links the traffic proportion using the tolled highway and the average speed in the toll free alternative highway.
Resumo:
A new methodology is proposed to produce subsidence activity maps based on the geostatistical analysis of persistent scatterer interferometry (PSI) data. PSI displacement measurements are interpolated based on conditional Sequential Gaussian Simulation (SGS) to calculate multiple equiprobable realizations of subsidence. The result from this process is a series of interpolated subsidence values, with an estimation of the spatial variability and a confidence level on the interpolation. These maps complement the PSI displacement map, improving the identification of wide subsiding areas at a regional scale. At a local scale, they can be used to identify buildings susceptible to suffer subsidence related damages. In order to do so, it is necessary to calculate the maximum differential settlement and the maximum angular distortion for each building of the study area. Based on PSI-derived parameters those buildings in which the serviceability limit state has been exceeded, and where in situ forensic analysis should be made, can be automatically identified. This methodology has been tested in the city of Orihuela (SE Spain) for the study of historical buildings damaged during the last two decades by subsidence due to aquifer overexploitation. The qualitative evaluation of the results from the methodology carried out in buildings where damages have been reported shows a success rate of 100%.
Resumo:
The Santas Justa and Rufina Gothic church (fourteenth century) has suffered several physical, mechanical, chemical, and biochemical types of pathologies along its history: rock alveolization, efflorescence, biological activity, and capillary ascent of groundwater. However, during the last two decades, a new phenomenon has seriously affected the church: ground subsidence caused by aquifer overexploitation. Subsidence is a process that affects the whole Vega Baja of the Segura River basin and consists of gradual sinking in the ground surface caused by soil consolidation due to a pore pressure decrease. This phenomenon has been studied by differential synthetic aperture radar interferometry techniques, which illustrate settlements up to 100 mm for the 1993–2009 period for the whole Orihuela city. Although no differential synthetic aperture radar interferometry information is available for the church due to the loss of interferometric coherence, the spatial analysis of nearby deformation combined with fieldwork has advanced the current understanding on the mechanisms that affect the Santas Justa and Rufina church. These results show the potential interest and the limitations of using this remote sensing technique as a complementary tool for the forensic analysis of building structures.
Resumo:
We have recently proposed the framework of independent blind source separation as an advantageous approach to steganography. Amongst the several characteristics noted was a sensitivity to message reconstruction due to small perturbations in the sources. This characteristic is not common in most other approaches to steganography. In this paper we discuss how this sensitivity relates the joint diagonalisation inside the independent component approach, and reliance on exact knowledge of secret information, and how it can be used as an additional and inherent security mechanism against malicious attack to discovery of the hidden messages. The paper therefore provides an enhanced mechanism that can be used for e-document forensic analysis and can be applied to different dimensionality digital data media. In this paper we use a low dimensional example of biomedical time series as might occur in the electronic patient health record, where protection of the private patient information is paramount.
Resumo:
This paper presents the digital imaging results of a collaborative research project working toward the generation of an on-line interactive digital image database of signs from ancient cuneiform tablets. An important aim of this project is the application of forensic analysis to the cuneiform symbols to identify scribal hands. Cuneiform tablets are amongst the earliest records of written communication, and could be considered as one of the original information technologies; an accessible, portable and robust medium for communication across distance and time. The earliest examples are up to 5,000 years old, and the writing technique remained in use for some 3,000 years. Unfortunately, only a small fraction of these tablets can be made available for display in museums and much important academic work has yet to be performed on the very large numbers of tablets to which there is necessarily restricted access. Our paper will describe the challenges encountered in the 2D image capture of a sample set of tablets held in the British Museum, explaining the motivation for attempting 3D imaging and the results of initial experiments scanning the smaller, more densely inscribed cuneiform tablets. We will also discuss the tractability of 3D digital capture, representation and manipulation, and investigate the requirements for scaleable data compression and transmission methods. Additional information can be found on the project website: www.cuneiform.net
Resumo:
The objective of this research is to develop nanoscale ultrasensitive transducers for detection of biological species at molecular level using carbon nanotubes as nanoelectrodes. Rapid detection of ultra low concentration or even single DNA molecules are essential for medical diagnosis and treatment, pharmaceutical applications, gene sequencing as well as forensic analysis. Here the use of functionalized single walled carbon nanotubes (SWNT) as nanoscale detection platform for rapid detection of single DNA molecules is demonstrated. The detection principle is based on obtaining electrical signal from a single amine terminated DNA molecule which is covalently bridged between two ends of an SWNT separated by a nanoscale gap. The synthesis, fabrication, chemical functionalization of nanoelectrodes and DNA attachment were optimized to perform reliable electrical characterization these molecules. Using this detection system fundamental study on charge transport in DNA molecule of both genomic and non genomic sequences is performed. We measured an electrical signal of about 30 pA through a hybridized DNA molecule of 80 base pair in length which encodes a portion of sequence of H5N1 gene of avian Influenza A virus. Due the dynamic nature of the DNA molecules the local environment such as ion concentration, pH and temperature significantly influence its physical properties. We observed a decrease in DNA conductance of about 33% in high vacuum conditions. The counterion variation was analyzed by changing the buffer from sodium acetate to tris(hydroxymethyl) aminomethane, which resulted in a two orders of magnitude increase in the conductivity of the DNA. The fabrication of large array of identical SWNT nanoelectrodes was achieved by using ultralong SWNTs. Using these nanoelectrode array we have investigated the sequence dependent charge transport in DNA. A systematic study performed on PolyG - PolyC sequence with varying number of intervening PolyA - PolyT pairs showed a decrease in electrical signal from 180 pA (PolyG - PolyC) to 30 pA with increasing number of the PolyA - PolyT pairs. This work also led to the development of ultrasensitive nanoelectrodes based on enzyme functionalized vertically aligned high density multiwalled CNTs for electrochemical detection of cholesterol. The nanoelectrodes exhibited selectively detection of cholesterol in the presence of common interferents found in human blood.
Resumo:
Recent advances in the massively parallel computational abilities of graphical processing units (GPUs) have increased their use for general purpose computation, as companies look to take advantage of big data processing techniques. This has given rise to the potential for malicious software targeting GPUs, which is of interest to forensic investigators examining the operation of software. The ability to carry out reverse-engineering of software is of great importance within the security and forensics elds, particularly when investigating malicious software or carrying out forensic analysis following a successful security breach. Due to the complexity of the Nvidia CUDA (Compute Uni ed Device Architecture) framework, it is not clear how best to approach the reverse engineering of a piece of CUDA software. We carry out a review of the di erent binary output formats which may be encountered from the CUDA compiler, and their implications on reverse engineering. We then demonstrate the process of carrying out disassembly of an example CUDA application, to establish the various techniques available to forensic investigators carrying out black-box disassembly and reverse engineering of CUDA binaries. We show that the Nvidia compiler, using default settings, leaks useful information. Finally, we demonstrate techniques to better protect intellectual property in CUDA algorithm implementations from reverse engineering.
Resumo:
Security Onion is a Network Security Manager (NSM) platform that provides multiple Intrusion Detection Systems (IDS) including Host IDS (HIDS) and Network IDS (NIDS). Many types of data can be acquired using Security Onion for analysis. This includes data related to: Host, Network, Session, Asset, Alert and Protocols. Security Onion can be implemented as a standalone deployment with server and sensor included or with a master server and multiple sensors allowing for the system to be scaled as required. Many interfaces and tools are available for management of the system and analysis of data such as Sguil, Snorby, Squert and Enterprise Log Search and Archive (ELSA). These interfaces can be used for analysis of alerts and captured events and then can be further exported for analysis in Network Forensic Analysis Tools (NFAT) such as NetworkMiner, CapME or Xplico. The Security Onion platform also provides various methods of management such as Secure SHell (SSH) for management of server and sensors and Web client remote access. All of this with the ability to replay and analyse example malicious traffic makes the Security Onion a suitable low cost alternative for Network Security Management. In this paper, we have a feature and functionality review for the Security Onion in terms of: types of data, configuration, interface, tools and system management.
Resumo:
Forensic speaker comparison exams have complex characteristics, demanding a long time for manual analysis. A method for automatic recognition of vowels, providing feature extraction for acoustic analysis is proposed, aiming to contribute as a support tool in these exams. The proposal is based in formant measurements by LPC (Linear Predictive Coding), selectively by fundamental frequency detection, zero crossing rate, bandwidth and continuity, with the clustering being done by the k-means method. Experiments using samples from three different databases have shown promising results, in which the regions corresponding to five of the Brasilian Portuguese vowels were successfully located, providing visualization of a speaker’s vocal tract behavior, as well as the detection of segments corresponding to target vowels.
Resumo:
En el presente artículo se tratan temas relacionados con el análisis forense aplicado a dispositivos móviles, así como la propuesta y prueba de una metodología que apoye efectivamente estas actividades. Se describen los objetivos a seguir en dicho estudio, y se plantea la búsqueda de evidencia almacenada en los dispositivos móviles bajo un escenario en el que se ha cometido un delito. Se hace énfasis en la evolución y multiplicidad de usos que poseen actualmente los dispositivos; y finalmente se aborda la necesidad de tener estándares que permitan garantizar la integridad de las evidencias encontradas, para ello se describe la metodología desarrollada, la cual permite realizar de manera adecuada el proceso forense sobre dispositivos móviles, por lo que se aspira a que se constituya en un estándar para realizar este tipo de investigaciones.
Resumo:
Effective management of invasive fishes depends on the availability of updated information about their distribution and spatial dispersion. Forensic analysis was performed using online and published data on the European catfish, Silurus glanis L., a recent invader in the Tagus catchment (Iberian Peninsula). Eighty records were obtained mainly from anglers’ fora and blogs, and more recently from www.youtube.com. Since the first record in 1998, S. glanis expanded its geographic range by 700 km of river network, occurring mainly in reservoirs and in high-order reaches. Human-mediated and natural dispersal events were identified, with the former occurring during the first years of invasion and involving movements of >50 km. Downstream dispersal directionality was predominant. The analysis of online data from anglers was found to provide useful information on the distribution and dispersal patterns of this non-native fish, and is potentially applicable as a preliminary, exploratory assessment tool for other non-native fishes.