945 resultados para digital forensic tool testing
Resumo:
Cybercrime and related malicious activity in our increasingly digital world has become more prevalent and sophisticated, evading traditional security mechanisms. Digital forensics has been proposed to help investigate, understand and eventually mitigate such attacks. The practice of digital forensics, however, is still fraught with various challenges. Some of the most prominent of these challenges include the increasing amounts of data and the diversity of digital evidence sources appearing in digital investigations. Mobile devices and cloud infrastructures are an interesting specimen, as they inherently exhibit these challenging circumstances and are becoming more prevalent in digital investigations today. Additionally they embody further characteristics such as large volumes of data from multiple sources, dynamic sharing of resources, limited individual device capabilities and the presence of sensitive data. These combined set of circumstances make digital investigations in mobile and cloud environments particularly challenging. This is not aided by the fact that digital forensics today still involves manual, time consuming tasks within the processes of identifying evidence, performing evidence acquisition and correlating multiple diverse sources of evidence in the analysis phase. Furthermore, industry standard tools developed are largely evidence-oriented, have limited support for evidence integration and only automate certain precursory tasks, such as indexing and text searching. In this study, efficiency, in the form of reducing the time and human labour effort expended, is sought after in digital investigations in highly networked environments through the automation of certain activities in the digital forensic process. To this end requirements are outlined and an architecture designed for an automated system that performs digital forensics in highly networked mobile and cloud environments. Part of the remote evidence acquisition activity of this architecture is built and tested on several mobile devices in terms of speed and reliability. A method for integrating multiple diverse evidence sources in an automated manner, supporting correlation and automated reasoning is developed and tested. Finally the proposed architecture is reviewed and enhancements proposed in order to further automate the architecture by introducing decentralization particularly within the storage and processing functionality. This decentralization also improves machine to machine communication supporting several digital investigation processes enabled by the architecture through harnessing the properties of various peer-to-peer overlays. Remote evidence acquisition helps to improve the efficiency (time and effort involved) in digital investigations by removing the need for proximity to the evidence. Experiments show that a single TCP connection client-server paradigm does not offer the required scalability and reliability for remote evidence acquisition and that a multi-TCP connection paradigm is required. The automated integration, correlation and reasoning on multiple diverse evidence sources demonstrated in the experiments improves speed and reduces the human effort needed in the analysis phase by removing the need for time-consuming manual correlation. Finally, informed by published scientific literature, the proposed enhancements for further decentralizing the Live Evidence Information Aggregator (LEIA) architecture offer a platform for increased machine-to-machine communication thereby enabling automation and reducing the need for manual human intervention.
Resumo:
Implantada no Brasil na década de 50, a TV a mídia de maior abrangência e poder ideológico entre seus públicos encara sua segunda grande transformação: ela deixa de ser analógica e passa a ser digital. Com isso, traz à tona novas possibilidades de recepção e a possível convergência de meios. Nesse contexto, o objetivo deste estudo foi analisar o processo de instalação do Sistema Brasileiro de Televisão Digital Terrestre (SBTVD-t) a partir do clipping on-line do Fórum Nacional pela Democratização da Comunicação (FNDC). A análise desse material teve como foco averiguar se o FNDC foi tendencioso ou não na veiculação de matérias voltadas aos aspectos técnicos da nova tecnologia, em detrimento de seu potencial social. Para tanto, optou-se por uma pesquisa de base quantitativa em que as informações e os dados coletados levaram à constatação de que o FNDC se mostra pouco eficaz como aparato crítico-apreciativo da grande mídia, além de não cumprir alguns de seus objetivos ao reproduzir discursos e ideologias de outros veículos. Da mesma forma, verificou-se ainda que o Governo Federal também fugiu aos objetivos listados nos decretos presidenciais que instituem e dispõem sobre a implantação do SBTVD.(AU)
Resumo:
Our paper presents the work of the Cuneiform Digital Forensic Project (CDFP), an interdisciplinary project at The University of Birmingham, concerned with the development of a multimedia database to support scholarly research into cuneiform, wedge-shaped writing imprinted onto clay tablets and indeed the earliest real form of writing. We describe the evolutionary design process and dynamic research and developmental cycles associated with the database. Unlike traditional publications, the electronic publication of resources offers the possibility of almost continuous revisions with the integration and support of new media and interfaces. However, if on-line resources are to win the favor and confidence of their respective communities there must be a clear distinction between published and maintainable resources, and, developmental content. Published material should, ideally, be supported via standard web-browser interfaces with fully integrated tools so that users receive a reliable, homogenous and intuitive flow of information and media relevant to their needs. We discuss the inherent dynamics of the design and publication of our on-line resource, starting with the basic design and maintenance aspects of the electronic database, which includes photographic instances of cuneiform signs, and shows how the continuous review process identifies areas for further research and development, for example, the “sign processor” graphical search tool and three-dimensional content, the results of which then feedback into the maintained resource.
Resumo:
Acknowledgements This research has been supported by the Leverhulme Trust International Network Grant IN-2012-140. Processing and collecting of ground penetrating data in Forgefonna was part of Elend Førre's master's project that was completed in 2009 at the Department of Geography, University of Bergen. We also acknowledge Dr Andreas Bauder for providing the subglacial topography data for Griessgletscher and Simone Tarquini for granting access to the high resolution TIN of Italy, a cut of which is provided to the reader to practice the tools (see Appendix). Referees Dr. Iestyn Barr, Dr. Jeremy Ely and Dr. Marc Oliva are thanked for their constructive comments and tool testing, which significantly improved the final output.
Resumo:
Digital forensics as a field has progressed alongside technological advancements over the years, just as digital devices have gotten more robust and sophisticated. However, criminals and attackers have devised means for exploiting the vulnerabilities or sophistication of these devices to carry out malicious activities in unprecedented ways. Their belief is that electronic crimes can be committed without identities being revealed or trails being established. Several applications of artificial intelligence (AI) have demonstrated interesting and promising solutions to seemingly intractable societal challenges. This thesis aims to advance the concept of applying AI techniques in digital forensic investigation. Our approach involves experimenting with a complex case scenario in which suspects corresponded by e-mail and deleted, suspiciously, certain communications, presumably to conceal evidence. The purpose is to demonstrate the efficacy of Artificial Neural Networks (ANN) in learning and detecting communication patterns over time, and then predicting the possibility of missing communication(s) along with potential topics of discussion. To do this, we developed a novel approach and included other existing models. The accuracy of our results is evaluated, and their performance on previously unseen data is measured. Second, we proposed conceptualizing the term “Digital Forensics AI” (DFAI) to formalize the application of AI in digital forensics. The objective is to highlight the instruments that facilitate the best evidential outcomes and presentation mechanisms that are adaptable to the probabilistic output of AI models. Finally, we enhanced our notion in support of the application of AI in digital forensics by recommending methodologies and approaches for bridging trust gaps through the development of interpretable models that facilitate the admissibility of digital evidence in legal proceedings.
Resumo:
Stable carbon and nitrogen isotope signatures (delta C-13 and delta N-15) of Cannabis sativa were assessed for their usefulness to trace seized Cannabis leaves to the country of origin and to source crops by determining how isotope signatures relate to plant growth conditions. The isotopic composition of Cannabis examined here covered nearly the entire range of values reported for terrestrial C-3 plants. The delta C-13 values of Cannabis from Australia, Papua New Guinea and Thailand ranged from -36 to -25 parts per thousand, and delta N-15 values ranged from -1.0 to 15.8 parts per thousand. The stable isotope content did not allow differentiation between Cannabis originating from the three countries, but delta C-13 values of plantation-grown Cannabis differed between well-watered plants (average delta C-13 of -30.0 parts per thousand) and plants that had received little irrigation (average delta C-13 of -26.4 parts per thousand). Cannabis grown under controlled conditions had delta C-13 values of -32.6 and -30.6 parts per thousand with high and low water supply, respectively. These results indicate that water availability determines leaf C-13 in plants grown under similar conditions of light, temperature and air humidity. The delta C-13 values also distinguished between indoor- and outdoor-grown Cannabis; indoor- grown plants had overall more negative delta C-13 values (average -31.8 parts per thousand) than outdoor-grown plants (average -27.9 parts per thousand). Contributing to the strong C-13-depletion of indoor- grown plants may be high relative humidity, poor ventilation and recycling of C-13-depleted respired CO2. Mineral fertilizers had mostly lower delta N-15 values (-0.2 to 2.2 parts per thousand) than manure-based fertilizers (7.6 to 22.7 parts per thousand). It was possible to link delta N-15 values of fertilizers associated with a crop site to soil and plant delta N-15 values. The strong relationship between soil, fertilizer, and plant delta N-15 suggests that Cannabis delta N-15 is determined by the isotopic composition of the nitrogen source. The distinct delta N-15 values measured in Cannabis crops make delta N-15 an excellent tool for matching seized Cannabis with a source crop. A case study is presented that demonstrates how delta C-13 and delta N-15 values can be used as a forensic tool.
Resumo:
Dissertação de Mestrado em Gestão de Empresas/MBA.
Resumo:
In this paper a new simulation environment for a virtual laboratory to educational proposes is presented. The Logisim platform was adopted as the base digital simulation tool, since it has a modular implementation in Java. All the hardware devices used in the laboratory course was designed as components accessible by the simulation tool, and integrated as a library. Moreover, this new library allows the user to access an external interface. This work was motivated by the needed to achieve better learning times on co-design projects, based on hardware and software implementations, and to reduce the laboratory time, decreasing the operational costs of engineer teaching. Furthermore, the use of virtual laboratories in educational environments allows the students to perform functional tests, before they went to a real laboratory. Moreover, these functional tests allow to speed-up the learning when a problem based approach methodology is considered. © 2014 IEEE.
Resumo:
This paper presents a statistical model for the quantification of the weight of fingerprint evidence. Contrarily to previous models (generative and score-based models), our model proposes to estimate the probability distributions of spatial relationships, directions and types of minutiae observed on fingerprints for any given fingermark. Our model is relying on an AFIS algorithm provided by 3M Cogent and on a dataset of more than 4,000,000 fingerprints to represent a sample from a relevant population of potential sources. The performance of our model was tested using several hundreds of minutiae configurations observed on a set of 565 fingermarks. In particular, the effects of various sub-populations of fingers (i.e., finger number, finger general pattern) on the expected evidential value of our test configurations were investigated. The performance of our model indicates that the spatial relationship between minutiae carries more evidential weight than their type or direction. Our results also indicate that the AFIS component of our model directly enables us to assign weight to fingerprint evidence without the need for the additional layer of complex statistical modeling involved by the estimation of the probability distributions of fingerprint features. In fact, it seems that the AFIS component is more sensitive to the sub-population effects than the other components of the model. Overall, the data generated during this research project contributes to support the idea that fingerprint evidence is a valuable forensic tool for the identification of individuals.
Resumo:
Almost everyone sketches. People use sketches day in and day out in many different and heterogeneous fields, to share their thoughts and clarify ambiguous interpretations, for example. The media used to sketch varies from analog tools like flipcharts to digital tools like smartboards. Whereas analog tools are usually affected by insufficient editing capabilities like cut/copy/paste, digital tools greatly support these scenarios. Digital tools can be grouped into informal and formal tools. Informal tools can be understood as simple drawing environments, whereas formal tools offer sophisticated support to create, optimize and validate diagrams of a certain application domain. Most digital formal tools force users to stick to a concrete syntax and editing workflow, limiting the user’s creativity. For that reason, a lot of people first sketch their ideas using the flexibility of analog or digital informal tools. Subsequently, the sketch is "portrayed" in an appropriate digital formal tool. This work presents Scribble, a highly configurable and extensible sketching framework which allows to dynamically inject sketching features into existing graphical diagram editors, based on Eclipse GEF. This allows to combine the flexibility of informal tools with the power of formal tools without any effort. No additional code is required to augment a GEF editor with sophisticated sketching features. Scribble recognizes drawn elements as well as handwritten text and automatically generates the corresponding domain elements. A local training data library is created dynamically by incrementally learning shapes, drawn by the user. Training data can be shared with others using the WebScribble web application which has been created as part of this work.
Resumo:
Considering the transition from industrial society to information society, we realize that the digital training that is addressed is currently insufficient to navigate within a digitized reality. As proposed to minimize this problem, this paper assesses, validates and develops the software RoboEduc to work with educational robotics with the main differential programming of robotic devices in levels, considering the specifics of reality training . One of the emphases of this work isthe presentation of materials and procedures involving the development, analysis and evolution of this software. For validation of usability tests were performed, based on analysis of these tests was developed version 4.0 of RoboEduc
Resumo:
Pós-graduação em Ciências Odontológicas - FOAR
Resumo:
AIM This study aimed to evaluate the effect of a digital learning tool on undergraduate dental students' performance in detecting dental caries using ICDAS. METHODS An experimental digital learning tool (DLT) was created using digital photographs of sound and carious teeth. Thirty-nine students were divided into three groups (n = 13) and each assessed 12 randomly allocated patients before and after learning strategies: G1, ICDAS e-learning program; G2, ICDAS e-learning program plus DLT; G3, no learning strategy. Students (n = 32) reassessed patients 2 weeks after training. RESULTS Comparing before and after the learning strategies, any difference in the values of specificity and area under the ROC curve for all groups was found. Sensitivity was statistically significantly higher for G1 and G2. Comparing the groups, G2 showed a significant increase in sensitivity at the D2 and D3 thresholds. Spearman's correlations with the gold standard before and after the learning strategy were 0.60 and 0.61 for G1, 0.57 and 0.63 for G2, and 0.54 and 0.54 for G3, respectively. The Wilcoxon test showed a statistically significant difference between the values obtained before and after learning strategies for G1 and G2. CONCLUSIONS Use of the DLT after the ICDAS e-learning program tended to increase the sensitivity of ICDAS used by undergraduate dental students. The DLT appeared to improve dental students' ability to use ICDAS.
Resumo:
El extraordinario auge de las nuevas tecnologías de la información, el desarrollo de la Internet de las Cosas, el comercio electrónico, las redes sociales, la telefonía móvil y la computación y almacenamiento en la nube, han proporcionado grandes beneficios en todos los ámbitos de la sociedad. Junto a éstos, se presentan nuevos retos para la protección y privacidad de la información y su contenido, como la suplantación de personalidad y la pérdida de la confidencialidad e integridad de los documentos o las comunicaciones electrónicas. Este hecho puede verse agravado por la falta de una frontera clara que delimite el mundo personal del mundo laboral en cuanto al acceso de la información. En todos estos campos de la actividad personal y laboral, la Criptografía ha jugado un papel fundamental aportando las herramientas necesarias para garantizar la confidencialidad, integridad y disponibilidad tanto de la privacidad de los datos personales como de la información. Por otro lado, la Biometría ha propuesto y ofrecido diferentes técnicas con el fin de garantizar la autentificación de individuos a través del uso de determinadas características personales como las huellas dáctilares, el iris, la geometría de la mano, la voz, la forma de caminar, etc. Cada una de estas dos ciencias, Criptografía y Biometría, aportan soluciones a campos específicos de la protección de datos y autentificación de usuarios, que se verían enormemente potenciados si determinadas características de ambas ciencias se unieran con vistas a objetivos comunes. Por ello es imperativo intensificar la investigación en estos ámbitos combinando los algoritmos y primitivas matemáticas de la Criptografía con la Biometría para dar respuesta a la demanda creciente de nuevas soluciones más técnicas, seguras y fáciles de usar que potencien de modo simultáneo la protección de datos y la identificacíón de usuarios. En esta combinación el concepto de biometría cancelable ha supuesto una piedra angular en el proceso de autentificación e identificación de usuarios al proporcionar propiedades de revocación y cancelación a los ragos biométricos. La contribución de esta tesis se basa en el principal aspecto de la Biometría, es decir, la autentificación segura y eficiente de usuarios a través de sus rasgos biométricos, utilizando tres aproximaciones distintas: 1. Diseño de un esquema criptobiométrico borroso que implemente los principios de la biometría cancelable para identificar usuarios lidiando con los problemas acaecidos de la variabilidad intra e inter-usuarios. 2. Diseño de una nueva función hash que preserva la similitud (SPHF por sus siglas en inglés). Actualmente estas funciones se usan en el campo del análisis forense digital con el objetivo de buscar similitudes en el contenido de archivos distintos pero similares de modo que se pueda precisar hasta qué punto estos archivos pudieran ser considerados iguales. La función definida en este trabajo de investigación, además de mejorar los resultados de las principales funciones desarrolladas hasta el momento, intenta extender su uso a la comparación entre patrones de iris. 3. Desarrollando un nuevo mecanismo de comparación de patrones de iris que considera tales patrones como si fueran señales para compararlos posteriormente utilizando la transformada de Walsh-Hadarmard. Los resultados obtenidos son excelentes teniendo en cuenta los requerimientos de seguridad y privacidad mencionados anteriormente. Cada uno de los tres esquemas diseñados han sido implementados para poder realizar experimentos y probar su eficacia operativa en escenarios que simulan situaciones reales: El esquema criptobiométrico borroso y la función SPHF han sido implementados en lenguaje Java mientras que el proceso basado en la transformada de Walsh-Hadamard en Matlab. En los experimentos se ha utilizado una base de datos de imágenes de iris (CASIA) para simular una población de usuarios del sistema. En el caso particular de la función de SPHF, además se han realizado experimentos para comprobar su utilidad en el campo de análisis forense comparando archivos e imágenes con contenido similar y distinto. En este sentido, para cada uno de los esquemas se han calculado los ratios de falso negativo y falso positivo. ABSTRACT The extraordinary increase of new information technologies, the development of Internet of Things, the electronic commerce, the social networks, mobile or smart telephony and cloud computing and storage, have provided great benefits in all areas of society. Besides this fact, there are new challenges for the protection and privacy of information and its content, such as the loss of confidentiality and integrity of electronic documents and communications. This is exarcebated by the lack of a clear boundary between the personal world and the business world as their differences are becoming narrower. In both worlds, i.e the personal and the business one, Cryptography has played a key role by providing the necessary tools to ensure the confidentiality, integrity and availability both of the privacy of the personal data and information. On the other hand, Biometrics has offered and proposed different techniques with the aim to assure the authentication of individuals through their biometric traits, such as fingerprints, iris, hand geometry, voice, gait, etc. Each of these sciences, Cryptography and Biometrics, provides tools to specific problems of the data protection and user authentication, which would be widely strengthen if determined characteristics of both sciences would be combined in order to achieve common objectives. Therefore, it is imperative to intensify the research in this area by combining the basics mathematical algorithms and primitives of Cryptography with Biometrics to meet the growing demand for more secure and usability techniques which would improve the data protection and the user authentication. In this combination, the use of cancelable biometrics makes a cornerstone in the user authentication and identification process since it provides revocable or cancelation properties to the biometric traits. The contributions in this thesis involve the main aspect of Biometrics, i.e. the secure and efficient authentication of users through their biometric templates, considered from three different approaches. The first one is designing a fuzzy crypto-biometric scheme using the cancelable biometric principles to take advantage of the fuzziness of the biometric templates at the same time that it deals with the intra- and inter-user variability among users without compromising the biometric templates extracted from the legitimate users. The second one is designing a new Similarity Preserving Hash Function (SPHF), currently widely used in the Digital Forensics field to find similarities among different files to calculate their similarity level. The function designed in this research work, besides the fact of improving the results of the two main functions of this field currently in place, it tries to expand its use to the iris template comparison. Finally, the last approach of this thesis is developing a new mechanism of handling the iris templates, considering them as signals, to use the Walsh-Hadamard transform (complemented with three other algorithms) to compare them. The results obtained are excellent taking into account the security and privacy requirements mentioned previously. Every one of the three schemes designed have been implemented to test their operational efficacy in situations that simulate real scenarios: The fuzzy crypto-biometric scheme and the SPHF have been implemented in Java language, while the process based on the Walsh-Hadamard transform in Matlab. The experiments have been performed using a database of iris templates (CASIA-IrisV2) to simulate a user population. The case of the new SPHF designed is special since previous to be applied i to the Biometrics field, it has been also tested to determine its applicability in the Digital Forensic field comparing similar and dissimilar files and images. The ratios of efficiency and effectiveness regarding user authentication, i.e. False Non Match and False Match Rate, for the schemes designed have been calculated with different parameters and cases to analyse their behaviour.
Resumo:
Under the framework of the ANDRILL Southern McMurdo Sound (SMS) Project successful downhole experiments were conducted in the 1138.54 metre (m)-deep AND-2A borehole. Wireline logs successfully recorded were: magnetic susceptibility, spectral gamma ray, sonic velocity, borehole televiewer, neutron porosity, density, calliper, geochemistry, temperature and dipmeter. A resistivity tool and its backup both failed to operate, thus resistivity data were not collected. Due to hole conditions, logs were collected in several passes from the total depth at ~1138 metres below sea floor (mbsf) to ~230 mbsf, except for some intervals that were either inaccessible due to bridging or were shielded by the drill string. Furthermore, a Vertical Seismic Profile (VSP) was created from ~1000 mbsf up to the sea floor. The first hydraulic fracturing stress measurements in Antarctica were conducted in the interval 1000-1138 mbsf. This extensive data set will allow the SMS Science Team to reach some of the ambitious objectives of the SMS Project. Valuable contributions can be expected for the following topics: cyclicity and climate change, heat flux and fluid flow, seismic stratigraphy in the Victoria Land Basin, and structure and state of the modern crustal stress field.