887 resultados para Computer forensic analysis
Resumo:
We have recently proposed the framework of independent blind source separation as an advantageous approach to steganography. Amongst the several characteristics noted was a sensitivity to message reconstruction due to small perturbations in the sources. This characteristic is not common in most other approaches to steganography. In this paper we discuss how this sensitivity relates the joint diagonalisation inside the independent component approach, and reliance on exact knowledge of secret information, and how it can be used as an additional and inherent security mechanism against malicious attack to discovery of the hidden messages. The paper therefore provides an enhanced mechanism that can be used for e-document forensic analysis and can be applied to different dimensionality digital data media. In this paper we use a low dimensional example of biomedical time series as might occur in the electronic patient health record, where protection of the private patient information is paramount.
Resumo:
In this thesis, standard algorithms are used to carry out the optimisation of cold-formed steel purlins such as zed, channel and sigma sections, which are assumed to be simply supported and subjected to a gravity load. For zed, channel and sigma section, the local buckling, distortional buckling and lateral-torsional buckling are considered respectively herein. Currently, the local buckling is based on the BS 5950-5:1998 and EN 1993-1-3:2006. The distortional buckling is calculated by the direct strength method employing the elastic distortional buckling which is calculated by three available approaches such as Hancock (1995), Schafer and Pekoz (1998), Yu (2005). In the optimisation program, the lateral-torsional buckling based on BS 5950-5:1998, AISI and analytical model of Li (2004) are investigated. For the optimisation program, the programming codes are written for optimisation of channel, zed and sigma beam. The full study has been coded into a computer-based analysis program (MATLAB).
Resumo:
The thesis presents a theoretical and practical study of the dynamic behaviour of electromagnetic relays. After discussing the problem of solving the dynamicc equations analytically and presenting a historical survey of the earlier works in the relay and its dynamics, the simulation of a relay on the analogue computer is discussed. It is shown that the simulation may be used to obtain specific solutions to the dynamic equations. The computer analysis provides the dynamic characteristics for design purposes and may be used in the study of bouncing, rebound oscillations and stability of the armature motion. An approximate analytical solution to the two dynamic equations is given based on the assumption that the dynamic variation of the pull with the position of the armature is linear. The assumption is supported by the Computer-aided analysis and experimental results. The solution is intended to provide a basis for a rational design. A rigorous method of analysing the dynamic performance by using Ahlberg's theory is also presented. This method may be justified to be the extension of Ahlberg's theory by taking the mass and frictional damping forces into account. While calculating the armature motion mathematically, Ahlberg considers the equilibrium of two kinds of forces, namely pull and load, and disregards the mass and friction forces, whereas the present method deals with the equilibrium of all four kinds of forces. It is shown how this can be utilised to calculate the dynamic characteristics for a specific design. The utility of this method also extends to the study of stability, contact bounce and armature rebound. The magnetic circuit and other related topics which are essential to the study of relay dynamics are discussed and some necessary experimental results are given.
Resumo:
This paper presents the digital imaging results of a collaborative research project working toward the generation of an on-line interactive digital image database of signs from ancient cuneiform tablets. An important aim of this project is the application of forensic analysis to the cuneiform symbols to identify scribal hands. Cuneiform tablets are amongst the earliest records of written communication, and could be considered as one of the original information technologies; an accessible, portable and robust medium for communication across distance and time. The earliest examples are up to 5,000 years old, and the writing technique remained in use for some 3,000 years. Unfortunately, only a small fraction of these tablets can be made available for display in museums and much important academic work has yet to be performed on the very large numbers of tablets to which there is necessarily restricted access. Our paper will describe the challenges encountered in the 2D image capture of a sample set of tablets held in the British Museum, explaining the motivation for attempting 3D imaging and the results of initial experiments scanning the smaller, more densely inscribed cuneiform tablets. We will also discuss the tractability of 3D digital capture, representation and manipulation, and investigate the requirements for scaleable data compression and transmission methods. Additional information can be found on the project website: www.cuneiform.net
Resumo:
The objective of this research is to develop nanoscale ultrasensitive transducers for detection of biological species at molecular level using carbon nanotubes as nanoelectrodes. Rapid detection of ultra low concentration or even single DNA molecules are essential for medical diagnosis and treatment, pharmaceutical applications, gene sequencing as well as forensic analysis. Here the use of functionalized single walled carbon nanotubes (SWNT) as nanoscale detection platform for rapid detection of single DNA molecules is demonstrated. The detection principle is based on obtaining electrical signal from a single amine terminated DNA molecule which is covalently bridged between two ends of an SWNT separated by a nanoscale gap. The synthesis, fabrication, chemical functionalization of nanoelectrodes and DNA attachment were optimized to perform reliable electrical characterization these molecules. Using this detection system fundamental study on charge transport in DNA molecule of both genomic and non genomic sequences is performed. We measured an electrical signal of about 30 pA through a hybridized DNA molecule of 80 base pair in length which encodes a portion of sequence of H5N1 gene of avian Influenza A virus. Due the dynamic nature of the DNA molecules the local environment such as ion concentration, pH and temperature significantly influence its physical properties. We observed a decrease in DNA conductance of about 33% in high vacuum conditions. The counterion variation was analyzed by changing the buffer from sodium acetate to tris(hydroxymethyl) aminomethane, which resulted in a two orders of magnitude increase in the conductivity of the DNA. The fabrication of large array of identical SWNT nanoelectrodes was achieved by using ultralong SWNTs. Using these nanoelectrode array we have investigated the sequence dependent charge transport in DNA. A systematic study performed on PolyG - PolyC sequence with varying number of intervening PolyA - PolyT pairs showed a decrease in electrical signal from 180 pA (PolyG - PolyC) to 30 pA with increasing number of the PolyA - PolyT pairs. This work also led to the development of ultrasensitive nanoelectrodes based on enzyme functionalized vertically aligned high density multiwalled CNTs for electrochemical detection of cholesterol. The nanoelectrodes exhibited selectively detection of cholesterol in the presence of common interferents found in human blood.
Resumo:
Forensic speaker comparison exams have complex characteristics, demanding a long time for manual analysis. A method for automatic recognition of vowels, providing feature extraction for acoustic analysis is proposed, aiming to contribute as a support tool in these exams. The proposal is based in formant measurements by LPC (Linear Predictive Coding), selectively by fundamental frequency detection, zero crossing rate, bandwidth and continuity, with the clustering being done by the k-means method. Experiments using samples from three different databases have shown promising results, in which the regions corresponding to five of the Brasilian Portuguese vowels were successfully located, providing visualization of a speaker’s vocal tract behavior, as well as the detection of segments corresponding to target vowels.
Resumo:
En el presente artículo se tratan temas relacionados con el análisis forense aplicado a dispositivos móviles, así como la propuesta y prueba de una metodología que apoye efectivamente estas actividades. Se describen los objetivos a seguir en dicho estudio, y se plantea la búsqueda de evidencia almacenada en los dispositivos móviles bajo un escenario en el que se ha cometido un delito. Se hace énfasis en la evolución y multiplicidad de usos que poseen actualmente los dispositivos; y finalmente se aborda la necesidad de tener estándares que permitan garantizar la integridad de las evidencias encontradas, para ello se describe la metodología desarrollada, la cual permite realizar de manera adecuada el proceso forense sobre dispositivos móviles, por lo que se aspira a que se constituya en un estándar para realizar este tipo de investigaciones.
Resumo:
In the medical field images obtained from high definition cameras and other medical imaging systems are an integral part of medical diagnosis. The analysis of these images are usually performed by the physicians who sometimes need to spend long hours reviewing the images before they are able to come up with a diagnosis and then decide on the course of action. In this dissertation we present a framework for a computer-aided analysis of medical imagery via the use of an expert system. While this problem has been discussed before, we will consider a system based on mobile devices. Since the release of the iPhone on April 2003, the popularity of mobile devices has increased rapidly and our lives have become more reliant on them. This popularity and the ease of development of mobile applications has now made it possible to perform on these devices many of the image analyses that previously required a personal computer. All of this has opened the door to a whole new set of possibilities and freed the physicians from their reliance on their desktop machines. The approach proposed in this dissertation aims to capitalize on these new found opportunities by providing a framework for analysis of medical images that physicians can utilize from their mobile devices thus remove their reliance on desktop computers. We also provide an expert system to aid in the analysis and advice on the selection of medical procedure. Finally, we also allow for other mobile applications to be developed by providing a generic mobile application development framework that allows for access of other applications into the mobile domain. In this dissertation we outline our work leading towards development of the proposed methodology and the remaining work needed to find a solution to the problem. In order to make this difficult problem tractable, we divide the problem into three parts: the development user interface modeling language and tooling, the creation of a game development modeling language and tooling, and the development of a generic mobile application framework. In order to make this problem more manageable, we will narrow down the initial scope to the hair transplant, and glaucoma domains.
Resumo:
Conventional rockmass characterization and analysis methods for geotechnical assessment in mining, civil tunnelling, and other excavations consider only the intact rock properties and the discrete fractures that are present and form blocks within rockmasses. Field logging and classification protocols are based on historically useful but highly simplified design techniques, including direct empirical design and empirical strength assessment for simplified ground reaction and support analysis. As modern underground excavations go deeper and enter into more high stress environments with complex excavation geometries and associated stress paths, healed structures within initially intact rock blocks such as sedimentary nodule boundaries and hydrothermal veins, veinlets and stockwork (termed intrablock structure) are having an increasing influence on rockmass behaviour and should be included in modern geotechnical design. Due to the reliance on geotechnical classification methods which predate computer aided analysis, these complexities are ignored in conventional design. Given the comparatively complex, sophisticated and powerful numerical simulation and analysis techniques now practically available to the geotechnical engineer, this research is driven by the need for enhanced characterization of intrablock structure for application to numerical methods. Intrablock structure governs stress-driven behaviour at depth, gravity driven disintegration for large shallow spans, and controls ultimate fragmentation. This research addresses the characterization of intrablock structure and the understanding of its behaviour at laboratory testing and excavation scales, and presents new methodologies and tools to incorporate intrablock structure into geotechnical design practice. A new field characterization tool, the Composite Geological Strength Index, is used for outcrop or excavation face evaluation and provides direct input to continuum numerical models with implicit rockmass structure. A brittle overbreak estimation tool for complex rockmasses is developed using field observations. New methods to evaluate geometrical and mechanical properties of intrablock structure are developed. Finally, laboratory direct shear testing protocols for interblock structure are critically evaluated and extended to intrablock structure for the purpose of determining input parameters for numerical models with explicit structure.
Resumo:
Effective management of invasive fishes depends on the availability of updated information about their distribution and spatial dispersion. Forensic analysis was performed using online and published data on the European catfish, Silurus glanis L., a recent invader in the Tagus catchment (Iberian Peninsula). Eighty records were obtained mainly from anglers’ fora and blogs, and more recently from www.youtube.com. Since the first record in 1998, S. glanis expanded its geographic range by 700 km of river network, occurring mainly in reservoirs and in high-order reaches. Human-mediated and natural dispersal events were identified, with the former occurring during the first years of invasion and involving movements of >50 km. Downstream dispersal directionality was predominant. The analysis of online data from anglers was found to provide useful information on the distribution and dispersal patterns of this non-native fish, and is potentially applicable as a preliminary, exploratory assessment tool for other non-native fishes.
Resumo:
The ethical and social responsibility of citing the sources in a scientific or artistic work is undeniable. This paper explores, in a preliminary way, academic plagiarism in its various forms. It includes findings based on a forensic analysis. The purpose of this paper is to raise awareness on the importance of considering these details when writing and publishing a text. Hopefully, this analysis may put the issue under discussion.
Resumo:
Following their detection and seizure by police and border guard authorities, false identity and travel documents are usually scanned, producing digital images. This research investigates the potential of these images to classify false identity documents, highlight links between documents produced by a same modus operandi or same source, and thus support forensic intelligence efforts. Inspired by previous research work about digital images of Ecstasy tablets, a systematic and complete method has been developed to acquire, collect, process and compare images of false identity documents. This first part of the article highlights the critical steps of the method and the development of a prototype that processes regions of interest extracted from images. Acquisition conditions have been fine-tuned in order to optimise reproducibility and comparability of images. Different filters and comparison metrics have been evaluated and the performance of the method has been assessed using two calibration and validation sets of documents, made up of 101 Italian driving licenses and 96 Portuguese passports seized in Switzerland, among which some were known to come from common sources. Results indicate that the use of Hue and Edge filters or their combination to extract profiles from images, and then the comparison of profiles with a Canberra distance-based metric provides the most accurate classification of documents. The method appears also to be quick, efficient and inexpensive. It can be easily operated from remote locations and shared amongst different organisations, which makes it very convenient for future operational applications. The method could serve as a first fast triage method that may help target more resource-intensive profiling methods (based on a visual, physical or chemical examination of documents for instance). Its contribution to forensic intelligence and its application to several sets of false identity documents seized by police and border guards will be developed in a forthcoming article (part II).
Resumo:
A two-dimensional numeric simulator is developed to predict the nonlinear, convective-reactive, oxygen mass exchange in a cross-flow hollow fiber blood oxygenator. The numeric simulator also calculates the carbon dioxide mass exchange, as hemoglobin affinity to oxygen is affected by the local pH value, which depends mostly on the local carbon dioxide content in blood. Blood pH calculation inside the oxygenator is made by the simultaneous solution of an equation that takes into account the blood buffering capacity and the classical Henderson-Hasselbach equation. The modeling of the mass transfer conductance in the blood comprises a global factor, which is a function of the Reynolds number, and a local factor, which takes into account the amount of oxygen reacted to hemoglobin. The simulator is calibrated against experimental data for an in-line fiber bundle. The results are: (i) the calibration process allows the precise determination of the mass transfer conductance for both oxygen and carbon dioxide; (ii) very alkaline pH values occur in the blood path at the gas inlet side of the fiber bundle; (iii) the parametric analysis of the effect of the blood base excess (BE) shows that V(CO2) is similar in the case of blood metabolic alkalosis, metabolic acidosis, or normal BE, for a similar blood inlet P(CO2), although the condition of metabolic alkalosis is the worst case, as the pH in the vicinity of the gas inlet is the most alkaline; (iv) the parametric analysis of the effect of the gas flow to blood flow ratio (Q(G)/Q(B)) shows that V(CO2) variation with the gas flow is almost linear up to Q(G)/Q(B) = 2.0. V(O2) is not affected by the gas flow as it was observed that by increasing the gas flow up to eight times, the V(O2) grows only 1%. The mass exchange of carbon dioxide uses the full length of the hollow-fiber only if Q(G)/Q(B) > 2.0, as it was observed that only in this condition does the local variation of pH and blood P(CO2) comprise the whole fiber bundle.
Resumo:
The XSophe-Sophe-XeprView((R)) computer simulation software suite enables scientists to easily determine spin Hamiltonian parameters from isotropic, randomly oriented and single crystal continuous wave electron paramagnetic resonance (CW EPR) spectra from radicals and isolated paramagnetic metal ion centers or clusters found in metalloproteins, chemical systems and materials science. XSophe provides an X-windows graphical user interface to the Sophe programme and allows: creation of multiple input files, local and remote execution of Sophe, the display of sophelog (output from Sophe) and input parameters/files. Sophe is a sophisticated computer simulation software programme employing a number of innovative technologies including; the Sydney OPera HousE (SOPHE) partition and interpolation schemes, a field segmentation algorithm, the mosaic misorientation linewidth model, parallelization and spectral optimisation. In conjunction with the SOPHE partition scheme and the field segmentation algorithm, the SOPHE interpolation scheme and the mosaic misorientation linewidth model greatly increase the speed of simulations for most spin systems. Employing brute force matrix diagonalization in the simulation of an EPR spectrum from a high spin Cr(III) complex with the spin Hamiltonian parameters g(e) = 2.00, D = 0.10 cm(-1), E/D = 0.25, A(x) = 120.0, A(y) = 120.0, A(z) = 240.0 x 10(-4) cm(-1) requires a SOPHE grid size of N = 400 (to produce a good signal to noise ratio) and takes 229.47 s. In contrast the use of either the SOPHE interpolation scheme or the mosaic misorientation linewidth model requires a SOPHE grid size of only N = 18 and takes 44.08 and 0.79 s, respectively. Results from Sophe are transferred via the Common Object Request Broker Architecture (CORBA) to XSophe and subsequently to XeprView((R)) where the simulated CW EPR spectra (1D and 2D) can be compared to the experimental spectra. Energy level diagrams, transition roadmaps and transition surfaces aid the interpretation of complicated randomly oriented CW EPR spectra and can be viewed with a web browser and an OpenInventor scene graph viewer.