887 resultados para Computer forensic analysis
Resumo:
Forensic archaeologists and criminal investigators employ many different techniques for the location, recovery, and analysis of clandestine graves. Many of these techniques are based upon the premise that a grave is an anomaly and therefore differs physically, biologically, or chemically from its surroundings. The work reviewed in this communication demonstrates how and why field mycology might provide a further tool towards the investigation of scenes of crime concealed in forest ecosystems. The fruiting structures of certain fungi, the ammonia and the postputrefaction fungi, have been recorded repeatedly in association with decomposed mammalian cadavers in disparate regions of the world. The ecology and physiology of these fungi are reviewed briefly with a view to their potential as a forensic tool. This application of mycology is at an interface with forensic archaeology and forensic taphonomy and may provide a means to detect graves and has the potential to estimate postburial interval.
Resumo:
Mites are a highly diversified group of chelicerates (arthropods) adapted to a broad spectrum of habitats and diets, presenting extreme specificity to habitats. They are considered to be important indicators of environmental conditions including those modified by human beings. Therefore, they can inform about the environment where a corpse has been exposed to, about the route of specific merchandises, as well as about other applied aspects of forensic entomology. It is not rare the presence of species adapted to cadaveric environments. Jean Pierre Mégnin, forensic veterinarian considered pioneer in the development of forensic entomology, conscious about the importance of mites as forensic indicators, was the first including mites in the decomposition process. For Mégnin, wave six was formed by mites only. Due to the increasing interest of forensic experts in including these organisms in their analysis of trace evidence, as mites are one of the most ubiquitous organisms, we have developed standards for the sampling, conservation and custody of mite evidence of forensic interest.
Resumo:
In analysis of complex nuclear forensic samples containing lanthanides, actinides and matrix elements, rapid selective extraction of Am/Cm for quantification is challenging, in particular due the difficult separation of Am/Cm from lanthanides. Here we present a separation process for Am/Cm(III) which is achieved using a combination of AG1-X8 chromatography followed by Am/Cm extraction with a triazine ligand. The ligands tested in our process were CyMe4-BTPhen, CyMe4- BTBP, CA-BTP and CA-BTPhen. Our process allows for purification and quantification of Am and Cm (recoveries 80%–100%) and other major actinides in < 2d without the use of multiple columns or thiocyanate. The process is unaffected by high level Ca(II)/Fe(III)/Al(III) (10mg mL−1) and thus requires little pre-treatment of samples.
Resumo:
Land cover data derived from satellites are commonly used to prescribe inputs to models of the land surface. Since such data inevitably contains errors, quantifying how uncertainties in the data affect a model’s output is important. To do so, a spatial distribution of possible land cover values is required to propagate through the model’s simulation. However, at large scales, such as those required for climate models, such spatial modelling can be difficult. Also, computer models often require land cover proportions at sites larger than the original map scale as inputs, and it is the uncertainty in these proportions that this article discusses. This paper describes a Monte Carlo sampling scheme that generates realisations of land cover proportions from the posterior distribution as implied by a Bayesian analysis that combines spatial information in the land cover map and its associated confusion matrix. The technique is computationally simple and has been applied previously to the Land Cover Map 2000 for the region of England and Wales. This article demonstrates the ability of the technique to scale up to large (global) satellite derived land cover maps and reports its application to the GlobCover 2009 data product. The results show that, in general, the GlobCover data possesses only small biases, with the largest belonging to non–vegetated surfaces. In vegetated surfaces, the most prominent area of uncertainty is Southern Africa, which represents a complex heterogeneous landscape. It is also clear from this study that greater resources need to be devoted to the construction of comprehensive confusion matrices.
Resumo:
Objective. This study was designed to determine the precision and accuracy of angular measurements using three-dimensional computed tomography (3D-CT) volume rendering by computer systems. Study design. The study population consisted of 28 dried skulls that were scanned with a 64-row multislice CT, and 3D-CT images were generated. Angular measurements, (n = 6) based upon conventional craniometric anatomical landmarks (n = 9), were identified independently in 3D-CT images by 2 radiologists, twice each, and were then performed by 3D-CT imaging. Subsequently, physical measurements were made by a third examiner using a Beyond Crysta-C9168 series 900 device. Results. The results demonstrated no statistically significant difference between interexaminer and intraexaminer analysis. The mean difference between the physical and 3-D-based angular measurements was -1.18% and -0.89%, respectively, for both examiners, demonstrating high accuracy. Conclusion. Maxillofacial analysis of angular measurements using 3D-CT volume rendering by 64-row multislice CT is established and can be used for orthodontic and dentofacial orthopedic applications.
Resumo:
The genome of the most virulent among 22 Brazilian geographical isolates of Spodoptera frugiperda nucleopolyhedrovirus, isolate 19 (SfMNPV-1 9), was completely sequenced and shown to comprise 132 565 bp and 141 open reading frames (ORFs). A total of 11 ORFs with no homology to genes in the GenBank database were found. Of those, four had typical baculovirus; promoter motifs and polyadenylation sites. Computer-simulated restriction enzyme cleavage patterns of SfMNPV-1 9 were compared with published physical maps of other SfMNPV isolates. Differences were observed in terms of the restriction profiles and genome size. Comparison of SfMNPV-1 9 with the sequence of the SfMNPV isolate 3AP2 indicated that they differed due to a 1427 bp deletion, as well as by a series of smaller deletions and point mutations. The majority of genes of SfMNPV-1 9 were conserved in the closely related Spodoptera exigua NPV (SeMNPV) and Agrotis segetum NPV (AgseMNPV-A), but a few regions experienced major changes and rearrangements. Synthenic maps for the genomes of group 11 NPVs revealed that gene collinearity was observed only within certain clusters. Analysis of the dynamics of gene gain and loss along the phylogenetic tree of the NPVs showed that group 11 had only five defining genes and supported the hypothesis that these viruses form ten highly divergent ancient lineages. Crucially, more than 60% of the gene gain events followed a power-law relation to genetic distance among baculoviruses, indicative of temporal organization in the gene accretion process.
Resumo:
Developed countries have an even distribution of published papers on the seventeen model organisms. Developing countries have biased preferences for a few model organisms which are associated with endemic human diseases. A variant of the Hirsch-index, that we call the mean (mo)h-index (""model organism h-index""), shows an exponential relationship with the amount of papers published in each country on the selected model organisms. Developing countries cluster together with low mean (mo)h-indexes, even those with high number of publications. The growth curves of publications on the recent model Caenorhabditis elegans in developed countries shows different formats. We also analyzed the growth curves of indexed publications originating from developing countries. Brazil and South Korea were selected for this comparison. The most prevalent model organisms in those countries show different growth curves when compared to a global analysis, reflecting the size and composition of their research communities.
Resumo:
Multidimensional Visualization techniques are invaluable tools for analysis of structured and unstructured data with variable dimensionality. This paper introduces PEx-Image-Projection Explorer for Images-a tool aimed at supporting analysis of image collections. The tool supports a methodology that employs interactive visualizations to aid user-driven feature detection and classification tasks, thus offering improved analysis and exploration capabilities. The visual mappings employ similarity-based multidimensional projections and point placement to layout the data on a plane for visual exploration. In addition to its application to image databases, we also illustrate how the proposed approach can be successfully employed in simultaneous analysis of different data types, such as text and images, offering a common visual representation for data expressed in different modalities.
Resumo:
In this paper, we present an algorithm for cluster analysis that integrates aspects from cluster ensemble and multi-objective clustering. The algorithm is based on a Pareto-based multi-objective genetic algorithm, with a special crossover operator, which uses clustering validation measures as objective functions. The algorithm proposed can deal with data sets presenting different types of clusters, without the need of expertise in cluster analysis. its result is a concise set of partitions representing alternative trade-offs among the objective functions. We compare the results obtained with our algorithm, in the context of gene expression data sets, to those achieved with multi-objective Clustering with automatic K-determination (MOCK). the algorithm most closely related to ours. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
A conceptual problem that appears in different contexts of clustering analysis is that of measuring the degree of compatibility between two sequences of numbers. This problem is usually addressed by means of numerical indexes referred to as sequence correlation indexes. This paper elaborates on why some specific sequence correlation indexes may not be good choices depending on the application scenario in hand. A variant of the Product-Moment correlation coefficient and a weighted formulation for the Goodman-Kruskal and Kendall`s indexes are derived that may be more appropriate for some particular application scenarios. The proposed and existing indexes are analyzed from different perspectives, such as their sensitivity to the ranks and magnitudes of the sequences under evaluation, among other relevant aspects of the problem. The results help suggesting scenarios within the context of clustering analysis that are possibly more appropriate for the application of each index. (C) 2008 Elsevier Inc. All rights reserved.
Resumo:
Successful classification, information retrieval and image analysis tools are intimately related with the quality of the features employed in the process. Pixel intensities, color, texture and shape are, generally, the basis from which most of the features are Computed and used in such fields. This papers presents a novel shape-based feature extraction approach where an image is decomposed into multiple contours, and further characterized by Fourier descriptors. Unlike traditional approaches we make use of topological knowledge to generate well-defined closed contours, which are efficient signatures for image retrieval. The method has been evaluated in the CBIR context and image analysis. The results have shown that the multi-contour decomposition, as opposed to a single shape information, introduced a significant improvement in the discrimination power. (c) 2008 Elsevier B.V. All rights reserved,
Resumo:
In this work we introduce a new hierarchical surface decomposition method for multiscale analysis of surface meshes. In contrast to other multiresolution methods, our approach relies on spectral properties of the surface to build a binary hierarchical decomposition. Namely, we utilize the first nontrivial eigenfunction of the Laplace-Beltrami operator to recursively decompose the surface. For this reason we coin our surface decomposition the Fiedler tree. Using the Fiedler tree ensures a number of attractive properties, including: mesh-independent decomposition, well-formed and nearly equi-areal surface patches, and noise robustness. We show how the evenly distributed patches can be exploited for generating multiresolution high quality uniform meshes. Additionally, our decomposition permits a natural means for carrying out wavelet methods, resulting in an intuitive method for producing feature-sensitive meshes at multiple scales. Published by Elsevier Ltd.
Resumo:
The use of inter-laboratory test comparisons to determine the performance of individual laboratories for specific tests (or for calibration) [ISO/IEC Guide 43-1, 1997. Proficiency testing by interlaboratory comparisons - Part 1: Development and operation of proficiency testing schemes] is called Proficiency Testing (PT). In this paper we propose the use of the generalized likelihood ratio test to compare the performance of the group of laboratories for specific tests relative to the assigned value and illustrate the procedure considering an actual data from the PT program in the area of volume. The proposed test extends the test criteria in use allowing to test for the consistency of the group of laboratories. Moreover, the class of elliptical distributions are considered for the obtained measurements. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Aspect-oriented programming (AOP) is a promising technology that supports separation of crosscutting concerns (i.e., functionality that tends to be tangled with, and scattered through the rest of the system). In AOP, a method-like construct named advice is applied to join points in the system through a special construct named pointcut. This mechanism supports the modularization of crosscutting behavior; however, since the added interactions are not explicit in the source code, it is hard to ensure their correctness. To tackle this problem, this paper presents a rigorous coverage analysis approach to ensure exercising the logic of each advice - statements, branches, and def-use pairs - at each affected join point. To make this analysis possible, a structural model based on Java bytecode - called PointCut-based Del-Use Graph (PCDU) - is proposed, along with three integration testing criteria. Theoretical, empirical, and exploratory studies involving 12 aspect-oriented programs and several fault examples present evidence of the feasibility and effectiveness of the proposed approach. (C) 2010 Elsevier Inc. All rights reserved.
Resumo:
The TCABR data analysis and acquisition system has been upgraded to support a joint research programme using remote participation technologies. The architecture of the new system uses Java language as programming environment. Since application parameters and hardware in a joint experiment are complex with a large variability of components, requirements and specification solutions need to be flexible and modular, independent from operating system and computer architecture. To describe and organize the information on all the components and the connections among them, systems are developed using the extensible Markup Language (XML) technology. The communication between clients and servers uses remote procedure call (RPC) based on the XML (RPC-XML technology). The integration among Java language, XML and RPC-XML technologies allows to develop easily a standard data and communication access layer between users and laboratories using common software libraries and Web application. The libraries allow data retrieval using the same methods for all user laboratories in the joint collaboration, and the Web application allows a simple graphical user interface (GUI) access. The TCABR tokamak team in collaboration with the IPFN (Instituto de Plasmas e Fusao Nuclear, Instituto Superior Tecnico, Universidade Tecnica de Lisboa) is implementing this remote participation technologies. The first version was tested at the Joint Experiment on TCABR (TCABRJE), a Host Laboratory Experiment, organized in cooperation with the IAEA (International Atomic Energy Agency) in the framework of the IAEA Coordinated Research Project (CRP) on ""Joint Research Using Small Tokamaks"". (C) 2010 Elsevier B.V. All rights reserved.