887 resultados para Computer forensic analysis
Resumo:
Objectives: The aim of this study was to determine the precision of the measurements of 2 craniometric anatomic points-glabella and anterior nasal spine-in order to verify their possibility as potential locations for placing implants aimed at nasal prostheses retention. Methods: Twenty-six dry human skulls were scanned in a high-resolution spiral tomography with 1-mm axial slice thickness and 1-mm interval reconstruction using a bone tissue filter. Images obtained were stored and transferred to an independent workstation containing e-film imaging software. The measurements (in the glabella and anterior nasal fossa) were made independently by 2 observers twice for each measurement. Data were submitted to statistical analysis (parametric t test). Results: The results demonstrated no statistically significant difference between interobserver and intraobserver measurements (P > .05). The standard error was found to be between 0.49 mm and 0.84 mrn for measurements in bone protocol, indicating a high /eve/ of precision. Conclusions: The measurements obtained in anterior nasal spine and glabella were considered precise and reproducible. Mean values of such measurements pointed to the possibility of implant placement in these regions, particularly in the anterior nasal spine.
Resumo:
Background: Understanding how clinical variables affect stress distribution facilitates optimal prosthesis design and fabrication and may lead to a decrease in mechanical failures as well as improve implant longevity. Purpose: In this study, the many clinical variations present in implant-supported prosthesis were analyzed by 3-D finite element method. Materials and Method: A geometrical model representing the anterior segment of a human mandible treated with 5 implants supporting a framework was created to perform the tests. The variables introduced in the computer model were cantilever length, elastic modulus of cancellous bone, abutment length, implant length, and framework alloy (AgPd or CoCr). The computer was programmed with physical properties of the materials as derived from the literature, and a 100N vertical load was used to simulate the occlusal force. Images with the fringes of stress were obtained and the maximum stress at each site was plotted in graphs for comparison. Results: Stresses clustered at the elements closest to the loading point. Stress increase was found to be proportional to the increase in cantilever length and inversely proportional to the increase in the elastic modulus of cancellous bone. Increasing the abutment length resulted in a decrease of stress on implants and framework. Stress decrease could not be demonstrated with implants longer than 13 mm. A stiffer framework may allow better stress distribution. Conclusion: The relative physical properties of the many materials involved in an implant-supported prosthesis system affect the way stresses are distributed.
Resumo:
The catalytic properties of enzymes are usually evaluated by measuring and analyzing reaction rates. However, analyzing the complete time course can be advantageous because it contains additional information about the properties of the enzyme. Moreover, for systems that are not at steady state, the analysis of time courses is the preferred method. One of the major barriers to the wide application of time courses is that it may be computationally more difficult to extract information from these experiments. Here the basic approach to analyzing time courses is described, together with some examples of the essential computer code to implement these analyses. A general method that can be applied to both steady state and non-steady-state systems is recommended. (C) 2001 academic Press.
Resumo:
Map algebra is a data model and simple functional notation to study the distribution and patterns of spatial phenomena. It uses a uniform representation of space as discrete grids, which are organized into layers. This paper discusses extensions to map algebra to handle neighborhood operations with a new data type called a template. Templates provide general windowing operations on grids to enable spatial models for cellular automata, mathematical morphology, and local spatial statistics. A programming language for map algebra that incorporates templates and special processing constructs is described. The programming language is called MapScript. Example program scripts are presented to perform diverse and interesting neighborhood analysis for descriptive, model-based and processed-based analysis.
Resumo:
A combination of modelling and analysis techniques was used to design a six component force balance. The balance was designed specifically for the measurement of impulsive aerodynamic forces and moments characteristic of hypervelocity shock tunnel testing using the stress wave force measurement technique. Aerodynamic modelling was used to estimate the magnitude and distribution of forces and finite element modelling to determine the mechanical response of proposed balance designs. Simulation of balance performance was based on aerodynamic loads and mechanical responses using convolution techniques. Deconvolution was then used to assess balance performance and to guide further design modifications leading to the final balance design. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
The use of gate-to-drain capacitance (C-gd) measurement as a tool to characterize hot-carrier-induced charge centers in submicron n- and p-MOSFET's has been reviewed and demonstrated. By analyzing the change in C-gd measured at room and cryogenic temperature before and after high gate-to-drain transverse field (high field) and maximum substrate current (I-bmax) stress, it is concluded that the degradation was found to be mostly due to trapping of majority carriers and generation of interface states. These interface states were found to be acceptor states at top half of band gap for n-MOSFETs and donor states at bottom half of band gap for p-MOSFETs. In general, hot electrons are more likely to be trapped in gate oxide as compared to hot holes while the presence of hot holes generates more interface states. Also, we have demonstrated a new method for extracting the spatial distribution of oxide trapped charge, Q(ot), through gate-to-substrate capacitance (C-gb) measurement. This method is simple to implement and does not require additional information from simulation or detailed knowledge of the device's structure. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
Understanding the genetic architecture of quantitative traits can greatly assist the design of strategies for their manipulation in plant-breeding programs. For a number of traits, genetic variation can be the result of segregation of a few major genes and many polygenes (minor genes). The joint segregation analysis (JSA) is a maximum-likelihood approach for fitting segregation models through the simultaneous use of phenotypic information from multiple generations. Our objective in this paper was to use computer simulation to quantify the power of the JSA method for testing the mixed-inheritance model for quantitative traits when it was applied to the six basic generations: both parents (P-1 and P-2), F-1, F-2, and both backcross generations (B-1 and B-2) derived from crossing the F-1 to each parent. A total of 1968 genetic model-experiment scenarios were considered in the simulation study to quantify the power of the method. Factors that interacted to influence the power of the JSA method to correctly detect genetic models were: (1) whether there were one or two major genes in combination with polygenes, (2) the heritability of the major genes and polygenes, (3) the level of dispersion of the major genes and polygenes between the two parents, and (4) the number of individuals examined in each generation (population size). The greatest levels of power were observed for the genetic models defined with simple inheritance; e.g., the power was greater than 90% for the one major gene model, regardless of the population size and major-gene heritability. Lower levels of power were observed for the genetic models with complex inheritance (major genes and polygenes), low heritability, small population sizes and a large dispersion of favourable genes among the two parents; e.g., the power was less than 5% for the two major-gene model with a heritability value of 0.3 and population sizes of 100 individuals. The JSA methodology was then applied to a previously studied sorghum data-set to investigate the genetic control of the putative drought resistance-trait osmotic adjustment in three crosses. The previous study concluded that there were two major genes segregating for osmotic adjustment in the three crosses. Application of the JSA method resulted in a change in the proposed genetic model. The presence of the two major genes was confirmed with the addition of an unspecified number of polygenes.
Resumo:
Qualitative data analysis (QDA) is often a time-consuming and laborious process usually involving the management of large quantities of textual data. Recently developed computer programs offer great advances in the efficiency of the processes of QDA. In this paper we report on an innovative use of a combination of extant computer software technologies to further enhance and simplify QDA. Used in appropriate circumstances, we believe that this innovation greatly enhances the speed with which theoretical and descriptive ideas can be abstracted from rich, complex, and chaotic qualitative data. © 2001 Human Sciences Press, Inc.
Resumo:
The personal computer revolution has resulted in the widespread availability of low-cost image analysis hardware. At the same time, new graphic file formats have made it possible to handle and display images at resolutions beyond the capability of the human eye. Consequently, there has been a significant research effort in recent years aimed at making use of these hardware and software technologies for flotation plant monitoring. Computer-based vision technology is now moving out of the research laboratory and into the plant to become a useful means of monitoring and controlling flotation performance at the cell level. This paper discusses the metallurgical parameters that influence surface froth appearance and examines the progress that has been made in image analysis of flotation froths. The texture spectrum and pixel tracing techniques developed at the Julius Kruttschnitt Mineral Research Centre are described in detail. The commercial implementation, JKFrothCam, is one of a number of froth image analysis systems now reaching maturity. In plants where it is installed, JKFrothCam has shown a number of performance benefits. Flotation runs more consistently, meeting product specifications while maintaining high recoveries. The system has also shown secondary benefits in that reagent costs have been significantly reduced as a result of improved flotation control. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
A field matching method is described to analyze a recessed circular cavity radiating into a radial waveguide. Using the wall impedance approach, the analysis is divided into two separate problems of the cavity and its external environment. Based on this analysis, a computer algorithm is developed for determining wall admittances as seen at the edge of the patch in the cavity, the radial admittance matrix for the two-probe feed arrangement, and the input impedance as observed from the coaxial line feeding the cavity. This algorithm is tested against the general-purpose Hewlett-Packard finite-element High Frequency Structure Simulator as well as against measured results. Good agreement in all considered cases is noted.
Resumo:
Observations of an insect's movement lead to theory on the insect's flight behaviour and the role of movement in the species' population dynamics. This theory leads to predictions of the way the population changes in time under different conditions. If a hypothesis on movement predicts a specific change in the population, then the hypothesis can be tested against observations of population change. Routine pest monitoring of agricultural crops provides a convenient source of data for studying movement into a region and among fields within a region. Examples of the use of statistical and computational methods for testing hypotheses with such data are presented. The types of questions that can be addressed with these methods and the limitations of pest monitoring data when used for this purpose are discussed. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
In this paper we use sensor-annotated abstraction hierarchies (Reising & Sanderson, 1996, 2002a,b) to show that unless appropriately instrumented, configural displays designed according to the principles of ecological interface design (EID) might be vulnerable to misinterpretation when sensors become unreliable or are unavailable. Building on foundations established in Reising and Sanderson (2002a) we use a pasteurization process control example to show how sensor-annotated AHs help the analyst determine the impact of different instrumentation engineering policies on a configural display that is part of an ecological interface. Our analyses suggest that configural displays showing higher-order properties of a system are especially vulnerable under some conservative instrumentation configurations. However, sensor-annotated AHs can be used to indicate where corrective instrumentation might be placed. We argue that if EID is to be effectively employed in the design of displays for complex systems, then the information needs of the human operator need to be considered while instrumentation requirements are being formulated. Rasmussen's abstraction hierarchy-and particularly its extension to the analysis of information captured by sensors and derived from sensors-may therefore be a useful adjunct to up-stream instrumentation design. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
In this paper we establish a foundation for understanding the instrumentation needs of complex dynamic systems if ecological interface design (EID)-based interfaces are to be robust in the face of instrumentation failures. EID-based interfaces often include configural displays which reveal the higher-order properties of complex systems. However, concerns have been expressed that such displays might be misleading when instrumentation is unreliable or unavailable. Rasmussen's abstraction hierarchy (AH) formalism can be extended to include representations of sensors near the functions or properties about which they provide information, resulting in what we call a sensor-annotated abstraction hierarchy. Sensor-annotated AHs help the analyst determine the impact of different instrumentation engineering policies on higher-order system information by showing how the data provided from individual sensors propagates within and across levels of abstraction in the AH. The use of sensor-annotated AHs with a configural display is illustrated with a simple water reservoir example. We argue that if EID is to be effectively employed in the design of interfaces for complex systems, then the information needs of the human operator need to be considered at the earliest stages of system development while instrumentation requirements are being formulated. In this way, Rasmussen's AH promotes a formative approach to instrumentation engineering. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
Solid earth simulations have recently been developed to address issues such as natural disasters, global environmental destruction and the conservation of natural resources. The simulation of solid earth phenomena involves the analysis of complex structures including strata, faults, and heterogeneous material properties. Simulation of the generation and cycle of earthquakes is particularly important, but such simulations require the analysis of complex fault dynamics. GeoFEM is a parallel finite-element analysis system intended for solid earth field phenomena problems. This paper describes recent development in the GeoFEM project for the simulation of earthquake generation and cycles.