887 resultados para Computer forensic analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Northern hardwood management was assessed throughout the state of Michigan using data collected on recently harvested stands in 2010 and 2011. Methods of forensic estimation of diameter at breast height were compared and an ideal, localized equation form was selected for use in reconstructing pre-harvest stand structures. Comparisons showed differences in predictive ability among available equation forms which led to substantial financial differences when used to estimate the value of removed timber. Management on all stands was then compared among state, private, and corporate landowners. Comparisons of harvest intensities against a liberal interpretation of a well-established management guideline showed that approximately one third of harvests were conducted in a manner which may imply that the guideline was followed. One third showed higher levels of removals than recommended, and one third of harvests were less intensive than recommended. Multiple management guidelines and postulated objectives were then synthesized into a novel system of harvest taxonomy, against which all harvests were compared. This further comparison showed approximately the same proportions of harvests, while distinguishing sanitation cuts and the future productive potential of harvests cut more intensely than suggested by guidelines. Stand structures are commonly represented using diameter distributions. Parametric and nonparametric techniques for describing diameter distributions were employed on pre-harvest and post-harvest data. A common polynomial regression procedure was found to be highly sensitive to the method of histogram construction which provides the data points for the regression. The discriminative ability of kernel density estimation was substantially different from that of the polynomial regression technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The electric utility business is an inherently dangerous area to work in with employees exposed to many potential hazards daily. One such hazard is an arc flash. An arc flash is a rapid release of energy, referred to as incident energy, caused by an electric arc. Due to the random nature and occurrence of an arc flash, one can only prepare and minimize the extent of harm to themself, other employees and damage to equipment due to such a violent event. Effective January 1, 2009 the National Electric Safety Code (NESC) requires that an arc-flash assessment be performed by companies whose employees work on or near energized equipment to determine the potential exposure to an electric arc. To comply with the NESC requirement, Minnesota Power’s (MP’s) current short circuit and relay coordination software package, ASPEN OneLinerTM and one of the first software packages to implement an arc-flash module, is used to conduct an arc-flash hazard analysis. At the same time, the package is benchmarked against equations provided in the IEEE Std. 1584-2002 and ultimately used to determine the incident energy levels on the MP transmission system. This report goes into the depth of the history of arc-flash hazards, analysis methods, both software and empirical derived equations, issues of concern with calculation methods and the work conducted at MP. This work also produced two offline software products to conduct and verify an offline arc-flash hazard analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The following is an analysis of the role of computer aided surgery by infralabyrinthine-subcochlear approach to the petrous apex for cholesterol granulomas with hearing preservation. In a retrospective case review from 1996 to 2008 six patients were analysed in our tertiary referral centre, otorhinolaryngology outpatient clinic. Excellent intraoperative localisation of the carotid artery, facial nerve and the entrance into the cholesterol cyst of the bone by means of the navigation system was seen. Additionally, the operation time decreased from an initial 4 h down to 2 h. The application of computer-aided surgery allows intraoperative monitoring of the position of the tip of the microsurgical instruments in case of a rare disease and in the delicate area of the petrous apex giving a high security level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When a single brush-less dc motor is fed by an inverter with a sensor-less algorithm embedded in the switching controller, the system exhibits a linear and stable output in terms of the speed and torque. However, with two motors modulated by the same inverter, the system is unstable and rendered useless for a steady application, unless provided with some resistive damping on the supply lines. The project discusses and analysis the stability of such a system through simulations and hardware demonstrations and also will discuss a method to derive the values of these damping.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nitrogen and water are essential for plant growth and development. In this study, we designed experiments to produce gene expression data of poplar roots under nitrogen starvation and water deprivation conditions. We found low concentration of nitrogen led first to increased root elongation followed by lateral root proliferation and eventually increased root biomass. To identify genes regulating root growth and development under nitrogen starvation and water deprivation, we designed a series of data analysis procedures, through which, we have successfully identified biologically important genes. Differentially Expressed Genes (DEGs) analysis identified the genes that are differentially expressed under nitrogen starvation or drought. Protein domain enrichment analysis identified enriched themes (in same domains) that are highly interactive during the treatment. Gene Ontology (GO) enrichment analysis allowed us to identify biological process changed during nitrogen starvation. Based on the above analyses, we examined the local Gene Regulatory Network (GRN) and identified a number of transcription factors. After testing, one of them is a high hierarchically ranked transcription factor that affects root growth under nitrogen starvation. It is very tedious and time-consuming to analyze gene expression data. To avoid doing analysis manually, we attempt to automate a computational pipeline that now can be used for identification of DEGs and protein domain analysis in a single run. It is implemented in scripts of Perl and R.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis develops high performance real-time signal processing modules for direction of arrival (DOA) estimation for localization systems. It proposes highly parallel algorithms for performing subspace decomposition and polynomial rooting, which are otherwise traditionally implemented using sequential algorithms. The proposed algorithms address the emerging need for real-time localization for a wide range of applications. As the antenna array size increases, the complexity of signal processing algorithms increases, making it increasingly difficult to satisfy the real-time constraints. This thesis addresses real-time implementation by proposing parallel algorithms, that maintain considerable improvement over traditional algorithms, especially for systems with larger number of antenna array elements. Singular value decomposition (SVD) and polynomial rooting are two computationally complex steps and act as the bottleneck to achieving real-time performance. The proposed algorithms are suitable for implementation on field programmable gated arrays (FPGAs), single instruction multiple data (SIMD) hardware or application specific integrated chips (ASICs), which offer large number of processing elements that can be exploited for parallel processing. The designs proposed in this thesis are modular, easily expandable and easy to implement. Firstly, this thesis proposes a fast converging SVD algorithm. The proposed method reduces the number of iterations it takes to converge to correct singular values, thus achieving closer to real-time performance. A general algorithm and a modular system design are provided making it easy for designers to replicate and extend the design to larger matrix sizes. Moreover, the method is highly parallel, which can be exploited in various hardware platforms mentioned earlier. A fixed point implementation of proposed SVD algorithm is presented. The FPGA design is pipelined to the maximum extent to increase the maximum achievable frequency of operation. The system was developed with the objective of achieving high throughput. Various modern cores available in FPGAs were used to maximize the performance and details of these modules are presented in detail. Finally, a parallel polynomial rooting technique based on Newton’s method applicable exclusively to root-MUSIC polynomials is proposed. Unique characteristics of root-MUSIC polynomial’s complex dynamics were exploited to derive this polynomial rooting method. The technique exhibits parallelism and converges to the desired root within fixed number of iterations, making this suitable for polynomial rooting of large degree polynomials. We believe this is the first time that complex dynamics of root-MUSIC polynomial were analyzed to propose an algorithm. In all, the thesis addresses two major bottlenecks in a direction of arrival estimation system, by providing simple, high throughput, parallel algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the realm of computer programming, the experience of writing a program is used to reinforce concepts and evaluate ability. This research uses three case studies to evaluate the introduction of testing through Kolb's Experiential Learning Model (ELM). We then analyze the impact of those testing experiences to determine methods for improving future courses. The first testing experience that students encounter are unit test reports in their early courses. This course demonstrates that automating and improving feedback can provide more ELM iterations. The JUnit Generation (JUG) tool also provided a positive experience for the instructor by reducing the overall workload. Later, undergraduate and graduate students have the opportunity to work together in a multi-role Human-Computer Interaction (HCI) course. The interactions use usability analysis techniques with graduate students as usability experts and undergraduate students as design engineers. Students get experience testing the user experience of their product prototypes using methods varying from heuristic analysis to user testing. From this course, we learned the importance of the instructors role in the ELM. As more roles were added to the HCI course, a desire arose to provide more complete, quality assured software. This inspired the addition of unit testing experiences to the course. However, we learned that significant preparations must be made to apply the ELM when students are resistant. The research presented through these courses was driven by the recognition of a need for testing in a Computer Science curriculum. Our understanding of the ELM suggests the need for student experience when being introduced to testing concepts. We learned that experiential learning, when appropriately implemented, can provide benefits to the Computer Science classroom. When examined together, these course-based research projects provided insight into building strong testing practices into a curriculum.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This morning Dr. Risser will introduce you to the basic ideas of social network analysis. You will learn some history behind the study of social networks. Dr. Risser will introduce you to mathematical measures of social networks including centrality measures and measures of spread and cohesion. You will also learn how to use a computer program to analyze social network data

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stable Isotope Ratio Analysis (SIRA) is the measurement of variation in different isotopes of same elements in a material. This technique is well-established in the natural sciences and has been long part of the methodological arsenal in fields such as geology and biology. More recently this technique has begun to be utilized in the social sciences, moving from initial applications in anthropology to potential uses in geography, public health, forensic science, and others. This presentation will discuss the techniques behind SIRA, examples of current applications in the natural and social sciences, and potential avenues of future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To assess the literature on accuracy and clinical performance of computer technology applications in surgical implant dentistry. MATERIALS AND METHODS: Electronic and manual literature searches were conducted to collect information about (1) the accuracy and (2) clinical performance of computer-assisted implant systems. Meta-regression analysis was performed for summarizing the accuracy studies. Failure/complication rates were analyzed using random-effects Poisson regression models to obtain summary estimates of 12-month proportions. RESULTS: Twenty-nine different image guidance systems were included. From 2,827 articles, 13 clinical and 19 accuracy studies were included in this systematic review. The meta-analysis of the accuracy (19 clinical and preclinical studies) revealed a total mean error of 0.74 mm (maximum of 4.5 mm) at the entry point in the bone and 0.85 mm at the apex (maximum of 7.1 mm). For the 5 included clinical studies (total of 506 implants) using computer-assisted implant dentistry, the mean failure rate was 3.36% (0% to 8.45%) after an observation period of at least 12 months. In 4.6% of the treated cases, intraoperative complications were reported; these included limited interocclusal distances to perform guided implant placement, limited primary implant stability, or need for additional grafting procedures. CONCLUSION: Differing levels and quantity of evidence were available for computer-assisted implant placement, revealing high implant survival rates after only 12 months of observation in different indications and a reasonable level of accuracy. However, future long-term clinical data are necessary to identify clinical indications and to justify additional radiation doses, effort, and costs associated with computer-assisted implant surgery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: The aim of this systematic review was to analyze the dental literature regarding accuracy and clinical application in computer-guided template-based implant dentistry. Materials and methods: An electronic literature search complemented by manual searching was performed to gather data on accuracy and surgical, biological and prosthetic complications in connection with computer-guided implant treatment. For the assessment of accuracy meta-regression analysis was performed. Complication rates are descriptively summarized. Results: From 3120 titles after the literature search, eight articles met the inclusion criteria regarding accuracy and 10 regarding the clinical performance. Meta-regression analysis revealed a mean deviation at the entry point of 1.07 mm (95% CI: 0.76-1.22 mm) and at the apex of 1.63 mm (95% CI: 1.26-2 mm). No significant differences between the studies were found regarding method of template production or template support and stabilization. Early surgical complications occurred in 9.1%, early prosthetic complications in 18.8% and late prosthetic complications in 12% of the cases. Implant survival rates of 91-100% after an observation time of 12-60 months are reported in six clinical studies with 537 implants mainly restored immediately after flapless implantation procedures. Conclusion: Computer-guided template-based implant placement showed high implant survival rates ranging from 91% to 100%. However, a considerable number of technique-related perioperative complications were observed. Preclinical and clinical studies indicated a reasonable mean accuracy with relatively high maximum deviations. Future research should be directed to increase the number of clinical studies with longer observation periods and to improve the systems in terms of perioperative handling, accuracy and prosthetic complications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To evaluate multislice spiral computed tomography (MSCT) and magnetic resonance imaging (MRI) findings in hanging and manual strangulation cases and compare them with forensic autopsy results. MATERIALS AND METHODS: Postmortem MSCT and MRI of nine persons who died from hanging or manual strangulation were performed. The neck findings were compared with those discovered during forensic autopsy. In addition, two living patients underwent imaging and clinical examination following severe manual strangulation and near-hanging, respectively. For evaluation, the findings were divided into "primary" (strangulation mark and subcutaneous desiccation (i.e., soft-tissue thinning as a result of tissue fluids being driven out by mechanical compression) in hanging, and subcutaneous and intramuscular hemorrhage in manual strangulation) and "collateral" signs. The Wilcoxon two-tailed test was used for statistical analysis of the lymph node and salivary gland findings. RESULTS: In hanging, the primary and most frequent collateral signs were revealed by imaging. In manual strangulation, the primary findings were accurately depicted, with the exception of one slight hemorrhage. Apart from a vocal cord hemorrhage, all frequent collateral signs could be diagnosed radiologically. Traumatic lymph node hemorrhage (P = 0.031) was found in all of the manual strangulation cases. CONCLUSION: MSCT and MRI revealed strangulation signs concordantly with forensic pathology findings. Imaging offers a great potential for the forensic examination of lesions due to strangulation in both clinical and postmortem settings.