956 resultados para Mathematical Techniques--Error Analysis
Resumo:
Modern imaging technologies, such as computed tomography (CT) techniques, represent a great challenge in forensic pathology. The field of forensics has experienced a rapid increase in the use of these new techniques to support investigations on critical cases, as indicated by the implementation of CT scanning by different forensic institutions worldwide. Advances in CT imaging techniques over the past few decades have finally led some authors to propose that virtual autopsy, a radiological method applied to post-mortem analysis, is a reliable alternative to traditional autopsy, at least in certain cases. The authors investigate the occurrence and the causes of errors and mistakes in diagnostic imaging applied to virtual autopsy. A case of suicide by a gunshot wound was submitted to full-body CT scanning before autopsy. We compared the first examination of sectional images with the autopsy findings and found a preliminary misdiagnosis in detecting a peritoneal lesion by gunshot wound that was due to radiologist's error. Then we discuss a new emerging issue related to the risk of diagnostic failure in virtual autopsy due to radiologist's error that is similar to what occurs in clinical radiology practice.
Resumo:
In this study, the effect of time derivatives of flow rate and rotational speed was investigated on the mathematical modeling of a rotary blood pump (RBP). The basic model estimates the pressure head of the pump as a dependent variable using measured flow and speed as predictive variables. Performance of the model was evaluated by adding time derivative terms for flow and speed. First, to create a realistic working condition, the Levitronix CentriMag RBP was implanted in a sheep. All parameters from the model were physically measured and digitally acquired over a wide range of conditions, including pulsatile speed. Second, a statistical analysis of the different variables (flow, speed, and their time derivatives) based on multiple regression analysis was performed to determine the significant variables for pressure head estimation. Finally, different mathematical models were used to show the effect of time derivative terms on the performance of the models. In order to evaluate how well the estimated pressure head using different models fits the measured pressure head, root mean square error and correlation coefficient were used. The results indicate that inclusion of time derivatives of flow and speed can improve model accuracy, but only minimally.
Resumo:
In low-income settings, treatment failure is often identified using CD4 cell count monitoring. Consequently, patients remain on a failing regimen, resulting in a higher risk of transmission. We investigated the benefit of routine viral load monitoring for reducing HIV transmission.
Resumo:
The Simulation Automation Framework for Experiments (SAFE) streamlines the de- sign and execution of experiments with the ns-3 network simulator. SAFE ensures that best practices are followed throughout the workflow a network simulation study, guaranteeing that results are both credible and reproducible by third parties. Data analysis is a crucial part of this workflow, where mistakes are often made. Even when appearing in highly regarded venues, scientific graphics in numerous network simulation publications fail to include graphic titles, units, legends, and confidence intervals. After studying the literature in network simulation methodology and in- formation graphics visualization, I developed a visualization component for SAFE to help users avoid these errors in their scientific workflow. The functionality of this new component includes support for interactive visualization through a web-based interface and for the generation of high-quality, static plots that can be included in publications. The overarching goal of my contribution is to help users create graphics that follow best practices in visualization and thereby succeed in conveying the right information about simulation results.
Resumo:
PURPOSE: Two noninvasive methods to measure dental implant stability are damping capacity assessment (Periotest) and resonance frequency analysis (Osstell). The objective of the present study was to assess the correlation of these 2 techniques in clinical use. MATERIALS AND METHODS: Implant stability of 213 clinically stable loaded and unloaded 1-stage implants in 65 patients was measured in triplicate by means of resonance frequency analysis and Periotest. Descriptive statistics as well as Pearson's, Spearman's, and intraclass correlation coefficients were calculated with SPSS 11.0.2. RESULTS: The mean values were 57.66 +/- 8.19 implant stability quotient for the resonance frequency analysis and -5.08 +/- 2.02 for the Periotest. The correlation of both measuring techniques was -0.64 (Pearson) and -0.65 (Spearman). The single-measure intraclass correlation coefficients for the ISQ and Periotest values were 0.99 and 0.88, respectively (95% CI). No significant correlation of implant length with either resonance frequency analysis or Periotest could be found. However, a significant correlation of implant diameter with both techniques was found (P < .005). The correlation of both measuring systems is moderate to good. It seems that the Periotest is more susceptible to clinical measurement variables than the Osstell device. The intraclass correlation indicated lower measurement precision for the Periotest technique. Additionally, the Periotest values differed more from the normal (Gaussian) curve of distribution than the ISQs. Both measurement techniques show a significant correlation to the implant diameter. CONCLUSION: Resonance frequency analysis appeared to be the more precise technique.
Resumo:
From the customer satisfaction point of view, sound quality of any product has become one of the important factors these days. The primary objective of this research is to determine factors which affect the acceptability of impulse noise. Though the analysis is based on a sample impulse sound file of a Commercial printer, the results can be applied to other similar impulsive noise. It is assumed that impulsive noise can be tuned to meet the accepTable criteria. Thus it is necessary to find the most significant factors which can be controlled physically. This analysis is based on a single impulse. A sample impulsive sound file is tweaked for different amplitudes, background noise, attack time, release time and the spectral content. A two level factorial design of experiments (DOE) is applied to study the significant effects and interactions. For each impulse file modified as per the DOE, the magnitude of perceived annoyance is calculated from the objective metric developed recently at Michigan Technological University. This metric is based on psychoacoustic criteria such as loudness, sharpness, roughness and loudness based impulsiveness. Software called ‘Artemis V11.2’ developed by HEAD Acoustics is used to calculate these psychoacoustic terms. As a result of two level factorial analyses, a new objective model of perceived annoyance is developed in terms of above mentioned physical parameters such as amplitudes, background noise, impulse attack time, impulse release time and the spectral content. Also the effects of the significant individual factors as well as two level interactions are also studied. The results show that all the mentioned five factors affect annoyance level of an impulsive sound significantly. Thus annoyance level can be reduced under the criteria by optimizing the levels. Also, an additional analysis is done to study the effect of these five significant parameters on the individual psychoacoustic metrics.
Resumo:
Reuse distance analysis, the prediction of how many distinct memory addresses will be accessed between two accesses to a given address, has been established as a useful technique in profile-based compiler optimization, but the cost of collecting the memory reuse profile has been prohibitive for some applications. In this report, we propose using the hardware monitoring facilities available in existing CPUs to gather an approximate reuse distance profile. The difficulties associated with this monitoring technique are discussed, most importantly that there is no obvious link between the reuse profile produced by hardware monitoring and the actual reuse behavior. Potential applications which would be made viable by a reliable hardware-based reuse distance analysis are identified.
Resumo:
Three-dimensional flow visualization plays an essential role in many areas of science and engineering, such as aero- and hydro-dynamical systems which dominate various physical and natural phenomena. For popular methods such as the streamline visualization to be effective, they should capture the underlying flow features while facilitating user observation and understanding of the flow field in a clear manner. My research mainly focuses on the analysis and visualization of flow fields using various techniques, e.g. information-theoretic techniques and graph-based representations. Since the streamline visualization is a popular technique in flow field visualization, how to select good streamlines to capture flow patterns and how to pick good viewpoints to observe flow fields become critical. We treat streamline selection and viewpoint selection as symmetric problems and solve them simultaneously using the dual information channel [81]. To the best of my knowledge, this is the first attempt in flow visualization to combine these two selection problems in a unified approach. This work selects streamline in a view-independent manner and the selected streamlines will not change for all viewpoints. My another work [56] uses an information-theoretic approach to evaluate the importance of each streamline under various sample viewpoints and presents a solution for view-dependent streamline selection that guarantees coherent streamline update when the view changes gradually. When projecting 3D streamlines to 2D images for viewing, occlusion and clutter become inevitable. To address this challenge, we design FlowGraph [57, 58], a novel compound graph representation that organizes field line clusters and spatiotemporal regions hierarchically for occlusion-free and controllable visual exploration. We enable observation and exploration of the relationships among field line clusters, spatiotemporal regions and their interconnection in the transformed space. Most viewpoint selection methods only consider the external viewpoints outside of the flow field. This will not convey a clear observation when the flow field is clutter on the boundary side. Therefore, we propose a new way to explore flow fields by selecting several internal viewpoints around the flow features inside of the flow field and then generating a B-Spline curve path traversing these viewpoints to provide users with closeup views of the flow field for detailed observation of hidden or occluded internal flow features [54]. This work is also extended to deal with unsteady flow fields. Besides flow field visualization, some other topics relevant to visualization also attract my attention. In iGraph [31], we leverage a distributed system along with a tiled display wall to provide users with high-resolution visual analytics of big image and text collections in real time. Developing pedagogical visualization tools forms my other research focus. Since most cryptography algorithms use sophisticated mathematics, it is difficult for beginners to understand both what the algorithm does and how the algorithm does that. Therefore, we develop a set of visualization tools to provide users with an intuitive way to learn and understand these algorithms.
A Cost-Benefit Analysis Using Contingent Valuation Techniques: A Feasibility Study in Spinal Surgery
Resumo:
If change over time is compared in several groups, it is important to take into account baseline values so that the comparison is carried out under the same preconditions. As the observed baseline measurements are distorted by measurement error, it may not be sufficient to include them as covariate. By fitting a longitudinal mixed-effects model to all data including the baseline observations and subsequently calculating the expected change conditional on the underlying baseline value, a solution to this problem has been provided recently so that groups with the same baseline characteristics can be compared. In this article, we present an extended approach where a broader set of models can be used. Specifically, it is possible to include any desired set of interactions between the time variable and the other covariates, and also, time-dependent covariates can be included. Additionally, we extend the method to adjust for baseline measurement error of other time-varying covariates. We apply the methodology to data from the Swiss HIV Cohort Study to address the question if a joint infection with HIV-1 and hepatitis C virus leads to a slower increase of CD4 lymphocyte counts over time after the start of antiretroviral therapy.