840 resultados para operational reliability
Resumo:
OBJECTIVE: To assess the intra-reader and inter-reader reliabilities of interpreting ultrasonography by several experts using video clips. METHOD: 99 video clips of healthy and rheumatic joints were recorded and delivered to 17 physician sonographers in two rounds. The intra-reader and inter-reader reliabilities of interpreting the ultrasound results were calculated using a dichotomous system (normal/abnormal) and a graded semiquantitative scoring system. RESULTS: The video reading method worked well. 70% of the readers could classify at least 70% of the cases correctly as normal or abnormal. The distribution of readers answering correctly was wide. The most difficult joints to assess were the elbow, wrist, metacarpophalangeal (MCP) and knee joints. The intra-reader and inter-reader agreements on interpreting dynamic ultrasound images as normal or abnormal, as well as detecting and scoring a Doppler signal were moderate to good (kappa = 0.52-0.82). CONCLUSIONS: Dynamic image assessment (video clips) can be used as an alternative method in ultrasonography reliability studies. The intra-reader and inter-reader reliabilities of ultrasonography in dynamic image reading are acceptable, but more definitions and training are needed to improve sonographic reproducibility.
Resumo:
This paper presents the German version of the Short Understanding of Substance Abuse Scale (SUSS) [Humphreys et al.: Psychol Addict Behav 1996;10:38-44], the Verstandnis von Storungen durch Substanzkonsum (VSS), and evaluates its psychometric properties. The VSS assesses clinicians' beliefs about the nature and treatment of substance use disorders, particularly their endorsement of psychosocial and disease orientation. The VSS was administered to 160 treatment staff members at 12 substance use disorder treatment programs in the German-speaking part of Switzerland. Because the confirmatory factor analysis of the VSS did not completely replicate the factorial structure of the SUSS, an exploratory factor analysis was undertaken. This analysis identified two factors: the Psychosocial model factor and a slightly different Disease model factor. The VSS Disease and Psychosocial subscales showed convergent and discriminant validity, as well as sufficient reliability.
Resumo:
When patients enter our emergency room with suspected multiple injuries, Statscan provides a full body anterior and lateral image for initial diagnosis, and then zooms in on specific smaller areas for a more detailed evaluation. In order to examine the possible role of Statscan in the management of multiply injured patients we implemented a modified ATLS((R)) algorithm, where X-ray of C-spine, chest and pelvis have been replaced by single-total a.p./lat. body radiograph. Between 15 October 2006 and 1 February 2007 143 trauma patients (mean ISS 15+/-14 (3-75)) were included. We compared the time in resuscitation room to 650 patients (mean ISS 14+/-14 (3-75)) which were treated between 1 January 2002 and 1 January 2004 according to conventional ATLS protocol. The total-body scanning time was 3.5 min (3-6 min) compared to 25.7 (8-48 min) for conventional X-rays, The total ER time was unchanged 28.7 min (13-58 min) compared to 29.1 min (15-65 min) using conventional plain radiography. In 116/143 patients additional CT scans were necessary. In 98/116 full body trauma CT scans were performed. In 18/116 patients selective CT scans were ordered based on Statscan findings. In 43/143 additional conventional X-rays had to be performed, mainly due to inadequate a.p. views of fractured bones. All radiographs were transmitted over the hospital network (Picture Archiving and Communication System, PACS) for immediate simultaneous viewing at different places. The rapid availability of images for interpretation because of their digital nature and the reduced need for repeat exposures because of faulty radiography are also felt to be strengths.
Resumo:
BACKGROUND: High intercoder reliability (ICR) is required in qualitative content analysis for assuring quality when more than one coder is involved in data analysis. The literature is short of standardized procedures for ICR procedures in qualitative content analysis. OBJECTIVE: To illustrate how ICR assessment can be used to improve codings in qualitative content analysis. METHODS: Key steps of the procedure are presented, drawing on data from a qualitative study on patients' perspectives on low back pain. RESULTS: First, a coding scheme was developed using a comprehensive inductive and deductive approach. Second, 10 transcripts were coded independently by two researchers, and ICR was calculated. A resulting kappa value of .67 can be regarded as satisfactory to solid. Moreover, varying agreement rates helped to identify problems in the coding scheme. Low agreement rates, for instance, indicated that respective codes were defined too broadly and would need clarification. In a third step, the results of the analysis were used to improve the coding scheme, leading to consistent and high-quality results. DISCUSSION: The quantitative approach of ICR assessment is a viable instrument for quality assurance in qualitative content analysis. Kappa values and close inspection of agreement rates help to estimate and increase quality of codings. This approach facilitates good practice in coding and enhances credibility of analysis, especially when large samples are interviewed, different coders are involved, and quantitative results are presented.
Resumo:
This dissertation investigates high performance cooperative localization in wireless environments based on multi-node time-of-arrival (TOA) and direction-of-arrival (DOA) estimations in line-of-sight (LOS) and non-LOS (NLOS) scenarios. Here, two categories of nodes are assumed: base nodes (BNs) and target nodes (TNs). BNs are equipped with antenna arrays and capable of estimating TOA (range) and DOA (angle). TNs are equipped with Omni-directional antennas and communicate with BNs to allow BNs to localize TNs; thus, the proposed localization is maintained by BNs and TNs cooperation. First, a LOS localization method is proposed, which is based on semi-distributed multi-node TOA-DOA fusion. The proposed technique is applicable to mobile ad-hoc networks (MANETs). We assume LOS is available between BNs and TNs. One BN is selected as the reference BN, and other nodes are localized in the coordinates of the reference BN. Each BN can localize TNs located in its coverage area independently. In addition, a TN might be localized by multiple BNs. High performance localization is attainable via multi-node TOA-DOA fusion. The complexity of the semi-distributed multi-node TOA-DOA fusion is low because the total computational load is distributed across all BNs. To evaluate the localization accuracy of the proposed method, we compare the proposed method with global positioning system (GPS) aided TOA (DOA) fusion, which are applicable to MANETs. The comparison criterion is the localization circular error probability (CEP). The results confirm that the proposed method is suitable for moderate scale MANETs, while GPS-aided TOA fusion is suitable for large scale MANETs. Usually, TOA and DOA of TNs are periodically estimated by BNs. Thus, Kalman filter (KF) is integrated with multi-node TOA-DOA fusion to further improve its performance. The integration of KF and multi-node TOA-DOA fusion is compared with extended-KF (EKF) when it is applied to multiple TOA-DOA estimations made by multiple BNs. The comparison depicts that it is stable (no divergence takes place) and its accuracy is slightly lower than that of the EKF, if the EKF converges. However, the EKF may diverge while the integration of KF and multi-node TOA-DOA fusion does not; thus, the reliability of the proposed method is higher. In addition, the computational complexity of the integration of KF and multi-node TOA-DOA fusion is much lower than that of EKF. In wireless environments, LOS might be obstructed. This degrades the localization reliability. Antenna arrays installed at each BN is incorporated to allow each BN to identify NLOS scenarios independently. Here, a single BN measures the phase difference across two antenna elements using a synchronized bi-receiver system, and maps it into wireless channel’s K-factor. The larger K is, the more likely the channel would be a LOS one. Next, the K-factor is incorporated to identify NLOS scenarios. The performance of this system is characterized in terms of probability of LOS and NLOS identification. The latency of the method is small. Finally, a multi-node NLOS identification and localization method is proposed to improve localization reliability. In this case, multiple BNs engage in the process of NLOS identification, shared reflectors determination and localization, and NLOS TN localization. In NLOS scenarios, when there are three or more shared reflectors, those reflectors are localized via DOA fusion, and then a TN is localized via TOA fusion based on the localization of shared reflectors.
Analysis of spring break-up and its effects on a biomass feedstock supply chain in northern Michigan
Resumo:
Demand for bio-fuels is expected to increase, due to rising prices of fossil fuels and concerns over greenhouse gas emissions and energy security. The overall cost of biomass energy generation is primarily related to biomass harvesting activity, transportation, and storage. With a commercial-scale cellulosic ethanol processing facility in Kinross Township of Chippewa County, Michigan about to be built, models including a simulation model and an optimization model have been developed to provide decision support for the facility. Both models track cost, emissions and energy consumption. While the optimization model provides guidance for a long-term strategic plan, the simulation model aims to present detailed output for specified operational scenarios over an annual period. Most importantly, the simulation model considers the uncertainty of spring break-up timing, i.e., seasonal road restrictions. Spring break-up timing is important because it will impact the feasibility of harvesting activity and the time duration of transportation restrictions, which significantly changes the availability of feedstock for the processing facility. This thesis focuses on the statistical model of spring break-up used in the simulation model. Spring break-up timing depends on various factors, including temperature, road conditions and soil type, as well as individual decision making processes at the county level. The spring break-up model, based on the historical spring break-up data from 27 counties over the period of 2002-2010, starts by specifying the probability distribution of a particular county’s spring break-up start day and end day, and then relates the spring break-up timing of the other counties in the harvesting zone to the first county. In order to estimate the dependence relationship between counties, regression analyses, including standard linear regression and reduced major axis regression, are conducted. Using realizations (scenarios) of spring break-up generated by the statistical spring breakup model, the simulation model is able to probabilistically evaluate different harvesting and transportation plans to help the bio-fuel facility select the most effective strategy. For early spring break-up, which usually indicates a longer than average break-up period, more log storage is required, total cost increases, and the probability of plant closure increases. The risk of plant closure may be partially offset through increased use of rail transportation, which is not subject to spring break-up restrictions. However, rail availability and rail yard storage may then become limiting factors in the supply chain. Rail use will impact total cost, energy consumption, system-wide CO2 emissions, and the reliability of providing feedstock to the bio-fuel processing facility.
Resumo:
Clouds are one of the most influential elements of weather on the earth system, yet they are also one of the least understood. Understanding their composition and behavior at small scales is critical to understanding and predicting larger scale feedbacks. Currently, the best method to study clouds on the microscale is through airborne in situ measurements using optical instruments capable of resolving clouds on the individual particle level. However, current instruments are unable to sufficiently resolve the scales important to cloud evolution and behavior. The Holodec is a new generation of optical cloud instrument which uses digital inline holography to overcome many of the limitations of conventional instruments. However, its performance and reliability was limited due to several deficiencies in its original design. These deficiencies were addressed and corrected to advance the instrument from the prototype stage to an operational instrument. In addition, the processing software used to reconstruct and analyze digitally recorded holograms was improved upon to increase robustness and ease of use.