26 resultados para Task Performance and Analysis
em DigitalCommons@The Texas Medical Center
Resumo:
This paper introduces an extended hierarchical task analysis (HTA) methodology devised to evaluate and compare user interfaces on volumetric infusion pumps. The pumps were studied along the dimensions of overall usability and propensity for generating human error. With HTA as our framework, we analyzed six pumps on a variety of common tasks using Norman’s Action theory. The introduced method of evaluation divides the problem space between the external world of the device interface and the user’s internal cognitive world, allowing for predictions of potential user errors at the human-device level. In this paper, one detailed analysis is provided as an example, comparing two different pumps on two separate tasks. The results demonstrate the inherent variation, often the cause of usage errors, found with infusion pumps being used in hospitals today. The reported methodology is a useful tool for evaluating human performance and predicting potential user errors with infusion pumps and other simple medical devices.
Resumo:
Objective Interruptions are known to have a negative impact on activity performance. Understanding how an interruption contributes to human error is limited because there is not a standard method for analyzing and classifying interruptions. Qualitative data are typically analyzed by either a deductive or an inductive method. Both methods have limitations. In this paper a hybrid method was developed that integrates deductive and inductive methods for the categorization of activities and interruptions recorded during an ethnographic study of physicians and registered nurses in a Level One Trauma Center. Understanding the effects of interruptions is important for designing and evaluating informatics tools in particular and for improving healthcare quality and patient safety in general. Method The hybrid method was developed using a deductive a priori classification framework with the provision of adding new categories discovered inductively in the data. The inductive process utilized line-by-line coding and constant comparison as stated in Grounded Theory. Results The categories of activities and interruptions were organized into a three-tiered hierarchy of activity. Validity and reliability of the categories were tested by categorizing a medical error case external to the study. No new categories of interruptions were identified during analysis of the medical error case. Conclusions Findings from this study provide evidence that the hybrid model of categorization is more complete than either a deductive or an inductive method alone. The hybrid method developed in this study provides the methodical support for understanding, analyzing, and managing interruptions and workflow.
Resumo:
OBJECTIVE: Interruptions are known to have a negative impact on activity performance. Understanding how an interruption contributes to human error is limited because there is not a standard method for analyzing and classifying interruptions. Qualitative data are typically analyzed by either a deductive or an inductive method. Both methods have limitations. In this paper, a hybrid method was developed that integrates deductive and inductive methods for the categorization of activities and interruptions recorded during an ethnographic study of physicians and registered nurses in a Level One Trauma Center. Understanding the effects of interruptions is important for designing and evaluating informatics tools in particular as well as improving healthcare quality and patient safety in general. METHOD: The hybrid method was developed using a deductive a priori classification framework with the provision of adding new categories discovered inductively in the data. The inductive process utilized line-by-line coding and constant comparison as stated in Grounded Theory. RESULTS: The categories of activities and interruptions were organized into a three-tiered hierarchy of activity. Validity and reliability of the categories were tested by categorizing a medical error case external to the study. No new categories of interruptions were identified during analysis of the medical error case. CONCLUSIONS: Findings from this study provide evidence that the hybrid model of categorization is more complete than either a deductive or an inductive method alone. The hybrid method developed in this study provides the methodical support for understanding, analyzing, and managing interruptions and workflow.
Resumo:
An understanding of interruptions in healthcare is important for the design, implementation, and evaluation of health information systems and for the management of clinical workflow and medical errors. The purpose of this study is to identify and classify the types of interruptions experienced by Emergency Department(ED) nurses working in a Level One Trauma Center. This was an observational field study of Registered Nurses (RNs) employed in a Level One Trauma Center using the shadowing method. Results of the study indicate that nurses were both recipients and initiators of interruptions. Telephones, pagers, and face-to-face conversations were the most common sources of interruptions. Unlike other industries, the healthcare community has not systematically studied interruptions in clinical settings to determine and weigh the necessity of the interruption against their sometimes negative results such as medical errors, decreased efficiency, and increased costs. Our study presented here is an initial step to understand the nature, causes, and effects of interruptions, thereby improving both the quality of healthcare and patient safety. We developed an ethnographic data collection technique and a data coding method for the capturing and analysis of interruptions. The interruption data we collected are systematic, comprehensive, and close to exhaustive. They confirmed the findings from earlier studies by other researchers that interruptions are frequent events in critical care and other healthcare settings. We are currently using these data to analyze the workflow dynamics of ED clinicians, to identify the bottlenecks of information flow, and to develop interventions to improve the efficiency of emergency care through the management of interruptions.
Resumo:
An understanding of interruptions in healthcare is important for the design, implementation, and evaluation of health information systems and for the management of clinical workflow and medical errors. The purpose of this study is to identify and classify the types of interruptions experienced by ED nurses working in a Level One Trauma Center. This was an observational field study of Registered Nurses employed in a Level One Trauma Center using the shadowing method. Results of the study indicate that nurses were both recipients and initiators of interruptions. Telephone, pagers, and face-to-face conversations were the most common sources of interruptions. Unlike other industries, the outcomes caused by interruptions resulting in medical errors, decreased efficiency and increased cost have not been systematically studied in healthcare. Our study presented here is an initial step to understand the nature, causes, and effects of interruptions, and to develop interventions to manage interruptions to improve healthcare quality and patient safety. We developed an ethnographic data collection technique and a data coding method for the capturing and analysis of interruptions. The interruption data we collected are systematic, comprehensive, and close to exhaustive. They confirmed the findings from early studies by other researchers that interruptions are frequent events in critical care and other healthcare settings. We are currently using these data to analyze the workflow dynamics of ED clinicians, identify the bottlenecks of information flow, and develop interventions to improve the efficiency of emergency care through the management of interruptions.
Resumo:
Dielectrophoresis—the tendency of a material of high dielectric permittivity to migrate in an electrical field gradient to a region of maximum field strength—provides an ideal motive force for manipulating small volumes of biological analytes in microfluidic microsystems. The work described in this thesis was based on the hypothesis that dielectrophoresis could be exploited to provide high-resolution cell separations in microsystems as well as a means for the electrically-controllable manipulation of solid supports for molecular analysis. To this end, a dielectrophoretic/gravitational field-flow-fractionation (DEP/G-FFF) system was developed and the separation performance evaluated using various types and sizes of polystyrene microspheres as model particles. It was shown that separation of the polystyrene beads was based on the differences in their effective dielectrophoretic properties. The ability of an improved DEP/G-FFF system to separate genetically identical, but phenotypically dissimilar cell types was demonstrated using mixtures of 6m2 mutant rat kidney cells grown under transforming and non-transforming culture conditions. Additionally, a panel of engineered dielectric microspheres was designed with specific, predetermined dielectrophoretic properties such that their dielectrophoretic behaviors would be controllable and predictable. The fabrication method involved the use of gold-coated polystyrene microsphere cores coated with a self-assembled monolayer of alkanethiol and, optionally, a self-assembled monolayer of phospholipid to form a thin-insulating-shell-over-conductive-interior structure. The successful development of the DEP/G-FFF separation system and the dielectrically engineered microspheres provides proof-of-principle demonstrations of enabling dielectrophoresis-based microsystem technology that should provide powerful new methods for the manipulation, separation and identification of analytes in many diverse fields. ^
Resumo:
Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a noninvasive technique for quantitative assessment of the integrity of blood-brain barrier and blood-spinal cord barrier (BSCB) in the presence of central nervous system pathologies. However, the results of DCE-MRI show substantial variability. The high variability can be caused by a number of factors including inaccurate T1 estimation, insufficient temporal resolution and poor contrast-to-noise ratio. My thesis work is to develop improved methods to reduce the variability of DCE-MRI results. To obtain fast and accurate T1 map, the Look-Locker acquisition technique was implemented with a novel and truly centric k-space segmentation scheme. In addition, an original multi-step curve fitting procedure was developed to increase the accuracy of T1 estimation. A view sharing acquisition method was implemented to increase temporal resolution, and a novel normalization method was introduced to reduce image artifacts. Finally, a new clustering algorithm was developed to reduce apparent noise in the DCE-MRI data. The performance of these proposed methods was verified by simulations and phantom studies. As part of this work, the proposed techniques were applied to an in vivo DCE-MRI study of experimental spinal cord injury (SCI). These methods have shown robust results and allow quantitative assessment of regions with very low vascular permeability. In conclusion, applications of the improved DCE-MRI acquisition and analysis methods developed in this thesis work can improve the accuracy of the DCE-MRI results.
Resumo:
CHARACTERIZATION OF THE COUNT RATE PERFORMANCE AND EVALUATION OF THE EFFECTS OF HIGH COUNT RATES ON MODERN GAMMA CAMERAS Michael Stephen Silosky, B.S. Supervisory Professor: S. Cheenu Kappadath, Ph.D. Evaluation of count rate performance (CRP) is an integral component of gamma camera quality assurance and measurement of system dead time (τ) is important for quantitative SPECT. The CRP of three modern gamma cameras was characterized using established methods (Decay and Dual Source) under a variety of experimental conditions. For the Decay method, input count rate was plotted against observed count rate and fit to the paralyzable detector model (PDM) to estimate τ (Rates method). A novel expression for observed counts as a function of measurement time interval was derived and the observed counts were fit to this expression to estimate τ (Counts method). Correlation and Bland-Altman analysis were performed to assess agreement in estimates of τ between methods. The dependencies of τ on energy window definition and incident energy spectrum were characterized. The Dual Source method was also used to estimate τ and its agreement with the Decay method under identical conditions and the effects of total activity and the ratio of source activities were investigated. Additionally, the effects of count rate on several performance metrics were evaluated. The CRP curves for each system agreed with the PDM at low count rates but deviated substantially at high count rates. Estimates of τ for the paralyzable portion of the CRP curves using the Rates and Counts methods were highly correlated (r=0.999) but with a small (~6%) difference. No significant difference was observed between the highly correlated estimates of τ using the Decay or Dual Source methods under identical experimental conditions (r=0.996). Estimates of τ increased as a power-law function with decreasing ratio of counts in the photopeak to the total counts and linearly with decreasing spectral effective energy. Dual Source method estimates of τ varied as a quadratic with the ratio of the single source to combined source activities and linearly with total activity used across a large range. Image uniformity, spatial resolution, and energy resolution degraded linearly with count rate and image distorting effects were observed. Guidelines for CRP testing and a possible method for the correction of count rate losses for clinical images have been proposed.
Resumo:
Purpose: To assess the relationship between student utilization of learning resources, including streaming video (SV), and their performance in the pre-clinical curriculum. [See PDF for complete abstract]
Resumo:
The use of smaller surgical incisions has become popularized for total hip arthroplasty (THR) because of the potential benefits of shorter recovery and improved cosmetic appearance. However, an increased incidence of serious complications has been reported. To minimize the risks of minimally invasive approaches to THR, we have developed an experimental approach which enables us to evaluate risk factors in these procedures through cadaveric simulations performed within the laboratory. During cadaveric hip replacement procedures performed via posterior and antero-lateral mini-incisions, pressures developed between the wound edges and the retractors were approximately double those recorded during conventional hip replacement using Charnley retractors (p < 0.01). In MIS procedures performed via the dual-incision approach, lack of direct visualisation of the proximal femur led to misalignment of broaches and implants with increased risk of cortical fracture during canal preparation and implant insertion. Cadaveric simulation of surgical procedures allows surgeons to measure variables affecting the technical success of surgery and to master new procedures without placing patients at risk.
Resumo:
People often use tools to search for information. In order to improve the quality of an information search, it is important to understand how internal information, which is stored in user’s mind, and external information, represented by the interface of tools interact with each other. How information is distributed between internal and external representations significantly affects information search performance. However, few studies have examined the relationship between types of interface and types of search task in the context of information search. For a distributed information search task, how data are distributed, represented, and formatted significantly affects the user search performance in terms of response time and accuracy. Guided by UFuRT (User, Function, Representation, Task), a human-centered process, I propose a search model, task taxonomy. The model defines its relationship with other existing information models. The taxonomy clarifies the legitimate operations for each type of search task of relation data. Based on the model and taxonomy, I have also developed prototypes of interface for the search tasks of relational data. These prototypes were used for experiments. The experiments described in this study are of a within-subject design with a sample of 24 participants recruited from the graduate schools located in the Texas Medical Center. Participants performed one-dimensional nominal search tasks over nominal, ordinal, and ratio displays, and searched one-dimensional nominal, ordinal, interval, and ratio tasks over table and graph displays. Participants also performed the same task and display combination for twodimensional searches. Distributed cognition theory has been adopted as a theoretical framework for analyzing and predicting the search performance of relational data. It has been shown that the representation dimensions and data scales, as well as the search task types, are main factors in determining search efficiency and effectiveness. In particular, the more external representations used, the better search task performance, and the results suggest the ideal search performance occurs when the question type and corresponding data scale representation match. The implications of the study lie in contributing to the effective design of search interface for relational data, especially laboratory results, which are often used in healthcare activities.
Resumo:
Attention has recently been drawn to Enterococcus faecium because of an increasing number of nosocomial infections caused by this species and its resistance to multiple antibacterial agents. However, relatively little is known about the pathogenic determinants of this organism. We have previously identified a cell-wall-anchored collagen adhesin, Acm, produced by some isolates of E. faecium, and a secreted antigen, SagA, exhibiting broad-spectrum binding to extracellular matrix proteins. Here, we analysed the draft genome of strain TX0016 for potential microbial surface components recognizing adhesive matrix molecules (MSCRAMMs). Genome-based bioinformatics identified 22 predicted cell-wall-anchored E. faecium surface proteins (Fms), of which 15 (including Acm) had characteristics typical of MSCRAMMs, including predicted folding into a modular architecture with multiple immunoglobulin-like domains. Functional characterization of one [Fms10; redesignated second collagen adhesin of E. faecium (Scm)] revealed that recombinant Scm(65) (A- and B-domains) and Scm(36) (A-domain) bound to collagen type V efficiently in a concentration-dependent manner, bound considerably less to collagen type I and fibrinogen, and differed from Acm in their binding specificities to collagen types IV and V. Results from far-UV circular dichroism measurements of recombinant Scm(36) and of Acm(37) indicated that these proteins were rich in beta-sheets, supporting our folding predictions. Whole-cell ELISA and FACS analyses unambiguously demonstrated surface expression of Scm in most E. faecium isolates. Strikingly, 11 of the 15 predicted MSCRAMMs clustered in four loci, each with a class C sortase gene; nine of these showed similarity to Enterococcus faecalis Ebp pilus subunits and also contained motifs essential for pilus assembly. Antibodies against one of the predicted major pilus proteins, Fms9 (redesignated EbpC(fm)), detected a 'ladder' pattern of high-molecular-mass protein bands in a Western blot analysis of cell surface extracts from E. faecium, suggesting that EbpC(fm) is polymerized into a pilus structure. Further analysis of the transcripts of the corresponding gene cluster indicated that fms1 (ebpA(fm)), fms5 (ebpB(fm)) and ebpC(fm) are co-transcribed, a result consistent with those for pilus-encoding gene clusters of other Gram-positive bacteria. All 15 genes occurred frequently in 30 clinically derived diverse E. faecium isolates tested. The common occurrence of MSCRAMM- and pilus-encoding genes and the presence of a second collagen-binding protein may have important implications for our understanding of this emerging pathogen.
Resumo:
The recA gene is essential for homologous recombination and for inducible DNA repair in Escherichia coli. The level of recA expression is important for these functions. The growth defect of a lambda phage carrying a recA-lacZ fusion was used to select mutations that reduced recA expression. Nine of these mutations were single base changes in the recA promoter; each reduced both induced and basal (repressed) levels of expression, indicating that only one promoter is used under both circumstances. Deletion analysis of the promoter region and S1 mapping of transcripts confirmed that there is only one promoter responsible for both basal and induced expression. Some of the mutants, however, displayed a ratio of induced to repressed expression that was much lower than wild-type. For one of these mutants (recA1270) LexA binding studies showed that this was not due to a change in the affinity of LexA repressor for the operator site. The extent of binding of RNA polymerase to this mutant promoter, however, was much reduced, and the complexes formed were qualitatively different. Further binding experiments provided some evidence that LexA does not block RNA polymerase binding to the recA promoter, but inhibits a later step in initiation. Behavior of the mutants with altered induction ratios could be explained if LexA binding to the operator actually increases RNA polymerase binding to the promoter in a closed complex compensating for defects in polymerase binding caused by the mutations.^ In a study of mutations in the recA structural gene, site-directed mutagenesis was used to replace cysteine codons at positions 90, 116, and 129 with a number of different codons. In vivo analysis of the replacements showed that none of the cysteines is absolutely essential and that they do not have a direct role as catalysts in ATP hydrolysis. Some amino acid substitutions abolished all RecA functions, while a few resulted in partial or altered function. Amino acids at positions 90 and 129 tended to affect all functions equally, while the amino acid at position 116 appeared to have a particular effect on the protease activity of the protein. ^
Resumo:
In numerous intervention studies and education field trials, random assignment to treatment occurs in clusters rather than at the level of observation. This departure of random assignment of units may be due to logistics, political feasibility, or ecological validity. Data within the same cluster or grouping are often correlated. Application of traditional regression techniques, which assume independence between observations, to clustered data produce consistent parameter estimates. However such estimators are often inefficient as compared to methods which incorporate the clustered nature of the data into the estimation procedure (Neuhaus 1993).1 Multilevel models, also known as random effects or random components models, can be used to account for the clustering of data by estimating higher level, or group, as well as lower level, or individual variation. Designing a study, in which the unit of observation is nested within higher level groupings, requires the determination of sample sizes at each level. This study investigates the design and analysis of various sampling strategies for a 3-level repeated measures design on the parameter estimates when the outcome variable of interest follows a Poisson distribution. ^ Results study suggest that second order PQL estimation produces the least biased estimates in the 3-level multilevel Poisson model followed by first order PQL and then second and first order MQL. The MQL estimates of both fixed and random parameters are generally satisfactory when the level 2 and level 3 variation is less than 0.10. However, as the higher level error variance increases, the MQL estimates become increasingly biased. If convergence of the estimation algorithm is not obtained by PQL procedure and higher level error variance is large, the estimates may be significantly biased. In this case bias correction techniques such as bootstrapping should be considered as an alternative procedure. For larger sample sizes, those structures with 20 or more units sampled at levels with normally distributed random errors produced more stable estimates with less sampling variance than structures with an increased number of level 1 units. For small sample sizes, sampling fewer units at the level with Poisson variation produces less sampling variation, however this criterion is no longer important when sample sizes are large. ^ 1Neuhaus J (1993). “Estimation efficiency and Tests of Covariate Effects with Clustered Binary Data”. Biometrics , 49, 989–996^