15 resultados para Information Models
em DigitalCommons@The Texas Medical Center
Resumo:
People often use tools to search for information. In order to improve the quality of an information search, it is important to understand how internal information, which is stored in user’s mind, and external information, represented by the interface of tools interact with each other. How information is distributed between internal and external representations significantly affects information search performance. However, few studies have examined the relationship between types of interface and types of search task in the context of information search. For a distributed information search task, how data are distributed, represented, and formatted significantly affects the user search performance in terms of response time and accuracy. Guided by UFuRT (User, Function, Representation, Task), a human-centered process, I propose a search model, task taxonomy. The model defines its relationship with other existing information models. The taxonomy clarifies the legitimate operations for each type of search task of relation data. Based on the model and taxonomy, I have also developed prototypes of interface for the search tasks of relational data. These prototypes were used for experiments. The experiments described in this study are of a within-subject design with a sample of 24 participants recruited from the graduate schools located in the Texas Medical Center. Participants performed one-dimensional nominal search tasks over nominal, ordinal, and ratio displays, and searched one-dimensional nominal, ordinal, interval, and ratio tasks over table and graph displays. Participants also performed the same task and display combination for twodimensional searches. Distributed cognition theory has been adopted as a theoretical framework for analyzing and predicting the search performance of relational data. It has been shown that the representation dimensions and data scales, as well as the search task types, are main factors in determining search efficiency and effectiveness. In particular, the more external representations used, the better search task performance, and the results suggest the ideal search performance occurs when the question type and corresponding data scale representation match. The implications of the study lie in contributing to the effective design of search interface for relational data, especially laboratory results, which are often used in healthcare activities.
Resumo:
Objective: To determine how a clinician’s background knowledge, their tasks, and displays of information interact to affect the clinician’s mental model. Design: Repeated Measure Nested Experimental Design Population, Sample, Setting: Populations were gastrointestinal/internal medicine physicians and nurses within the greater Houston area. A purposeful sample of 24 physicians and 24 nurses were studied in 2003. Methods: Subjects were randomized to two different displays of two different mock medical records; one that contained highlighted patient information and one that contained non-highlighted patient information. They were asked to read and summarize their understanding of the patients aloud. Propositional analysis was used to understand their comprehension of the patients. Findings: Different mental models were found between physicians and nurses given the same display of information. The information they shared was very minor compared to the variance in their mental models. There was additionally more variance within the nursing mental models than the physician mental models given different displays of the same information. Statistically, there was no interaction effect between the display of information and clinician type. Only clinician type could account for the differences in the clinician comprehension and thus their mental models of the cases. Conclusion: The factors that may explain the variance within and between the clinician models are clinician type, and only in the nursing group, the use of highlighting.
Resumo:
Despite current enthusiasm for investigation of gene-gene interactions and gene-environment interactions, the essential issue of how to define and detect gene-environment interactions remains unresolved. In this report, we define gene-environment interactions as a stochastic dependence in the context of the effects of the genetic and environmental risk factors on the cause of phenotypic variation among individuals. We use mutual information that is widely used in communication and complex system analysis to measure gene-environment interactions. We investigate how gene-environment interactions generate the large difference in the information measure of gene-environment interactions between the general population and a diseased population, which motives us to develop mutual information-based statistics for testing gene-environment interactions. We validated the null distribution and calculated the type 1 error rates for the mutual information-based statistics to test gene-environment interactions using extensive simulation studies. We found that the new test statistics were more powerful than the traditional logistic regression under several disease models. Finally, in order to further evaluate the performance of our new method, we applied the mutual information-based statistics to three real examples. Our results showed that P-values for the mutual information-based statistics were much smaller than that obtained by other approaches including logistic regression models.
Resumo:
Currently more than half of Electronic Health Record (EHR) projects fail. Most of these failures are not due to flawed technology, but rather due to the lack of systematic considerations of human issues. Among the barriers for EHR adoption, function mismatching among users, activities, and systems is a major area that has not been systematically addressed from a human-centered perspective. A theoretical framework called Functional Framework was developed for identifying and reducing functional discrepancies among users, activities, and systems. The Functional Framework is composed of three models – the User Model, the Designer Model, and the Activity Model. The User Model was developed by conducting a survey (N = 32) that identified the functions needed and desired from the user’s perspective. The Designer Model was developed by conducting a systemic review of an Electronic Dental Record (EDR) and its functions. The Activity Model was developed using an ethnographic method called shadowing where EDR users (5 dentists, 5 dental assistants, 5 administrative personnel) were followed quietly and observed for their activities. These three models were combined to form a unified model. From the unified model the work domain ontology was developed by asking users to rate the functions (a total of 190 functions) in the unified model along the dimensions of frequency and criticality in a survey. The functional discrepancies, as indicated by the regions of the Venn diagrams formed by the three models, were consistent with the survey results, especially with user satisfaction. The survey for the Functional Framework indicated the preference of one system over the other (R=0.895). The results of this project showed that the Functional Framework provides a systematic method for identifying, evaluating, and reducing functional discrepancies among users, systems, and activities. Limitations and generalizability of the Functional Framework were discussed.
Resumo:
The hippocampus receives input from upper levels of the association cortex and is implicated in many mnemonic processes, but the exact mechanisms by which it codes and stores information is an unresolved topic. This work examines the flow of information through the hippocampal formation while attempting to determine the computations that each of the hippocampal subfields performs in learning and memory. The formation, storage, and recall of hippocampal-dependent memories theoretically utilize an autoassociative attractor network that functions by implementing two competitive, yet complementary, processes. Pattern separation, hypothesized to occur in the dentate gyrus (DG), refers to the ability to decrease the similarity among incoming information by producing output patterns that overlap less than the inputs. In contrast, pattern completion, hypothesized to occur in the CA3 region, refers to the ability to reproduce a previously stored output pattern from a partial or degraded input pattern. Prior to addressing the functional role of the DG and CA3 subfields, the spatial firing properties of neurons in the dentate gyrus were examined. The principal cell of the dentate gyrus, the granule cell, has spatially selective place fields; however, the behavioral correlates of another excitatory cell, the mossy cell of the dentate polymorphic layer, are unknown. This report shows that putative mossy cells have spatially selective firing that consists of multiple fields similar to previously reported properties of granule cells. Other cells recorded from the DG had single place fields. Compared to cells with multiple fields, cells with single fields fired at a lower rate during sleep, were less likely to burst, and were more likely to be recorded simultaneously with a large population of neurons that were active during sleep and silent during behavior. These data suggest that single-field and multiple-field cells constitute at least two distinct cell classes in the DG. Based on these characteristics, we propose that putative mossy cells tend to fire in multiple, distinct locations in an environment, whereas putative granule cells tend to fire in single locations, similar to place fields of the CA1 and CA3 regions. Experimental evidence supporting the theories of pattern separation and pattern completion comes from both behavioral and electrophysiological tests. These studies specifically focused on the function of each subregion and made implicit assumptions about how environmental manipulations changed the representations encoded by the hippocampal inputs. However, the cell populations that provided these inputs were in most cases not directly examined. We conducted a series of studies to investigate the neural activity in the entorhinal cortex, dentate gyrus, and CA3 in the same experimental conditions, which allowed a direct comparison between the input and output representations. The results show that the dentate gyrus representation changes between the familiar and cue altered environments more than its input representations, whereas the CA3 representation changes less than its input representations. These findings are consistent with longstanding computational models proposing that (1) CA3 is an associative memory system performing pattern completion in order to recall previous memories from partial inputs, and (2) the dentate gyrus performs pattern separation to help store different memories in ways that reduce interference when the memories are subsequently recalled.
Resumo:
The prognosis for lung cancer patients remains poor. Five year survival rates have been reported to be 15%. Studies have shown that dose escalation to the tumor can lead to better local control and subsequently better overall survival. However, dose to lung tumor is limited by normal tissue toxicity. The most prevalent thoracic toxicity is radiation pneumonitis. In order to determine a safe dose that can be delivered to the healthy lung, researchers have turned to mathematical models predicting the rate of radiation pneumonitis. However, these models rely on simple metrics based on the dose-volume histogram and are not yet accurate enough to be used for dose escalation trials. The purpose of this work was to improve the fit of predictive risk models for radiation pneumonitis and to show the dosimetric benefit of using the models to guide patient treatment planning. The study was divided into 3 specific aims. The first two specifics aims were focused on improving the fit of the predictive model. In Specific Aim 1 we incorporated information about the spatial location of the lung dose distribution into a predictive model. In Specific Aim 2 we incorporated ventilation-based functional information into a predictive pneumonitis model. In the third specific aim a proof of principle virtual simulation was performed where a model-determined limit was used to scale the prescription dose. The data showed that for our patient cohort, the fit of the model to the data was not improved by incorporating spatial information. Although we were not able to achieve a significant improvement in model fit using pre-treatment ventilation, we show some promising results indicating that ventilation imaging can provide useful information about lung function in lung cancer patients. The virtual simulation trial demonstrated that using a personalized lung dose limit derived from a predictive model will result in a different prescription than what was achieved with the clinically used plan; thus demonstrating the utility of a normal tissue toxicity model in personalizing the prescription dose.
Resumo:
Cultural models of the domains healing and health are important in how people understand health and their behavior regarding it. The biomedicine model has been predominant in Western society. Recent popularity of holistic health and alternative healing modalities contrasts with the biomedical model and the assumptions upon which that model has been practiced. The holistic health movement characterizes an effort by health care providers and others such as nurses to expand the biomedical model and has often incorporated alternative modalities. This research described and compared the cultural models of healing of professional nurses and alternative healers. A group of nursing faculty who promote a holistic model were compared to a group of healers using healing touch. Ethnographic methods of participant observation, free listing and pile sort were used. Theoretical sampling in the free listings reached saturation at 18 in the group of nurses and 21 in the group of healers. Categories consistent for both groups emerged from the data. These were: physical, mental, attitude, relationships, spiritual, self management, and health seeking including biomedical and alternative resources. The healers had little differentiation between the concepts health and healing. The nurses, however, had more elements in self management for health and in health seeking for healing. This reflects the nurse's role in facilitating the shift in locus of responsibility between health and healing. The healers provided more specific information regarding alternative resources. The healer's conceptualization of health was embedded in a spiritual belief system and contrasted dramatically with that of biomedicine. The healer's models also contrasted with holistic health in the areas of holism, locus of responsibility, and dealing with uncertainty. The similarity between the groups and their dissimilarity to biomedicine suggest a larger cultural shift in beliefs regarding health care. ^
Resumo:
This paper reports a comparison of three modeling strategies for the analysis of hospital mortality in a sample of general medicine inpatients in a Department of Veterans Affairs medical center. Logistic regression, a Markov chain model, and longitudinal logistic regression were evaluated on predictive performance as measured by the c-index and on accuracy of expected numbers of deaths compared to observed. The logistic regression used patient information collected at admission; the Markov model was comprised of two absorbing states for discharge and death and three transient states reflecting increasing severity of illness as measured by laboratory data collected during the hospital stay; longitudinal regression employed Generalized Estimating Equations (GEE) to model covariance structure for the repeated binary outcome. Results showed that the logistic regression predicted hospital mortality as well as the alternative methods but was limited in scope of application. The Markov chain provides insights into how day to day changes of illness severity lead to discharge or death. The longitudinal logistic regression showed that increasing illness trajectory is associated with hospital mortality. The conclusion is reached that for standard applications in modeling hospital mortality, logistic regression is adequate, but for new challenges facing health services research today, alternative methods are equally predictive, practical, and can provide new insights. ^
Resumo:
Most statistical analysis, theory and practice, is concerned with static models; models with a proposed set of parameters whose values are fixed across observational units. Static models implicitly assume that the quantified relationships remain the same across the design space of the data. While this is reasonable under many circumstances this can be a dangerous assumption when dealing with sequentially ordered data. The mere passage of time always brings fresh considerations and the interrelationships among parameters, or subsets of parameters, may need to be continually revised. ^ When data are gathered sequentially dynamic interim monitoring may be useful as new subject-specific parameters are introduced with each new observational unit. Sequential imputation via dynamic hierarchical models is an efficient strategy for handling missing data and analyzing longitudinal studies. Dynamic conditional independence models offers a flexible framework that exploits the Bayesian updating scheme for capturing the evolution of both the population and individual effects over time. While static models often describe aggregate information well they often do not reflect conflicts in the information at the individual level. Dynamic models prove advantageous over static models in capturing both individual and aggregate trends. Computations for such models can be carried out via the Gibbs sampler. An application using a small sample repeated measures normally distributed growth curve data is presented. ^
Resumo:
Early Employee Assistance Programs (EAPs) had their origin in humanitarian motives, and there was little concern for their cost/benefit ratios; however, as some programs began accumulating data and analyzing it over time, even with single variables such as absenteeism, it became apparent that the humanitarian reasons for a program could be reinforced by cost savings particularly when the existence of the program was subject to justification.^ Today there is general agreement that cost/benefit analyses of EAPs are desirable, but the specific models for such analyses, particularly those making use of sophisticated but simple computer based data management systems, are few.^ The purpose of this research and development project was to develop a method, a design, and a prototype for gathering managing and presenting information about EAPS. This scheme provides information retrieval and analyses relevant to such aspects of EAP operations as: (1) EAP personnel activities, (2) Supervisory training effectiveness, (3) Client population demographics, (4) Assessment and Referral Effectiveness, (5) Treatment network efficacy, (6) Economic worth of the EAP.^ This scheme has been implemented and made operational at The University of Texas Employee Assistance Programs for more than three years.^ Application of the scheme in the various programs has defined certain variables which remained necessary in all programs. Depending on the degree of aggressiveness for data acquisition maintained by program personnel, other program specific variables are also defined. ^
Resumo:
Microarray technology is a high-throughput method for genotyping and gene expression profiling. Limited sensitivity and specificity are one of the essential problems for this technology. Most of existing methods of microarray data analysis have an apparent limitation for they merely deal with the numerical part of microarray data and have made little use of gene sequence information. Because it's the gene sequences that precisely define the physical objects being measured by a microarray, it is natural to make the gene sequences an essential part of the data analysis. This dissertation focused on the development of free energy models to integrate sequence information in microarray data analysis. The models were used to characterize the mechanism of hybridization on microarrays and enhance sensitivity and specificity of microarray measurements. ^ Cross-hybridization is a major obstacle factor for the sensitivity and specificity of microarray measurements. In this dissertation, we evaluated the scope of cross-hybridization problem on short-oligo microarrays. The results showed that cross hybridization on arrays is mostly caused by oligo fragments with a run of 10 to 16 nucleotides complementary to the probes. Furthermore, a free-energy based model was proposed to quantify the amount of cross-hybridization signal on each probe. This model treats cross-hybridization as an integral effect of the interactions between a probe and various off-target oligo fragments. Using public spike-in datasets, the model showed high accuracy in predicting the cross-hybridization signals on those probes whose intended targets are absent in the sample. ^ Several prospective models were proposed to improve Positional Dependent Nearest-Neighbor (PDNN) model for better quantification of gene expression and cross-hybridization. ^ The problem addressed in this dissertation is fundamental to the microarray technology. We expect that this study will help us to understand the detailed mechanism that determines sensitivity and specificity on the microarrays. Consequently, this research will have a wide impact on how microarrays are designed and how the data are interpreted. ^
Resumo:
Background. Childhood immunization programs have dramatically reduced the morbidity and mortality associated with vaccine-preventable diseases. Proper documentation of immunizations that have been administered is essential to prevent duplicate immunization of children. To help improve documentation, immunization information systems (IISs) have been developed. IISs are comprehensive repositories of immunization information for children residing within a geographic region. The two models for participation in an IIS are voluntary inclusion, or "opt-in," and voluntary exclusion, or "opt-out." In an opt-in system, consent must be obtained for each participant, conversely, in an opt-out IIS, all children are included unless procedures to exclude the child are completed. Consent requirements for participation vary by state; the Texas IIS, ImmTrac, is an opt-in system.^ Objectives. The specific objectives are to: (1) Evaluate the variance among the time and costs associated with collecting ImmTrac consent at public and private birthing hospitals in the Greater Houston area; (2) Estimate the total costs associated with collecting ImmTrac consent at selected public and private birthing hospitals in the Greater Houston area; (3) Describe the alternative opt-out process for collecting ImmTrac consent at birth and discuss the associated cost savings relative to an opt-in system.^ Methods. Existing time-motion studies (n=281) conducted between October, 2006 and August, 2007 at 8 birthing hospitals in the Greater Houston area were used to assess the time and costs associated with obtaining ImmTrac consent at birth. All data analyzed are deidentified and contain no personal information. Variations in time and costs at each location were assessed and total costs per child and costs per year were estimated. The cost of an alternative opt-out system was also calculated.^ Results. The median time required by birth registrars to complete consent procedures varied from 72-285 seconds per child. The annual costs associated with obtaining consent for 388,285 newborns in ImmTrac's opt-in consent process were estimated at $702,000. The corresponding costs of the proposed opt-out system were estimated to total $194,000 per year. ^ Conclusions. Substantial variation in the time and costs associated with completion of ImmTrac consent procedures were observed. Changing to an opt-out system for participation could represent significant cost savings. ^
Resumo:
The three articles that comprise this dissertation describe how small area estimation and geographic information systems (GIS) technologies can be integrated to provide useful information about the number of uninsured and where they are located. Comprehensive data about the numbers and characteristics of the uninsured are typically only available from surveys. Utilization and administrative data are poor proxies from which to develop this information. Those who cannot access services are unlikely to be fully captured, either by health care provider utilization data or by state and local administrative data. In the absence of direct measures, a well-developed estimation of the local uninsured count or rate can prove valuable when assessing the unmet health service needs of this population. However, the fact that these are “estimates” increases the chances that results will be rejected or, at best, treated with suspicion. The visual impact and spatial analysis capabilities afforded by geographic information systems (GIS) technology can strengthen the likelihood of acceptance of area estimates by those most likely to benefit from the information, including health planners and policy makers. ^ The first article describes how uninsured estimates are currently being performed in the Houston metropolitan region. It details the synthetic model used to calculate numbers and percentages of uninsured, and how the resulting estimates are integrated into a GIS. The second article compares the estimation method of the first article with one currently used by the Texas State Data Center to estimate numbers of uninsured for all Texas counties. Estimates are developed for census tracts in Harris County, using both models with the same data sets. The results are statistically compared. The third article describes a new, revised synthetic method that is being tested to provide uninsured estimates at sub-county levels for eight counties in the Houston metropolitan area. It is being designed to replicate the same categorical results provided by a current U.S. Census Bureau estimation method. The estimates calculated by this revised model are compared to the most recent U.S. Census Bureau estimates, using the same areas and population categories. ^
Resumo:
Additive and multiplicative models of relative risk were used to measure the effect of cancer misclassification and DS86 random errors on lifetime risk projections in the Life Span Study (LSS) of Hiroshima and Nagasaki atomic bomb survivors. The true number of cancer deaths in each stratum of the cancer mortality cross-classification was estimated using sufficient statistics from the EM algorithm. Average survivor doses in the strata were corrected for DS86 random error ($\sigma$ = 0.45) by use of reduction factors. Poisson regression was used to model the corrected and uncorrected mortality rates with covariates for age at-time-of-bombing, age at-time-of-death and gender. Excess risks were in good agreement with risks in RERF Report 11 (Part 2) and the BEIR-V report. Bias due to DS86 random error typically ranged from $-$15% to $-$30% for both sexes, and all sites and models. The total bias, including diagnostic misclassification, of excess risk of nonleukemia for exposure to 1 Sv from age 18 to 65 under the non-constant relative projection model was $-$37.1% for males and $-$23.3% for females. Total excess risks of leukemia under the relative projection model were biased $-$27.1% for males and $-$43.4% for females. Thus, nonleukemia risks for 1 Sv from ages 18 to 85 (DRREF = 2) increased from 1.91%/Sv to 2.68%/Sv among males and from 3.23%/Sv to 4.02%/Sv among females. Leukemia excess risks increased from 0.87%/Sv to 1.10%/Sv among males and from 0.73%/Sv to 1.04%/Sv among females. Bias was dependent on the gender, site, correction method, exposure profile and projection model considered. Future studies that use LSS data for U.S. nuclear workers may be downwardly biased if lifetime risk projections are not adjusted for random and systematic errors. (Supported by U.S. NRC Grant NRC-04-091-02.) ^
Resumo:
In light of the new healthcare regulations, hospitals are increasingly reevaluating their IT integration strategies to meet expanded healthcare information exchange requirements. Nevertheless, hospital executives do not have all the information they need to differentiate between the available strategies and recognize what may better fit their organizational needs. ^ In the interest of providing the desired information, this study explored the relationships between hospital financial performance, integration strategy selection, and strategy change. The integration strategies examined – applied as binary logistic regression dependent variables and in the order from most to least integrated – were Single-Vendor (SV), Best-of-Suite (BoS), and Best-of-Breed (BoB). In addition, the financial measurements adopted as independent variables for the models were two administrative labor efficiency and six industry standard financial ratios designed to provide a broad proxy of hospital financial performance. Furthermore, descriptive statistical analyses were carried out to evaluate recent trends in hospital integration strategy change. Overall six research questions were proposed for this study. ^ The first research question sought to answer if financial performance was related to the selection of integration strategies. The next questions, however, explored whether hospitals were more likely to change strategies or remain the same when there was no external stimulus to change, and if they did change, they would prefer strategies closer to the existing ones. These were followed by a question that inquired if financial performance was also related to strategy change. Nevertheless, rounding up the questions, the last two probed if the new Health Information Technology for Economic and Clinical Health (HITECH) Act had any impact on the frequency and direction of strategy change. ^ The results confirmed that financial performance is related to both IT integration strategy selection and strategy change, while concurred with prior studies that suggested hospital and environmental characteristics are associated factors as well. Specifically this study noted that the most integrated SV strategy is related to increased administrative labor efficiency and the hybrid BoS strategy is associated with improved financial health (based on operating margin and equity financing ratios). On the other hand, no financial indicators were found to be related to the least integrated BoB strategy, except for short-term liquidity (current ratio) when involving strategy change. ^ Ultimately, this study concluded that when making IT integration strategy decisions hospitals closely follow the resource dependence view of minimizing uncertainty. As each integration strategy may favor certain organizational characteristics, hospitals traditionally preferred not to make strategy changes and when they did, they selected strategies that were more closely related to the existing ones. However, as new regulations further heighten revenue uncertainty while require increased information integration, moving forward, as evidence already suggests a growing trend of organizations shifting towards more integrated strategies, hospitals may be more limited in their strategy selection choices.^