918 resultados para Information Models
Resumo:
OBJECTIVE: Anaemia in rheumatoid arthritis (RA) is prototypical of the chronic disease type and is often neglected in clinical practice. We studied anaemia in relation to disease activity, medications and radiographic progression. METHODS: Data were collected between 1996 and 2007 over a mean follow-up of 2.2 years. Anaemia was defined according to WHO (♀ haemoglobin<12 g/dl, ♂: haemoglobin<13 g/dl), or alternative criteria. Anaemia prevalence was studied in relation to disease parameters and pharmacological therapy. Radiographic progression was analysed in 9731 radiograph sets from 2681 patients in crude longitudinal regression models and after adjusting for potential confounding factors, including the clinical disease activity score with the 28-joint count for tender and swollen joints and erythrocyte sedimentation rate (DAS28ESR) or the clinical disease activity index (cDAI), synthetic antirheumatic drugs and antitumour necrosis factor (TNF) therapy. RESULTS: Anaemia prevalence decreased from more than 24% in years before 2001 to 15% in 2007. Erosions progressed significantly faster in patients with anaemia (p<0.001). Adjusted models showed these effects independently of clinical disease activity and other indicators of disease severity. Radiographic damage progression rates were increasing with severity of anaemia, suggesting a 'dose-response effect'. The effect of anaemia on damage progression was maintained in subgroups of patients treated with TNF blockade or corticosteroids, and without non-selective nonsteroidal anti-inflammatory drugs (NSAIDs). CONCLUSIONS: Anaemia in RA appears to capture disease processes that remain unmeasured by established disease activity measures in patients with or without TNF blockade, and may help to identify patients with more rapid erosive disease.
Resumo:
With improving clinical CT scanning technology, the accuracy of CT-based finite element (FE) models of the human skeleton may be ameliorated by an enhanced description of apparent level bone mechanical properties. Micro-finite element (μFE) modeling can be used to study the apparent elastic behavior of human cancellous bone. In this study, samples from the femur, radius and vertebral body were investigated to evaluate the predictive power of morphology–elasticity relationships and to compare them across different anatomical regions. μFE models of 701 trabecular bone cubes with a side length of 5.3 mm were analyzed using kinematic boundary conditions. Based on the FE results, four morphology–elasticity models using bone volume fraction as well as full, limited or no fabric information were calibrated for each anatomical region. The 5 parameter Zysset–Curnier model using full fabric information showed excellent predictive power with coefficients of determination ( r2adj ) of 0.98, 0.95 and 0.94 of the femur, radius and vertebra data, respectively, with mean total norm errors between 14 and 20%. A constant orthotropy model and a constant transverse isotropy model, where the elastic anisotropy is defined by the model parameters, yielded coefficients of determination between 0.90 and 0.98 with total norm errors between 16 and 25%. Neglecting fabric information and using an isotropic model led to r2adj between 0.73 and 0.92 with total norm errors between 38 and 49%. A comparison of the model regressions revealed minor but significant (p<0.01) differences for the fabric–elasticity model parameters calibrated for the different anatomical regions. The proposed models and identified parameters can be used in future studies to compute the apparent elastic properties of human cancellous bone for homogenized FE models.
Resumo:
Despite current enthusiasm for investigation of gene-gene interactions and gene-environment interactions, the essential issue of how to define and detect gene-environment interactions remains unresolved. In this report, we define gene-environment interactions as a stochastic dependence in the context of the effects of the genetic and environmental risk factors on the cause of phenotypic variation among individuals. We use mutual information that is widely used in communication and complex system analysis to measure gene-environment interactions. We investigate how gene-environment interactions generate the large difference in the information measure of gene-environment interactions between the general population and a diseased population, which motives us to develop mutual information-based statistics for testing gene-environment interactions. We validated the null distribution and calculated the type 1 error rates for the mutual information-based statistics to test gene-environment interactions using extensive simulation studies. We found that the new test statistics were more powerful than the traditional logistic regression under several disease models. Finally, in order to further evaluate the performance of our new method, we applied the mutual information-based statistics to three real examples. Our results showed that P-values for the mutual information-based statistics were much smaller than that obtained by other approaches including logistic regression models.
Resumo:
Currently more than half of Electronic Health Record (EHR) projects fail. Most of these failures are not due to flawed technology, but rather due to the lack of systematic considerations of human issues. Among the barriers for EHR adoption, function mismatching among users, activities, and systems is a major area that has not been systematically addressed from a human-centered perspective. A theoretical framework called Functional Framework was developed for identifying and reducing functional discrepancies among users, activities, and systems. The Functional Framework is composed of three models – the User Model, the Designer Model, and the Activity Model. The User Model was developed by conducting a survey (N = 32) that identified the functions needed and desired from the user’s perspective. The Designer Model was developed by conducting a systemic review of an Electronic Dental Record (EDR) and its functions. The Activity Model was developed using an ethnographic method called shadowing where EDR users (5 dentists, 5 dental assistants, 5 administrative personnel) were followed quietly and observed for their activities. These three models were combined to form a unified model. From the unified model the work domain ontology was developed by asking users to rate the functions (a total of 190 functions) in the unified model along the dimensions of frequency and criticality in a survey. The functional discrepancies, as indicated by the regions of the Venn diagrams formed by the three models, were consistent with the survey results, especially with user satisfaction. The survey for the Functional Framework indicated the preference of one system over the other (R=0.895). The results of this project showed that the Functional Framework provides a systematic method for identifying, evaluating, and reducing functional discrepancies among users, systems, and activities. Limitations and generalizability of the Functional Framework were discussed.
Resumo:
The hippocampus receives input from upper levels of the association cortex and is implicated in many mnemonic processes, but the exact mechanisms by which it codes and stores information is an unresolved topic. This work examines the flow of information through the hippocampal formation while attempting to determine the computations that each of the hippocampal subfields performs in learning and memory. The formation, storage, and recall of hippocampal-dependent memories theoretically utilize an autoassociative attractor network that functions by implementing two competitive, yet complementary, processes. Pattern separation, hypothesized to occur in the dentate gyrus (DG), refers to the ability to decrease the similarity among incoming information by producing output patterns that overlap less than the inputs. In contrast, pattern completion, hypothesized to occur in the CA3 region, refers to the ability to reproduce a previously stored output pattern from a partial or degraded input pattern. Prior to addressing the functional role of the DG and CA3 subfields, the spatial firing properties of neurons in the dentate gyrus were examined. The principal cell of the dentate gyrus, the granule cell, has spatially selective place fields; however, the behavioral correlates of another excitatory cell, the mossy cell of the dentate polymorphic layer, are unknown. This report shows that putative mossy cells have spatially selective firing that consists of multiple fields similar to previously reported properties of granule cells. Other cells recorded from the DG had single place fields. Compared to cells with multiple fields, cells with single fields fired at a lower rate during sleep, were less likely to burst, and were more likely to be recorded simultaneously with a large population of neurons that were active during sleep and silent during behavior. These data suggest that single-field and multiple-field cells constitute at least two distinct cell classes in the DG. Based on these characteristics, we propose that putative mossy cells tend to fire in multiple, distinct locations in an environment, whereas putative granule cells tend to fire in single locations, similar to place fields of the CA1 and CA3 regions. Experimental evidence supporting the theories of pattern separation and pattern completion comes from both behavioral and electrophysiological tests. These studies specifically focused on the function of each subregion and made implicit assumptions about how environmental manipulations changed the representations encoded by the hippocampal inputs. However, the cell populations that provided these inputs were in most cases not directly examined. We conducted a series of studies to investigate the neural activity in the entorhinal cortex, dentate gyrus, and CA3 in the same experimental conditions, which allowed a direct comparison between the input and output representations. The results show that the dentate gyrus representation changes between the familiar and cue altered environments more than its input representations, whereas the CA3 representation changes less than its input representations. These findings are consistent with longstanding computational models proposing that (1) CA3 is an associative memory system performing pattern completion in order to recall previous memories from partial inputs, and (2) the dentate gyrus performs pattern separation to help store different memories in ways that reduce interference when the memories are subsequently recalled.
Resumo:
The prognosis for lung cancer patients remains poor. Five year survival rates have been reported to be 15%. Studies have shown that dose escalation to the tumor can lead to better local control and subsequently better overall survival. However, dose to lung tumor is limited by normal tissue toxicity. The most prevalent thoracic toxicity is radiation pneumonitis. In order to determine a safe dose that can be delivered to the healthy lung, researchers have turned to mathematical models predicting the rate of radiation pneumonitis. However, these models rely on simple metrics based on the dose-volume histogram and are not yet accurate enough to be used for dose escalation trials. The purpose of this work was to improve the fit of predictive risk models for radiation pneumonitis and to show the dosimetric benefit of using the models to guide patient treatment planning. The study was divided into 3 specific aims. The first two specifics aims were focused on improving the fit of the predictive model. In Specific Aim 1 we incorporated information about the spatial location of the lung dose distribution into a predictive model. In Specific Aim 2 we incorporated ventilation-based functional information into a predictive pneumonitis model. In the third specific aim a proof of principle virtual simulation was performed where a model-determined limit was used to scale the prescription dose. The data showed that for our patient cohort, the fit of the model to the data was not improved by incorporating spatial information. Although we were not able to achieve a significant improvement in model fit using pre-treatment ventilation, we show some promising results indicating that ventilation imaging can provide useful information about lung function in lung cancer patients. The virtual simulation trial demonstrated that using a personalized lung dose limit derived from a predictive model will result in a different prescription than what was achieved with the clinically used plan; thus demonstrating the utility of a normal tissue toxicity model in personalizing the prescription dose.
Resumo:
Cultural models of the domains healing and health are important in how people understand health and their behavior regarding it. The biomedicine model has been predominant in Western society. Recent popularity of holistic health and alternative healing modalities contrasts with the biomedical model and the assumptions upon which that model has been practiced. The holistic health movement characterizes an effort by health care providers and others such as nurses to expand the biomedical model and has often incorporated alternative modalities. This research described and compared the cultural models of healing of professional nurses and alternative healers. A group of nursing faculty who promote a holistic model were compared to a group of healers using healing touch. Ethnographic methods of participant observation, free listing and pile sort were used. Theoretical sampling in the free listings reached saturation at 18 in the group of nurses and 21 in the group of healers. Categories consistent for both groups emerged from the data. These were: physical, mental, attitude, relationships, spiritual, self management, and health seeking including biomedical and alternative resources. The healers had little differentiation between the concepts health and healing. The nurses, however, had more elements in self management for health and in health seeking for healing. This reflects the nurse's role in facilitating the shift in locus of responsibility between health and healing. The healers provided more specific information regarding alternative resources. The healer's conceptualization of health was embedded in a spiritual belief system and contrasted dramatically with that of biomedicine. The healer's models also contrasted with holistic health in the areas of holism, locus of responsibility, and dealing with uncertainty. The similarity between the groups and their dissimilarity to biomedicine suggest a larger cultural shift in beliefs regarding health care. ^
Resumo:
This paper reports a comparison of three modeling strategies for the analysis of hospital mortality in a sample of general medicine inpatients in a Department of Veterans Affairs medical center. Logistic regression, a Markov chain model, and longitudinal logistic regression were evaluated on predictive performance as measured by the c-index and on accuracy of expected numbers of deaths compared to observed. The logistic regression used patient information collected at admission; the Markov model was comprised of two absorbing states for discharge and death and three transient states reflecting increasing severity of illness as measured by laboratory data collected during the hospital stay; longitudinal regression employed Generalized Estimating Equations (GEE) to model covariance structure for the repeated binary outcome. Results showed that the logistic regression predicted hospital mortality as well as the alternative methods but was limited in scope of application. The Markov chain provides insights into how day to day changes of illness severity lead to discharge or death. The longitudinal logistic regression showed that increasing illness trajectory is associated with hospital mortality. The conclusion is reached that for standard applications in modeling hospital mortality, logistic regression is adequate, but for new challenges facing health services research today, alternative methods are equally predictive, practical, and can provide new insights. ^
Resumo:
Most statistical analysis, theory and practice, is concerned with static models; models with a proposed set of parameters whose values are fixed across observational units. Static models implicitly assume that the quantified relationships remain the same across the design space of the data. While this is reasonable under many circumstances this can be a dangerous assumption when dealing with sequentially ordered data. The mere passage of time always brings fresh considerations and the interrelationships among parameters, or subsets of parameters, may need to be continually revised. ^ When data are gathered sequentially dynamic interim monitoring may be useful as new subject-specific parameters are introduced with each new observational unit. Sequential imputation via dynamic hierarchical models is an efficient strategy for handling missing data and analyzing longitudinal studies. Dynamic conditional independence models offers a flexible framework that exploits the Bayesian updating scheme for capturing the evolution of both the population and individual effects over time. While static models often describe aggregate information well they often do not reflect conflicts in the information at the individual level. Dynamic models prove advantageous over static models in capturing both individual and aggregate trends. Computations for such models can be carried out via the Gibbs sampler. An application using a small sample repeated measures normally distributed growth curve data is presented. ^
Resumo:
Digital technologies have profoundly changed not only the ways we create, distribute, access, use and re-use information but also many of the governance structures we had in place. Overall, "older" institutions at all governance levels have grappled and often failed to master the multi-faceted and multi-directional issues of the Internet. Regulatory entrepreneurs have yet to discover and fully mobilize the potential of digital technologies as an influential factor impacting upon the regulability of the environment and as a potential regulatory tool in themselves. At the same time, we have seen a deterioration of some public spaces and lower prioritization of public objectives, when strong private commercial interests are at play, such as most tellingly in the field of copyright. Less tangibly, private ordering has taken hold and captured through contracts spaces, previously regulated by public law. Code embedded in technology often replaces law. Non-state action has in general proliferated and put serious pressure upon conventional state-centered, command-and-control models. Under the conditions of this "messy" governance, the provision of key public goods, such as freedom of information, has been made difficult or is indeed jeopardized.The grand question is how can we navigate this complex multi-actor, multi-issue space and secure the attainment of fundamental public interest objectives. This is also the question that Ian Brown and Chris Marsden seek to answer with their book, Regulating Code, as recently published under the "Information Revolution and Global Politics" series of MIT Press. This book review critically assesses the bold effort by Brown and Marsden.
Resumo:
The maintenance of genetic variation in a spatially heterogeneous environment has been one of the main research themes in theoretical population genetics. Despite considerable progress in understanding the consequences of spatially structured environments on genetic variation, many problems remain unsolved. One of them concerns the relationship between the number of demes, the degree of dominance, and the maximum number of alleles that can be maintained by selection in a subdivided population. In this work, we study the potential of maintaining genetic variation in a two-deme model with deme-independent degree of intermediate dominance, which includes absence of G x E interaction as a special case. We present a thorough numerical analysis of a two-deme three-allele model, which allows us to identify dominance and selection patterns that harbor the potential for stable triallelic equilibria. The information gained by this approach is then used to construct an example in which existence and asymptotic stability of a fully polymorphic equilibrium can be proved analytically. Noteworthy, in this example the parameter range in which three alleles can coexist is maximized for intermediate migration rates. Our results can be interpreted in a specialist-generalist context and (among others) show when two specialists can coexist with a generalist in two demes if the degree of dominance is deme independent and intermediate. The dominance relation between the generalist allele and the specialist alleles play a decisive role. We also discuss linear selection on a quantitative trait and show that G x E interaction is not necessary for the maintenance of more than two alleles in two demes.
Resumo:
Satellite remote sensing provides a powerful instrument for mapping and monitoring traces of historical settlements and infrastructure, not only in distant areas and crisis regions. It helps archaeologists to embed their findings from field surveys into the broader context of the landscape. With the start of the TanDEM-X mission, spatially explicit 3D-information is available to researchers at an unprecedented resolution worldwide. We examined different experimental TanDEM-X digital elevation models (DEM) that were processed from two different imaging modes (Stripmap/High Resolution Spotlight) using the operational alternating bistatic acquisition mode. The quality and accuracy of the experimental DEM products was compared to other available DEM products and a high precision archaeological field survey. The results indicate the potential of TanDEM-X Stripmap (SM) data for mapping surface elements at regional scale. For the alluvial plain of Cilicia, a suspected palaeochannel could be reconstructed. At the local scale, DEM products from TanDEM-X High Resolution Spotlight (HS) mode were processed at 2 m spatial resolution using a merge of two monostatic/bistatic interferograms. The absolute and relative vertical accuracy of the outcome meet the specification of high resolution elevation data (HRE) standards from the National System for Geospatial Intelligence (NSG) at the HRE20 level.
Resumo:
We present an application and sample independent method for the automatic discrimination of noise and signal in optical coherence tomography Bscans. The proposed algorithm models the observed noise probabilistically and allows for a dynamic determination of image noise parameters and the choice of appropriate image rendering parameters. This overcomes the observer variability and the need for a priori information about the content of sample images, both of which are challenging to estimate systematically with current systems. As such, our approach has the advantage of automatically determining crucial parameters for evaluating rendered image quality in a systematic and task independent way. We tested our algorithm on data from four different biological and nonbiological samples (index finger, lemon slices, sticky tape, and detector cards) acquired with three different experimental spectral domain optical coherence tomography (OCT) measurement systems including a swept source OCT. The results are compared to parameters determined manually by four experienced OCT users. Overall, our algorithm works reliably regardless of which system and sample are used and estimates noise parameters in all cases within the confidence interval of those found by observers.
Resumo:
This book attempts to synthesize research that contributes to a better understanding of how to reach sustainable business value through information systems (IS) outsourcing. Important topics in this realm are how IS outsourcing can contribute to innovation, how it can be dynamically governed, how to cope with its increasing complexity through multi-vendor arrangements, how service quality standards can be met, how corporate social responsibility can be upheld, and how to cope with increasing demands of internationalization and new sourcing models, such as crowdsourcing and platform-based cooperation. These issues are viewed from either the client or vendor perspective, or both. The book should be of interest to all academics and students in the fields of Information Systems, Management, and Organization as well as corporate executives and professionals who seek a more profound analysis and understanding of the underlying factors and mechanisms of outsourcing.
Resumo:
This book attempts to synthesize research that contributes to a better understanding of how to reach sustainable business value through information systems (IS) outsourcing. Important topics in this realm are how IS outsourcing can contribute to innovation, how it can be dynamically governed, how to cope with its increasing complexity through multi-vendor arrangements, how service quality standards can be met, how corporate social responsibility can be upheld and how to cope with increasing demands of internationalization and new sourcing models, such as crowdsourcing and platform-based cooperation. These issues are viewed from either the client or vendor perspective, or both. The book should be of interest to all academics and students in the fields of Information Systems, Management and Organization as well as corporate executives and professionals who seek a more profound analysis and understanding of the underlying factors and mechanisms of outsourcing.