800 resultados para information bottleneck method
Resumo:
Despite current enthusiasm for investigation of gene-gene interactions and gene-environment interactions, the essential issue of how to define and detect gene-environment interactions remains unresolved. In this report, we define gene-environment interactions as a stochastic dependence in the context of the effects of the genetic and environmental risk factors on the cause of phenotypic variation among individuals. We use mutual information that is widely used in communication and complex system analysis to measure gene-environment interactions. We investigate how gene-environment interactions generate the large difference in the information measure of gene-environment interactions between the general population and a diseased population, which motives us to develop mutual information-based statistics for testing gene-environment interactions. We validated the null distribution and calculated the type 1 error rates for the mutual information-based statistics to test gene-environment interactions using extensive simulation studies. We found that the new test statistics were more powerful than the traditional logistic regression under several disease models. Finally, in order to further evaluate the performance of our new method, we applied the mutual information-based statistics to three real examples. Our results showed that P-values for the mutual information-based statistics were much smaller than that obtained by other approaches including logistic regression models.
Resumo:
Purpose: The purpose of the Camp For All Connection project is to facilitate access to electronic health information resources at the Camp For All facility. Setting/Participants/Resources: Camp For All is a barrier-free camp working in partnership with organizations to enrich the lives of children and adults with chronic illnesses and disabilities and their families by providing camping and retreat experiences. The camp facility is located on 206 acres in Burton, Texas. The project partners are Texas Woman's University, Houston Academy of Medicine-Texas Medical Center Library, and Camp For All. Brief Description: The Camp For All Connection project placed Internet-connected workstations at the camp's health center in the main lodge and provided training in the use of electronic health information resources. A train-the-trainer approach was used to provide training to Camp For All staff. Results/Outcome: Project workstations are being used by health care providers and camp staff for communication purposes and to make better informed health care decisions for Camp For All campers. Evaluation Method: A post-training evaluation was administered at the end of the train-the-trainer session. In addition, a series of site visits and interviews was conducted with camp staff members involved in the project. The site visits and interviews allowed for ongoing dialog between project staff and project participants.
Resumo:
A nonlinear viscoelastic image registration algorithm based on the demons paradigm and incorporating inverse consistent constraint (ICC) is implemented. An inverse consistent and symmetric cost function using mutual information (MI) as a similarity measure is employed. The cost function also includes regularization of transformation and inverse consistent error (ICE). The uncertainties in balancing various terms in the cost function are avoided by alternatively minimizing the similarity measure, the regularization of the transformation, and the ICE terms. The diffeomorphism of registration for preventing folding and/or tearing in the deformation is achieved by the composition scheme. The quality of image registration is first demonstrated by constructing brain atlas from 20 adult brains (age range 30-60). It is shown that with this registration technique: (1) the Jacobian determinant is positive for all voxels and (2) the average ICE is around 0.004 voxels with a maximum value below 0.1 voxels. Further, the deformation-based segmentation on Internet Brain Segmentation Repository, a publicly available dataset, has yielded high Dice similarity index (DSI) of 94.7% for the cerebellum and 74.7% for the hippocampus, attesting to the quality of our registration method.
Resumo:
We developed a novel combinatorial method termed restriction endonuclease protection selection and amplification (REPSA) to identify consensus binding sites of DNA-binding ligands. REPSA uses a unique enzymatic selection based on the inhibition of cleavage by a type IIS restriction endonuclease, an enzyme that cleaves DNA at a site distal from its recognition sequence. Sequences bound by a ligand are protected from cleavage while unprotected sequences are cleaved. This enzymatic selection occurs in solution under mild conditions and is dependant only on the DNA-binding ability of the ligand. Thus, REPSA is useful for a broad range of ligands including all classes of DNA-binding ligands, weakly binding ligands, mixed populations of ligands, and unknown ligands. Here I describe REPSA and the application of this method to select the consensus DNA-binding sequences of three representative DNA-binding ligands; a nucleic acid (triplex-forming single-stranded DNA), a protein (the TATA-binding protein), and a small molecule (Distamycin A). These studies generated new information regarding the specificity of these ligands in addition to establishing their DNA-binding sequences. ^
Resumo:
News items reporting self-immolation by Tibetans have been on the increase in recent years. After examining the corpse of a Swiss man who had committed suicide by deliberate self-burning, we wondered how often this occurs in Switzerland. The Federal Statistics Office (FSO) does not register self-burning specifically so no official national data on this form of suicide are available. However, we had access to the data from a Swiss National Science Foundation (SNSF) project Suicides in Switzerland between 2000 and 2010, which collected information on all (4885) cases of suicide investigated by the various institutes of forensic medicine. From this data pool we extracted 50 cases (1.02%) of suicide by selfburning, in order to determine the details and to identify the possible reasons for choosing this method. To look at our results in the light of studies from other countries, we searched the literature for studies that had also retrospectively examined suicide by self-immolation based on forensic records. Our results showed that, on the whole, personal aspects of selfburning in Switzerland do not differ from those in other industrialised nations. Some data, including religious and sociocultural background, were unfortunately missing – not only from our study but also from the similar ones. In our opinion, the most important prevention strategy is to make healthcare professionals more aware of this rare method of suicide.
Resumo:
Perceived duration is assumed to be positively related to nontemporal stimulus magnitude. Most recently, the finding that larger stimuli are perceived to last longer has been challenged to represent a mere decisional bias induced by the use of comparative duration judgments. Therefore, in the present study, the method of temporal reproduction was applied as a psychophysical procedure to quantify perceived duration. Another major goal was to investigate the influence of attention on the effect of visual stimulus size on perceived duration. For this purpose, an additional dual-task paradigm was employed. Our results not only converged with previous findings in demonstrating a functional positive relationship between nontemporal stimulus size and perceived duration, but also showed that the effect of stimulus size on perceived duration was not confined to comparative duration judgments. Furthermore, the effect of stimulus size proved to be independent of attentional resources allocated to stimulus size; nontemporal visual stimulus information does not need to be processed intentionally to influence perceived duration. Finally, the effect of nontemporal stimulus size on perceived duration was effectively modulated by the duration of the target intervals, suggesting a hitherto largely unrecognized role of temporal context for the effect of nontemporal stimulus size to become evident.
Resumo:
We present an application and sample independent method for the automatic discrimination of noise and signal in optical coherence tomography Bscans. The proposed algorithm models the observed noise probabilistically and allows for a dynamic determination of image noise parameters and the choice of appropriate image rendering parameters. This overcomes the observer variability and the need for a priori information about the content of sample images, both of which are challenging to estimate systematically with current systems. As such, our approach has the advantage of automatically determining crucial parameters for evaluating rendered image quality in a systematic and task independent way. We tested our algorithm on data from four different biological and nonbiological samples (index finger, lemon slices, sticky tape, and detector cards) acquired with three different experimental spectral domain optical coherence tomography (OCT) measurement systems including a swept source OCT. The results are compared to parameters determined manually by four experienced OCT users. Overall, our algorithm works reliably regardless of which system and sample are used and estimates noise parameters in all cases within the confidence interval of those found by observers.
Resumo:
Many technological developments of the past two decades come with the promise of greater IT flexi-bility, i.e. greater capacity to adapt IT. These technologies are increasingly used to improve organiza-tional routines that are not affected by large, hard-to-change IT such as ERP. Yet, most findings on the interaction of routines and IT stem from contexts where IT is hard to change. Our research ex-plores how routines and IT co-evolve when IT is flexible. We review the literatures on routines to sug-gest that IT may act as a boundary object that mediates the learning process unfolding between the ostensive and the performative aspect of the routine. Although prior work has concluded from such conceptualizations that IT stabilizes routines, we qualify that flexible IT can also stimulate change because it enables learning in short feedback cycles. We suggest that, however, such change might not always materialize because it is contingent on governance choices and technical knowledge. We de-scribe the case-study method to explore how routines and flexible IT co-evolve and how governance and technical knowledge influence this process. We expect to contribute towards stronger theory of routines and to develop recommendations for the effective implementation of flexible IT in loosely coupled routines.
Resumo:
BACKGROUND Record linkage of existing individual health care data is an efficient way to answer important epidemiological research questions. Reuse of individual health-related data faces several problems: Either a unique personal identifier, like social security number, is not available or non-unique person identifiable information, like names, are privacy protected and cannot be accessed. A solution to protect privacy in probabilistic record linkages is to encrypt these sensitive information. Unfortunately, encrypted hash codes of two names differ completely if the plain names differ only by a single character. Therefore, standard encryption methods cannot be applied. To overcome these challenges, we developed the Privacy Preserving Probabilistic Record Linkage (P3RL) method. METHODS In this Privacy Preserving Probabilistic Record Linkage method we apply a three-party protocol, with two sites collecting individual data and an independent trusted linkage center as the third partner. Our method consists of three main steps: pre-processing, encryption and probabilistic record linkage. Data pre-processing and encryption are done at the sites by local personnel. To guarantee similar quality and format of variables and identical encryption procedure at each site, the linkage center generates semi-automated pre-processing and encryption templates. To retrieve information (i.e. data structure) for the creation of templates without ever accessing plain person identifiable information, we introduced a novel method of data masking. Sensitive string variables are encrypted using Bloom filters, which enables calculation of similarity coefficients. For date variables, we developed special encryption procedures to handle the most common date errors. The linkage center performs probabilistic record linkage with encrypted person identifiable information and plain non-sensitive variables. RESULTS In this paper we describe step by step how to link existing health-related data using encryption methods to preserve privacy of persons in the study. CONCLUSION Privacy Preserving Probabilistic Record linkage expands record linkage facilities in settings where a unique identifier is unavailable and/or regulations restrict access to the non-unique person identifiable information needed to link existing health-related data sets. Automated pre-processing and encryption fully protect sensitive information ensuring participant confidentiality. This method is suitable not just for epidemiological research but also for any setting with similar challenges.
Resumo:
Background: A prerequisite for high performance in motor tasks is the acquisition of egocentric sensory information that must be translated into motor actions. A phenomenon that supports this process is the Quiet Eye (QE) defined as long final fixation before movement initiation. It is assumed that the QE facilitates information processing, particularly regarding movement parameterization. Aims: The question remains whether this facilitation also holds for the information-processing stage of response selection and – related to perception crucial – stage of stimulus identification. Method: In two experiments with sport science students, performance-enhancing effects of experimentally manipulated QE durations were tested as a function of target position predictability and target visibility, thereby selectively manipulating response selection and stimulus identification demands, respectively. Results: The results support the hypothesis of facilitated information processing through long QE durations since in both experiments performance-enhancing effects of long QE durations were found under increased processing demands only. In Experiment 1, QE duration affected performance only if the target position was not predictable and positional information had to be processed over the QE period. In Experiment 2, in a full vs. no target visibility comparison with saccades to the upcoming target position induced by flicker cues, the functionality of a long QE duration depended on the visual stimulus identification period as soon as the interval falls below a certain threshold. Conclusions: The results corroborate earlier findings that QE efficiency depends on demands put on the visuomotor system, thereby furthering the assumption that the phenomenon supports the processes of sensorimotor integration.
Resumo:
Indoor positioning has become an emerging research area because of huge commercial demands for location-based services in indoor environments. Channel State Information (CSI) as a fine-grained physical layer information has been recently proposed to achieve high positioning accuracy by using range-based methods, e.g., trilateration. In this work, we propose to fuse the CSI-based ranges and velocity estimated from inertial sensors by an enhanced particle filter to achieve highly accurate tracking. The algorithm relies on some enhanced ranging methods and further mitigates the remaining ranging errors by a weighting technique. Additionally, we provide an efficient method to estimate the velocity based on inertial sensors. The algorithms are designed in a network-based system, which uses rather cheap commercial devices as anchor nodes. We evaluate our system in a complex environment along three different moving paths. Our proposed tracking method can achieve 1:3m for mean accuracy and 2:2m for 90% accuracy, which is more accurate and stable than pedestrian dead reckoning and range-based positioning.
Resumo:
The concentration of 11-nor-9-carboxy-Δ(9)-tetrahydrocannabinol (THCCOOH) in whole blood is used as a parameter for assessing the consumption behavior of cannabis consumers. The blood level of THCCOOH-glucuronide might provide additional information about the frequency of cannabis use. To verify this assumption, a column-switching liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for the rapid and direct quantification of free and glucuronidated THCCOOH in human whole blood was newly developed. The method comprised protein precipitation, followed by injection of the processed sample onto a trapping column and subsequent gradient elution to an analytical column for separation and detection. The total LC run time was 4.5 min. Detection of the analytes was accomplished by electrospray ionization in positive ion mode and selected reaction monitoring using a triple-stage quadrupole mass spectrometer. The method was fully validated by evaluating the following parameters: linearity, lower limit of quantification, accuracy and imprecision, selectivity, extraction efficiency, matrix effect, carry-over, dilution integrity, analyte stability, and re-injection reproducibility. All acceptance criteria were analyzed and the predefined criteria met. Linearity ranged from 5.0 to 500 μg/L for both analytes. The method was successfully applied to whole blood samples from a large collective of cannabis consumers, demonstrating its applicability in the forensic field.
Resumo:
Five test runs were performed to assess possible bias when performing the loss on ignition (LOI) method to estimate organic matter and carbonate content of lake sediments. An accurate and stable weight loss was achieved after 2 h of burning pure CaCO3 at 950 °C, whereas LOI of pure graphite at 530 °C showed a direct relation to sample size and exposure time, with only 40-70% of the possible weight loss reached after 2 h of exposure and smaller samples losing weight faster than larger ones. Experiments with a standardised lake sediment revealed a strong initial weight loss at 550 °C, but samples continued to lose weight at a slow rate at exposure of up to 64 h, which was likely the effect of loss of volatile salts, structural water of clay minerals or metal oxides, or of inorganic carbon after the initial burning of organic matter. A further test-run revealed that at 550 °C samples in the centre of the furnace lost more weight than marginal samples. At 950 °C this pattern was still apparent but the differences became negligible. Again, LOI was dependent on sample size. An analytical LOI quality control experiment including ten different laboratories was carried out using each laboratory's own LOI procedure as well as a standardised LOI procedure to analyse three different sediments. The range of LOI values between laboratories measured at 550 °C was generally larger when each laboratory used its own method than when using the standard method. This was similar for 950 °C, although the range of values tended to be smaller. The within-laboratory range of LOI measurements for a given sediment was generally small. Comparisons of the results of the individual and the standardised method suggest that there is a laboratory-specific pattern in the results, probably due to differences in laboratory equipment and/or handling that could not be eliminated by standardising the LOI procedure. Factors such as sample size, exposure time, position of samples in the furnace and the laboratory measuring affected LOI results, with LOI at 550 °C being more susceptible to these factors than LOI at 950 °C. We, therefore, recommend analysts to be consistent in the LOI method used in relation to the ignition temperatures, exposure times, and the sample size and to include information on these three parameters when referring to the method.
What’s the best method? Comparison of different short forms oft he Pathological Narcissism Inventory
Resumo:
Recent research emphasizes the various facets of narcissism. As a consequence, newly developed questionnaires for narcissism have a large number of subscales and items. However, for the daily use in research and practice, short measures are crucial. In this study we compare different short forms of the Pathological Narcissism Questionnaire, a 54 item measure with seven subscales. In different samples (total N>2000) we applied different theoretical models to construct short forms of approximately 20 items. In particular, we compared IRT, item-total correlation, and factor loading based short forms and versions based on content validity and random selection. In all versions the original subscale structure was preserved. Results show that the short forms all have high correlations with the original version. Furthermore, correlations with criterion validation measures were comparable. We conclude that the item number can be reduced substantially without loosing information. Pros and cons of the different reduction methods are discussed.
Clinical Evaluation of a Fully-automatic Segmentation Method for Longitudinal Brain Tumor Volumetry.
Resumo:
Information about the size of a tumor and its temporal evolution is needed for diagnosis as well as treatment of brain tumor patients. The aim of the study was to investigate the potential of a fully-automatic segmentation method, called BraTumIA, for longitudinal brain tumor volumetry by comparing the automatically estimated volumes with ground truth data acquired via manual segmentation. Longitudinal Magnetic Resonance (MR) Imaging data of 14 patients with newly diagnosed glioblastoma encompassing 64 MR acquisitions, ranging from preoperative up to 12 month follow-up images, was analysed. Manual segmentation was performed by two human raters. Strong correlations (R = 0.83-0.96, p < 0.001) were observed between volumetric estimates of BraTumIA and of each of the human raters for the contrast-enhancing (CET) and non-enhancing T2-hyperintense tumor compartments (NCE-T2). A quantitative analysis of the inter-rater disagreement showed that the disagreement between BraTumIA and each of the human raters was comparable to the disagreement between the human raters. In summary, BraTumIA generated volumetric trend curves of contrast-enhancing and non-enhancing T2-hyperintense tumor compartments comparable to estimates of human raters. These findings suggest the potential of automated longitudinal tumor segmentation to substitute manual volumetric follow-up of contrast-enhancing and non-enhancing T2-hyperintense tumor compartments.