62 resultados para Task-based information access
em CentAUR: Central Archive University of Reading - UK
Resumo:
Accessing information, which is spread across multiple sources, in a structured and connected way, is a general problem for enterprises. A unified structure for knowledge representation is urgently needed to enable integration of heterogeneous information resources. Topic Maps seem to be a solution for this problem. The Topic Map technology enables connecting information, through concepts and relationships, and their occurrences across multiple systems. In this paper, we address this problem by describing a framework built on topic maps, to support the current need of knowledge management. New approaches for information integration, intelligent search and topic map exploration are introduced within this framework.
Resumo:
Healthcare organizations are known for their complex and intense information environment. Healthcare information is facilitated via heterogeneous information systems or paper-based sources. Access to the right information under increasing time pressure is extremely challenging. This paper proposes an information architecture for healthcare organizations. It facilitates the provision of the right information to the right person in the right place and time tailored to their requirements. It adapts an abductive reasoning research approach. Organizational semiotics serves as its theoretical underpinning, guiding the data collection process through direct observation in the ophthalmology outpatient clinics of a UK hospital. It results the norm and information objects that form the information architecture. This is modeled by Archimate. The contribution of the information architecture can be seen from organizational, social and technical perspective. It clearly shows how information is facilitated within a healthcare organization, reducing duplicated data entry, and guiding the future technological implementation.
Resumo:
Previous studies of the Stroop task propose two key mediators: the prefrontal and cingulate cortices but hints exist of functional specialization within these regions. This study aimed to examine the effect of task modality upon the prefrontal and cingulate response by examining the response to colour, number, and shape Stroop tasks whilst BOLD fMRI images were acquired on a Siemens 3 T MRI scanner. Behavioural analyses indicated facilitation and interference effects and a noticeable effect of task difficulty. Some modular effects of modality were observed in the prefrontal cortex that survived exclusion of task difficulty related activations. No effect of task-relevant information was observed in the anterior cingulate. Future comparisons of the mediation of selective attention need to consider the effects of task context and task difficulty. (c) 2005 Elsevier Inc. All rights reserved.
Resumo:
Given that the next and current generation networks will coexist for a considerable period of time, it is important to improve the performance of existing networks. One such improvement recently proposed is to enhance the throughput of ad hoc networks by using dual-hop relay-based transmission schemes. Since in ad hoc networks throughput is normally related to their energy consumption, it is important to examine the impact of using relay-based transmissions on energy consumption. In this paper, we present an analytical energy consumption model for dual-hop relay-based medium access control (MAC) protocols. Based on the recently reported relay-enabled Distributed Coordination Function (rDCF), we have shown the efficacy of the proposed analytical model. This is a generalized model and can be used to predict energy consumption in saturated relay-based ad hoc networks. This model can predict energy consumption in ideal environment and with transmission errors. It is shown that using a relay results in not only better throughput but also better energy efficiency. Copyright (C) 2009 Rizwan Ahmad et al.
Resumo:
This paper uses an entropy-based information approach to determine if farmland values are more closely associated with urban pressure or farm income. The basic question is: how much information on changes in farm real estate values is contained in changes in population versus changes in returns to production agriculture? Results suggest population is informative, but changes in farmland values are more strongly associated with changes in the distribution of returns. However, this relationship is not true for every region nor does it hold over time, as for some regions and time periods changes in population are more informative. Results have policy implications for both equity and efficiency.
Resumo:
This paper presents an investigation into learners’ and teachers’ perceptions of and criteria for task difficulty. Ten second language learners performed four oral narrative tasks and were retrospectively interviewed about which tasks they perceived as difficult, what factors affected this difficulty and how they identified and defined this task difficulty. Ten EFL/ESOL teachers were given the same tasks and asked to consider the difficulty of the tasks for their learners, and were invited to discuss the factors they believed contributed to this difficulty. Qualitative analysis of the data revealed that, although there were some differences between the two groups’ perceptions of task difficulty, there was substantial similarity between them in terms of the criteria they considered in identifying and defining task difficulty. The findings of this study lend support to the tenets of a cognitive approach to task-based language learning, and demonstrate which aspects of two models of task difficulty reflect the teachers’ and learners’ perceptions and perspectives.
Resumo:
Introduction. Feature usage is a pre-requisite to realising the benefits of investments in feature rich systems. We propose that conceptualising the dependent variable 'system use' as 'level of use' and specifying it as a formative construct has greater value for measuring the post-adoption use of feature rich systems. We then validate the content of the construct as a first step in developing a research instrument to measure it. The context of our study is the post-adoption use of electronic medical records (EMR) by primary care physicians. Method. Initially, a literature review of the empirical context defines the scope based on prior studies. Having identified core features from the literature, they are further refined with the help of experts in a consensus seeking process that follows the Delphi technique. Results.The methodology was successfully applied to EMRs, which were selected as an example of feature rich systems. A review of EMR usage and regulatory standards provided the feature input for the first round of the Delphi process. A panel of experts then reached consensus after four rounds, identifying ten task-based features that would be indicators of level of use. Conclusions. To study why some users deploy more advanced features than others, theories of post-adoption require a rich formative dependent variable that measures level of use. We have demonstrated that a context sensitive literature review followed by refinement through a consensus seeking process is a suitable methodology to validate the content of this dependent variable. This is the first step of instrument development prior to statistical confirmation with a larger sample.
Resumo:
This paper describes a case study of an electronic data management system developed in-house by the Facilities Management Directorate (FMD) of an educational institution in the UK. The FMD Maintenance and Business Services department is responsible for the maintenance of the built-estate owned by the university. The department needs to have a clear definition of the type of work undertaken and the administration that enables any maintenance work to be carried out. These include the management of resources, budget, cash flow and workflow of reactive, preventative and planned maintenance of the campus. In order to be more efficient in supporting the business process, the FMD had decided to move from a paper-based information system to an electronic system, WREN, to support the business process of the FMD. Some of the main advantages of WREN are that it is tailor-made to fit the purpose of the users; it is cost effective when it comes to modifications on the system; and the database can also be used as a knowledge management tool. There is a trade-off; as WREN is tailored to the specific requirements of the FMD, it may not be easy to implement within a different institution without extensive modifications. However, WREN is successful in not only allowing the FMD to carry out the tasks of maintaining and looking after the built-estate of the university, but also has achieved its aim to minimise costs and maximise efficiency.
Resumo:
The field site network (FSN) plays a central role in conducting joint research within all Assessing Large-scale Risks for biodiversity with tested Methods (ALARM) modules and provides a mechanism for integrating research on different topics in ALARM on the same site for measuring multiple impacts on biodiversity. The network covers most European climates and biogeographic regions, from Mediterranean through central European and boreal to subarctic. The project links databases with the European-wide field site network FSN, including geographic information system (GIS)-based information to characterise the test location for ALARM researchers for joint on-site research. Maps are provided in a standardised way and merged with other site-specific information. The application of GIS for these field sites and the information management promotes the use of the FSN for research and to disseminate the results. We conclude that ALARM FSN sites together with other research sites in Europe jointly could be used as a future backbone for research proposals
A hierarchical Bayesian model for predicting the functional consequences of amino-acid polymorphisms
Resumo:
Genetic polymorphisms in deoxyribonucleic acid coding regions may have a phenotypic effect on the carrier, e.g. by influencing susceptibility to disease. Detection of deleterious mutations via association studies is hampered by the large number of candidate sites; therefore methods are needed to narrow down the search to the most promising sites. For this, a possible approach is to use structural and sequence-based information of the encoded protein to predict whether a mutation at a particular site is likely to disrupt the functionality of the protein itself. We propose a hierarchical Bayesian multivariate adaptive regression spline (BMARS) model for supervised learning in this context and assess its predictive performance by using data from mutagenesis experiments on lac repressor and lysozyme proteins. In these experiments, about 12 amino-acid substitutions were performed at each native amino-acid position and the effect on protein functionality was assessed. The training data thus consist of repeated observations at each position, which the hierarchical framework is needed to account for. The model is trained on the lac repressor data and tested on the lysozyme mutations and vice versa. In particular, we show that the hierarchical BMARS model, by allowing for the clustered nature of the data, yields lower out-of-sample misclassification rates compared with both a BMARS and a frequen-tist MARS model, a support vector machine classifier and an optimally pruned classification tree.
Resumo:
Determining rat preferences for, and behaviour towards, environmental enrichment objects allows us to provide evidence-based information about how the caged environment may be enriched. In recent years there have been many studies investigating the preferences of laboratory rodents for a wide variety of environmental enrichment objects and materials. While these have provided important information regarding the animals' perception of the items, very few studies have attempted to systematically investigate the precise attributes that constitute a preferred object and the behaviour that these objects afford. We have designed a research program to systematically study rats' motivation to interact with enrichment objects. Here we present the results from two experiments which examined the time rats spent with objects that only differed in size. This showed that rats spent longer with large objects rather than small ones, even though objects were presented individually. We also investigated the rats' behaviour with the objects in an open field and found that rats spent longer climbing on top of the large object. This behaviour continued when the large objects were laid on their sides instead of placed upright in the arena, suggesting that the rats were not simply climbing on the objects to investigate the top of the arena and thus an escape route, but instead were genuinely motivated to climb. This suggests that rat welfare could be enhanced by the addition to their cages of objects that permit climbing. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
A new robust neurofuzzy model construction algorithm has been introduced for the modeling of a priori unknown dynamical systems from observed finite data sets in the form of a set of fuzzy rules. Based on a Takagi-Sugeno (T-S) inference mechanism a one to one mapping between a fuzzy rule base and a model matrix feature subspace is established. This link enables rule based knowledge to be extracted from matrix subspace to enhance model transparency. In order to achieve maximized model robustness and sparsity, a new robust extended Gram-Schmidt (G-S) method has been introduced via two effective and complementary approaches of regularization and D-optimality experimental design. Model rule bases are decomposed into orthogonal subspaces, so as to enhance model transparency with the capability of interpreting the derived rule base energy level. A locally regularized orthogonal least squares algorithm, combined with a D-optimality used for subspace based rule selection, has been extended for fuzzy rule regularization and subspace based information extraction. By using a weighting for the D-optimality cost function, the entire model construction procedure becomes automatic. Numerical examples are included to demonstrate the effectiveness of the proposed new algorithm.