19 resultados para Context information
em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast
Resumo:
The need to merge multiple sources of uncertaininformation is an important issue in many application areas,especially when there is potential for contradictions betweensources. Possibility theory offers a flexible framework to represent,and reason with, uncertain information, and there isa range of merging operators, such as the conjunctive anddisjunctive operators, for combining information. However, withthe proposals to date, the context of the information to be mergedis largely ignored during the process of selecting which mergingoperators to use. To address this shortcoming, in this paper,we propose an adaptive merging algorithm which selects largelypartially maximal consistent subsets (LPMCSs) of sources, thatcan be merged through relaxation of the conjunctive operator, byassessing the coherence of the information in each subset. In thisway, a fusion process can integrate both conjunctive and disjunctiveoperators in a more flexible manner and thereby be morecontext dependent. A comparison with related merging methodsshows how our algorithm can produce a more consensual result.
Resumo:
There has been much interest in the belief–desire–intention (BDI) agent-based model for developing scalable intelligent systems, e.g. using the AgentSpeak framework. However, reasoning from sensor information in these large-scale systems remains a significant challenge. For example, agents may be faced with information from heterogeneous sources which is uncertain and incomplete, while the sources themselves may be unreliable or conflicting. In order to derive meaningful conclusions, it is important that such information be correctly modelled and combined. In this paper, we choose to model uncertain sensor information in Dempster–Shafer (DS) theory. Unfortunately, as in other uncertainty theories, simple combination strategies in DS theory are often too restrictive (losing valuable information) or too permissive (resulting in ignorance). For this reason, we investigate how a context-dependent strategy originally defined for possibility theory can be adapted to DS theory. In particular, we use the notion of largely partially maximal consistent subsets (LPMCSes) to characterise the context for when to use Dempster’s original rule of combination and for when to resort to an alternative. To guide this process, we identify existing measures of similarity and conflict for finding LPMCSes along with quality of information heuristics to ensure that LPMCSes are formed around high-quality information. We then propose an intelligent sensor model for integrating this information into the AgentSpeak framework which is responsible for applying evidence propagation to construct compatible information, for performing context-dependent combination and for deriving beliefs for revising an agent’s belief base. Finally, we present a power grid scenario inspired by a real-world case study to demonstrate our work.
Resumo:
The realization of nonclassical states is an important task for many applications of quantum information processing. Usually, properly tailored interactions, different from goal to goal, are considered in order to accomplish specific tasks within the general framework of quantum state engineering. In this paper, we remark on the flexibility of a cross-Kerr nonlinear coupling in hybrid systems as an important ingredient in the engineering of nonclassical states. The general scenario we consider is the implementation of high cross-Kerr nonlinearity in cavity-quantum electrodynamics. In this context, we discuss the possibility of performing entanglement transfer and swapping between matter qubits and light fields initially prepared in separable coherent states. The recently introduced concept of entanglement reciprocation is also considered and shown to be possible with our scheme. We reinterpret some of our results in terms of applications of a generalized Ising interaction to systems of different nature.
Resumo:
Goal-directed, coordinated movements in humans emerge from a variety of constraints that range from 'high-level' cognitive strategies based oil perception of the task to 'low-level' neuromuscular-skeletal factors such as differential contributions to coordination from flexor and extensor muscles. There has been a tendency in the literature to dichotomize these sources of constraint, favouring one or the other rather than recognizing and understanding their mutual interplay. In this experiment, subjects were required to coordinate rhythmic flexion and extension movements with an auditory metronome, the rate of which was systematically increased. When subjects started in extension on the beat of the metronome, there was a small tendency to switch to flexion at higher rates, but not vice versa. When subjects: were asked to contact a physical stop, the location of which was either coincident with or counterphase to the auditor) stimulus, two effects occurred. When haptic contact was coincident with sound, coordination was stabilized for both flexion and extension. When haptic contact was counterphase to the metronome, coordination was actually destabilized, with transitions occurring from both extension to flexion on the beat and from flexion to extension on the beat. These results reveal the complementary nature of strategic and neuromuscular factors in sensorimotor coordination. They also suggest the presence of a multimodal neural integration process-which is parametrizable by rate and context - in which intentional movement, touch and sound are bound into a single, coherent unit.
Resumo:
The ‘Dublin Blaschka Congress’ was conceived as a gathering to bring together the diverse scholarly disciplines that are uniquely, if eccentrically, joined in the study of scientific glass models. Leopold and Rudolf Blaschka are best known for the ‘Glass Flowers’ of Harvard but in the nineteenth century they also invented techniques to sculpt anatomically accurate marine invertebrates in glass. In the course of preparing the Congress and a coordinated temporary exhibition, much new information was uncovered about the collections of Blaschka objects in Ireland, including a total of nearly 800 surviving models. The history of the artists shows a clever business model that was designed to tap a niche market in the contemporary fascination with natural history, and improved through the course of several decades with input from clients and their own passion for understanding their biological subjects. From a modern perspective, a single Blaschka glass model of a marine invertebrate can embody biology, the history of science, craftsmanship, glass chemistry, aesthetics and art. This ability to cross interdisciplinary bridges is a singular strength of the Blaschka works, and is evident in the published proceedings of the Congress.
Resumo:
This study concerns the spatial allocation of material flows, with emphasis on construction material in the Irish housing sector. It addresses some of the key issues concerning anthropogenic impact on the environment through spatial temporal visualisation of the flow of materials, wastes and emissions at different spatial levels. This is presented in the form of a spatial model, Spatial Allocation of Material Flow Analysis (SAMFA), which enables the simulation of construction material flows and associated energy use. SAMFA parallels the Island Limits project (EPA funded under 2004-SD-MS-22-M2), which aimed to create a material flow analysis of the Irish economy classified by industrial sector. SAMFA further develops this by attempting to establish the material flows at the subnational geographical scale that could be used in the development of local authority (LA) sustainability strategies and spatial planning frameworks by highlighting the cumulative environmental impacts of the development of the built environment. By drawing on the idea of planning support systems, SAMFA also aims to provide a cross-disciplinary, integrative medium for involving stakeholders in strategies for a sustainable built environment and, as such, would help illustrate the sustainability consequences of alternative The pilot run of the model in Kildare has shown that the model can be successfully calibrated and applied to develop alternative material flows and energy-use scenarios at the ED level. This has been demonstrated through the development of an integrated and a business-as-usual scenario, with the former integrating a range of potential material efficiency and energysaving policy options and the latter replicating conditions that best describe the current trend. Their comparison shows that the former is better than the latter in terms of both material and energy use. This report also identifies a number of potential areas of future research and areas of broader application. This includes improving the accuracy of the SAMFA model (e.g. by establishing actual life expectancy of buildings in the Irish context through field surveys) and the extension of the model to other Irish counties. This would establish SAMFA as a valuable predicting and monitoring tool that is capable of integrating national and local spatial planning objectives with actual environmental impacts. Furthermore, should the model prove successful at this level, it then has the potential to transfer the modelling approach to other areas of the built environment, such as commercial development and other key contributors of greenhouse emissions. The ultimate aim is to develop a meta-model for predicting the consequences of consumption patterns at the local scale. This therefore offers the possibility of creating critical links between socio technical systems with the most important challenge of all the limitations of the biophysical environment.
Resumo:
We discuss the limitations and rights which may affect the researcher’s access to and use of digital, court and administrative tribunal based information. We suggest that there is a need for a European-wide investigation of the legal framework which affects the researcher who might wish to utilise this form of information. A European-wide context is required because much of the relevant law is European rather than national, but much of the constraints are cultural. It is our thesis that research improves understanding and then improves practice as that understanding becomes part of public debate. If it is difficult to undertake research, then public debate about the court system – its effectiveness, its biases, its strengths – becomes constrained. Access to court records is currently determined on a discretionary basis or on the basis of interpretation of rules of the court where these are challenged in legal proceedings. Anecdotal evidence would suggest that there are significant variations in the extent to which court documents such as pleadings, transcripts, affidavits etc are made generally accessible under court rules or as a result of litigation in different jurisdictions or, indeed, in different courts in the same jurisdiction. Such a lack of clarity can only encourage a chilling of what might otherwise be valuable research. Courts are not, of course, democratic bodies. However, they are part of a democratic system and should, we suggest – both for the public benefit and for their proper operation – be accessible and criticisable by the independent researcher. The extent to which the independent researcher is enabled access is the subject of this article. The rights of access for researchers and the public have been examined in other common law countries but not, to date, in the UK or Europe.
Resumo:
BACKGROUND: To date, there are no clinically reliable predictive markers of response to the current treatment regimens for advanced colorectal cancer. The aim of the current study was to compare and assess the power of transcriptional profiling using a generic microarray and a disease-specific transcriptome-based microarray. We also examined the biological and clinical relevance of the disease-specific transcriptome.
METHODS: DNA microarray profiling was carried out on isogenic sensitive and 5-FU-resistant HCT116 colorectal cancer cell lines using the Affymetrix HG-U133 Plus2.0 array and the Almac Diagnostics Colorectal cancer disease specific Research tool. In addition, DNA microarray profiling was also carried out on pre-treatment metastatic colorectal cancer biopsies using the colorectal cancer disease specific Research tool. The two microarray platforms were compared based on detection of probesets and biological information.
RESULTS: The results demonstrated that the disease-specific transcriptome-based microarray was able to out-perform the generic genomic-based microarray on a number of levels including detection of transcripts and pathway analysis. In addition, the disease-specific microarray contains a high percentage of antisense transcripts and further analysis demonstrated that a number of these exist in sense:antisense pairs. Comparison between cell line models and metastatic CRC patient biopsies further demonstrated that a number of the identified sense:antisense pairs were also detected in CRC patient biopsies, suggesting potential clinical relevance.
CONCLUSIONS: Analysis from our in vitro and clinical experiments has demonstrated that many transcripts exist in sense:antisense pairs including IGF2BP2, which may have a direct regulatory function in the context of colorectal cancer. While the functional relevance of the antisense transcripts has been established by many studies, their functional role is currently unclear; however, the numbers that have been detected by the disease-specific microarray would suggest that they may be important regulatory transcripts. This study has demonstrated the power of a disease-specific transcriptome-based approach and highlighted the potential novel biologically and clinically relevant information that is gained when using such a methodology.
Resumo:
ARK (‘Access Research Knowledge’) was set up with a single goal: to make social science information on Northern Ireland available to the widest possible audience. The most well-known and widely used part of the ARK resource is CAIN (Conflict Archive on the INternet), which is one of the largest on-line collections of source material and information and about the Northern Ireland conflict. The compilation of CAIN's new Remembering: Victims, Survivors and Commemoration section raised issues related to the sensitivity of the material, as it feeds into the fundamental debate on the legacy of the Northern Ireland conflict. It also fundamentally raises the question to what extent archiving is a neutral or political activity and necessitates a discourse on responsibility and ethics among social researchers. Experiences from the establishment of the Northern Ireland Qualitative Archive (NIQA) shed light on future possibilities with regard to qualitative archives on the Northern Ireland conflict.
Resumo:
Genetic testing for gene mutations associated with specific cancers provides an opportunity for early detection, surveillance, and intervention (Smith, Cokkinides, & Brawley, 2008). Lifetime risk estimates provided by genetic testing refer to the risk of developing a specific disease within one's lifetime, and evidence suggests that this is important for the medical choices people make, as well as their future family and financial plans. The present studies tested whether adult men understand the lifetime risks of prostate cancer informed by genetic testing. In 2 experiments, adult men were asked to interpret the lifetime risk information provided in statements about risks of prostate cancer. Statement format was manipulated such that the most appropriate interpretation of risk statements referred to an absolute risk of cancer in experiment 1 and a relative risk in experiment 2. Experiment 1 revealed that few men correctly interpreted the lifetime risks of cancer when these refer to an absolute risk of cancer, and numeracy levels positively predicted correct responding. The proportion of correct responses was greatly improved in experiment 2 when the most appropriate interpretation of risk statements referred instead to a relative rather than an absolute risk, and numeracy levels were less involved. Understanding of lifetime risk information is often poor because individuals incorrectly believe that these refer to relative rather than absolute risks of cancer.
Resumo:
We extend the concept that life is an informational phenomenon, at every level of organisation, from molecules to the global ecological system. According to this thesis: (a) living is information processing, in which memory is maintained by both molecular states and ecological states as well as the more obvious nucleic acid coding; (b) this information processing has one overall function-to perpetuate itself; and (c) the processing method is filtration (cognition) of, and synthesis of, information at lower levels to appear at higher levels in complex systems (emergence). We show how information patterns, are united by the creation of mutual context, generating persistent consequences, to result in 'functional information'. This constructive process forms arbitrarily large complexes of information, the combined effects of which include the functions of life. Molecules and simple organisms have already been measured in terms of functional information content; we show how quantification may be extended to each level of organisation up to the ecological. In terms of a computer analogy, life is both the data and the program and its biochemical structure is the way the information is embodied. This idea supports the seamless integration of life at all scales with the physical universe. The innovation reported here is essentially to integrate these ideas, basing information on the 'general definition' of information, rather than simply the statistics of information, thereby explaining how functional information operates throughout life. © 2013 Springer Science+Business Media Dordrecht.
Resumo:
Context: Despite the fact that most deaths occur in hospital, problems remain with how patients and families experience care at the end of life when a death occurs in a hospital. Objectives: (1) assess family member satisfaction with information sharing and communication, and (2) examine how satisfaction with information sharing and communication is associated with patient factors. Methods: Using a cross-sectional survey, data were collected from family members of adult patients who died in an acute care organization. Correlation and factor analysis were conducted, and internal consistency assessed using Cronbach's alpha. Linear regression was performed to determine the relationship among patient variables and satisfaction on the Information Sharing and Communication (ISC) scale. Results: There were 529 questionnaires available for analysis. Following correlation analysis and the dropping of redundant and conceptually irrelevant items, seven items remained for factor analysis. One factor was identified, described as information sharing and communication, that explained 76.3% of the variance. The questionnaire demonstrated good content and reliability (Cronbach's alpha 0.96). Overall, family members were satisfied with information sharing and communication (mean total satisfaction score 3.9, SD 1.1). The ISC total score was significantly associated with patient gender, the number of days in hospital before death, and the hospital program where the patient died. Conclusions: The ISC scale demonstrated good content validity and reliability. The ISC scale offers acute care organizations a means to assess the quality of information sharing and communication that transpires in care at the end of life. © Copyright 2013, Mary Ann Liebert, Inc.
Resumo:
Correctly modelling and reasoning with uncertain information from heterogeneous sources in large-scale systems is critical when the reliability is unknown and we still want to derive adequate conclusions. To this end, context-dependent merging strategies have been proposed in the literature. In this paper we investigate how one such context-dependent merging strategy (originally defined for possibility theory), called largely partially maximal consistent subsets (LPMCS), can be adapted to Dempster-Shafer (DS) theory. We identify those measures for the degree of uncertainty and internal conflict that are available in DS theory and show how they can be used for guiding LPMCS merging. A simplified real-world power distribution scenario illustrates our framework. We also briefly discuss how our approach can be incorporated into a multi-agent programming language, thus leading to better plan selection and decision making.
Resumo:
Automatically determining and assigning shared and meaningful text labels to data extracted from an e-Commerce web page is a challenging problem. An e-Commerce web page can display a list of data records, each of which can contain a combination of data items (e.g. product name and price) and explicit labels, which describe some of these data items. Recent advances in extraction techniques have made it much easier to precisely extract individual data items and labels from a web page, however, there are two open problems: 1. assigning an explicit label to a data item, and 2. determining labels for the remaining data items. Furthermore, improvements in the availability and coverage of vocabularies, especially in the context of e-Commerce web sites, means that we now have access to a bank of relevant, meaningful and shared labels which can be assigned to extracted data items. However, there is a need for a technique which will take as input a set of extracted data items and assign automatically to them the most relevant and meaningful labels from a shared vocabulary. We observe that the Information Extraction (IE) community has developed a great number of techniques which solve problems similar to our own. In this work-in-progress paper we propose our intention to theoretically and experimentally evaluate different IE techniques to ascertain which is most suitable to solve this problem.