77 resultados para decentralised data fusion framework
Resumo:
This paper presents a framework for a telecommunications interface which allows data from sensors embedded in Smart Grid applications to reliably archive data in an appropriate time-series database. The challenge in doing so is two-fold, firstly the various formats in which sensor data is represented, secondly the problems of telecoms reliability. A prototype of the authors' framework is detailed which showcases the main features of the framework in a case study featuring Phasor Measurement Units (PMU) as the application. Useful analysis of PMU data is achieved whenever data from multiple locations can be compared on a common time axis. The prototype developed highlights its reliability, extensibility and adoptability; features which are largely deferred from industry standards for data representation to proprietary database solutions. The open source framework presented provides link reliability for any type of Smart Grid sensor and is interoperable with existing proprietary database systems, and open database systems. The features of the authors' framework allow for researchers and developers to focus on the core of their real-time or historical analysis applications, rather than having to spend time interfacing with complex protocols.
Resumo:
The increasing adoption of cloud computing, social networking, mobile and big data technologies provide challenges and opportunities for both research and practice. Researchers face a deluge of data generated by social network platforms which is further exacerbated by the co-mingling of social network platforms and the emerging Internet of Everything. While the topicality of big data and social media increases, there is a lack of conceptual tools in the literature to help researchers approach, structure and codify knowledge from social media big data in diverse subject matter domains, many of whom are from nontechnical disciplines. Researchers do not have a general-purpose scaffold to make sense of the data and the complex web of relationships between entities, social networks, social platforms and other third party databases, systems and objects. This is further complicated when spatio-temporal data is introduced. Based on practical experience of working with social media datasets and existing literature, we propose a general research framework for social media research using big data. Such a framework assists researchers in placing their contributions in an overall context, focusing their research efforts and building the body of knowledge in a given discipline area using social media data in a consistent and coherent manner.
Resumo:
Trends and focii of interest in atomic modelling and data are identified in connection with recent observations and experiments in fusion and astrophysics. In the fusion domain, spectral observations are included of core, beam penetrated and divertor plasma. The helium beam experiments at JET and the studies with very heavy species at ASDEX and JET are noted. In the astrophysics domain, illustrations are given from the SOHO and CHANDRA spacecraft which span from the solar upper atmosphere, through soft x-rays from comets to supernovae remnants. It is shown that non-Maxwellian, dynamic and possibly optically thick regimes must be considered. The generalized collisional-radiative model properly describes the collisional regime of most astrophysical and laboratory fusion plasmas and yields self-consistent derived data for spectral emission, power balance and ionization state studies. The tuning of this method to routine analysis of the spectral observations is described. A forward look is taken as to how such atomic modelling, and the atomic data which underpin it, ought to evolve to deal with the extended conditions and novel environments of the illustrations. It is noted that atomic physics influences most aspects of fusion and astrophysical plasma behaviour but the effectiveness of analysis depends on the quality of the bi-directional pathway from fundamental data production through atomic/plasma model development to the confrontation with experiment. The principal atomic data capability at JET, and other fusion and astrophysical laboratories, is supplied via the Atomic Data and Analysis Structure (ADAS) Project. The close ties between the various experiments and ADAS have helped in this path of communication.
Resumo:
Objective: To apply the UK Medical Research Council (MRC) framework for development and evaluation of trials of complex interventions to a primary healthcare intervention to promote secondary prevention of coronary heart disease. Study Design: Case report of intervention development. Methods: First, literature relating to secondary prevention and lifestyle change was reviewed. Second, a preliminary intervention was modeled, based on literature findings and focus group interviews with patients (n = 23) and staff (n = 29) from 4 general practices. Participants’ experiences of and attitudes toward key intervention components were explored. Third, the preliminary intervention was pilot-tested in 4 general practices. After delivery of the pilot intervention, practitioners evaluated the training sessions, and qualitative data relating to experiences of the intervention were collected using semistructured interviews with staff (n = 10) and patient focus groups (n = 17). Results: Literature review identified 3 intervention components: a structured recall system, practitioner training, and patient information. Initial qualitative data identified variations in recall system design, training requirements (medication prescribing, facilitating behavior change), and information appropriate to the prospective study participants. Identifying detailed structures within intervention components clarified how the intervention could be tailored to individual practice, practitioner, and patient needs while preserving the theoretical functions of the components. Findings from the pilot phase informed further modeling of the intervention, reducing administrative time, increasing practical content of training, and omitting unhelpful patient information. Conclusion: Application of the MRC framework helped to determine the feasibility and development of a complex intervention for primary care research.
Resumo:
This paper provides algorithms that use an information-theoretic analysis to learn Bayesian network structures from data. Based on our three-phase learning framework, we develop efficient algorithms that can effectively learn Bayesian networks, requiring only polynomial numbers of conditional independence (CI) tests in typical cases. We provide precise conditions that specify when these algorithms are guaranteed to be correct as well as empirical evidence (from real world applications and simulation tests) that demonstrates that these systems work efficiently and reliably in practice.
Resumo:
In previous papers, we have presented a logic-based framework based on fusion rules for merging structured news reports. Structured news reports are XML documents, where the textentries are restricted to individual words or simple phrases, such as names and domain-specific terminology, and numbers and units. We assume structured news reports do not require natural language processing. Fusion rules are a form of scripting language that define how structured news reports should be merged. The antecedent of a fusion rule is a call to investigate the information in the structured news reports and the background knowledge, and the consequent of a fusion rule is a formula specifying an action to be undertaken to form a merged report. It is expected that a set of fusion rules is defined for any given application. In this paper we extend the approach to handling probability values, degrees of beliefs, or necessity measures associated with textentries in the news reports. We present the formal definition for each of these types of uncertainty and explain how they can be handled using fusion rules. We also discuss the methods of detecting inconsistencies among sources.
Resumo:
We present a multimodal detection and tracking algorithm for sensors composed of a camera mounted between two microphones. Target localization is performed on color-based change detection in the video modality and on time difference of arrival (TDOA) estimation between the two microphones in the audio modality. The TDOA is computed by multiband generalized cross correlation (GCC) analysis. The estimated directions of arrival are then postprocessed using a Riccati Kalman filter. The visual and audio estimates are finally integrated, at the likelihood level, into a particle filter (PF) that uses a zero-order motion model, and a weighted probabilistic data association (WPDA) scheme. We demonstrate that the Kalman filtering (KF) improves the accuracy of the audio source localization and that the WPDA helps to enhance the tracking performance of sensor fusion in reverberant scenarios. The combination of multiband GCC, KF, and WPDA within the particle filtering framework improves the performance of the algorithm in noisy scenarios. We also show how the proposed audiovisual tracker summarizes the observed scene by generating metadata that can be transmitted to other network nodes instead of transmitting the raw images and can be used for very low bit rate communication. Moreover, the generated metadata can also be used to detect and monitor events of interest.
Resumo:
Motivation: Microarray experiments generate a high data volume. However, often due to financial or experimental considerations, e.g. lack of sample, there is little or no replication of the experiments or hybridizations. These factors combined with the intrinsic variability associated with the measurement of gene expression can result in an unsatisfactory detection rate of differential gene expression (DGE). Our motivation was to provide an easy to use measure of the success rate of DGE detection that could find routine use in the design of microarray experiments or in post-experiment assessment.
Resumo:
Multi-Mev proton beams generated by target normal sheath acceleration (TNSA) during the interaction of an ultra intense laser beam (Ia parts per thousand yen10(19) W/cm(2)) with a thin metallic foil (thickness of the order of a few tens of microns) are particularly suited as a particle probe for laser plasma experiments. The proton imaging technique employs a laser-driven proton beam in a point-projection imaging scheme as a diagnostic tool for the detection of electric fields in such experiments. The proton probing technique has been applied in experiments of relevance to inertial confinement fusion (ICF) such as laser heated gasbags and laser-hohlraum experiments. The data provides direct information on the onset of laser beam filamentation and on the plasma expansion in the hohlraum's interior, and confirms the suitability and usefulness of this technique as an ICF diagnostic.
Resumo:
Objective The objective of this research was to examine differences in patterns of statin prescribing between Northern Ireland and England both before and after the introduction of the Quality and Outcomes Framework (QOF). Setting: Primary care practices in Northern Ireland and England. Method Northern Ireland practices were matched with practices in England, statin prescribing data and QOF achievement scores (for the first year post-QOF) were obtained. Crude prescribing data from matched practices were manipulated to provide a data set of Defined Daily Doses (DDDs)/1,000 patients and cost/DDD/1,000 patients for each statin drug entity covering 1 year before and after the introduction of QOF. QOF achievements were converted into percentage scores for matched practices. Main outcome measure Cost per defined daily dose (DDD) per 1,000 patients. Results Significantly less statins (DDD/1,000 patients) were dispensed in Northern Ireland compared with the matched region in England both before and after the introduction of QOF (P
Resumo:
Data from a series of controlled suction triaxial tests on samples of compacted speswhite kaolin were used in the development of an elasto–plastic critical state framework for unsaturated soil. The framework is defined in terms of four state variables: mean net stress, deviator stress, suction and specific volume. Included within the proposed framework are an isotropic normal compression hyperline, a critical state hyperline and a state boundary hypersurface. For states that lie inside the state boundary hypersurface the soil behaviour is assumed to be elastic, with movement over the state boundary hypersurface corresponding to expansion of a yield surface in stress space. The pattern of swelling and collapse observed during wetting, the elastic–plastic compression behaviour during isotropic loading and the increase of shear strength with suction were all related to the shape of the yield surface and the hardening law defined by the form of the state boundary. By assuming that constant–suction cross–sections of the yield surface were elliptical it was possible to predict test paths for different types of triaxial shear test that showed good agreement with observed behaviour. The development of shear strain was also predicted with reasonable success, by assuming an associated flow rule.
Resumo:
A recognised aim of science education is to promote critical engagement with science in the media. Evidence would suggest that this is challenging for both teachers and pupils and that at science education does not yet adequately prepare young people for this task. Furthermore, in the absence of clear guidance as to what this means and how this may be achieved it is difficult for teachers to develop approaches and resources that address the matter and that systematically promote such critical engagement within their teaching programmes. Twenty-six individuals with recognised expertise or interest in science in the media, drawn from a range of disciplines and areas of practice, constituted a specialist panel in this study. The question this research sought to answer was ‘what are the elements of knowledge, skill and attitude which underpin critical reading of science based news reports’? During in-depth individual interviews the panel were asked to explore what they considered to be essential elements of knowledge, skills and attitude which people need to enable them to respond critically to news reports with a science component. Analysis of the data revealed fourteen fundamental elements which together contribute to an individual’s capacity to engage critically with science-based news. These are classified in five categories ‘knowledge of science’, ‘knowledge of writing and language’, ‘knowledge about news, newspapers and journalism’, ‘skills’ and ‘attitudes’. Illustrative profiles of each category along with indicators of critical engagement are presented. The implications for curriculum planning and pedagogy are considered.
Resumo:
Carboxyl-terminal modulator protein (CTMP) is a tumor suppressor-like binding partner of Protein kinase B (PKB/Akt) that negative regulates this kinase. In the course of our recent work, we identified that CTMP is consistently associated with leucine zipper/EF-hand-containing transmembrane-1 (LETM1). Here, we report that adenovirus-LETM1 increased the sensitivity of HeLa cells to apoptosis, induced by either staurosporine or actinomycin D. As shown previously, LETM1 localized to the inner mitochondrial membrane. Electron-microscopy analysis of adenovirus-LETM1 transduced cells revealed that mitochondrial cristae were swollen in these cells, a phenotype similar to that observed in optic atrophy type-1 (OPA1)-ablated cells. OPA1 cleavage was increased in LETM1-overexpressing cells, and this phenotype was reversed by overexpression of OPA1 variant-7, a cleavage resistant form of OPA1. Taken together, these data suggest that LETM1 is a novel binding partner for CTMP that may play an important role in mitochondrial fragmentation via OPA1-cleavage. (C) 2009 Elsevier Inc. All rights reserved
Resumo:
Recent years have witnessed an incredibly increasing interest in the topic of incremental learning. Unlike conventional machine learning situations, data flow targeted by incremental learning becomes available continuously over time. Accordingly, it is desirable to be able to abandon the traditional assumption of the availability of representative training data during the training period to develop decision boundaries. Under scenarios of continuous data flow, the challenge is how to transform the vast amount of stream raw data into information and knowledge representation, and accumulate experience over time to support future decision-making process. In this paper, we propose a general adaptive incremental learning framework named ADAIN that is capable of learning from continuous raw data, accumulating experience over time, and using such knowledge to improve future learning and prediction performance. Detailed system level architecture and design strategies are presented in this paper. Simulation results over several real-world data sets are used to validate the effectiveness of this method.