807 resultados para decentralised data fusion framework
Resumo:
This paper presents a novel method of audio-visual feature-level fusion for person identification where both the speech and facial modalities may be corrupted, and there is a lack of prior knowledge about the corruption. Furthermore, we assume there are limited amount of training data for each modality (e.g., a short training speech segment and a single training facial image for each person). A new multimodal feature representation and a modified cosine similarity are introduced to combine and compare bimodal features with limited training data, as well as vastly differing data rates and feature sizes. Optimal feature selection and multicondition training are used to reduce the mismatch between training and testing, thereby making the system robust to unknown bimodal corruption. Experiments have been carried out on a bimodal dataset created from the SPIDRE speaker recognition database and AR face recognition database with variable noise corruption of speech and occlusion in the face images. The system's speaker identification performance on the SPIDRE database, and facial identification performance on the AR database, is comparable with the literature. Combining both modalities using the new method of multimodal fusion leads to significantly improved accuracy over the unimodal systems, even when both modalities have been corrupted. The new method also shows improved identification accuracy compared with the bimodal systems based on multicondition model training or missing-feature decoding alone.
Resumo:
The enzyme UDP-galactose 4'-epimerase (GALE) catalyses the reversible epimerisation of both UDP-galactose and UDP-N-acetyl-galactosamine. Deficiency of the human enzyme (hGALE) is associated with type III galactosemia. The majority of known mutations in hGALE are missense and private thus making clinical guidance difficult. In this study a bioinformatics approach was employed to analyse the structural effects due to each mutation using both the UDP-glucose and UDP-N-acetylglucosamine bound structures of the wild-type protein. Changes to the enzyme's overall stability, substrate/cofactor binding and propensity to aggregate were also predicted. These predictions were found to be in good agreement with previous in vitro and in vivo studies when data was available and allowed for the differentiation of those mutants that severely impair the enzyme's activity against UDP-galactose. Next this combination of techniques were applied to another twenty-six reported variants from the NCBI dbSNP database that have yet to be studied to predict their effects. This identified p.I14T, p.R184H and p.G302R as likely severely impairing mutations. Although severely impaired mutants were predicted to decrease the protein's stability, overall predicted stability changes only weakly correlated with residual activity against UDP-galactose. This suggests other protein functions such as changes in cofactor and substrate binding may also contribute to the mechanism of impairment. Finally this investigation shows that this combination of different in silico approaches is useful in predicting the effects of mutations and that it could be the basis of an initial prediction of likely clinical severity when new hGALE mutants are discovered.
Resumo:
The advent of next generation sequencing technologies (NGS) has expanded the area of genomic research, offering high coverage and increased sensitivity over older microarray platforms. Although the current cost of next generation sequencing is still exceeding that of microarray approaches, the rapid advances in NGS will likely make it the platform of choice for future research in differential gene expression. Connectivity mapping is a procedure for examining the connections among diseases, genes and drugs by differential gene expression initially based on microarray technology, with which a large collection of compound-induced reference gene expression profiles have been accumulated. In this work, we aim to test the feasibility of incorporating NGS RNA-Seq data into the current connectivity mapping framework by utilizing the microarray based reference profiles and the construction of a differentially expressed gene signature from a NGS dataset. This would allow for the establishment of connections between the NGS gene signature and those microarray reference profiles, alleviating the associated incurring cost of re-creating drug profiles with NGS technology. We examined the connectivity mapping approach on a publicly available NGS dataset with androgen stimulation of LNCaP cells in order to extract candidate compounds that could inhibit the proliferative phenotype of LNCaP cells and to elucidate their potential in a laboratory setting. In addition, we also analyzed an independent microarray dataset of similar experimental settings. We found a high level of concordance between the top compounds identified using the gene signatures from the two datasets. The nicotine derivative cotinine was returned as the top candidate among the overlapping compounds with potential to suppress this proliferative phenotype. Subsequent lab experiments validated this connectivity mapping hit, showing that cotinine inhibits cell proliferation in an androgen dependent manner. Thus the results in this study suggest a promising prospect of integrating NGS data with connectivity mapping. © 2013 McArt et al.
Resumo:
Hydrodynamic models are a powerful tool that can be used by a wide range of end users to assist in predicting the effects of both physical and biological processes on local environmental conditions. This paper describes the development of a tidal model for Strangford Lough, Northern Ireland, a body of water renowned for the location of the first grid-connected tidal turbine, SeaGen, as well as the UK’s third Marine Nature Reserve. Using MIKE 21 modelling software, the development, calibration and performance of the modelare described in detail. Strangford Lough has a complex flow pattern with high flows through the Narrows (~3.5 m/s) linking the main body of the Lough to the Irish Sea and intricate flow patterns around the numerous islands. With the aid of good quality tidal and current data obtained throughout the Lough during the model development, the surface elevation and current magnitude between the observed and numerical model were almost identical with model skill >0.98 and >0.84 respectively. The applicability of the model is such that it can be used as an important tool for the prediction of important ecological processes as well as engineering applications within Strangford Lough.
Resumo:
A reduction in the time required to locate and restore faults on a utility's distribution network improves the customer minutes lost (CML) measurement and hence brings direct cost savings to the operating company. The traditional approach to fault location involves fault impedance determination from high volume waveform files dispatched across a communications channel to a central location for processing and analysis. This paper examines an alternative scheme where data processing is undertaken locally within a recording instrument thus reducing the volume of data to be transmitted. Processed event fault reports may be emailed to relevant operational staff for the timely repair and restoration of the line.
Resumo:
This paper presents a preliminary study of developing a novel distributed adaptive real-time learning framework for wide area monitoring of power systems integrated with distributed generations using synchrophasor technology. The framework comprises distributed agents (synchrophasors) for autonomous local condition monitoring and fault detection, and a central unit for generating global view for situation awareness and decision making. Key technologies that can be integrated into this hierarchical distributed learning scheme are discussed to enable real-time information extraction and knowledge discovery for decision making, without explicitly accumulating and storing all raw data by the central unit. Based on this, the configuration of a wide area monitoring system of power systems using synchrophasor technology, and the functionalities for locally installed open-phasor-measurement-units (OpenPMUs) and a central unit are presented. Initial results on anti-islanding protection using the proposed approach are given to illustrate the effectiveness.
Resumo:
Model selection between competing models is a key consideration in the discovery of prognostic multigene signatures. The use of appropriate statistical performance measures as well as verification of biological significance of the signatures is imperative to maximise the chance of external validation of the generated signatures. Current approaches in time-to-event studies often use only a single measure of performance in model selection, such as logrank test p-values, or dichotomise the follow-up times at some phase of the study to facilitate signature discovery. In this study we improve the prognostic signature discovery process through the application of the multivariate partial Cox model combined with the concordance index, hazard ratio of predictions, independence from available clinical covariates and biological enrichment as measures of signature performance. The proposed framework was applied to discover prognostic multigene signatures from early breast cancer data. The partial Cox model combined with the multiple performance measures were used in both guiding the selection of the optimal panel of prognostic genes and prediction of risk within cross validation without dichotomising the follow-up times at any stage. The signatures were successfully externally cross validated in independent breast cancer datasets, yielding a hazard ratio of 2.55 [1.44, 4.51] for the top ranking signature.
Resumo:
Object tracking is an active research area nowadays due to its importance in human computer interface, teleconferencing and video surveillance. However, reliable tracking of objects in the presence of occlusions, pose and illumination changes is still a challenging topic. In this paper, we introduce a novel tracking approach that fuses two cues namely colour and spatio-temporal motion energy within a particle filter based framework. We conduct a measure of coherent motion over two image frames, which reveals the spatio-temporal dynamics of the target. At the same time, the importance of both colour and motion energy cues is determined in the stage of reliability evaluation. This determination helps maintain the performance of the tracking system against abrupt appearance changes. Experimental results demonstrate that the proposed method outperforms the other state of the art techniques in the used test datasets.
Integrating Multiple Point Statistics with Aerial Geophysical Data to assist Groundwater Flow Models
Resumo:
The process of accounting for heterogeneity has made significant advances in statistical research, primarily in the framework of stochastic analysis and the development of multiple-point statistics (MPS). Among MPS techniques, the direct sampling (DS) method is tested to determine its ability to delineate heterogeneity from aerial magnetics data in a regional sandstone aquifer intruded by low-permeability volcanic dykes in Northern Ireland, UK. The use of two two-dimensional bivariate training images aids in creating spatial probability distributions of heterogeneities of hydrogeological interest, despite relatively ‘noisy’ magnetics data (i.e. including hydrogeologically irrelevant urban noise and regional geologic effects). These distributions are incorporated into a hierarchy system where previously published density function and upscaling methods are applied to derive regional distributions of equivalent hydraulic conductivity tensor K. Several K models, as determined by several stochastic realisations of MPS dyke locations, are computed within groundwater flow models and evaluated by comparing modelled heads with field observations. Results show a significant improvement in model calibration when compared to a simplistic homogeneous and isotropic aquifer model that does not account for the dyke occurrence evidenced by airborne magnetic data. The best model is obtained when normal and reverse polarity dykes are computed separately within MPS simulations and when a probability threshold of 0.7 is applied. The presented stochastic approach also provides improvement when compared to a previously published deterministic anisotropic model based on the unprocessed (i.e. noisy) airborne magnetics. This demonstrates the potential of coupling MPS to airborne geophysical data for regional groundwater modelling.
Resumo:
Future digital signal processing (DSP) systems must provide robustness on algorithm and application level to the presence of reliability issues that come along with corresponding implementations in modern semiconductor process technologies. In this paper, we address this issue by investigating the impact of unreliable memories on general DSP systems. In particular, we propose a novel framework to characterize the effects of unreliable memories, which enables us to devise novel methods to mitigate the associated performance loss. We propose to deploy specifically designed data representations, which have the capability of substantially improving the system reliability compared to that realized by conventional data representations used in digital integrated circuits, such as 2's-complement or sign-magnitude number formats. To demonstrate the efficacy of the proposed framework, we analyze the impact of unreliable memories on coded communication systems, and we show that the deployment of optimized data representations substantially improves the error-rate performance of such systems.
Resumo:
Correctly modelling and reasoning with uncertain information from heterogeneous sources in large-scale systems is critical when the reliability is unknown and we still want to derive adequate conclusions. To this end, context-dependent merging strategies have been proposed in the literature. In this paper we investigate how one such context-dependent merging strategy (originally defined for possibility theory), called largely partially maximal consistent subsets (LPMCS), can be adapted to Dempster-Shafer (DS) theory. We identify those measures for the degree of uncertainty and internal conflict that are available in DS theory and show how they can be used for guiding LPMCS merging. A simplified real-world power distribution scenario illustrates our framework. We also briefly discuss how our approach can be incorporated into a multi-agent programming language, thus leading to better plan selection and decision making.
Resumo:
To provide in-time reactions to a large volume of surveil- lance data, uncertainty-enabled event reasoning frameworks for CCTV and sensor based intelligent surveillance system have been integrated to model and infer events of interest. However, most of the existing works do not consider decision making under uncertainty which is important for surveillance operators. In this paper, we extend an event reasoning framework for decision support, which enables our framework to predict, rank and alarm threats from multiple heterogeneous sources.
Resumo:
Three issues usually are associated with threat prevention intelligent surveillance systems. First, the fusion and interpretation of large scale incomplete heterogeneous information; second, the demand of effectively predicting suspects’ intention and ranking the potential threats posed by each suspect; third, strategies of allocating limited security resources (e.g., the dispatch of security team) to prevent a suspect’s further actions towards critical assets. However, in the literature, these three issues are seldomly considered together in a sensor network based intelligent surveillance framework. To address
this problem, in this paper, we propose a multi-level decision support framework for in-time reaction in intelligent surveillance. More specifically, based on a multi-criteria event modeling framework, we design a method to predict the most plausible intention of a suspect. Following this, a decision support model is proposed to rank each suspect based on their threat severity and to determine resource allocation strategies. Finally, formal properties are discussed to justify our framework.
Resumo:
The predominant fear in capital markets is that of a price spike. Commodity markets differ in that there is a fear of both upward and down jumps, this results in implied volatility curves displaying distinct shapes when compared to equity markets. The use of a novel functional data analysis (FDA) approach, provides a framework to produce and interpret functional objects that characterise the underlying dynamics of oil future options. We use the FDA framework to examine implied volatility, jump risk, and pricing dynamics within crude oil markets. Examining a WTI crude oil sample for the 2007–2013 period, which includes the global financial crisis and the Arab Spring, strong evidence is found of converse jump dynamics during periods of demand and supply side weakness. This is used as a basis for an FDA-derived Merton (1976) jump diffusion optimised delta hedging strategy, which exhibits superior portfolio management results over traditional methods.
Resumo:
IMPORTANCE Systematic reviews and meta-analyses of individual participant data (IPD) aim to collect, check, and reanalyze individual-level data from all studies addressing a particular research question and are therefore considered a gold standard approach to evidence synthesis. They are likely to be used with increasing frequency as current initiatives to share clinical trial data gain momentum and may be particularly important in reviewing controversial therapeutic areas.
OBJECTIVE To develop PRISMA-IPD as a stand-alone extension to the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) Statement, tailored to the specific requirements of reporting systematic reviews and meta-analyses of IPD. Although developed primarily for reviews of randomized trials, many items will apply in other contexts, including reviews of diagnosis and prognosis.
DESIGN Development of PRISMA-IPD followed the EQUATOR Network framework guidance and used the existing standard PRISMA Statement as a starting point to draft additional relevant material. A web-based survey informed discussion at an international workshop that included researchers, clinicians, methodologists experienced in conducting systematic reviews and meta-analyses of IPD, and journal editors. The statement was drafted and iterative refinements were made by the project, advisory, and development groups. The PRISMA-IPD Development Group reached agreement on the PRISMA-IPD checklist and flow diagram by consensus.
FINDINGS Compared with standard PRISMA, the PRISMA-IPD checklist includes 3 new items that address (1) methods of checking the integrity of the IPD (such as pattern of randomization, data consistency, baseline imbalance, and missing data), (2) reporting any important issues that emerge, and (3) exploring variation (such as whether certain types of individual benefit more from the intervention than others). A further additional item was created by reorganization of standard PRISMA items relating to interpreting results. Wording was modified in 23 items to reflect the IPD approach.
CONCLUSIONS AND RELEVANCE PRISMA-IPD provides guidelines for reporting systematic reviews and meta-analyses of IPD.