20 resultados para Interval graphs
Resumo:
In developing neural network techniques for real world applications it is still very rare to see estimates of confidence placed on the neural network predictions. This is a major deficiency, especially in safety-critical systems. In this paper we explore three distinct methods of producing point-wise confidence intervals using neural networks. We compare and contrast Bayesian, Gaussian Process and Predictive error bars evaluated on real data. The problem domain is concerned with the calibration of a real automotive engine management system for both air-fuel ratio determination and on-line ignition timing. This problem requires real-time control and is a good candidate for exploring the use of confidence predictions due to its safety-critical nature.
Resumo:
We study a variation of the graph coloring problem on random graphs of finite average connectivity. Given the number of colors, we aim to maximize the number of different colors at neighboring vertices (i.e. one edge distance) of any vertex. Two efficient algorithms, belief propagation and Walksat are adapted to carry out this task. We present experimental results based on two types of random graphs for different system sizes and identify the critical value of the connectivity for the algorithms to find a perfect solution. The problem and the suggested algorithms have practical relevance since various applications, such as distributed storage, can be mapped onto this problem.
Resumo:
Resource allocation in sparsely connected networks, a representative problem of systems with real variables, is studied using the replica and Bethe approximation methods. An efficient distributed algorithm is devised on the basis of insights gained from the analysis and is examined using numerical simulations,showing excellent performance and full agreement with the theoretical results. The physical properties of the resource allocation model are discussed.
Resumo:
The problem of resource allocation in sparse graphs with real variables is studied using methods of statistical physics. An efficient distributed algorithm is devised on the basis of insight gained from the analysis and is examined using numerical simulations, showing excellent performance and full agreement with the theoretical results.
Resumo:
We propose a simple model that captures the salient properties of distribution networks, and study the possible occurrence of blackouts, i.e., sudden failings of large portions of such networks. The model is defined on a random graph of finite connectivity. The nodes of the graph represent hubs of the network, while the edges of the graph represent the links of the distribution network. Both, the nodes and the edges carry dynamical two state variables representing the functioning or dysfunctional state of the node or link in question. We describe a dynamical process in which the breakdown of a link or node is triggered when the level of maintenance it receives falls below a given threshold. This form of dynamics can lead to situations of catastrophic breakdown, if levels of maintenance are themselves dependent on the functioning of the net, once maintenance levels locally fall below a critical threshold due to fluctuations. We formulate conditions under which such systems can be analyzed in terms of thermodynamic equilibrium techniques, and under these conditions derive a phase diagram characterizing the collective behavior of the system, given its model parameters. The phase diagram is confirmed qualitatively and quantitatively by simulations on explicit realizations of the graph, thus confirming the validity of our approach. © 2007 The American Physical Society.
Resumo:
Inference and optimization of real-value edge variables in sparse graphs are studied using the Bethe approximation and replica method of statistical physics. Equilibrium states of general energy functions involving a large set of real edge variables that interact at the network nodes are obtained in various cases. When applied to the representative problem of network resource allocation, efficient distributed algorithms are also devised. Scaling properties with respect to the network connectivity and the resource availability are found, and links to probabilistic Bayesian approximation methods are established. Different cost measures are considered and algorithmic solutions in the various cases are devised and examined numerically. Simulation results are in full agreement with the theory. © 2007 The American Physical Society.
Resumo:
Non-uniform B-spline dictionaries on a compact interval are discussed in the context of sparse signal representation. For each given partition, dictionaries of B-spline functions for the corresponding spline space are built up by dividing the partition into subpartitions and joining together the bases for the concomitant subspaces. The resulting slightly redundant dictionaries are composed of B-spline functions of broader support than those corresponding to the B-spline basis for the identical space. Such dictionaries are meant to assist in the construction of adaptive sparse signal representation through a combination of stepwise optimal greedy techniques.
Resumo:
Motor timing tasks have been employed in studies of neurodevelopmental disorders such as developmental dyslexia and ADHD, where they provide an index of temporal processing ability. Investigations of these disorders have used different stimulus parameters within the motor timing tasks which are likely to affect performance measures. Here we assessed the effect of auditory and visual pacing stimuli on synchronised motor timing performance and its relationship with cognitive and behavioural predictors that are commonly used in the diagnosis of these highly prevalent developmental disorders. Twenty- one children (mean age 9.6 years) completed a finger tapping task in two stimulus conditions, together with additional psychometric measures. As anticipated, synchronisation to the beat (ISI 329 ms) was less accurate in the visually paced condition. Decomposition of timing variance indicated that this effect resulted from differences in the way that visual and auditory paced tasks are processed by central timekeeping and associated peripheral implementation systems. The ability to utilise an efficient processing strategy on the visual task correlated with both reading and sustained attention skills. Dissociations between these patterns of relationship across task modality suggest that not all timing tasks are equivalent.
Resumo:
Optimal paths connecting randomly selected network nodes and fixed routers are studied analytically in the presence of a nonlinear overlap cost that penalizes congestion. Routing becomes more difficult as the number of selected nodes increases and exhibits ergodicity breaking in the case of multiple routers. The ground state of such systems reveals nonmonotonic complex behaviors in average path length and algorithmic convergence, depending on the network topology, and densities of communicating nodes and routers. A distributed linearly scalable routing algorithm is also devised. © 2012 American Physical Society.
Resumo:
Background - Several antipsychotic agents are known to prolong the QT interval in a dose dependent manner. Corrected QT interval (QTc) exceeding a threshold value of 450 ms may be associated with an increased risk of life threatening arrhythmias. Antipsychotic agents are often given in combination with other psychotropic drugs, such as antidepressants, that may also contribute to QT prolongation. This observational study compares the effects observed on QT interval between antipsychotic monotherapy and psychoactive polytherapy, which included an additional antidepressant or lithium treatment. Method - We examined two groups of hospitalized women with Schizophrenia, Bipolar Disorder and Schizoaffective Disorder in a naturalistic setting. Group 1 was composed of nineteen hospitalized women treated with antipsychotic monotherapy (either haloperidol, olanzapine, risperidone or clozapine) and Group 2 was composed of nineteen hospitalized women treated with an antipsychotic (either haloperidol, olanzapine, risperidone or quetiapine) with an additional antidepressant (citalopram, escitalopram, sertraline, paroxetine, fluvoxamine, mirtazapine, venlafaxine or clomipramine) or lithium. An Electrocardiogram (ECG) was carried out before the beginning of the treatment for both groups and at a second time after four days of therapy at full dosage, when blood was also drawn for determination of serum levels of the antipsychotic. Statistical analysis included repeated measures ANOVA, Fisher Exact Test and Indipendent T Test. Results - Mean QTc intervals significantly increased in Group 2 (24 ± 21 ms) however this was not the case in Group 1 (-1 ± 30 ms) (Repeated measures ANOVA p < 0,01). Furthermore we found a significant difference in the number of patients who exceeded the threshold of borderline QTc interval value (450 ms) between the two groups, with seven patients in Group 2 (38%) compared to one patient in Group 1 (7%) (Fisher Exact Text, p < 0,05). Conclusions - No significant prolongation of the QT interval was found following monotherapy with an antipsychotic agent, while combination of these drugs with antidepressants caused a significant QT prolongation. Careful monitoring of the QT interval is suggested in patients taking a combined treatment of antipsychotic and antidepressant agents.
Resumo:
The increasing intensity of global competition has led organizations to utilize various types of performance measurement tools for improving the quality of their products and services. Data envelopment analysis (DEA) is a methodology for evaluating and measuring the relative efficiencies of a set of decision making units (DMUs) that use multiple inputs to produce multiple outputs. All the data in the conventional DEA with input and/or output ratios assumes the form of crisp numbers. However, the observed values of data in real-world problems are sometimes expressed as interval ratios. In this paper, we propose two new models: general and multiplicative non-parametric ratio models for DEA problems with interval data. The contributions of this paper are fourfold: (1) we consider input and output data expressed as interval ratios in DEA; (2) we address the gap in DEA literature for problems not suitable or difficult to model with crisp values; (3) we propose two new DEA models for evaluating the relative efficiencies of DMUs with interval ratios, and (4) we present a case study involving 20 banks with three interval ratios to demonstrate the applicability and efficacy of the proposed models where the traditional indicators are mostly financial ratios. © 2011 Elsevier Inc.
Resumo:
Although crisp data are fundamentally indispensable for determining the profit Malmquist productivity index (MPI), the observed values in real-world problems are often imprecise or vague. These imprecise or vague data can be suitably characterized with fuzzy and interval methods. In this paper, we reformulate the conventional profit MPI problem as an imprecise data envelopment analysis (DEA) problem, and propose two novel methods for measuring the overall profit MPI when the inputs, outputs, and price vectors are fuzzy or vary in intervals. We develop a fuzzy version of the conventional MPI model by using a ranking method, and solve the model with a commercial off-the-shelf DEA software package. In addition, we define an interval for the overall profit MPI of each decision-making unit (DMU) and divide the DMUs into six groups according to the intervals obtained for their overall profit efficiency and MPIs. We also present two numerical examples to demonstrate the applicability of the two proposed models and exhibit the efficacy of the procedures and algorithms. © 2011 Elsevier Ltd.
Resumo:
Conventional DEA models assume deterministic, precise and non-negative data for input and output observations. However, real applications may be characterized by observations that are given in form of intervals and include negative numbers. For instance, the consumption of electricity in decentralized energy resources may be either negative or positive, depending on the heat consumption. Likewise, the heat losses in distribution networks may be within a certain range, depending on e.g. external temperature and real-time outtake. Complementing earlier work separately addressing the two problems; interval data and negative data; we propose a comprehensive evaluation process for measuring the relative efficiencies of a set of DMUs in DEA. In our general formulation, the intervals may contain upper or lower bounds with different signs. The proposed method determines upper and lower bounds for the technical efficiency through the limits of the intervals after decomposition. Based on the interval scores, DMUs are then classified into three classes, namely, the strictly efficient, weakly efficient and inefficient. An intuitive ranking approach is presented for the respective classes. The approach is demonstrated through an application to the evaluation of bank branches. © 2013.