990 resultados para Consensus processes
Resumo:
Considering some predictive mechanisms, we show that ultrafast average-consensus can be achieved in networks of interconnected agents. More specifically, by predicting the dynamics of the network several steps ahead and using this information in the design of the consensus protocol of each agent, drastic improvements can be achieved in terms of the speed of consensus convergence, without changing the topology of the network. Moreover, using these predictive mechanisms, the range of sampling periods leading to consensus convergence is greatly expanded compared with the routine consensus protocol. This study provides a mathematical basis for the idea that some predictive mechanisms exist in widely-spread biological swarms, flocks, and networks. From the industrial engineering point of view, inclusion of an efficient predictive mechanism allows for a significant increase in the speed of consensus convergence and also a reduction of the communication energy required to achieve a predefined consensus performance.
Resumo:
Many data are naturally modeled by an unobserved hierarchical structure. In this paper we propose a flexible nonparametric prior over unknown data hierarchies. The approach uses nested stick-breaking processes to allow for trees of unbounded width and depth, where data can live at any node and are infinitely exchangeable. One can view our model as providing infinite mixtures where the components have a dependency structure corresponding to an evolutionary diffusion down a tree. By using a stick-breaking approach, we can apply Markov chain Monte Carlo methods based on slice sampling to perform Bayesian inference and simulate from the posterior distribution on trees. We apply our method to hierarchical clustering of images and topic modeling of text data.
Resumo:
We define a copula process which describes the dependencies between arbitrarily many random variables independently of their marginal distributions. As an example, we develop a stochastic volatility model, Gaussian Copula Process Volatility (GCPV), to predict the latent standard deviations of a sequence of random variables. To make predictions we use Bayesian inference, with the Laplace approximation, and with Markov chain Monte Carlo as an alternative. We find both methods comparable. We also find our model can outperform GARCH on simulated and financial data. And unlike GARCH, GCPV can easily handle missing data, incorporate covariates other than time, and model a rich class of covariance structures.
Resumo:
We introduce a stochastic process with Wishart marginals: the generalised Wishart process (GWP). It is a collection of positive semi-definite random matrices indexed by any arbitrary dependent variable. We use it to model dynamic (e.g. time varying) covariance matrices. Unlike existing models, it can capture a diverse class of covariance structures, it can easily handle missing data, the dependent variable can readily include covariates other than time, and it scales well with dimension; there is no need for free parameters, and optional parameters are easy to interpret. We describe how to construct the GWP, introduce general procedures for inference and predictions, and show that it outperforms its main competitor, multivariate GARCH, even on financial data that especially suits GARCH. We also show how to predict the mean of a multivariate process while accounting for dynamic correlations.
Resumo:
Given a spectral density matrix or, equivalently, a real autocovariance sequence, the author seeks to determine a finite-dimensional linear time-invariant system which, when driven by white noise, will produce an output whose spectral density is approximately PHI ( omega ), and an approximate spectral factor of PHI ( omega ). The author employs the Anderson-Faurre theory in his analysis.
Resumo:
This paper considers the effect of the rotor tip on the casing heat load of a transonic axial flow turbine. The aim of the research is to understand the dominant causes of casing heat-transfer. Experimental measurements were conducted at engine-representative Mach number, Reynolds number and stage inlet to casing wall temperature ratio. Time-resolved heat-transfer coefficient and gas recovery temperature on the casing were measured using an array of heat-transfer gauges. Time-resolved static pressure on the casing wall was measured using Kulite pressure transducers. Time-resolved numerical simulations were undertaken to aid understanding of the mechanism responsible for casing heat load. The results show that between 35% and 60% axial chord the rotor tip-leakage flow is responsible for more than 50% of casing heat transfer. The effects of both gas recovery temperature and heat transfer coefficient were investigated separately and it is shown that an increased stagnation temperature in the rotor tip gap dominates casing heat-transfer. In the tip gap the stagnation temperature is shown to rise above that found at stage inlet (combustor exit) by as much as 35% of stage total temperature drop. The rise in stagnation temperature is caused by an isentropic work input to the tip-leakage fluid by the rotor. The size of this mechanism is investigated by computationally tracking fluid path-lines through the rotor tip gap to understand the unsteady work processes that occur. Copyright © 2005 by ASME.
Resumo:
During laser welding, the keyhole is generated by the recoil pressure induced by the evaporation processes occurring mainly on the front keyhole wall (KW). In order to characterize the evaporation process, we have measured this recoil pressure by using a plume deflection technique, where the plume generated for static conditions (i. e. with no sample displacement) is deflected by a transverse side gas jet. From the measurement of the plume deflection angle, the recoil pressure can be determined as a function of incident intensity and sample material. From these data one can estimate the pressure generated on the front KW, during laser welding. Therefore, the corresponding dynamic pressure exerted by the vapor plume expansion on the rear KW, in contact with the melt pool, can be also estimated. These pressures appear to be in close agreement with those generated by an additional side jet that has been used in previous experiments, for stabilizing the observed melt pool oscillations or fluctuations.
Resumo:
This working paper is part of a review of aquaculture technologies and gender in Bangladesh in the period 1990 to 2014. It assesses how gender has been integrated within past aquaculture technology interventions, before exploring the gender dimensions associated with current approaches to transferring knowledge about homestead aquaculture technology. It draws out existing knowledge, identifies research gaps, and selects practices to build upon--as well as practices to move away from. The review examines the research and practice of WorldFish and other development partners in Bangladesh through consultations, a review of gray and published literature, and fieldwork. It aims to contribute to the development of aquaculture technology dissemination methodologies that strengthen and underpin women’s participation in aquaculture.
Resumo:
It is shown in the paper how robustness can be guaranteed for consensus protocols with heterogeneous dynamics in a scalable and decentralized way i.e. by each agent satisfying a test that does not require knowledge of the entire network. Random graph examples illustrate that the proposed certificates are not conservative for classes of large scale networks, despite the heterogeneity of the dynamics, which is a distinctive feature of this work. The conditions hold for symmetric protocols and more conservative stability conditions are given for general nonsymmetric interconnections. Nonlinear extensions in an IQC framework are finally discussed. Copyright © 2005 IFAC.
Resumo:
In this paper we present Poisson sum series representations for α-stable (αS) random variables and a-stable processes, in particular concentrating on continuous-time autoregressive (CAR) models driven by α-stable Lévy processes. Our representations aim to provide a conditionally Gaussian framework, which will allow parameter estimation using Rao-Blackwellised versions of state of the art Bayesian computational methods such as particle filters and Markov chain Monte Carlo (MCMC). To overcome the issues due to truncation of the series, novel residual approximations are developed. Simulations demonstrate the potential of these Poisson sum representations for inference in otherwise intractable α-stable models. © 2011 IEEE.
Resumo:
This paper presents a method to manage Engineering Changes (EC) during the product development process, which is seen to be a complex system. The ability to manage engineering changes efficiently reflects the agility of an enterprise. Although there are unnecessary ECs that should be avoided, many of the ECs are actually beneficial. The proposed method explores the linkages between the product development process features and product specifications dependencies. It suggests ways of identifying and managing specification dependencies to support the Engineering Change Management process. Furthermore, the impacts of an EC on the product specifications as well as on the process organization are studied. © 2009 World Scientific Publishing Company.
Resumo:
The authors use simulation to analyse the resource-driven dependencies between concurrent processes used to create customised products in a company. Such processes are uncertain and unique according to the design changes required. However, they have similar structures. For simulation, a level of abstraction is chosen such that all possible processes are represented by the same activity network. Differences between processes are determined by the customisations that they implement. The approach is illustrated through application to a small business that creates customised fashion products. We suggest that similar techniques could be applied to study intertwined design processes in more complex domains. Copyright © 2011 Inderscience Enterprises Ltd.
Resumo:
Novel statistical models are proposed and developed in this paper for automated multiple-pitch estimation problems. Point estimates of the parameters of partial frequencies of a musical note are modeled as realizations from a non-homogeneous Poisson process defined on the frequency axis. When several notes are combined, the processes for the individual notes combine to give a new Poisson process whose likelihood is easy to compute. This model avoids the data-association step of linking the harmonics of each note with the corresponding partials and is ideal for efficient Bayesian inference of unknown multiple fundamental frequencies in a signal. © 2011 IEEE.