878 resultados para Multi-component systems
Resumo:
In this paper, we evaluate the Probabilistic Occupancy Map (POM) pedestrian detection algorithm on the PETS 2009 benchmark dataset. POM is a multi-camera generative detection method, which estimates ground plane occupancy from multiple background subtraction views. Occupancy probabilities are iteratively estimated by fitting a synthetic model of the background subtraction to the binary foreground motion. Furthermore, we test the integration of this algorithm into a larger framework designed for understanding human activities in real environments. We demonstrate accurate detection and localization on the PETS dataset, despite suboptimal calibration and foreground motion segmentation input.
Resumo:
Hypercube is one of the most popular topologies for connecting processors in multicomputer systems. In this paper we address the maximum order of a connected component in a faulty cube. The results established include several known conclusions as special cases. We conclude that the hypercube structure is resilient as it includes a large connected component in the presence of large number of faulty vertices.
Resumo:
In this paper, a new equalizer learning scheme is introduced based on the algorithm of the directional evolutionary multi-objective optimization (EMOO). Whilst nonlinear channel equalizers such as the radial basis function (RBF) equalizers have been widely studied to combat the linear and nonlinear distortions in the modern communication systems, most of them do not take into account the equalizers' generalization capabilities. In this paper, equalizers are designed aiming at improving their generalization capabilities. It is proposed that this objective can be achieved by treating the equalizer design problem as a multi-objective optimization (MOO) problem, with each objective based on one of several training sets, followed by deriving equalizers with good capabilities of recovering the signals for all the training sets. Conventional EMOO which is widely applied in the MOO problems suffers from disadvantages such as slow convergence speed. Directional EMOO improves the computational efficiency of the conventional EMOO by explicitly making use of the directional information. The new equalizer learning scheme based on the directional EMOO is applied to the RBF equalizer design. Computer simulation demonstrates that the new scheme can be used to derive RBF equalizers with good generalization capabilities, i.e., good performance on predicting the unseen samples.
Resumo:
A multi-layered architecture of self-organizing neural networks is being developed as part of an intelligent alarm processor to analyse a stream of power grid fault messages and provide a suggested diagnosis of the fault location. Feedback concerning the accuracy of the diagnosis is provided by an object-oriented grid simulator which acts as an external supervisor to the learning system. The utilization of artificial neural networks within this environment should result in a powerful generic alarm processor which will not require extensive training by a human expert to produce accurate results.
Resumo:
One of the most pervading concepts underlying computational models of information processing in the brain is linear input integration of rate coded uni-variate information by neurons. After a suitable learning process this results in neuronal structures that statically represent knowledge as a vector of real valued synaptic weights. Although this general framework has contributed to the many successes of connectionism, in this paper we argue that for all but the most basic of cognitive processes, a more complex, multi-variate dynamic neural coding mechanism is required - knowledge should not be spacially bound to a particular neuron or group of neurons. We conclude the paper with discussion of a simple experiment that illustrates dynamic knowledge representation in a spiking neuron connectionist system.
Resumo:
Multi-rate multicarrier DS-CDMA is a potentially attractive multiple access method for future wireless networks that must support multimedia, and thus multi-rate, traffic. Considering that high performance detection such as coherent demodulation needs the explicit knowledge of the channel, this paper proposes a subspace-based blind adaptive algorithm for timing acquisition and channel estimation in asynchronous multirate multicarrier DS-CDMA systems, which is applicable to both multicode and variable spreading factor systems.
Resumo:
Where users are interacting in a distributed virtual environment, the actions of each user must be observed by peers with sufficient consistency and within a limited delay so as not to be detrimental to the interaction. The consistency control issue may be split into three parts: update control; consistent enactment and evolution of events; and causal consistency. The delay in the presentation of events, termed latency, is primarily dependent on the network propagation delay and the consistency control algorithms. The latency induced by the consistency control algorithm, in particular causal ordering, is proportional to the number of participants. This paper describes how the effect of network delays may be reduced and introduces a scalable solution that provides sufficient consistency control while minimising its effect on latency. The principles described have been developed at Reading over the past five years. Similar principles are now emerging in the simulation community through the HLA standard. This paper attempts to validate the suggested principles within the schema of distributed simulation and virtual environments and to compare and contrast with those described by the HLA definition documents.
Resumo:
Multi-factor approaches to analysis of real estate returns have, since the pioneering work of Chan, Hendershott and Sanders (1990), emphasised a macro-variables approach in preference to the latent factor approach that formed the original basis of the arbitrage pricing theory. With increasing use of high frequency data and trading strategies and with a growing emphasis on the risks of extreme events, the macro-variable procedure has some deficiencies. This paper explores a third way, with the use of an alternative to the standard principal components approach – independent components analysis (ICA). ICA seeks higher moment independence and maximises in relation to a chosen risk parameter. We apply an ICA based on kurtosis maximisation to weekly US REIT data using a kurtosis maximising algorithm. The results show that ICA is successful in capturing the kurtosis characteristics of REIT returns, offering possibilities for the development of risk management strategies that are sensitive to extreme events and tail distributions.
Resumo:
In this paper the use of neural networks for the control of dynamical systems is considered. Both identification and feedback control aspects are discussed as well as the types of system for which neural networks can provide a useful technique. Multi-layer Perceptron and Radial Basis function neural network types are looked at, with an emphasis on the latter. It is shown how basis function centre selection is a critical part of the implementation process and that multivariate clustering algorithms can be an extremely useful tool for finding centres.
Resumo:
The consistency of precipitation variability estimated from the multiple satellite-based observing systems is assessed. There is generally good agreement between TRMM TMI, SSM/I, GPCP and AMSRE datasets for the inter-annual variability of precipitation since 1997 but the HOAPS dataset appears to overestimate the magnitude of variability. Over the tropical ocean the TRMM 3B42 dataset produces unrealistic variabilitys. Based upon deseasonalised GPCP data for the period 1998-2008, the sensitivity of global mean precipitation (P) to surface temperature (T) changes (dP/dT) is about 6%/K, although a smaller sensitivity of 3.6%/K is found using monthly GPCP data over the longer period 1989-2008. Over the tropical oceans dP/dT ranges from 10-30%/K depending upon time-period and dataset while over tropical land dP/dT is -8 to -11%/K for the 1998-2008 period. Analyzing the response of the tropical ocean precipitation intensity distribution to changes in T we find the wetter area P shows a strong positive response to T of around 20%/K. The response over the drier tropical regimes is less coherent and varies with datasets, but responses over the tropical land show significant negative relationships over an interannual time-scale. The spatial and temporal resolutions of the datasets strongly influence the precipitation responses over the tropical oceans and help explain some of the discrepancy between different datasets. Consistency between datasets is found to increase on averaging from daily to 5-day time-scales and considering a 1o (or coarser) spatial resolution. Defining the wet and dry tropical ocean regime by the 60th percentile of P intensity, the 5-day average, 1o TMI data exhibits a coherent drying of the dry regime at the rate of -20%/K and the wet regime becomes wetter at a similar rate with warming.
Resumo:
In order to harness the computational capacity of dissociated cultured neuronal networks, it is necessary to understand neuronal dynamics and connectivity on a mesoscopic scale. To this end, this paper uncovers dynamic spatiotemporal patterns emerging from electrically stimulated neuronal cultures using hidden Markov models (HMMs) to characterize multi-channel spike trains as a progression of patterns of underlying states of neuronal activity. However, experimentation aimed at optimal choice of parameters for such models is essential and results are reported in detail. Results derived from ensemble neuronal data revealed highly repeatable patterns of state transitions in the order of milliseconds in response to probing stimuli.
Resumo:
This technical note investigates the controllability of the linearized dynamics of the multilink inverted pendulum as the number of links and the number and location of actuators changes. It is demonstrated that, in some instances, there exist sets of parameter values that render the system uncontrollable and so usual methods for assessing controllability are difficult to employ. To assess the controllability, a theorem on strong structural controllability for single-input systems is extended to the multiinput case.
Resumo:
Polymer-stabilised liquid crystals are systems in which a small amount of monomer is dissolved within a liquid crystalline host, and then polymerised in situ to produce a network. The progress of the polymerisation, performed within electro-optic cells, was studied by establishing an analytical method novel to these systems. Samples were prepared by photopolymerisation of the monomer under well-defined reaction conditions; subsequent immersion in acetone caused the host and any unreacted monomer to dissolve. High performance liquid chromatography was used to separate and detect the various solutes in the resulting solutions, enabling the amount of unreacted monomer for a given set of conditions to be quantified. Longer irradiations cause a decrease in the proportion of unreacted monomer since more network is formed, while a more uniform LC director alignment (achieved by decreasing the sample thickness) or a higher level of order (achieved by decreasing the polymerisation temperature) promotes faster reactions.
Resumo:
An experimental method is described which enables the inelastically scattered X-ray component to be removed from diffractometer data prior to radial density function analysis. At each scattering angle an energy spectrum is generated from a Si(Li) detector combined with a multi-channel analyser from which the coherently scattered component is separated. The data obtained from organic polymers has an improved signal/noise ratio at high values of scattering angle, and a commensurate enhancement of resolution of the RDF at low r is demonstrated for the case of PMMA (ICI `Perspex'). The method obviates the need for the complicated correction for multiple scattering.