148 resultados para Topological entropy


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The problem of estimating pseudobearing rate information of an airborne target based on measurements from a vision sensor is considered. Novel image speed and heading angle estimators are presented that exploit image morphology, hidden Markov model (HMM) filtering, and relative entropy rate (RER) concepts to allow pseudobearing rate information to be determined before (or whilst) the target track is being estimated from vision information.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis establishes performance properties for approximate filters and controllers that are designed on the basis of approximate dynamic system representations. These performance properties provide a theoretical justification for the widespread application of approximate filters and controllers in the common situation where system models are not known with complete certainty. This research also provides useful tools for approximate filter designs, which are applied to hybrid filtering of uncertain nonlinear systems. As a contribution towards applications, this thesis also investigates air traffic separation control in the presence of measurement uncertainties.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Utility functions in Bayesian experimental design are usually based on the posterior distribution. When the posterior is found by simulation, it must be sampled from for each future data set drawn from the prior predictive distribution. Many thousands of posterior distributions are often required. A popular technique in the Bayesian experimental design literature to rapidly obtain samples from the posterior is importance sampling, using the prior as the importance distribution. However, importance sampling will tend to break down if there is a reasonable number of experimental observations and/or the model parameter is high dimensional. In this paper we explore the use of Laplace approximations in the design setting to overcome this drawback. Furthermore, we consider using the Laplace approximation to form the importance distribution to obtain a more efficient importance distribution than the prior. The methodology is motivated by a pharmacokinetic study which investigates the effect of extracorporeal membrane oxygenation on the pharmacokinetics of antibiotics in sheep. The design problem is to find 10 near optimal plasma sampling times which produce precise estimates of pharmacokinetic model parameters/measures of interest. We consider several different utility functions of interest in these studies, which involve the posterior distribution of parameter functions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A major challenge for robot localization and mapping systems is maintaining reliable operation in a changing environment. Vision-based systems in particular are susceptible to changes in illumination and weather, and the same location at another time of day may appear radically different to a system using a feature-based visual localization system. One approach for mapping changing environments is to create and maintain maps that contain multiple representations of each physical location in a topological framework or manifold. However, this requires the system to be able to correctly link two or more appearance representations to the same spatial location, even though the representations may appear quite dissimilar. This paper proposes a method of linking visual representations from the same location without requiring a visual match, thereby allowing vision-based localization systems to create multiple appearance representations of physical locations. The most likely position on the robot path is determined using particle filter methods based on dead reckoning data and recent visual loop closures. In order to avoid erroneous loop closures, the odometry-based inferences are only accepted when the inferred path's end point is confirmed as correct by the visual matching system. Algorithm performance is demonstrated using an indoor robot dataset and a large outdoor camera dataset.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Organizational transformations reliant on successful ICT system developments (continue to) fail to deliver projected benefits even when contemporary governance models are applied rigorously. Modifications to traditional program, project and systems development management methods have produced little material improvement to successful transformation as they are unable to routinely address the complexity and uncertainty of dynamic alignment of IS investments and innovation. Complexity theory provides insight into why this phenomenon occurs and is used to develop a conceptualization of complexity in IS-driven organizational transformations. This research-in-progress aims to identify complexity formulations relevant to organizational transformation. Political/power based influences, interrelated business rules, socio-technical innovation, impacts on stakeholders and emergent behaviors are commonly considered as characterizing complexity while the proposed conceptualization accommodates these as connectivity, irreducibility, entropy and/or information gain in hierarchically approximation and scaling, number of states in a finite automata and/or dimension of attractor, and information and/or variety.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The diagnostics of mechanical components operating in transient conditions is still an open issue, in both research and industrial field. Indeed, the signal processing techniques developed to analyse stationary data are not applicable or are affected by a loss of effectiveness when applied to signal acquired in transient conditions. In this paper, a suitable and original signal processing tool (named EEMED), which can be used for mechanical component diagnostics in whatever operating condition and noise level, is developed exploiting some data-adaptive techniques such as Empirical Mode Decomposition (EMD), Minimum Entropy Deconvolution (MED) and the analytical approach of the Hilbert transform. The proposed tool is able to supply diagnostic information on the basis of experimental vibrations measured in transient conditions. The tool has been originally developed in order to detect localized faults on bearings installed in high speed train traction equipments and it is more effective to detect a fault in non-stationary conditions than signal processing tools based on spectral kurtosis or envelope analysis, which represent until now the landmark for bearings diagnostics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Diagnostics of rolling element bearings is usually performed by means of vibration signals measured by accelerometers placed in the proximity of the bearing under investigation. The aim is to monitor the integrity of the bearing components, in order to avoid catastrophic failures, or to implement condition based maintenance strategies. In particular, the trend in this field is to combine in a single algorithm different signal-enhancement and signal-analysis techniques. Among the first ones, Minimum Entropy Deconvolution (MED) has been pointed out as a key tool able to highlight the effect of a possible damage in one of the bearing components within the vibration signal. This paper presents the application of this technique to signals collected on a simple test-rig, able to test damaged industrial roller bearings in different working conditions. The effectiveness of the technique has been tested, comparing the results of one undamaged bearing with three bearings artificially damaged in different locations, namely on the inner race, outer race and rollers. Since MED performances are dependent on the filter length, the most suitable value of this parameter is defined on the basis of both the application and measured signals. This represents an original contribution of the paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The signal processing techniques developed for the diagnostics of mechanical components operating in stationary conditions are often not applicable or are affected by a loss of effectiveness when applied to signals measured in transient conditions. In this chapter, an original signal processing tool is developed exploiting some data-adaptive techniques such as Empirical Mode Decomposition, Minimum Entropy Deconvolution and the analytical approach of the Hilbert transform. The tool has been developed to detect localized faults on bearings of traction systems of high speed trains and it is more effective to detect a fault in non-stationary conditions than signal processing tools based on envelope analysis or spectral kurtosis, which represent until now the landmark for bearings diagnostics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Consistency and invariance in movements are traditionally viewed as essential features of skill acquisition and elite sports performance. This emphasis on the stabilization of action has resulted in important processes of adaptation in movement coordination during performance being overlooked in investigations of elite sport performance. Here we investigate whether differences exist between the movement kinematics displayed by five, elite springboard divers (age 17 ± 2.4 years) in the preparation phases of baulked and completed take-offs. The two-dimensional kinematic characteristics of the reverse somersault take-off phases (approach and hurdle) were recorded during normal training sessions and used for intra-individual analysis. All participants displayed observable differences in movement patterns at key events during the approach phase; however, the presence of similar global topological characteristics suggested that, overall, participants did not perform distinctly different movement patterns during completed and baulked dives. These findings provide a powerful rationale for coaches to consider assessing functional variability or adaptability of motor behaviour as a key criterion of successful performance in sports such as diving.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study investigated movement synchronization of players within and between teams during competitive association football performance. Cluster phase analysis was introduced as a method to assess synchronies between whole teams and between individual players with their team as a function of time, ball possession and field direction. Measures of dispersion (SD) and regularity (sample entropy – SampEn – and cross sample entropy – Cross-SampEn) were used to quantify the magnitude and structure of synchrony. Large synergistic relations within each professional team sport collective were observed, particularly in the longitudinal direction of the field (0.89 ± 0.12) compared to the lateral direction (0.73 ± 0.16, p < .01). The coupling between the group measures of the two teams also revealed that changes in the synchrony of each team were intimately related (Cross-SampEn values of 0.02 ± 0.01). Interestingly, ball possession did not influence team synchronization levels. In player–team synchronization, individuals tended to be coordinated under near in-phase modes with team behavior (mean ranges between −7 and 5° of relative phase). The magnitudes of variations were low, but more irregular in time, for the longitudinal (SD: 18 ± 3°; SampEn: 0.07 ± 0.01), compared to the lateral direction (SD: 28 ± 5°; SampEn: 0.06 ± 0.01, p < .05) on-field. Increases in regularity were also observed between the first (SampEn: 0.07 ± 0.01) and second half (SampEn: 0.06 ± 0.01, p < .05) of the observed competitive game. Findings suggest that the method of analysis introduced in the current study may offer a suitable tool for examining team’s synchronization behaviors and the mutual influence of each team’s cohesiveness in competing social collectives.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study investigated changes in the complexity (magnitude and structure of variability) of the collective behaviours of association football teams during competitive performance. Raw positional data from an entire competitive match between two professional teams were obtained with the ProZone® tracking system. Five compound positional variables were used to investigate the collective patterns of performance of each team including: surface area, stretch index, team length, team width, and geometrical centre. Analyses involve the coefficient of variation (%CV) and approximate entropy (ApEn), as well as the linear association between both parameters. Collective measures successfully captured the idiosyncratic behaviours of each team and their variations across the six time periods of the match. Key events such as goals scored and game breaks (such as half time and full time) seemed to influence the collective patterns of performance. While ApEn values significantly decreased during each half, the %CV increased. Teams seem to become more regular and predictable, but with increased magnitudes of variation in their organisational shape over the natural course of a match.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Proton-bound dimers consisting of two glycerophospholipids with different headgroups were prepared using negative ion electrospray ionization and dissociated in a triple quadrupole mass spectrometer. Analysis of the tandem mass spectra of the dimers using the kinetic method provides, for the first time, an order of acidity for the phospholipid classes in the gas phase of PE < PA << PG < PS < PI. Hybrid density functional calculations on model phospholipids were used to predict the absolute deprotonation enthalpies of the phospholipid classes from isodesmic proton transfer reactions with phosphoric acid. The computational data largely support the experimental acidity trend, with the exception of the relative acidity ranking of the two most acidic phospholipid species. Possible causes of the discrepancy between experiment and theory are discussed and the experimental trend is recommended. The sequence of gas phase acidities for the phospholipid headgroups is found to (1) have little correlation with the relative ionization efficiencies of the phospholipid classes observed in the negative ion electrospray process, and (2) correlate well with fragmentation trends observed upon collisional activation of phospholipid \[M - H](-) anions. (c) 2005 American Society for Mass Spectrometry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We introduce the notion of distributed password-based public-key cryptography, where a virtual high-entropy private key is implicitly defined as a concatenation of low-entropy passwords held in separate locations. The users can jointly perform private-key operations by exchanging messages over an arbitrary channel, based on their respective passwords, without ever sharing their passwords or reconstituting the key. Focusing on the case of ElGamal encryption as an example, we start by formally defining ideal functionalities for distributed public-key generation and virtual private-key computation in the UC model. We then construct efficient protocols that securely realize them in either the RO model (for efficiency) or the CRS model (for elegance). We conclude by showing that our distributed protocols generalize to a broad class of “discrete-log”-based public-key cryptosystems, which notably includes identity-based encryption. This opens the door to a powerful extension of IBE with a virtual PKG made of a group of people, each one memorizing a small portion of the master key.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ability to understand and predict how thermal, hydrological,mechanical and chemical (THMC) processes interact is fundamental to many research initiatives and industrial applications. We present (1) a new Thermal– Hydrological–Mechanical–Chemical (THMC) coupling formulation, based on non-equilibrium thermodynamics; (2) show how THMC feedback is incorporated in the thermodynamic approach; (3) suggest a unifying thermodynamic framework for multi-scaling; and (4) formulate a new rationale for assessing upper and lower bounds of dissipation for THMC processes. The technique is based on deducing time and length scales suitable for separating processes using a macroscopic finite time thermodynamic approach. We show that if the time and length scales are suitably chosen, the calculation of entropic bounds can be used to describe three different types of material and process uncertainties: geometric uncertainties,stemming from the microstructure; process uncertainty, stemming from the correct derivation of the constitutive behavior; and uncertainties in time evolution, stemming from the path dependence of the time integration of the irreversible entropy production. Although the approach is specifically formulated here for THMC coupling we suggest that it has a much broader applicability. In a general sense it consists of finding the entropic bounds of the dissipation defined by the product of thermodynamic force times thermodynamic flux which in material sciences corresponds to generalized stress and generalized strain rates, respectively.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we propose a steganalysis method that is able to identify the locations of stego bearing pixels in the binary image. In order to do that, our proposed method will calculate the residual between a given stego image and its estimated cover image. After that, we will compute the local entropy difference between these two versions of images as well. Finally, we will compute the mean of residual and mean of local entropy difference across multiple stego images. From these two means, the locations of stego bearing pixels can be identified. The presented empirical results demonstrate that our proposed method can identify the stego bearing locations of near perfect accuracy when sufficient stego images are supplied. Hence, our proposed method can be used to reveal which pixels in the binary image have been used to carry the secret message.