153 resultados para wavelet entropy


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Circuit-breakers (CBs) are subject to electrical stresses with restrikes during capacitor bank operation. Stresses are caused by the overvoltages across CBs, the interrupting currents and the rate of rise of recovery voltage (RRRV). Such electrical stresses also depend on the types of system grounding and the types of dielectric strength curves. The aim of this study is to demonstrate a restrike waveform predictive model for a SF6 CB that considered the types of system grounding: grounded and non-grounded and the computation accuracy comparison on the application of the cold withstand dielectric strength and the hot recovery dielectric strength curve including the POW (point-on-wave) recommendations to make an assessment of increasing the CB remaining life. The simulation of SF6 CB stresses in a typical 400 kV system was undertaken and the results in the applications are presented. The simulated restrike waveforms produced with the identified features using wavelet transform can be used for restrike diagnostic algorithm development with wavelet transform to locate a substation with breaker restrikes. This study found that the hot withstand dielectric strength curve has less magnitude than the cold withstand dielectric strength curve for restrike simulation results. Computation accuracy improved with the hot withstand dielectric strength and POW controlled switching can increase the life for a SF6 CB.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study model selection strategies based on penalized empirical loss minimization. We point out a tight relationship between error estimation and data-based complexity penalization: any good error estimate may be converted into a data-based penalty function and the performance of the estimate is governed by the quality of the error estimate. We consider several penalty functions, involving error estimates on independent test data, empirical VC dimension, empirical VC entropy, and margin-based quantities. We also consider the maximal difference between the error on the first half of the training data and the second half, and the expected maximal discrepancy, a closely related capacity estimate that can be calculated by Monte Carlo integration. Maximal discrepancy penalty functions are appealing for pattern classification problems, since their computation is equivalent to empirical risk minimization over the training data with some labels flipped.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Epilepsy is characterized by the spontaneous and seemingly unforeseeable occurrence of seizures, during which the perception or behavior of patients is disturbed. An automatic system that detects seizure onsets would allow patients or the people near them to take appropriate precautions, and could provide more insight into this phenomenon. Various methods have been proposed to predict the onset of seizures based on EEG recordings. The use of nonlinear features motivated by the higher order spectra (HOS) has been reported to be a promising approach to differentiate between normal, background (pre-ictal) and epileptic EEG signals. In this work, we made a comparative study of the performance of Gaussian mixture model (GMM) and Support Vector Machine (SVM) classifiers using the features derived from HOS and from the power spectrum. Results show that the selected HOS based features achieve 93.11% classification accuracy compared to 88.78% with features derived from the power spectrum for a GMM classifier. The SVM classifier achieves an improvement from 86.89% with features based on the power spectrum to 92.56% with features based on the bispectrum.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Texture analysis and textural cues have been applied for image classification, segmentation and pattern recognition. Dominant texture descriptors include directionality, coarseness, line-likeness etc. In this dissertation a class of textures known as particulate textures are defined, which are predominantly coarse or blob-like. The set of features that characterise particulate textures are different from those that characterise classical textures. These features are micro-texture, macro-texture, size, shape and compaction. Classical texture analysis techniques do not adequately capture particulate texture features. This gap is identified and new methods for analysing particulate textures are proposed. The levels of complexity in particulate textures are also presented ranging from the simplest images where blob-like particles are easily isolated from their back- ground to the more complex images where the particles and the background are not easily separable or the particles are occluded. Simple particulate images can be analysed for particle shapes and sizes. Complex particulate texture images, on the other hand, often permit only the estimation of particle dimensions. Real life applications of particulate textures are reviewed, including applications to sedimentology, granulometry and road surface texture analysis. A new framework for computation of particulate shape is proposed. A granulometric approach for particle size estimation based on edge detection is developed which can be adapted to the gray level of the images by varying its parameters. This study binds visual texture analysis and road surface macrotexture in a theoretical framework, thus making it possible to apply monocular imaging techniques to road surface texture analysis. Results from the application of the developed algorithm to road surface macro-texture, are compared with results based on Fourier spectra, the auto- correlation function and wavelet decomposition, indicating the superior performance of the proposed technique. The influence of image acquisition conditions such as illumination and camera angle on the results was systematically analysed. Experimental data was collected from over 5km of road in Brisbane and the estimated coarseness along the road was compared with laser profilometer measurements. Coefficient of determination R2 exceeding 0.9 was obtained when correlating the proposed imaging technique with the state of the art Sensor Measured Texture Depth (SMTD) obtained using laser profilometers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Phospholipid (PL) molecules form the main structure of the membrane that prevents the direct contact of opposing articular cartilage layers. In this paper we conceptualise articular cartilage as a giant reverse micelle (GRM) in which the highly hydrated three-dimensional network of phospholipids is electrically charged and able to resist compressive forces during joint movement, and hence loading. Using this hypothetical base, we describe a hydrophilic-hydrophilic (HL-HL) biopair model of joint lubrication by contacting cartilages, whose mechanism is reliant on lamellar cushioning. To demonstrate the viability of our concept, the electrokinetic properties of the membranous layer on the articular surface were determined by measuring via microelectrophoresis, the adsorption of ions H, OH, Na and Cl on phospholipid membrane of liposomes, leading to the calculation of the effective surface charge density. The surface charge density was found to be -0.08 ± 0.002 cm-2 (mean ± S.D.) for phospholipid membranes, in 0.155 M NaCl solution and physiological pH. This value was approximately five times less than that measured in 0.01 M NaCl. The addition of synovial fluid (SF) to the 0.155 M NaCl solution reduced the surface charge density by 30% which was attributed to the binding of synovial fluid macromolecules to the phospholipid membrane. Our experiments show that particles charge and interact strongly with the polar core of RM. We demonstrate that particles can have strong electrostatic interactions when ions and macromolecules are solubilized by reverse micelle (RM). Since ions are solubilized by reverse micelle, the surface entropy influences the change in the charge density of the phospholipid membrane on cartilage surfaces. Reverse micelles stabilize ions maintaining equilibrium, their surface charges contribute to the stability of particles, while providing additional screening for electrostatic processes. © 2008 Elsevier Ireland Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Usability in HCI (Human-Computer Interaction) is normally understood as the simplicity and clarity with which the interaction with a computer program or a web site is designed. Identity management systems need to provide adequate usability and should have a simple and intuitive interface. The system should not only be designed to satisfy service provider requirements but it has to consider user requirements, otherwise it will lead to inconvenience and poor usability for users when managing their identities. With poor usability and a poor user interface with regard to security, it is highly likely that the system will have poor security. The rapid growth in the number of online services leads to an increasing number of different digital identities each user needs to manage. As a result, many people feel overloaded with credentials, which in turn negatively impacts their ability to manage them securely. Passwords are perhaps the most common type of credential used today. To avoid the tedious task of remembering difficult passwords, users often behave less securely by using low entropy and weak passwords. Weak passwords and bad password habits represent security threats to online services. Some solutions have been developed to eliminate the need for users to create and manage passwords. A typical solution is based on generating one-time passwords, i.e. passwords for single session or transaction usage. Unfortunately, most of these solutions do not satisfy scalability and/or usability requirements, or they are simply insecure. In this thesis, the security and usability aspects of contemporary methods for authentication based on one-time passwords (OTP) are examined and analyzed. In addition, more scalable solutions that provide a good user experience while at the same time preserving strong security are proposed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper proposes an innovative instance similarity based evaluation metric that reduces the search map for clustering to be performed. An aggregate global score is calculated for each instance using the novel idea of Fibonacci series. The use of Fibonacci numbers is able to separate the instances effectively and, in hence, the intra-cluster similarity is increased and the inter-cluster similarity is decreased during clustering. The proposed FIBCLUS algorithm is able to handle datasets with numerical, categorical and a mix of both types of attributes. Results obtained with FIBCLUS are compared with the results of existing algorithms such as k-means, x-means expected maximization and hierarchical algorithms that are widely used to cluster numeric, categorical and mix data types. Empirical analysis shows that FIBCLUS is able to produce better clustering solutions in terms of entropy, purity and F-score in comparison to the above described existing algorithms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a general, global approach to the problem of robot exploration, utilizing a topological data structure to guide an underlying Simultaneous Localization and Mapping (SLAM) process. A Gap Navigation Tree (GNT) is used to motivate global target selection and occluded regions of the environment (called “gaps”) are tracked probabilistically. The process of map construction and the motion of the vehicle alters both the shape and location of these regions. The use of online mapping is shown to reduce the difficulties in implementing the GNT.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Here we present a sequential Monte Carlo (SMC) algorithm that can be used for any one-at-a-time Bayesian sequential design problem in the presence of model uncertainty where discrete data are encountered. Our focus is on adaptive design for model discrimination but the methodology is applicable if one has a different design objective such as parameter estimation or prediction. An SMC algorithm is run in parallel for each model and the algorithm relies on a convenient estimator of the evidence of each model which is essentially a function of importance sampling weights. Other methods for this task such as quadrature, often used in design, suffer from the curse of dimensionality. Approximating posterior model probabilities in this way allows us to use model discrimination utility functions derived from information theory that were previously difficult to compute except for conjugate models. A major benefit of the algorithm is that it requires very little problem specific tuning. We demonstrate the methodology on three applications, including discriminating between models for decline in motor neuron numbers in patients suffering from neurological diseases such as Motor Neuron disease.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work presents two UAS See and Avoid approaches using Fuzzy Control. We compare the performance of each controller when a Cross-Entropy method is applied to optimase the parameters for one of the controllers. Each controller receive information from an image processing front-end that detect and track targets in the environment. Visual information is then used under a visual servoing approach to perform autonomous avoidance. Experimental flight trials using a small quadrotor were performed to validate and compare the behaviour of both controllers

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes a novel method for determining the extrinsic calibration parameters between 2D and 3D LIDAR sensors with respect to a vehicle base frame. To recover the calibration parameters we attempt to optimize the quality of a 3D point cloud produced by the vehicle as it traverses an unknown, unmodified environment. The point cloud quality metric is derived from Rényi Quadratic Entropy and quantifies the compactness of the point distribution using only a single tuning parameter. We also present a fast approximate method to reduce the computational requirements of the entropy evaluation, allowing unsupervised calibration in vast environments with millions of points. The algorithm is analyzed using real world data gathered in many locations, showing robust calibration performance and substantial speed improvements from the approximations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A characteristic of Parkinson's disease (PD) is the development of tremor within the 4–6 Hz range. One method used to better understand pathological tremor is to compare the responses to tremor-type actions generated intentionally in healthy adults. This study was designed to investigate the similarities and differences between voluntarily generated 4–6 Hz tremor and PD tremor in regards to their amplitude, frequency and coupling characteristics. Tremor responses for 8 PD individuals (on- and off-medication) and 12 healthy adults were assessed under postural and resting conditions. Results showed that the voluntary and PD tremor were essentially identical with regards to the amplitude and peak frequency. However, differences between the groups were found for the variability (SD of peak frequency, proportional power) and regularity (Approximate Entropy, ApEn) of the tremor signal. Additionally, coherence analysis revealed strong inter-limb coupling during voluntary conditions while no bilateral coupling was seen for the PD persons. Overall, healthy participants were able to produce a 5 Hz tremulous motion indistinguishable to that of PD patients in terms of peak frequency and amplitude. However, differences in the structure of variability and level of inter-limb coupling were found for the tremor responses of the PD and healthy adults. These differences were preserved irrespective of the medication state of the PD persons. The results illustrate the importance of assessing the pattern of signal structure/variability to discriminate between different tremor forms, especially where no differences emerge in standard measures of mean amplitude as traditionally defined.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Certain statistic and scientometric features of articles published in the journal “International Research in Geographical and Environmental Education” are examined in this paper, for the period 1992-2009, by applying nonparametric statistics and Shannon’s entropy (diversity) formula. The main findings of this analysis are: a) after 2004 the research priorities of researchers in geographical and environmental education seem to have changed, b) “teacher education” has been the most recurrent theme throughout these 18 years, followed by “values & attitudes” and “inquiry & problem solving” c) the themes “GIS” and “Sustainability” were the most “stable” throughout the 18 years, meaning that they maintained their ranks as publication priorities more than other themes, d) citations of IRGEE increase annually, e) the average thematic diversity of articles published during the period 1992-2009 is 82.7% of the maximum thematic diversity (very high), meaning that the Journal has the capacity to attract a wide readership for the 10 themes it has successfully covered throughout the 18 years of its publication.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Real-world AI systems have been recently deployed which can automatically analyze the plan and tactics of tennis players. As the game-state is updated regularly at short intervals (i.e. point-level), a library of successful and unsuccessful plans of a player can be learnt over time. Given the relative strengths and weaknesses of a player’s plans, a set of proven plans or tactics from the library that characterize a player can be identified. For low-scoring, continuous team sports like soccer, such analysis for multi-agent teams does not exist as the game is not segmented into “discretized” plays (i.e. plans), making it difficult to obtain a library that characterizes a team’s behavior. Additionally, as player tracking data is costly and difficult to obtain, we only have partial team tracings in the form of ball actions which makes this problem even more difficult. In this paper, we propose a method to overcome these issues by representing team behavior via play-segments, which are spatio-temporal descriptions of ball movement over fixed windows of time. Using these representations we can characterize team behavior from entropy maps, which give a measure of predictability of team behaviors across the field. We show the efficacy and applicability of our method on the 2010-2011 English Premier League soccer data.