908 resultados para Dynamic Threshold Algorithm
Resumo:
A simple but efficient voice activity detector based on the Hilbert transform and a dynamic threshold is presented to be used on the pre-processing of audio signals -- The algorithm to define the dynamic threshold is a modification of a convex combination found in literature -- This scheme allows the detection of prosodic and silence segments on a speech in presence of non-ideal conditions like a spectral overlapped noise -- The present work shows preliminary results over a database built with some political speech -- The tests were performed adding artificial noise to natural noises over the audio signals, and some algorithms are compared -- Results will be extrapolated to the field of adaptive filtering on monophonic signals and the analysis of speech pathologies on futures works
Resumo:
Division of labor is a widely studied aspect of colony behavior of social insects. Division of labor models indicate how individuals distribute themselves in order to perform different tasks simultaneously. However, models that study division of labor from a dynamical system point of view cannot be found in the literature. In this paper, we define a division of labor model as a discrete-time dynamical system, in order to study the equilibrium points and their properties related to convergence and stability. By making use of this analytical model, an adaptive algorithm based on division of labor can be designed to satisfy dynamic criteria. In this way, we have designed and tested an algorithm that varies the response thresholds in order to modify the dynamic behavior of the system. This behavior modification allows the system to adapt to specific environmental and collective situations, making the algorithm a good candidate for distributed control applications. The variable threshold algorithm is based on specialization mechanisms. It is able to achieve an asymptotically stable behavior of the system in different environments and independently of the number of individuals. The algorithm has been successfully tested under several initial conditions and number of individuals.
Resumo:
A dynamic programming algorithm for joint data detection and carrier phase estimation of continuous-phase-modulated signal is presented. The intent is to combine the robustness of noncoherent detectors with the superior performance of coherent ones. The algorithm differs from the Viterbi algorithm only in the metric that it maximizes over the possible transmitted data sequences. This metric is influenced both by the correlation with the received signal and the current estimate of the carrier phase. Carrier-phase estimation is based on decision guiding, but there is no external phase-locked loop. Instead, the phase of the best complex correlation with the received signal over the last few signaling intervals is used. The algorithm is slightly more complex than the coherent Viterbi algorithm but does not require narrowband filtering of the recovered carrier, as earlier appproaches did, to achieve the same level of performance.
Resumo:
PURPOSE: To evaluate the sensitivity and specificity of the screening mode of the Humphrey-Welch Allyn frequency-doubling technology (FDT), Octopus tendency-oriented perimetry (TOP), and the Humphrey Swedish Interactive Threshold Algorithm (SITA)-fast (HSF) in patients with glaucoma. DESIGN: A comparative consecutive case series. METHODS: This was a prospective study which took place in the glaucoma unit of an academic department of ophthalmology. One eye of 70 consecutive glaucoma patients and 28 age-matched normal subjects was studied. Eyes were examined with the program C-20 of FDT, G1-TOP, and 24-2 HSF in one visit and in random order. The gold standard for glaucoma was presence of a typical glaucomatous optic disk appearance on stereoscopic examination, which was judged by a glaucoma expert. The sensitivity and specificity, positive and negative predictive value, and receiver operating characteristic (ROC) curves of two algorithms for the FDT screening test, two algorithms for TOP, and three algorithms for HSF, as defined before the start of this study, were evaluated. The time required for each test was also analyzed. RESULTS: Values for area under the ROC curve ranged from 82.5%-93.9%. The largest area (93.9%) under the ROC curve was obtained with the FDT criteria, defining abnormality as presence of at least one abnormal location. Mean test time was 1.08 ± 0.28 minutes, 2.31 ± 0.28 minutes, and 4.14 ± 0.57 minutes for the FDT, TOP, and HSF, respectively. The difference in testing time was statistically significant (P <.0001). CONCLUSIONS: The C-20 FDT, G1-TOP, and 24-2 HSF appear to be useful tools to diagnose glaucoma. The test C-20 FDT and G1-TOP take approximately 1/4 and 1/2 of the time taken by 24 to 2 HSF. © 2002 by Elsevier Science Inc. All rights reserved.
Resumo:
Background: Vigabatrin (VGB) is an anti-epileptic medication which has been linked to peripheral constriction of the visual field. Documenting the natural history associated with continued VGB exposure is important when making decisions about the risk and benefits associated with the treatment. Due to its speed the Swedish Interactive Threshold Algorithm (SITA) has become the algorithm of choice when carrying out Full Threshold automated static perimetry. SITA uses prior distributions of normal and glaucomatous visual field behaviour to estimate threshold sensitivity. As the abnormal model is based on glaucomatous behaviour this algorithm has not been validated for VGB recipients. We aim to assess the clinical utility of the SITA algorithm for accurately mapping VGB attributed field loss. Methods: The sample comprised one randomly selected eye of 16 patients diagnosed with epilepsy, exposed to VGB therapy. A clinical diagnosis of VGB attributed visual field loss was documented in 44% of the group. The mean age was 39.3 years∈±∈14.5 years and the mean deviation was -4.76 dB ±4.34 dB. Each patient was examined with the Full Threshold, SITA Standard and SITA Fast algorithm. Results: SITA Standard was on average approximately twice as fast (7.6 minutes) and SITA Fast approximately 3 times as fast (4.7 minutes) as examinations completed using the Full Threshold algorithm (15.8 minutes). In the clinical environment, the visual field outcome with both SITA algorithms was equivalent to visual field examination using the Full Threshold algorithm in terms of visual inspection of the grey scale plots, defect area and defect severity. Conclusions: Our research shows that both SITA algorithms are able to accurately map visual field loss attributed to VGB. As patients diagnosed with epilepsy are often vulnerable to fatigue, the time saving offered by SITA Fast means that this algorithm has a significant advantage for use with VGB recipients.
Resumo:
Colour segmentation is the most commonly used method in road signs detection. Road sign contains several basic colours such as red, yellow, blue and white which depends on countries.The objective of this thesis is to do an evaluation of the four colour segmentation algorithms. Dynamic Threshold Algorithm, A Modification of de la Escalera’s Algorithm, the Fuzzy Colour Segmentation Algorithm and Shadow and Highlight Invariant Algorithm. The processing time and segmentation success rate as criteria are used to compare the performance of the four algorithms. And red colour is selected as the target colour to complete the comparison. All the testing images are selected from the Traffic Signs Database of Dalarna University [1] randomly according to the category. These road sign images are taken from a digital camera mounted in a moving car in Sweden.Experiments show that the Fuzzy Colour Segmentation Algorithm and Shadow and Highlight Invariant Algorithm are more accurate and stable to detect red colour of road signs. And the method could also be used in other colours analysis research. The yellow colour which is chosen to evaluate the performance of the four algorithms can reference Master Thesis of Yumei Liu.
Resumo:
This paper presents a new method to determine feeder reconfiguration scheme considering variable load profile. The objective function consists of system losses, reliability costs and also switching costs. In order to achieve an optimal solution the proposed method compares these costs dynamically and determines when and how it is reasonable to have a switching operation. The proposed method divides a year into several equal time periods, then using particle swarm optimization (PSO), optimal candidate configurations for each period are obtained. System losses and customer interruption cost of each configuration during each period is also calculated. Then, considering switching cost from a configuration to another one, dynamic programming algorithm (DPA) is used to determine the annual reconfiguration scheme. Several test systems were used to validate the proposed method. The obtained results denote that to have an optimum solution it is necessary to compare operation costs dynamically.
Resumo:
Reduction of the execution time of a job through equitable distribution of work load among the processors in a distributed system is the goal of load balancing. Performance of static and dynamic load balancing algorithms for the extended hypercube, is discussed. Threshold algorithms are very well-known algorithms for dynamic load balancing in distributed systems. An extension of the threshold algorithm, called the multilevel threshold algorithm, has been proposed. The hierarchical interconnection network of the extended hypercube is suitable for implementing the proposed algorithm. The new algorithm has been implemented on a transputer-based system and the performance of the algorithm for an extended hypercube is compared with those for mesh and binary hypercube networks
Resumo:
This paper considers a group of agents that aim to reach an agreement on individually received time-varying signals by local communication. In contrast to static network averaging problem, the consensus considered in this paper is reached in a dynamic sense. A discrete-time dynamic average consensus protocol can be designed to allow all the agents tracking the average of their reference inputs asymptotically. We propose a minimal-time dynamic consensus algorithm, which only utilises a minimal number of local observations of a randomly picked node in a network to compute the final consensus signal. Our results illustrate that with memory and computational ability, the running time of distributed averaging algorithms can be indeed improved dramatically as suggested by Olshevsky and Tsitsiklis. © 2012 AACC American Automatic Control Council).
Resumo:
We present five new cloud detection algorithms over land based on dynamic threshold or Bayesian techniques, applicable to the Advanced Along Track Scanning Radiometer (AATSR) instrument and compare these with the standard threshold based SADIST cloud detection scheme. We use a manually classified dataset as a reference to assess algorithm performance and quantify the impact of each cloud detection scheme on land surface temperature (LST) retrieval. The use of probabilistic Bayesian cloud detection methods improves algorithm true skill scores by 8-9 % over SADIST (maximum score of 77.93 % compared to 69.27 %). We present an assessment of the impact of imperfect cloud masking, in relation to the reference cloud mask, on the retrieved AATSR LST imposing a 2 K tolerance over a 3x3 pixel domain. We find an increase of 5-7 % in the observations falling within this tolerance when using Bayesian methods (maximum of 92.02 % compared to 85.69 %). We also demonstrate that the use of dynamic thresholds in the tests employed by SADIST can significantly improve performance, applicable to cloud-test data to provided by the Sea and Land Surface Temperature Radiometer (SLSTR) due to be launched on the Sentinel 3 mission (estimated 2014).
Resumo:
This thesis presents a system to recognise and classify road and traffic signs for the purpose of developing an inventory of them which could assist the highway engineers’ tasks of updating and maintaining them. It uses images taken by a camera from a moving vehicle. The system is based on three major stages: colour segmentation, recognition, and classification. Four colour segmentation algorithms are developed and tested. They are a shadow and highlight invariant, a dynamic threshold, a modification of de la Escalera’s algorithm and a Fuzzy colour segmentation algorithm. All algorithms are tested using hundreds of images and the shadow-highlight invariant algorithm is eventually chosen as the best performer. This is because it is immune to shadows and highlights. It is also robust as it was tested in different lighting conditions, weather conditions, and times of the day. Approximately 97% successful segmentation rate was achieved using this algorithm.Recognition of traffic signs is carried out using a fuzzy shape recogniser. Based on four shape measures - the rectangularity, triangularity, ellipticity, and octagonality, fuzzy rules were developed to determine the shape of the sign. Among these shape measures octangonality has been introduced in this research. The final decision of the recogniser is based on the combination of both the colour and shape of the sign. The recogniser was tested in a variety of testing conditions giving an overall performance of approximately 88%.Classification was undertaken using a Support Vector Machine (SVM) classifier. The classification is carried out in two stages: rim’s shape classification followed by the classification of interior of the sign. The classifier was trained and tested using binary images in addition to five different types of moments which are Geometric moments, Zernike moments, Legendre moments, Orthogonal Fourier-Mellin Moments, and Binary Haar features. The performance of the SVM was tested using different features, kernels, SVM types, SVM parameters, and moment’s orders. The average classification rate achieved is about 97%. Binary images show the best testing results followed by Legendre moments. Linear kernel gives the best testing results followed by RBF. C-SVM shows very good performance, but ?-SVM gives better results in some case.
Resumo:
Neurons generate spikes reliably with millisecond precision if driven by a fluctuating current--is it then possible to predict the spike timing knowing the input? We determined parameters of an adapting threshold model using data recorded in vitro from 24 layer 5 pyramidal neurons from rat somatosensory cortex, stimulated intracellularly by a fluctuating current simulating synaptic bombardment in vivo. The model generates output spikes whenever the membrane voltage (a filtered version of the input current) reaches a dynamic threshold. We find that for input currents with large fluctuation amplitude, up to 75% of the spike times can be predicted with a precision of +/-2 ms. Some of the intrinsic neuronal unreliability can be accounted for by a noisy threshold mechanism. Our results suggest that, under random current injection into the soma, (i) neuronal behavior in the subthreshold regime can be well approximated by a simple linear filter; and (ii) most of the nonlinearities are captured by a simple threshold process.
Resumo:
The application of energy minimisation methods for stereo matching has been demonstrated to produce high quality disparity maps. However the majority of these methods are known to be computationally expensive, requiring minutes or even hours of computation. We propose a fast minimisation scheme that produces strongly competitive results for significantly reduced computation, requiring only a few seconds of computation. In this paper, we present our iterated dynamic programming algorithm along with a quadtree subregioning process for fast stereo matching.
Resumo:
A novel approach of normal ECG recognition based on scale-space signal representation is proposed. The approach utilizes curvature scale-space signal representation used to match visual objects shapes previously and dynamic programming algorithm for matching CSS representations of ECG signals. Extraction and matching processes are fast and experimental results show that the approach is quite robust for preliminary normal ECG recognition.