996 resultados para verification algorithm
Resumo:
This paper presents a study on wavelets and their characteristics for the specific purpose of serving as a feature extraction tool for speaker verification (SV), considering a Radial Basis Function (RBF) classifier, which is a particular type of Artificial Neural Network (ANN). Examining characteristics such as support-size, frequency and phase responses, amongst others, we show how Discrete Wavelet Transforms (DWTs), particularly the ones which derive from Finite Impulse Response (FIR) filters, can be used to extract important features from a speech signal which are useful for SV. Lastly, an SV algorithm based on the concepts presented is described.
Resumo:
Les systèmes Matériels/Logiciels deviennent indispensables dans tous les aspects de la vie quotidienne. La présence croissante de ces systèmes dans les différents produits et services incite à trouver des méthodes pour les développer efficacement. Mais une conception efficace de ces systèmes est limitée par plusieurs facteurs, certains d'entre eux sont: la complexité croissante des applications, une augmentation de la densité d'intégration, la nature hétérogène des produits et services, la diminution de temps d’accès au marché. Une modélisation transactionnelle (TLM) est considérée comme un paradigme prometteur permettant de gérer la complexité de conception et fournissant des moyens d’exploration et de validation d'alternatives de conception à des niveaux d’abstraction élevés. Cette recherche propose une méthodologie d’expression de temps dans TLM basée sur une analyse de contraintes temporelles. Nous proposons d'utiliser une combinaison de deux paradigmes de développement pour accélérer la conception: le TLM d'une part et une méthodologie d’expression de temps entre différentes transactions d’autre part. Cette synergie nous permet de combiner dans un seul environnement des méthodes de simulation performantes et des méthodes analytiques formelles. Nous avons proposé un nouvel algorithme de vérification temporelle basé sur la procédure de linéarisation des contraintes de type min/max et une technique d'optimisation afin d'améliorer l'efficacité de l'algorithme. Nous avons complété la description mathématique de tous les types de contraintes présentées dans la littérature. Nous avons développé des méthodes d'exploration et raffinement de système de communication qui nous a permis d'utiliser les algorithmes de vérification temporelle à différents niveaux TLM. Comme il existe plusieurs définitions du TLM, dans le cadre de notre recherche, nous avons défini une méthodologie de spécification et simulation pour des systèmes Matériel/Logiciel basée sur le paradigme de TLM. Dans cette méthodologie plusieurs concepts de modélisation peuvent être considérés séparément. Basée sur l'utilisation des technologies modernes de génie logiciel telles que XML, XSLT, XSD, la programmation orientée objet et plusieurs autres fournies par l’environnement .Net, la méthodologie proposée présente une approche qui rend possible une réutilisation des modèles intermédiaires afin de faire face à la contrainte de temps d’accès au marché. Elle fournit une approche générale dans la modélisation du système qui sépare les différents aspects de conception tels que des modèles de calculs utilisés pour décrire le système à des niveaux d’abstraction multiples. En conséquence, dans le modèle du système nous pouvons clairement identifier la fonctionnalité du système sans les détails reliés aux plateformes de développement et ceci mènera à améliorer la "portabilité" du modèle d'application.
Resumo:
This paper presents a new face verification algorithm based on Gabor wavelets and AdaBoost. In the algorithm, faces are represented by Gabor wavelet features generated by Gabor wavelet transform. Gabor wavelets with 5 scales and 8 orientations are chosen to form a family of Gabor wavelets. By convolving face images with these 40 Gabor wavelets, the original images are transformed into magnitude response images of Gabor wavelet features. The AdaBoost algorithm selects a small set of significant features from the pool of the Gabor wavelet features. Each feature is the basis for a weak classifier which is trained with face images taken from the XM2VTS database. The feature with the lowest classification error is selected in each iteration of the AdaBoost operation. We also address issues regarding computational costs in feature selection with AdaBoost. A support vector machine (SVM) is trained with examples of 20 features, and the results have shown a low false positive rate and a low classification error rate in face verification.
Resumo:
为了简化在限界模型检测过程中模型的建立过程,给出了一种采用基于一阶迁移系统语言的模型建立方法,并在此一阶迁移系统语言中加入了通道的功能,增强了描述能力。然后在此基础上完成了一个以基于插值和k步归纳的限界验证算法为核心的模型检测工具(BMCF),最后利用该工具对常见的互斥协议,简单数据传输协议的性质进行了分析与验证。结果表明,利用该工具对系统进行建模具有方便直观的特点,并借助实现的验证算法能高效的检验性质的正确性,如果性质不成立工具还会给出反例提示。
Resumo:
A computational algorithm (based on Smullyan's analytic tableau method) that varifies whether a given well-formed formula in propositional calculus is a tautology or not has been implemented on a DEC system 10. The stepwise refinement approch of program development used for this implementation forms the subject matter of this paper. The top-down design has resulted in a modular and reliable program package. This computational algoritlhm compares favourably with the algorithm based on the well-known resolution principle used in theorem provers.
Resumo:
Työssä arvioidaan ja verifioidaan puheluiden luokitteluun suunniteltu Call Sequence Analysing Algorithm (CSA-algoritmi). Algoritmin tavoitteena on luokitella riittävän samankaltaiset puhelut ryhmiksi tarkempaa vika-analyysia varten. Työssä esitellään eri koneoppimisalgoritmien pääluokitukset ja niiden tyypilliset eroavaisuudet, eri luokitteluprosesseille ominaiset datatyypit, sekä toimintaympäristöt, joissa kyseinen toteutus on suunniteltu toimivaksi. CSA-algoritmille syötetään verkon ylläpitoviesteistä koostuvia viestisarjoja, joiden sisällön perusteella samankaltaiset sarjat ryhmitellään kokonaisuuksiksi. Algoritmin suorituskykyä arvioidaan 94 käsin luokitellun verrokkisarjan avulla. Sarjat on kerätty toimivasta 3G-verkon kontrollerista. Kahta sarjaa vertailemalla sarjaparille muodostetaan keskinäinen tunnusluku: sarjojen samanlaisuutta kuvaava etäisyys. Tässä työssä keskitytään erityisesti Hamming-etäisyyteen. Etäisyyden avulla sarjat koostetaan ryhmiksi. Muuttamalla hyväksyttävää maksimietäisyyttä, jonka perusteella kaksi sarjaa lasketaan kuuluvaksi samaan ryhmään, saadaan aikaiseksi alaryhmiä, joihin kuuluu ainoastaan samankaltaisia sarjoja. Hyväksyttävän etäisyyden kasvaessa, myös virheluokitusten määrä kasvaa. Oikeiden lajittelutulosten vertailukohteena toimii käsin luokiteltu ryhmittely. CSA-algoritmin luokittelutuloksen tarkkuus esitetään prosentuaalisena osuutena tavoiteryhmittelystä maksimietäisyyden funktiona. Työssä osoitetaan, miten etäisyysattribuutiksi valittu Hamming-etäisyys ei sovellu tämän datan luokitteluun. Työn lopussa ehdotetaan menetelmää ja työkalua, joiden avulla useampaa eri lajittelija-algoritmia voidaan testata nopealla kehityssyklillä.
Resumo:
Considerate amount of research has proposed optimization-based approaches employing various vibration parameters for structural damage diagnosis. The damage detection by these methods is in fact a result of updating the analytical structural model in line with the current physical model. The feasibility of these approaches has been proven. But most of the verification has been done on simple structures, such as beams or plates. In the application on a complex structure, like steel truss bridges, a traditional optimization process will cost massive computational resources and lengthy convergence. This study presents a multi-layer genetic algorithm (ML-GA) to overcome the problem. Unlike the tedious convergence process in a conventional damage optimization process, in each layer, the proposed algorithm divides the GA’s population into groups with a less number of damage candidates; then, the converged population in each group evolves as an initial population of the next layer, where the groups merge to larger groups. In a damage detection process featuring ML-GA, as parallel computation can be implemented, the optimization performance and computational efficiency can be enhanced. In order to assess the proposed algorithm, the modal strain energy correlation (MSEC) has been considered as the objective function. Several damage scenarios of a complex steel truss bridge’s finite element model have been employed to evaluate the effectiveness and performance of ML-GA, against a conventional GA. In both single- and multiple damage scenarios, the analytical and experimental study shows that the MSEC index has achieved excellent damage indication and efficiency using the proposed ML-GA, whereas the conventional GA only converges at a local solution.
Resumo:
Two dimensional Optical Orthogonal Codes (OOCs) named Wavelength/Time Multiple-Pulses-per-Row (W/T MPR) codes suitable for use in incoherent fiber-optic code division multiple access (FO-CDMA) networks are reported in [6]. In this paper, we report the construction of W/T MPR codes, using Greedy Algorithm (GA), with distinct 1-D OOCs [1] as the row vectors. We present the W/T MPR codes obtained using the GA. Further, we verify the correlation properties of the generated W/T MPR codes using Matlab.
Resumo:
Formal specification is vital to the development of distributed real-time systems as these systems are inherently complex and safety-critical. It is widely acknowledged that formal specification and automatic analysis of specifications can significantly increase system reliability. Although a number of specification techniques for real-time systems have been reported in the literature, most of these formalisms do not adequately address to the constraints that the aspects of 'distribution' and 'real-time' impose on specifications. Further, an automatic verification tool is necessary to reduce human errors in the reasoning process. In this regard, this paper is an attempt towards the development of a novel executable specification language for distributed real-time systems. First, we give a precise characterization of the syntax and semantics of DL. Subsequently, we discuss the problems of model checking, automatic verification of satisfiability of DL specifications, and testing conformance of event traces with DL specifications. Effective solutions to these problems are presented as extensions to the classical first-order tableau algorithm. The use of the proposed framework is illustrated by specifying a sample problem.
Resumo:
A polynomial time algorithm (pruned correspondence search, PCS) with good average case performance for solving a wide class of geometric maximal matching problems, including the problem of recognizing 3D objects from a single 2D image, is presented. Efficient verification algorithms, based on a linear representation of location constraints, are given for the case of affine transformations among vector spaces and for the case of rigid 2D and 3D transformations with scale. Some preliminary experiments suggest that PCS is a practical algorithm. Its similarity to existing correspondence based algorithms means that a number of existing techniques for speedup can be incorporated into PCS to improve its performance.
Resumo:
Coccolithophores are the largest source of calcium carbonate in the oceans and are considered to play an important role in oceanic carbon cycles. Current methods to detect the presence of coccolithophore blooms from Earth observation data often produce high numbers of false positives in shelf seas and coastal zones due to the spectral similarity between coccolithophores and other suspended particulates. Current methods are therefore unable to characterise the bloom events in shelf seas and coastal zones, despite the importance of these phytoplankton in the global carbon cycle. A novel approach to detect the presence of coccolithophore blooms from Earth observation data is presented. The method builds upon previous optical work and uses a statistical framework to combine spectral, spatial and temporal information to produce maps of coccolithophore bloom extent. Validation and verification results for an area of the north east Atlantic are presented using an in situ database (N = 432) and all available SeaWiFS data for 2003 and 2004. Verification results show that the approach produces a temporal seasonal signal consistent with biological studies of these phytoplankton. Validation using the in situ coccolithophore cell count database shows a high correct recognition rate of 80% and a low false-positive rate of 0.14 (in comparison to 63% and 0.34 respectively for the established, purely spectral approach). To guide its broader use, a full sensitivity analysis for the algorithm parameters is presented.
Resumo:
PURPOSE: MicroRNAs (miRNAs) play a global role in regulating gene expression and have important tissue-specific functions. Little is known about their role in the retina. The purpose of this study was to establish the retinal expression of those miRNAs predicted to target genes involved in vision. METHODS: miRNAs potentially targeting important "retinal" genes, as defined by expression pattern and implication in disease, were predicted using a published algorithm (TargetScan; Envisioneering Medical Technologies, St. Louis, MO). The presence of candidate miRNAs in human and rat retinal RNA was assessed by RT-PCR. cDNA levels for each miRNA were determined by quantitative PCR. The ability to discriminate between miRNAs varying by a single nucleotide was assessed. The activity of miR-124 and miR-29 against predicted target sites in Rdh10 and Impdh1 was tested by cotransfection of miRNA mimics and luciferase reporter plasmids. RESULTS: Sixty-seven miRNAs were predicted to target one or more of the 320 retinal genes listed herein. All 11 candidate miRNAs tested were expressed in the retina, including miR-7, miR-124, miR135a, and miR135b. Relative levels of individual miRNAs were similar between rats and humans. The Rdh10 3'UTR, which contains a predicted miR-124 target site, mediated the inhibition of luciferase activity by miR-124 mimics in cell culture. CONCLUSIONS: Many miRNAs likely to regulate genes important for retinal function are present in the retina. Conservation of miRNA retinal expression patterns from rats to humans supports evidence from other tissues that disruption of miRNAs is a likely cause of a range of visual abnormalities.
Resumo:
We propose a dynamic verification approach for large-scale message passing programs to locate correctness bugs caused by unforeseen nondeterministic interactions. This approach hinges on an efficient protocol to track the causality between nondeterministic message receive operations and potentially matching send operations. We show that causality tracking protocols that rely solely on logical clocks fail to capture all nuances of MPI program behavior, including the variety of ways in which nonblocking calls can complete. Our approach is hinged on formally defining the matches-before relation underlying the MPI standard, and devising lazy update logical clock based algorithms that can correctly discover all potential outcomes of nondeterministic receives in practice. can achieve the same coverage as a vector clock based algorithm while maintaining good scalability. LLCP allows us to analyze realistic MPI programs involving a thousand MPI processes, incurring only modest overheads in terms of communication bandwidth, latency, and memory consumption. © 2011 IEEE.