970 resultados para QUT Speaker Identity Verification System
Resumo:
Report on a review of selected general and application controls over the Iowa Department of Human Services’ Issuance Verification System for the period March 19, 2009 through April 17, 2009
Resumo:
The usage of intensity modulated radiotherapy (IMRT) treatments necessitates a significant amount of patient-specific quality assurance (QA). This research has investigated the precision and accuracy of Kodak EDR2 film measurements for IMRT verifications, the use of comparisons between 2D dose calculations and measurements to improve treatment plan beam models, and the dosimetric impact of delivery errors. New measurement techniques and software were developed and used clinically at M. D. Anderson Cancer Center. The software implemented two new dose comparison parameters, the 2D normalized agreement test (NAT) and the scalar NAT index. A single-film calibration technique using multileaf collimator (MLC) delivery was developed. EDR2 film's optical density response was found to be sensitive to several factors: radiation time, length of time between exposure and processing, and phantom material. Precision of EDR2 film measurements was found to be better than 1%. For IMRT verification, EDR2 film measurements agreed with ion chamber results to 2%/2mm accuracy for single-beam fluence map verifications and to 5%/2mm for transverse plane measurements of complete plan dose distributions. The same system was used to quantitatively optimize the radiation field offset and MLC transmission beam modeling parameters for Varian MLCs. While scalar dose comparison metrics can work well for optimization purposes, the influence of external parameters on the dose discrepancies must be minimized. The ability of 2D verifications to detect delivery errors was tested with simulated data. The dosimetric characteristics of delivery errors were compared to patient-specific clinical IMRT verifications. For the clinical verifications, the NAT index and percent of pixels failing the gamma index were exponentially distributed and dependent upon the measurement phantom but not the treatment site. Delivery errors affecting all beams in the treatment plan were flagged by the NAT index, although delivery errors impacting only one beam could not be differentiated from routine clinical verification discrepancies. Clinical use of this system will flag outliers, allow physicists to examine their causes, and perhaps improve the level of agreement between radiation dose distribution measurements and calculations. The principles used to design and evaluate this system are extensible to future multidimensional dose measurements and comparisons. ^
Resumo:
This article evaluates an authentication technique for mobiles based on gestures. Users create a remindful identifying gesture to be considered as their in-air signature. This work analyzes a database of 120 gestures of different vulnerability, obtaining an Equal Error Rate (EER) of 9.19% when robustness of gestures is not verified. Most of the errors in this EER come from very simple and easily forgeable gestures that should be discarded at enrollment phase. Therefore, an in-air signature robustness verification system using Linear Discriminant Analysis is proposed to infer automatically whether the gesture is secure or not. Different configurations have been tested obtaining a lowest EER of 4.01% when 45.02% of gestures were discarded, and an optimal compromise of EER of 4.82% when 19.19% of gestures were automatically rejected.
Resumo:
European public administrations must manage citizens' digital identities, particularly considering interoperability among different countries. Owing to the diversity of electronic identity management (eIDM) systems, when users of one such system seek to communicate with governments using a different system, both systems must be linked and understand each other. To achieve this, the European Union is working on an interoperability framework. This article provides an overview of eIDM systems' current state at a pan-European level. It identifies and analyzes issues on which agreement exists, as well as those that aren't yet resolved and are preventing the adoption of a large-scale model.
Resumo:
This paper presents an innovative approach for signature verification and forgery detection based on fuzzy modeling. The signature image is binarized and resized to a fixed size window and is then thinned. The thinned image is then partitioned into a fixed number of eight sub-images called boxes. This partition is done using the horizontal density approximation approach. Each sub-image is then further resized and again partitioned into twelve further sub-images using the uniform partitioning approach. The features of consideration are normalized vector angle (α) from each box. Each feature extracted from sample signatures gives rise to a fuzzy set. Since the choice of a proper fuzzification function is crucial for verification, we have devised a new fuzzification function with structural parameters, which is able to adapt to the variations in fuzzy sets. This function is employed to develop a complete forgery detection and verification system.
Resumo:
A identidade goesa: uma, várias ou nenhuma? A lingua materna e as experiências históricas da comunidade ancestral são componentes essenciais desse legado. Em caso dos goeses que vivem em Goa ou tiveram antepassados goeses, a matriz cultural indiana está sempre presente, mesmo que seja em forma mais ou menos diluida. A presença colonial portuguesa deixou a sua marca, tanto nos goeses católicos, como nos goeses hindus. Sem esta componente dificilmente Goa poderia justificar o direito para o estatuto de Estado autónomo na união indiana. E como é com os goeses em Portugal? A consciencia importada das castas continua a jogar o seu papel nas várias associações em que predomina uma casta ou outra. Mas nada disso impede os goeses em Portugal de apreciar a carilada ou os cantares tradicionais de Goa. Nem perdem alguma oportunidade de visitar a sua terra natal.
Resumo:
Die Materialverfolgung gewinnt in der Metallindustrie immer mehr an Bedeutung:rnEs ist notwendig, dass ein Metallband im Fertigungsprozess ein festgelegtes Programm durchläuft - erst dann ist die Qualität des Endprodukts garantiert. Die bisherige Praxis besteht darin, jedem Metallband eine Nummer zuzuordnen, mit der dieses Band beschriftet wird. Bei einer tagelangen Lagerung der Bänder zwischen zwei Produktionsschritten erweist sich diese Methode als fehleranfällig: Die Beschriftungen können z.B. verloren gehen, verwechselt, falsch ausgelesen oder unleserlich werden. 2007 meldete die iba AG das Patent zur Identifikation der Metallbänder anhand ihres Dickenprofils an (Anhaus [3]) - damit kann die Identität des Metallbandes zweifelsfrei nachgewiesen werden, eine zuverlässige Materialverfolgung wurde möglich.Es stellte sich jedoch heraus, dass die messfehlerbehafteten Dickenprofile, die als lange Zeitreihen aufgefasst werden können, mit Hilfe von bisherigen Verfahren (z.B. L2-Abstandsminimierung oder Dynamic Time Warping) nicht erfolgreich verglichen werden können.Diese Arbeit stellt einen effizienten feature-basierten Algorithmus zum Vergleichrnzweier Zeitreihen vor. Er ist sowohl robust gegenüber Rauschen und Messausfällen als auch invariant gegenüber solchen Koordinatentransformationen der Zeitreihen wie Skalierung und Translation. Des Weiteren sind auch Vergleiche mit Teilzeitreihen möglich. Unser Framework zeichnet sich sowohl durch seine hohe Genauigkeit als auch durch seine hohe Geschwindigkeit aus: Mehr als 99.5% der Anfragen an unsere aus realen Profilen bestehende Testdatenbank werden richtig beantwortet. Mit mehreren hundert Zeitreihen-Vergleichen pro Sekunde ist es etwa um den Faktor 10 schneller als die auf dem Gebiet der Zeitreihenanalyse etablierten Verfahren, die jedoch nicht im Stande sind, mehr als 90% der Anfragen korrekt zu verarbeiten. Der Algorithmus hat sich als industrietauglich erwiesen. Die iba AG setzt ihn in einem weltweit einzigartigen dickenprofilbasierten Überwachungssystemrnzur Materialverfolgung ein, das in ersten Stahl- und Aluminiumwalzwerkenrnbereits erfolgreich zum Einsatz kommt.
Resumo:
El Hogar Digital Accesible (HDA) de la ETSIST nace con el propósito de acercar las nuevas Tecnologías de la Información a las personas que precisan de necesidades concretas de accesibilidad y usabilidad, dotándoles de herramientas que les permitan aumentar su calidad de vida, confort, seguridad y autonomía. El entorno del HDA consta de elementos de control para puertas, persianas, iluminación, agua o gas, sensores de temperatura, incendios, gas, sistemas de climatización, sistemas de entretenimiento y sistemas de seguridad tales como detectores de presencia y alarmas. Todo ello apoyado sobre una arquitectura de red que proporciona una pasarela residencial y un acceso a banda ancha. El objetivo principal de este PFG ha sido el desarrollo de un sistema de autenticación para el Hogar Digital Accesible de bajo coste. La idea de integrar un sistema de autenticación en el HDA, surge de la necesidad de proteger de accesos no deseados determinados servicios disponibles dentro de un ámbito privado. Algunos de estos servicios pueden ser tales como el acceso a la lectura de los mensajes disponibles en el contestador automático, el uso de equipos multimedia, la desconexión de alarmas de seguridad o simplemente la configuración de ambientes según el usuario que esté autenticado (intensidad de luz, temperatura de la sala, etc.). En el desarrollo han primado los principios de accesibilidad, usabilidad y seguridad necesarios para la creación de un entorno no invasivo, que permitiera acreditar la identidad del usuario frente al sistema HDA. Se ha planteado como posible solución, un sistema basado en el reconocimiento de un trazo realizado por el usuario. Este trazo se usará como clave de cara a validar a los usuarios. El usuario deberá repetir el trazado que registró en el sistema para autenticarse. Durante la ejecución del presente PFG, se justificará la elección de este mecanismo de autenticación frente a otras alternativas disponibles en el mercado. Para probar la aplicación, se ha podido contar con dos periféricos de distintas gamas, el uDraw creado para la PS3 que se compone de una tableta digitalizadora y un lápiz que permite recoger los trazos realizados por el usuario de forma inalámbrica y la tableta digitalizadora Bamboo de Wacom. La herramienta desarrollada permite a su vez, la posibilidad de ser usada por otro tipo de dispositivos como es el caso del reloj con acelerómetro de 3 ejes de Texas Instruments Chronos eZ430 capaz de trasladar los movimientos del usuario al puntero de un ratón. El PFG se encuentra dividido en tres grandes bloques de flujo de trabajo. El primero se centra en el análisis del sistema y las tecnologías que lo componen, incluyendo los distintos algoritmos disponibles para realizar la autenticación basada en reconocimiento de patrones aplicados a imágenes que mejor se adaptan a las necesidades del usuario. En el segundo bloque se recoge una versión de prueba basada en el análisis y el diseño UML realizado previamente, sobre la que se efectuaron pruebas de concepto y se comprobó la viabilidad del proyecto. El último bloque incluye la verificación y validación del sistema mediante pruebas que certifican que se han alcanzado los niveles de calidad necesarios para la consecución de los objetivos planteados, generando finalmente la documentación necesaria. Como resultado del trabajo realizado, se ha obtenido un sistema que plantea una arquitectura fácilmente ampliable lograda a través del uso de técnicas como la introspección, que permiten separar la lógica de la capa de negocio del código que la implementa, pudiendo de forma simple e intuitiva sustituir código mediante ficheros de configuración, lo que hace que el sistema sea flexible y escalable. Tras la realización del PFG, se puede concluir que el producto final obtenido ha respondido de forma satisfactoria alcanzando los niveles de calidad requeridos, siendo capaz de proporcionar un sistema de autenticación alternativo a los convencionales, manteniendo unas cotas de seguridad elevadas y haciendo de la accesibilidad y el precio sus características más reseñables. ABSTRACT. Accessible Digital Home (HDA) of the ETSIST was created with the aim of bringing the latest information and communications technologies closer to the people who has special needs of accessibility and usability increasing their quality of life, comfort, security and autonomy. The HDA environment has different control elements for doors, blinds, lighting, water or gas, temperature sensors, fire protection systems, gas flashover, air conditioning systems, entertainments systems and security systems such as intruders detectors and alarms. Everything supported by an architecture net which provides a broadband residential services gateway. The main goal of this PFG was the development of a low-cost authentication system for the Accessible Digital Home. The idea of integrating an authentication system on the HDA, stems from the need to safeguard certain private key network resources from unauthorized access. Some of said resources are the access to the answering machine messages, the use of multimedia devices, the alarms deactivation or the parameter settings for each environment as programmed by the authenticated user (light intensity, room temperature, etc.). During the development priority was given to concepts like accessibility, usability and security. All of them necessary to create a non invasive environment that allows the users to certify their identity. A system based on stroke pattern recognition, was considered as a possible solution. This stroke is used as a key to validate users. The user must repeat the stroke that was saved on the system to validate access. The selection of this authentication mechanism among the others available options will be justified during this PFG. Two peripherals with different ranges were used to test the application. One of them was uDraw design for the PS3. It is wireless and is formed by a pen and a drawing tablet that allow us to register the different strokes drawn by the user. The other one was the Wacom Bamboo tablet, that supports the same functionality but with better accuracy. The developed tool allows another kind of peripherals like the 3-axes accelerometer digital wristwatch Texas Instruments Chronos eZ430 capable of transfering user movements to the mouse cursor. The PFG is divided by three big blocks that represent different workflows. The first block is focused on the system analysis and the technologies related to it, including algorithms for image pattern recognition that fits the user's needs. The second block describes how the beta version was developed based on the UML analysis and design previously done. It was tested and the viability of the project was verified. The last block contains the system verification and validation. These processes certify that the requirements have been fulfilled as well as the quality levels needed to reach the planned goals. Finally all the documentation has been produced. As a result of the work, an expandable system has been created, due to the introspection that provides the opportunity to separate the business logic from the code that implements it. With this technique, the code could be replaced throughout configuration files which makes the system flexible and highly scalable. Once the PFG has finished, it must therefore be concluded that the final product has been a success and high levels of quality have been achieved. This authentication tool gives us a low-cost alternative to the conventional ones. The new authentication system remains security levels reasonably high giving particular emphasis to the accessibility and the price.
Resumo:
This paper proposes an emotion transplantation method capable of modifying a synthetic speech model through the use of CSMAPLR adaptation in order to incorporate emotional information learned from a different speaker model while maintaining the identity of the original speaker as much as possible. The proposed method relies on learning both emotional and speaker identity information by means of their adaptation function from an average voice model, and combining them into a single cascade transform capable of imbuing the desired emotion into the target speaker. This method is then applied to the task of transplanting four emotions (anger, happiness, sadness and surprise) into 3 male speakers and 3 female speakers and evaluated in a number of perceptual tests. The results of the evaluations show how the perceived naturalness for emotional text significantly favors the use of the proposed transplanted emotional speech synthesis when compared to traditional neutral speech synthesis, evidenced by a big increase in the perceived emotional strength of the synthesized utterances at a slight cost in speech quality. A final evaluation with a robotic laboratory assistant application shows how by using emotional speech we can significantly increase the students’ satisfaction with the dialog system, proving how the proposed emotion transplantation system provides benefits in real applications.
Resumo:
Työn tavoitteena oli toteuttaa simulointimalli, jolla pystytään tutkimaan kestomagnetoidun tahtikoneen aiheuttaman vääntömomenttivärähtelyn vaikutuksia sähkömoottoriin liitetyssä mekaniikassa. Tarkoitus oli lisäksi selvittää kuinka kyseinen simulointimalli voidaan toteuttaa nykyaikaisia simulointiohjelmia käyttäen. Saatujen simulointitulosten oikeellisuus varmistettiin tätä työtä varten rakennetulla verifiointilaitteistolla. Tutkittava rakenne koostui akselista, johon kiinnitettiin epäkeskotanko. Epäkeskotankoon kiinnitettiin massa, jonka sijaintia voitiin muunnella. Massan asemaa muuttamalla saatiin rakenteelle erilaisia ominaistaajuuksia. Epäkeskotanko mallinnettiin joustavana elementtimenetelmää apuna käyttäen. Mekaniikka mallinnettiin dynamiikan simulointiin tarkoitetussa ADAMS –ohjelmistossa, johon joustavana mallinnettu epäkeskotanko tuotiin ANSYS –elementtimenetelmäohjelmasta. Mekaniikan malli siirrettiin SIMULINK –ohjelmistoon, jossa mallinnettiin myös sähkökäyttö. SIMULINK –ohjelmassa mallinnettiin sähkökäyttö, joka kuvaa kestomagnetoitua tahtikonetta. Kestomagnetoidun tahtikoneen yhtälöt perustuvat lineaarisiin differentiaaliyhtälöihin, joihin hammasvääntömomentin vaikutus on lisätty häiriösignaalina. Sähkökäytön malli tuottaa vääntömomenttia, joka syötetään ADAMS –ohjelmistolla mallinnettuun mekaniikkaan. Mekaniikan mallista otetaan roottorin kulmakiihtyvyyden arvo takaisinkytkentänä sähkömoottorin malliin. Näin saadaan aikaiseksi yhdistetty simulointi, joka koostuu sähkötoimilaitekäytöstä ja mekaniikasta. Tulosten perusteella voidaan todeta, että sähkökäyttöjen ja mekaniikan yhdistetty simulointi on mahdollista toteuttaa valituilla menetelmillä. Simuloimalla saadut tulokset vastaavat hyvin mitattuja tuloksia.
Resumo:
EU on käynnistämässä ympäristöteknologioiden verifiointijärjestelmää, jonka avulla voidaan tarjota käyttäjille ja sijoittajille riippumatonta tietoa innovatiivisten teknologioiden toimivuudesta ja suorituskyvystä ja parantaa niiden markkina-asemaa. Verifioinnilla tarkoitetaan kolmannen osapuolen suorittamaa prosessia tai mekanismia, jonka avulla tuotteen toiminta ja suorituskyky voidaan todentaa. Kansallinen ympäristöteknologioiden verifiointijärjestelmä on käytössä mm. Yhdysvalloissa ja Kanadassa. Euroopassa järjestelmä otettaneen käyttöön vuonna 2011–2012. Suomessa tehdään nykyisin noin 300 pilaantuneen maan puhdistushanketta koskevaa lupa- ja ilmoituspäätöstä vuosittain. Noin 85 prosentissa kohteista käytetään kunnostusmenetelmänä massanvaihtoa. Massanvaihto tulee ainakin toistaiseksi säilymään yleisimpänä kunnostusmenetelmänä, mutta mm. in situ -menetelmien käytön arvioidaan lisääntyvän. Tämän diplomityön tavoitteena oli arvioida voidaanko verifiointia hyödyntää pilaantuneiden maiden laadukkaan käsittelyn edistämisessä ja voiko verifiointi nopeuttaa innovatiivisten kunnostusmenetelmien markkinoillepääsyä. Aihetta tarkasteltiin mm. kahden erityyppisen pilaantuneen maan ja pohjaveden kunnostusmenetelmän, reaktiivisten seinämien (in situ) ja bitumistabiloinnin (ex situ) kautta. Pilaantuneiden maiden kunnostusmenetelmien toimivuus riippuu monista eri tekijöistä, joista osaa ei voida hallita tai mallintaa luotettavasti. Verifiointi soveltuukin parhaiten laitteiden tai PIMA-menetelmiä yksinkertaisempien puhdistusmenetelmien suorituskyvyn todentamiseen. Verifiointi saattaa kuitenkin hyvin toimia PIMA-kunnostuksen kohdalla esimerkiksi tiedollisena ohjauskeinona. Reaktiivisten seinämien ja bitumistabiloinnin verifioinnin työvaiheet ovat hyvin samankaltaiset, suurimpana erona seinämien kohdalla tulee kuvata myös kohde johon seinämä on asennettu. Reaktiivisten seinämien toiminta on riippuvaista monista ympäristötekijöistä, toisin kuin erillisellä laitteistolla suoritettavan bitumistabiloinnin. Tulosten perusteella voidaan yleistää, että verifiointi soveltuu paremmin ex situ -, kuin in situ -kunnostusmenetelmille.
Resumo:
The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.
Resumo:
In Safety critical software failure can have a high price. Such software should be free of errors before it is put into operation. Application of formal methods in the Software Development Life Cycle helps to ensure that the software for safety critical missions are ultra reliable. PVS theorem prover, a formal method tool, can be used for the formal verification of software in ADA Language for Flight Software Application (ALFA.). This paper describes the modeling of ALFA programs for PVS theorem prover. An ALFA2PVS translator is developed which automatically converts the software in ALFA to PVS specification. By this approach the software can be verified formally with respect to underflow/overflow errors and divide by zero conditions without the actual execution of the code.