970 resultados para QUT Speaker Identity Verification System


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this work was to track and verify the delivery of respiratory-gated irradiations, performed with three versions of TrueBeam linac, using a novel phantom arrangement that combined the OCTAVIUS® SRS 1000 array with a moving platform. The platform was programmed to generate sinusoidal motion of the array. This motion was tracked using the real-time position management (RPM) system and four amplitude gating options were employed to interrupt MV beam delivery when the platform was not located within set limits. Time-resolved spatial information extracted from analysis of x-ray fluences measured by the array was compared to the programmed motion of the platform and to the trace recorded by the RPM system during the delivery of the x-ray field. Temporal data recorded by the phantom and the RPM system were validated against trajectory log files, recorded by the linac during the irradiation, as well as oscilloscope waveforms recorded from the linac target signal. Gamma analysis was employed to compare time-integrated 2D x-ray dose fluences with theoretical fluences derived from the probability density function for each of the gating settings applied, where gamma criteria of 2%/2 mm, 1%/1 mm and 0.5%/0.5 mm were used to evaluate the limitations of the RPM system. Excellent agreement was observed in the analysis of spatial information extracted from the SRS 1000 array measurements. Comparisons of the average platform position with the expected position indicated absolute deviations of  <0.5 mm for all four gating settings. Differences were observed when comparing time-resolved beam-on data stored in the RPM files and trajectory logs to the true target signal waveforms. Trajectory log files underestimated the cycle time between consecutive beam-on windows by 10.0  ±  0.8 ms. All measured fluences achieved 100% pass-rates using gamma criteria of 2%/2 mm and 50% of the fluences achieved pass-rates  >90% when criteria of 0.5%/0.5 mm were used. Results using this novel phantom arrangement indicate that the RPM system is capable of accurately gating x-ray exposure during the delivery of a fixed-field treatment beam.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN]Most automatic recognition systems are focused on recognizing, given a single mug-shot of an individual, any new image of that individual. Most verification systems are designed to authenticate an identity provided by the user. However, the previous work rarely focus on the problem of detecting when a new individual, i.e., an unknown one, is present. The goal of the work presented in this paper deals with the possibility of providing the system with basic tools to detect when a new individual starts an interactive session, in order to allow the system to add or improve an identity model in the database. Experiments carried out with a set of 36 different individuals show promising results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article summarises the explorations of two Initial Teacher Education (ITE) lecturers looking particularly at Muslim families’ sense of belonging as they encounter the British education system. The study draws on Garcia’s (2009, Alstad, 2013) view of monoglossic and heteroglossic settings, and on Cremin’s (2015) proposition of the super-diversity of inner-city experiences. Case studies of individual families are used to create a picture that reflects the complexity and shifting nature of cultures, languages and identities in present-day Britain. Video and tape interviews are used and data coded and analysed to identify prevailing themes. The families and schools taking part are active participants in the research process, giving informed and ongoing consent, and having control of the resulting findings. Parents’ and children’s perceptions and experience have evolved in complex ways across the generations, and in ways that challenge the stereotypes that dominate media portrayals. Early findings suggest that existing paradigms for discussing identity fail to capture the increasingly complex and super-diverse realities. In a world where xenophobia currently fuels rigid and stereotypical views of cultures in general and Muslim cultures in particular, it is important that the complexity of families’ identities and relationships to the existing systems is seen, heard and appreciated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The air-sea flux of greenhouse gases (e.g. carbon dioxide, CO2) is a critical part of the climate system and a major factor in the biogeochemical development of the oceans. More accurate and higher resolution calculations of these gas fluxes are required if we are to fully understand and predict our future climate. Satellite Earth observation is able to provide large spatial scale datasets that can be used to study gas fluxes. However, the large storage requirements needed to host such data can restrict its use by the scientific community. Fortunately, the development of cloud-computing can provide a solution. Here we describe an open source air-sea CO2 flux processing toolbox called the ‘FluxEngine’, designed for use on a cloud-computing infrastructure. The toolbox allows users to easily generate global and regional air-sea CO2 flux data from model, in situ and Earth observation data, and its air-sea gas flux calculation is user configurable. Its current installation on the Nephalae cloud allows users to easily exploit more than 8 terabytes of climate-quality Earth observation data for the derivation of gas fluxes. The resultant NetCDF data output files contain >20 data layers containing the various stages of the flux calculation along with process indicator layers to aid interpretation of the data. This paper describes the toolbox design, the verification of the air-sea CO2 flux calculations, demonstrates the use of the tools for studying global and shelf-sea air-sea fluxes and describes future developments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, we present a quantitative approach using probabilistic verification techniques for the analysis of reliability, availability, maintainability, and safety (RAMS) properties of satellite systems. The subject of our research is satellites used in mission critical industrial applications. A strong case for using probabilistic model checking to support RAMS analysis of satellite systems is made by our verification results. This study is intended to build a foundation to help reliability engineers with a basic background in model checking to apply probabilistic model checking to small satellite systems. We make two major contributions. One of these is the approach of RAMS analysis to satellite systems. In the past, RAMS analysis has been extensively applied to the field of electrical and electronics engineering. It allows system designers and reliability engineers to predict the likelihood of failures from the indication of historical or current operational data. There is a high potential for the application of RAMS analysis in the field of space science and engineering. However, there is a lack of standardisation and suitable procedures for the correct study of RAMS characteristics for satellite systems. This thesis considers the promising application of RAMS analysis to the case of satellite design, use, and maintenance, focusing on its system segments. Data collection and verification procedures are discussed, and a number of considerations are also presented on how to predict the probability of failure. Our second contribution is leveraging the power of probabilistic model checking to analyse satellite systems. We present techniques for analysing satellite systems that differ from the more common quantitative approaches based on traditional simulation and testing. These techniques have not been applied in this context before. We present the use of probabilistic techniques via a suite of detailed examples, together with their analysis. Our presentation is done in an incremental manner: in terms of complexity of application domains and system models, and a detailed PRISM model of each scenario. We also provide results from practical work together with a discussion about future improvements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Part 6: Engineering and Implementation of Collaborative Networks

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In today’s world heritage worldwide are at the risk not only because of natural process of decay and destruction but also by social change like urbanization, globalization and homogenization of cultures. With these emerging problems, the heritage conservation discourse also has reached to a new dimension including broader range of concepts like tangible heritage, intangible heritage, community participation, indigenous knowledge and many more. Even with the changing scenario in the international context about the heritage conservation, Nepal’s heritage conservation still focus on monuments, sites and buildings. In add to that the conservation practices are still top-down approach and community involvements are limited only in plans. While numerous intangible heritages like masking dances chariot processions, festivals and rituals, which form an integral part of the daily social life of people are still being continued and managed by the community and its people, without with out serious attention form the government. In Kathmandu Valley these heritages has been maintained with the traditional social association of people known as “Guthi” which has been continuing since 5th Century. Most of the tangible and intangible heritages have survived for centuries because of this unique association of people. Among the numerous festivals of the Kathmandu Valley, the festival Yenya Punhi was chosen as a case for this study, which is also a major festival of Kathmandu. This festival is the perfect example for the study as its celebrated in the city that is the most urbanized city of Nepal with the challenges of the every modern city like social changes and urbanization. Despite modern challenges Guthi still plays a major role in the heritage conservation in Kathmandu Valley. Now there are some interventions of the various formal institutions. So this study will be focusing on the management, continuity and problems of the festival along with Nepal’s position in terms of intangible heritage conservation. The problem of Kathmandu and Yenya Punhi festival is the problem of every country in the similar situation so with this case study it can be a good example for finding solutions of the similar problem not only the other festivals within Nepal but also elsewhere in the world; Resumo: Conexão de Património: Festival Yenya Punhi um caminho de fortalecimento de identidade: A experiência de Catmandu Nos dias de hoje, os patrimónios mundiais encontram-se em risco, não só devido ao processo natural de degradação e destruição, mas também pelas mudanças sociais, tais como a urbanização, globalização e homogeneização de culturas. Com o emergir destes problemas, o discurso de conservação de Património atingiu também uma nova dimensão, incluíndo uma área mais abrangente de conceitos, como por exemplo, património material, património imaterial, participação da comunidade, conhecimento indígena, entre outros. Mesmo com este cenário de mudança no contexto mundial de conservação do património, a preservação do património do Nepal continua a focar-se em monumentos, sítios e edíficios. A acrescentar a isso, as práticas de conservação ainda têm uma abordagem descendente e os envolvimentos da comunidade são limitados por planificações. Enquanto que os numerosos patrimónios imateriais como danças com máscaras, procissões, festivais e rituais, os quais formam uma parte integral da vida diária social das pessoas que as continuam e as gerem em comunidade, sem uma atenção séria por parte do governo. No Vale de Catmandu, este património tem sido mantido pela associação tradicional de pessoas conhecidas como ''Guthi'' desde o século V. A maior parte destes patrimónios materiais e imateriais tem sobrevivido durante séculos graças a esta associação única de pessoas. Entre os numerosos festivais do Vale de Catmandu, o festival Yenya Puhni foi escolhido para este estudo, pois é também um grande festival em Catmandu. Este festival é o exemplo perfeito para este estudo, pois é celebrado na cidade mais urbanizada do Nepal, com os desafios das cidades modernas tais como mudanças sociais e urbanização. Apesar dos desafios da modernização, os ''Guthi'' ainda desempenham um papel importante na preservação do património do Vale de Catmandu. Agora, existem algumas intervenções de várias instituições formais Então, este estudo irá focar-se na gestão, continuidade e problemas do festival, juntamente com a posição do Nepal em termos de conservação de património imaterial. O problema de Catmandu e do festival Yenya Punhi é o problema de todos os países em situação semelhante então, este estudo pode ser um bom exemplo para encontrar soluções de problemas parecidos, não só em outros festivais no Nepal mas também para qualquer parte do mundo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Phonation distortion leaves relevant marks in a speaker's biometric profile. Dysphonic voice production may be used for biometrical speaker characterization. In the present paper phonation features derived from the glottal source (GS) parameterization, after vocal tract inversion, is proposed for dysphonic voice characterization in Speaker Verification tasks. The glottal source derived parameters are matched in a forensic evaluation framework defining a distance-based metric specification. The phonation segments used in the study are derived from fillers, long vowels, and other phonation segments produced in spontaneous telephone conversations. Phonated segments from a telephonic database of 100 male Spanish native speakers are combined in a 10-fold cross-validation task to produce the set of quality measurements outlined in the paper. Shimmer, mucosal wave correlate, vocal fold cover biomechanical parameter unbalance and a subset of the GS cepstral profile produce accuracy rates as high as 99.57 for a wide threshold interval (62.08-75.04%). An Equal Error Rate of 0.64 % can be granted. The proposed metric framework is shown to behave more fairly than classical likelihood ratios in supporting the hypothesis of the defense vs that of the prosecution, thus ofering a more reliable evaluation scoring. Possible applications are Speaker Verification and Dysphonic Voice Grading.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer simulation programs are essential tools for scientists and engineers to understand a particular system of interest. As expected, the complexity of the software increases with the depth of the model used. In addition to the exigent demands of software engineering, verification of simulation programs is especially challenging because the models represented are complex and ridden with unknowns that will be discovered by developers in an iterative process. To manage such complexity, advanced verification techniques for continually matching the intended model to the implemented model are necessary. Therefore, the main goal of this research work is to design a useful verification and validation framework that is able to identify model representation errors and is applicable to generic simulators. The framework that was developed and implemented consists of two parts. The first part is First-Order Logic Constraint Specification Language (FOLCSL) that enables users to specify the invariants of a model under consideration. From the first-order logic specification, the FOLCSL translator automatically synthesizes a verification program that reads the event trace generated by a simulator and signals whether all invariants are respected. The second part consists of mining the temporal flow of events using a newly developed representation called State Flow Temporal Analysis Graph (SFTAG). While the first part seeks an assurance of implementation correctness by checking that the model invariants hold, the second part derives an extended model of the implementation and hence enables a deeper understanding of what was implemented. The main application studied in this work is the validation of the timing behavior of micro-architecture simulators. The study includes SFTAGs generated for a wide set of benchmark programs and their analysis using several artificial intelligence algorithms. This work improves the computer architecture research and verification processes as shown by the case studies and experiments that have been conducted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern power networks incorporate communications and information technology infrastructure into the electrical power system to create a smart grid in terms of control and operation. The smart grid enables real-time communication and control between consumers and utility companies allowing suppliers to optimize energy usage based on price preference and system technical issues. The smart grid design aims to provide overall power system monitoring, create protection and control strategies to maintain system performance, stability and security. This dissertation contributed to the development of a unique and novel smart grid test-bed laboratory with integrated monitoring, protection and control systems. This test-bed was used as a platform to test the smart grid operational ideas developed here. The implementation of this system in the real-time software creates an environment for studying, implementing and verifying novel control and protection schemes developed in this dissertation. Phasor measurement techniques were developed using the available Data Acquisition (DAQ) devices in order to monitor all points in the power system in real time. This provides a practical view of system parameter changes, system abnormal conditions and its stability and security information system. These developments provide valuable measurements for technical power system operators in the energy control centers. Phasor Measurement technology is an excellent solution for improving system planning, operation and energy trading in addition to enabling advanced applications in Wide Area Monitoring, Protection and Control (WAMPAC). Moreover, a virtual protection system was developed and implemented in the smart grid laboratory with integrated functionality for wide area applications. Experiments and procedures were developed in the system in order to detect the system abnormal conditions and apply proper remedies to heal the system. A design for DC microgrid was developed to integrate it to the AC system with appropriate control capability. This system represents realistic hybrid AC/DC microgrids connectivity to the AC side to study the use of such architecture in system operation to help remedy system abnormal conditions. In addition, this dissertation explored the challenges and feasibility of the implementation of real-time system analysis features in order to monitor the system security and stability measures. These indices are measured experimentally during the operation of the developed hybrid AC/DC microgrids. Furthermore, a real-time optimal power flow system was implemented to optimally manage the power sharing between AC generators and DC side resources. A study relating to real-time energy management algorithm in hybrid microgrids was performed to evaluate the effects of using energy storage resources and their use in mitigating heavy load impacts on system stability and operational security.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two key solutions to reduce the greenhouse gas emissions and increase the overall energy efficiency are to maximize the utilization of renewable energy resources (RERs) to generate energy for load consumption and to shift to low or zero emission plug-in electric vehicles (PEVs) for transportation. The present U.S. aging and overburdened power grid infrastructure is under a tremendous pressure to handle the issues involved in penetration of RERS and PEVs. The future power grid should be designed with for the effective utilization of distributed RERs and distributed generations to intelligently respond to varying customer demand including PEVs with high level of security, stability and reliability. This dissertation develops and verifies such a hybrid AC-DC power system. The system will operate in a distributed manner incorporating multiple components in both AC and DC styles and work in both grid-connected and islanding modes. ^ The verification was performed on a laboratory-based hybrid AC-DC power system testbed as hardware/software platform. In this system, RERs emulators together with their maximum power point tracking technology and power electronics converters were designed to test different energy harvesting algorithms. The Energy storage devices including lithium-ion batteries and ultra-capacitors were used to optimize the performance of the hybrid power system. A lithium-ion battery smart energy management system with thermal and state of charge self-balancing was proposed to protect the energy storage system. A grid connected DC PEVs parking garage emulator, with five lithium-ion batteries was also designed with the smart charging functions that can emulate the future vehicle-to-grid (V2G), vehicle-to-vehicle (V2V) and vehicle-to-house (V2H) services. This includes grid voltage and frequency regulations, spinning reserves, micro grid islanding detection and energy resource support. ^ The results show successful integration of the developed techniques for control and energy management of future hybrid AC-DC power systems with high penetration of RERs and PEVs.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bilinear pairings can be used to construct cryptographic systems with very desirable properties. A pairing performs a mapping on members of groups on elliptic and genus 2 hyperelliptic curves to an extension of the finite field on which the curves are defined. The finite fields must, however, be large to ensure adequate security. The complicated group structure of the curves and the expensive field operations result in time consuming computations that are an impediment to the practicality of pairing-based systems. The Tate pairing can be computed efficiently using the ɳT method. Hardware architectures can be used to accelerate the required operations by exploiting the parallelism inherent to the algorithmic and finite field calculations. The Tate pairing can be performed on elliptic curves of characteristic 2 and 3 and on genus 2 hyperelliptic curves of characteristic 2. Curve selection is dependent on several factors including desired computational speed, the area constraints of the target device and the required security level. In this thesis, custom hardware processors for the acceleration of the Tate pairing are presented and implemented on an FPGA. The underlying hardware architectures are designed with care to exploit available parallelism while ensuring resource efficiency. The characteristic 2 elliptic curve processor contains novel units that return a pairing result in a very low number of clock cycles. Despite the more complicated computational algorithm, the speed of the genus 2 processor is comparable. Pairing computation on each of these curves can be appealing in applications with various attributes. A flexible processor that can perform pairing computation on elliptic curves of characteristic 2 and 3 has also been designed. An integrated hardware/software design and verification environment has been developed. This system automates the procedures required for robust processor creation and enables the rapid provision of solutions for a wide range of cryptographic applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis project studies the agent identity privacy problem in the scalar linear quadratic Gaussian (LQG) control system. For the agent identity privacy problem in the LQG control, privacy models and privacy measures have to be established first. It depends on a trajectory of correlated data rather than a single observation. I propose here privacy models and the corresponding privacy measures by taking into account the two characteristics. The agent identity is a binary hypothesis: Agent A or Agent B. An eavesdropper is assumed to make a hypothesis testing on the agent identity based on the intercepted environment state sequence. The privacy risk is measured by the Kullback-Leibler divergence between the probability distributions of state sequences under two hypotheses. By taking into account both the accumulative control reward and privacy risk, an optimization problem of the policy of Agent B is formulated. The optimal deterministic privacy-preserving LQG policy of Agent B is a linear mapping. A sufficient condition is given to guarantee that the optimal deterministic privacy-preserving policy is time-invariant in the asymptotic regime. An independent Gaussian random variable cannot improve the performance of Agent B. The numerical experiments justify the theoretic results and illustrate the reward-privacy trade-off. Based on the privacy model and the LQG control model, I have formulated the mathematical problems for the agent identity privacy problem in LQG. The formulated problems address the two design objectives: to maximize the control reward and to minimize the privacy risk. I have conducted theoretic analysis on the LQG control policy in the agent identity privacy problem and the trade-off between the control reward and the privacy risk.Finally, the theoretic results are justified by numerical experiments. From the numerical results, I expected to have some interesting observations and insights, which are explained in the last chapter.