34 resultados para noisy speaker verification
Resumo:
On the basis of the standard model for the photorefractive nonlinearity we investigate whether a systematic description of the dependence of two-beam energy exchange on beam polarization and grating vector K is possible. Our result is that there is good agreement between theory and experiment with respect to the polarization properties and semi-quantitative agreement with respect to the K-dependence of the energy exchange.
Resumo:
Distributed Brillouin sensing of strain and temperature works by making spatially resolved measurements of the position of the measurand-dependent extremum of the resonance curve associated with the scattering process in the weakly nonlinear regime. Typically, measurements of backscattered Stokes intensity (the dependent variable) are made at a number of predetermined fixed frequencies covering the design measurand range of the apparatus and combined to yield an estimate of the position of the extremum. The measurand can then be found because its relationship to the position of the extremum is assumed known. We present analytical expressions relating the relative error in the extremum position to experimental errors in the dependent variable. This is done for two cases: (i) a simple non-parametric estimate of the mean based on moments and (ii) the case in which a least squares technique is used to fit a Lorentzian to the data. The question of statistical bias in the estimates is discussed and in the second case we go further and present for the first time a general method by which the probability density function (PDF) of errors in the fitted parameters can be obtained in closed form in terms of the PDFs of the errors in the noisy data.
Resumo:
We introduce two techniques to measure the efficiency of inter mode FWM with respect to intra mode FWM. The first technique allows an estimation of the additional FWM penalty for any given system; the second isolates the contribution of each mode. Measurements are compared to an analytical model showing the FWM signal increases by ∼2dB with inter mode phase matching.
Resumo:
A self-starting all-fiber passively mode-locked Tm-doped fiber laser based on nonlinear loop mirror (NOLM) is demonstrated. Stable soliton pulses centered at 2017.33 nm with 1.56 nm FWHM were produced at a repetition rate of 1.514 MHz with pulse duration of 2.8 ps and pulse energy of 83.8 pJ. As increased pump power, the oscillator can also operate at noise-like (NL) regime. Stable NL pulses with coherence spike width of 341 fs and pulse energy of up to 249.32 nJ was achieved at a center wavelength of 2017.24 nm with 21.33 nm FWHM. To the best of our knowledge, this is the first 2 μm region NOLM-based mode-locked fiber laser operating at two regimes with the highest single pulse energy for NL pulses. © 2014 Optical Society of America.
Resumo:
The focus of our work is the verification of tight functional properties of numerical programs, such as showing that a floating-point implementation of Riemann integration computes a close approximation of the exact integral. Programmers and engineers writing such programs will benefit from verification tools that support an expressive specification language and that are highly automated. Our work provides a new method for verification of numerical software, supporting a substantially more expressive language for specifications than other publicly available automated tools. The additional expressivity in the specification language is provided by two constructs. First, the specification can feature inclusions between interval arithmetic expressions. Second, the integral operator from classical analysis can be used in the specifications, where the integration bounds can be arbitrary expressions over real variables. To support our claim of expressivity, we outline the verification of four example programs, including the integration example mentioned earlier. A key component of our method is an algorithm for proving numerical theorems. This algorithm is based on automatic polynomial approximation of non-linear real and real-interval functions defined by expressions. The PolyPaver tool is our implementation of the algorithm and its source code is publicly available. In this paper we report on experiments using PolyPaver that indicate that the additional expressivity does not come at a performance cost when comparing with other publicly available state-of-the-art provers. We also include a scalability study that explores the limits of PolyPaver in proving tight functional specifications of progressively larger randomly generated programs. © 2014 Springer International Publishing Switzerland.
Resumo:
A combination of the two-fluid and drift flux models have been used to model the transport of fibrous debris. This debris is generated during loss of coolant accidents in the primary circuit of pressurized or boiling water nuclear reactors, as high pressure steam or water jets can damage adjacent insulation materials including mineral wool blankets. Fibre agglomerates released from the mineral wools may reach the containment sump strainers, where they can accumulate and compromise the long-term operation of the emergency core cooling system. Single-effect experiments of sedimentation in a quiescent rectangular column and sedimentation in a horizontal flow are used to verify and validate this particular application of the multiphase numerical models. The utilization of both modeling approaches allows a number of pseudocontinuous dispersed phases of spherical wetted agglomerates to be modeled simultaneously. Key effects on the transport of the fibre agglomerates are particle size, density and turbulent dispersion, as well as the relative viscosity of the fluid-fibre mixture.
Resumo:
We study noisy computation in randomly generated k-ary Boolean formulas. We establish bounds on the noise level above which the results of computation by random formulas are not reliable. This bound is saturated by formulas constructed from a single majority-like gate. We show that these gates can be used to compute any Boolean function reliably below the noise bound.
Resumo:
Master of Arts dissertation
Resumo:
Reliability modelling and verification is indispensable in modern manufacturing, especially for product development risk reduction. Based on the discussion of the deficiencies of traditional reliability modelling methods for process reliability, a novel modelling method is presented herein that draws upon a knowledge network of process scenarios based on the analytic network process (ANP). An integration framework of manufacturing process reliability and product quality is presented together with a product development and reliability verification process. According to the roles of key characteristics (KCs) in manufacturing processes, KCs are organised into four clusters, that is, product KCs, material KCs, operation KCs and equipment KCs, which represent the process knowledge network of manufacturing processes. A mathematical model and algorithm is developed for calculating the reliability requirements of KCs with respect to different manufacturing process scenarios. A case study on valve-sleeve component manufacturing is provided as an application example of the new reliability modelling and verification procedure. This methodology is applied in the valve-sleeve component manufacturing processes to manage and deploy production resources.
Resumo:
This paper describes work carried out to develop methods of verifying that machine tools are capable of machining parts to within specification, immediately before carrying out critical material removal operations, and with negligible impact on process times. A review of machine tool calibration and verification technologies identified that current techniques were not suitable due to requirements for significant time and skilled human intervention. A 'solution toolkit' is presented consisting of a selection circular tests and artefact probing which are able to rapidly verify the kinematic errors and in some cases also dynamic errors for different types of machine tool, as well as supplementary methods for tool and spindle error detection. A novel artefact probing process is introduced which simplifies data processing so that the process can be readily automated using only the native machine tool controller. Laboratory testing and industrial case studies are described which demonstrate the effectiveness of this approach.
Resumo:
In today’s modern manufacturing industry there is an increasing need to improve internal processes to meet diverse client needs. Process re-engineering is an important activity that is well understood by industry but its rate of application within small to medium size enterprises (SME) is less developed. Business pressures shift the focus of SMEs toward winning new projects and contracts rather than developing long-term, sustainable manufacturing processes. Variations in manufacturing processes are inevitable, but the amount of non-conformity often exceeds the acceptable levels. This paper is focused on the re-engineering of the manufacturing and verification procedure for discrete parts production with the aim of enhancing process control and product verification. The ideologies of the ‘Push’ and ‘Pull’ approaches to manufacturing are useful in the context of process re-engineering for data improvement. Currently information is pulled from the market and prominent customers, and manufacturing companies always try to make the right product, by following customer procedures that attempt to verify against specifications. This approach can result in significant quality control challenges. The aim of this paper is to highlight the importance of process re-engineering in product verification in SMEs. Leadership, culture, ownership and process management are among the main attributes required for the successful deployment of process re-engineering. This paper presents the findings from a case study showcasing the application of a modified re-engingeering method for the manufacturing and verification process. The findings from the case study indicate there are several advantages to implementing the re-engineering method outlined in this paper.
Resumo:
Five axis machine tools are increasing and becoming more popular as customers demand more complex machined parts. In high value manufacturing, the importance of machine tools in producing high accuracy products is essential. High accuracy manufacturing requires producing parts in a repeatable manner and precision in compliance to the defined design specifications. The performance of the machine tools is often affected by geometrical errors due to a variety of causes including incorrect tool offsets, errors in the centres of rotation and thermal growth. As a consequence, it can be difficult to produce highly accurate parts consistently. It is, therefore, essential to ensure that machine tools are verified in terms of their geometric and positioning accuracy. When machine tools are verified in terms of their accuracy, the resulting numerical values of positional accuracy and process capability can be used to define design for verification rules and algorithms so that machined parts can be easily produced without scrap and little or no after process measurement. In this paper the benefits of machine tool verification are listed and a case study is used to demonstrate the implementation of robust machine tool performance measurement and diagnostics using a ballbar system.
Resumo:
This paper details work carried out to verify the dimensional measurement performance of the Indoor GPS (iGPS) system; a network of Rotary-Laser Automatic Theodolites (R-LATs). Initially tests were carried out to determine the angular uncertainties on an individual R-LAT transmitter-receiver pair. A method is presented of determining the uncertainty of dimensional measurement for a three dimensional coordinate measurement machine. An experimental procedure was developed to compare three dimensional coordinate measurements with calibrated reference points. The reference standard used to calibrate these reference points was a fringe counting interferometer with the multilateration technique employed to establish three dimensional coordinates. This is an extension of the established technique of comparing measured lengths with calibrated lengths. The method was found to be practical and able to establish that the expanded uncertainty of the basic iGPS system was approximately 1 mm at a 95% confidence level. Further tests carried out on a highly optimized version of the iGPS system have shown that the coordinate uncertainty can be reduced to 0.25 mm at a 95% confidence level.
Resumo:
Design verification in the digital domain, using model-based principles, is a key research objective to address the industrial requirement for reduced physical testing and prototyping. For complex assemblies, the verification of design and the associated production methods is currently fragmented, prolonged and sub-optimal, as it uses digital and physical verification stages that are deployed in a sequential manner using multiple systems. This paper describes a novel, hybrid design verification methodology that integrates model-based variability analysis with measurement data of assemblies, in order to reduce simulation uncertainty and allow early design verification from the perspective of satisfying key assembly criteria.
Resumo:
The verification and validation of engineering designs are of primary importance as they directly influence production performance and ultimately define product functionality and customer perception. Research in aspects of verification and validation is widely spread ranging from tools employed during the digital design phase, to methods deployed for prototype verification and validation. This paper reviews the standard definitions of verification and validation in the context of engineering design and progresses to provide a coherent analysis and classification of these activities from preliminary design, to design in the digital domain and the physical verification and validation of products and processes. The scope of the paper includes aspects of system design and demonstrates how complex products are validated in the context of their lifecycle. Industrial requirements are highlighted and research trends and priorities identified. © 2010 CIRP.