965 resultados para Speaker verification
Resumo:
A combination of the two-fluid and drift flux models have been used to model the transport of fibrous debris. This debris is generated during loss of coolant accidents in the primary circuit of pressurized or boiling water nuclear reactors, as high pressure steam or water jets can damage adjacent insulation materials including mineral wool blankets. Fibre agglomerates released from the mineral wools may reach the containment sump strainers, where they can accumulate and compromise the long-term operation of the emergency core cooling system. Single-effect experiments of sedimentation in a quiescent rectangular column and sedimentation in a horizontal flow are used to verify and validate this particular application of the multiphase numerical models. The utilization of both modeling approaches allows a number of pseudocontinuous dispersed phases of spherical wetted agglomerates to be modeled simultaneously. Key effects on the transport of the fibre agglomerates are particle size, density and turbulent dispersion, as well as the relative viscosity of the fluid-fibre mixture.
Resumo:
A verification task of proving the equivalence of two descriptions of the same device is examined for the case, when one of the descriptions is partially defined. In this case, the verification task is reduced to checking out whether logical descriptions are equivalent on the domain of the incompletely defined one. Simulation-based approach to solving this task for different vector forms of description representations is proposed. Fast Boolean computations over Boolean and ternary vectors having big sizes underlie the offered methods.
Resumo:
The paper represents a verification of a previously developed conceptual model of security related processes in DRM implementation. The applicability of established security requirements in practice is checked as well by comparing these requirements to four real DRM implementations (Microsoft Media DRM, Apple's iTunes, SunnComm Technologies’s MediaMax DRM and First4Internet’s XCP DRM). The exploited weaknesses of these systems resulting from the violation of specific security requirements are explained and the possibilities to avoid the attacks by implementing the requirements in designing step are discussed.
Resumo:
Master of Arts dissertation
Resumo:
Магдалина Василева Тодорова - В статията е описан подход за верификация на процедурни програми чрез изграждане на техни модели, дефинирани чрез обобщени мрежи. Подходът интегрира концепцията “design by contract” с подходи за верификация от тип доказателство на теореми и проверка на съгласуваност на модели. За целта разделно се верифицират функциите, които изграждат програмата относно спецификации според предназначението им. Изгражда се обобщен мрежов модел, специфициащ връзките между функциите във вид на коректни редици от извиквания. За главната функция на програмата се построява обобщен мрежов модел и се проверява дали той съответства на мрежовия модел на връзките между функциите на програмата. Всяка от функциите на програмата, която използва други функции се верифицира и относно спецификацията, зададена чрез мрежовия модел на връзките между функциите на програмата.
Resumo:
Reliability modelling and verification is indispensable in modern manufacturing, especially for product development risk reduction. Based on the discussion of the deficiencies of traditional reliability modelling methods for process reliability, a novel modelling method is presented herein that draws upon a knowledge network of process scenarios based on the analytic network process (ANP). An integration framework of manufacturing process reliability and product quality is presented together with a product development and reliability verification process. According to the roles of key characteristics (KCs) in manufacturing processes, KCs are organised into four clusters, that is, product KCs, material KCs, operation KCs and equipment KCs, which represent the process knowledge network of manufacturing processes. A mathematical model and algorithm is developed for calculating the reliability requirements of KCs with respect to different manufacturing process scenarios. A case study on valve-sleeve component manufacturing is provided as an application example of the new reliability modelling and verification procedure. This methodology is applied in the valve-sleeve component manufacturing processes to manage and deploy production resources.
Resumo:
This paper describes work carried out to develop methods of verifying that machine tools are capable of machining parts to within specification, immediately before carrying out critical material removal operations, and with negligible impact on process times. A review of machine tool calibration and verification technologies identified that current techniques were not suitable due to requirements for significant time and skilled human intervention. A 'solution toolkit' is presented consisting of a selection circular tests and artefact probing which are able to rapidly verify the kinematic errors and in some cases also dynamic errors for different types of machine tool, as well as supplementary methods for tool and spindle error detection. A novel artefact probing process is introduced which simplifies data processing so that the process can be readily automated using only the native machine tool controller. Laboratory testing and industrial case studies are described which demonstrate the effectiveness of this approach.
Resumo:
In today’s modern manufacturing industry there is an increasing need to improve internal processes to meet diverse client needs. Process re-engineering is an important activity that is well understood by industry but its rate of application within small to medium size enterprises (SME) is less developed. Business pressures shift the focus of SMEs toward winning new projects and contracts rather than developing long-term, sustainable manufacturing processes. Variations in manufacturing processes are inevitable, but the amount of non-conformity often exceeds the acceptable levels. This paper is focused on the re-engineering of the manufacturing and verification procedure for discrete parts production with the aim of enhancing process control and product verification. The ideologies of the ‘Push’ and ‘Pull’ approaches to manufacturing are useful in the context of process re-engineering for data improvement. Currently information is pulled from the market and prominent customers, and manufacturing companies always try to make the right product, by following customer procedures that attempt to verify against specifications. This approach can result in significant quality control challenges. The aim of this paper is to highlight the importance of process re-engineering in product verification in SMEs. Leadership, culture, ownership and process management are among the main attributes required for the successful deployment of process re-engineering. This paper presents the findings from a case study showcasing the application of a modified re-engingeering method for the manufacturing and verification process. The findings from the case study indicate there are several advantages to implementing the re-engineering method outlined in this paper.
Resumo:
Five axis machine tools are increasing and becoming more popular as customers demand more complex machined parts. In high value manufacturing, the importance of machine tools in producing high accuracy products is essential. High accuracy manufacturing requires producing parts in a repeatable manner and precision in compliance to the defined design specifications. The performance of the machine tools is often affected by geometrical errors due to a variety of causes including incorrect tool offsets, errors in the centres of rotation and thermal growth. As a consequence, it can be difficult to produce highly accurate parts consistently. It is, therefore, essential to ensure that machine tools are verified in terms of their geometric and positioning accuracy. When machine tools are verified in terms of their accuracy, the resulting numerical values of positional accuracy and process capability can be used to define design for verification rules and algorithms so that machined parts can be easily produced without scrap and little or no after process measurement. In this paper the benefits of machine tool verification are listed and a case study is used to demonstrate the implementation of robust machine tool performance measurement and diagnostics using a ballbar system.
Resumo:
This paper details work carried out to verify the dimensional measurement performance of the Indoor GPS (iGPS) system; a network of Rotary-Laser Automatic Theodolites (R-LATs). Initially tests were carried out to determine the angular uncertainties on an individual R-LAT transmitter-receiver pair. A method is presented of determining the uncertainty of dimensional measurement for a three dimensional coordinate measurement machine. An experimental procedure was developed to compare three dimensional coordinate measurements with calibrated reference points. The reference standard used to calibrate these reference points was a fringe counting interferometer with the multilateration technique employed to establish three dimensional coordinates. This is an extension of the established technique of comparing measured lengths with calibrated lengths. The method was found to be practical and able to establish that the expanded uncertainty of the basic iGPS system was approximately 1 mm at a 95% confidence level. Further tests carried out on a highly optimized version of the iGPS system have shown that the coordinate uncertainty can be reduced to 0.25 mm at a 95% confidence level.
Resumo:
Design verification in the digital domain, using model-based principles, is a key research objective to address the industrial requirement for reduced physical testing and prototyping. For complex assemblies, the verification of design and the associated production methods is currently fragmented, prolonged and sub-optimal, as it uses digital and physical verification stages that are deployed in a sequential manner using multiple systems. This paper describes a novel, hybrid design verification methodology that integrates model-based variability analysis with measurement data of assemblies, in order to reduce simulation uncertainty and allow early design verification from the perspective of satisfying key assembly criteria.
Resumo:
The verification and validation of engineering designs are of primary importance as they directly influence production performance and ultimately define product functionality and customer perception. Research in aspects of verification and validation is widely spread ranging from tools employed during the digital design phase, to methods deployed for prototype verification and validation. This paper reviews the standard definitions of verification and validation in the context of engineering design and progresses to provide a coherent analysis and classification of these activities from preliminary design, to design in the digital domain and the physical verification and validation of products and processes. The scope of the paper includes aspects of system design and demonstrates how complex products are validated in the context of their lifecycle. Industrial requirements are highlighted and research trends and priorities identified. © 2010 CIRP.
Resumo:
Our modular approach to data hiding is an innovative concept in the data hiding research field. It enables the creation of modular digital watermarking methods that have extendable features and are designed for use in web applications. The methods consist of two types of modules – a basic module and an application-specific module. The basic module mainly provides features which are connected with the specific image format. As JPEG is a preferred image format on the Internet, we have put a focus on the achievement of a robust and error-free embedding and retrieval of the embedded data in JPEG images. The application-specific modules are adaptable to user requirements in the concrete web application. The experimental results of the modular data watermarking are very promising. They indicate excellent image quality, satisfactory size of the embedded data and perfect robustness against JPEG transformations with prespecified compression ratios. ACM Computing Classification System (1998): C.2.0.
Resumo:
Software product line modeling aims at capturing a set of software products in an economic yet meaningful way. We introduce a class of variability models that capture the sharing between the software artifacts forming the products of a software product line (SPL) in a hierarchical fashion, in terms of commonalities and orthogonalities. Such models are useful when analyzing and verifying all products of an SPL, since they provide a scheme for divide-and-conquer-style decomposition of the analysis or verification problem at hand. We define an abstract class of SPLs for which variability models can be constructed that are optimal w.r.t. the chosen representation of sharing. We show how the constructed models can be fed into a previously developed algorithmic technique for compositional verification of control-flow temporal safety properties, so that the properties to be verified are iteratively decomposed into simpler ones over orthogonal parts of the SPL, and are not re-verified over the shared parts. We provide tool support for our technique, and evaluate our tool on a small but realistic SPL of cash desks.
Resumo:
This article investigates potential effects which (the recontextualisation of) interpreted discourse can have on the positioning of participants. The discursive event which forms the basis of the analysis are international press conferences which bring politicians and journalists together. The dominant question addressed is: (How) do interpreter-mediated encounters influence the positioning of participants and thus the construction of interactional and social roles? The article illustrates that methods of (critical) discourse analysis can be used to identify positioning strategies which are employed by participants in such triadic exchanges. The data come from press conferences which involve English, German, and French as source and target languages.