29 resultados para Speaker verification


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The importance of a careful selection of rocks used in building facade cladding is highlighted. A simple and viable methodology for the structural detailing of dimension stones and the verification of the global performance is presented based on a Strap software simulation. The results obtained proved the applicability of the proposed structural dimensioning methodology which represents an excellent simple tool for dimensioning rock slabs used for building facade cladding. The Strap software satisfactorily simulated the structural conditions of the stone slabs under the studied conditions, allowing the determination of alternative slab dimensions and the verification of the cladding strength at the support.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Distributed control systems consist of sensors, actuators and controllers, interconnected by communication networks and are characterized by a high number of concurrent process. This work presents a proposal for a procedure to model and analyze communication networks for distributed control systems in intelligent building. The approach considered for this purpose is based on the characterization of the control system as a discrete event system and application of coloured Petri net as a formal method for specification, analysis and verification of control solutions. With this approach, we develop the models that compose the communication networks for the control systems of intelligent building, which are considered the relationships between the various buildings systems. This procedure provides a structured development of models, facilitating the process of specifying the control algorithm. An application example is presented in order to illustrate the main features of this approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since the 1990s several large companies have been publishing nonfinancial performance reports. Focusing initially on the physical environment, these reports evolved to consider social relations, as well as data on the firm`s economic performance. A few mining companies pioneered this trend, and in the last years some of them incorporated the three dimensions of sustainable development, publishing so-called sustainability reports. This article reviews 31 reports published between 2001 and 2006 by four major mining companies. A set of 62 assessment items organized in six categories (namely context and commitment, management, environmental, social and economic performance, and accessibility and assurance) were selected to guide the review. The items were derived from international literature and recommended best practices, including the Global Reporting Initiative G3 framework. A content analysis was performed using the report as a sampling unit, and using phrases, graphics, or tables containing certain information as data collection units. A basic rating scale (0 or 1) was used for noting the presence or absence of information and a final percentage score was obtained for each report. Results show that there is a clear evolution in report`s comprehensiveness and depth. Categories ""accessibility and assurance"" and ""economic performance"" featured the lowest scores and do not present a clear evolution trend in the period, whereas categories ""context and commitment"" and ""social performance"" presented the best results and regular improvement; the category ""environmental performance,"" despite it not reaching the biggest scores, also featured constant evolution. Description of data measurement techniques, besides more comprehensive third-party verification are the items most in need of improvement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The computational design of a composite where the properties of its constituents change gradually within a unit cell can be successfully achieved by means of a material design method that combines topology optimization with homogenization. This is an iterative numerical method, which leads to changes in the composite material unit cell until desired properties (or performance) are obtained. Such method has been applied to several types of materials in the last few years. In this work, the objective is to extend the material design method to obtain functionally graded material architectures, i.e. materials that are graded at the local level (e.g. microstructural level). Consistent with this goal, a continuum distribution of the design variable inside the finite element domain is considered to represent a fully continuous material variation during the design process. Thus the topology optimization naturally leads to a smoothly graded material system. To illustrate the theoretical and numerical approaches, numerical examples are provided. The homogenization method is verified by considering one-dimensional material gradation profiles for which analytical solutions for the effective elastic properties are available. The verification of the homogenization method is extended to two dimensions considering a trigonometric material gradation, and a material variation with discontinuous derivatives. These are also used as benchmark examples to verify the optimization method for functionally graded material cell design. Finally the influence of material gradation on extreme materials is investigated, which includes materials with near-zero shear modulus, and materials with negative Poisson`s ratio.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Petri net (PN) modeling is one of the most used formal methods in the automation applications field, together with programmable logic controllers (PLCs). Therefore, the creation of a modeling methodology for PNs compatible with the IEC61131 standard is a necessity of automation specialists. Different works dealing with this subject have been carried out; they are presented in the first part of this paper [Frey (2000a, 2000b); Peng and Zhou (IEEE Trans Syst Man Cybern, Part C Appl Rev 34(4):523-531, 2004); Uzam and Jones (Int J Adv Manuf Technol 14(10):716-728, 1998)], but they do not present a completely compatible methodology with this standard. At the same time, they do not maintain the simplicity required for such applications, nor the use of all-graphical and all-mathematical ordinary Petri net (OPN) tools to facilitate model verification and validation. The proposal presented here completes these requirements. Educational applications at the USP and UEA (Brazil) and the UO (Cuba), as well as industrial applications in Brazil and Cuba, have already been carried out with good results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study examines the applicability of a micromechanics approach based upon the computational cell methodology incorporating the Gurson-Tvergaard (GT) model and the CTOA criterion to describe ductile crack extension of longitudinal crack-like defects in high pressure pipeline steels. A central focus is to gain additional insight into the effectiveness and limitations of both approaches to describe crack growth response and to predict the burst pressure for the tested cracked pipes. A verification study conducted on burst testing of large-diameter, precracked pipe specimens with varying crack depth to thickness ratio (a/t) shows the potential predictive capability of the cell approach even though both the CT model and the CTOA criterion appear to depend on defect geometry. Overall, the results presented here lend additional support for further developments in the cell methodology as a valid engineering tool for integrity assessments of pipelines with axial defects. (C) 2011 Elsevier Ltd. All rights reserved,

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a novel algorithm to successfully achieve viable integrity and authenticity addition and verification of n-frame DICOM medical images using cryptographic mechanisms. The aim of this work is the enhancement of DICOM security measures, especially for multiframe images. Current approaches have limitations that should be properly addressed for improved security. The algorithm proposed in this work uses data encryption to provide integrity and authenticity, along with digital signature. Relevant header data and digital signature are used as inputs to cipher the image. Therefore, one can only retrieve the original data if and only if the images and the inputs are correct. The encryption process itself is a cascading scheme, where a frame is ciphered with data related to the previous frames, generating also additional data on image integrity and authenticity. Decryption is similar to encryption, featuring also the standard security verification of the image. The implementation was done in JAVA, and a performance evaluation was carried out comparing the speed of the algorithm with other existing approaches. The evaluation showed a good performance of the algorithm, which is an encouraging result to use it in a real environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sound source localization (SSL) is an essential task in many applications involving speech capture and enhancement. As such, speaker localization with microphone arrays has received significant research attention. Nevertheless, existing SSL algorithms for small arrays still have two significant limitations: lack of range resolution, and accuracy degradation with increasing reverberation. The latter is natural and expected, given that strong reflections can have amplitudes similar to that of the direct signal, but different directions of arrival. Therefore, correctly modeling the room and compensating for the reflections should reduce the degradation due to reverberation. In this paper, we show a stronger result. If modeled correctly, early reflections can be used to provide more information about the source location than would have been available in an anechoic scenario. The modeling not only compensates for the reverberation, but also significantly increases resolution for range and elevation. Thus, we show that under certain conditions and limitations, reverberation can be used to improve SSL performance. Prior attempts to compensate for reverberation tried to model the room impulse response (RIR). However, RIRs change quickly with speaker position, and are nearly impossible to track accurately. Instead, we build a 3-D model of the room, which we use to predict early reflections, which are then incorporated into the SSL estimation. Simulation results with real and synthetic data show that even a simplistic room model is sufficient to produce significant improvements in range and elevation estimation, tasks which would be very difficult when relying only on direct path signal components.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When building genetic maps, it is necessary to choose from several marker ordering algorithms and criteria, and the choice is not always simple. In this study, we evaluate the efficiency of algorithms try (TRY), seriation (SER), rapid chain delineation (RCD), recombination counting and ordering (RECORD) and unidirectional growth (UG), as well as the criteria PARF (product of adjacent recombination fractions), SARF (sum of adjacent recombination fractions), SALOD (sum of adjacent LOD scores) and LHMC (likelihood through hidden Markov chains), used with the RIPPLE algorithm for error verification, in the construction of genetic linkage maps. A linkage map of a hypothetical diploid and monoecious plant species was simulated containing one linkage group and 21 markers with fixed distance of 3 cM between them. In all, 700 F(2) populations were randomly simulated with and 400 individuals with different combinations of dominant and co-dominant markers, as well as 10 and 20% of missing data. The simulations showed that, in the presence of co-dominant markers only, any combination of algorithm and criteria may be used, even for a reduced population size. In the case of a smaller proportion of dominant markers, any of the algorithms and criteria (except SALOD) investigated may be used. In the presence of high proportions of dominant markers and smaller samples (around 100), the probability of repulsion linkage increases between them and, in this case, use of the algorithms TRY and SER associated to RIPPLE with criterion LHMC would provide better results. Heredity (2009) 103, 494-502; doi:10.1038/hdy.2009.96; published online 29 July 2009

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, chemometric methods are reported as potential tools for monitoring the authenticity of Brazilian ultra-high temperature (UHT) milk processed in industrial plants located in different regions of the country. A total of 100 samples were submitted to the qualitative analysis of adulterants such as starch, chlorine, formal. hydrogen peroxide and urine. Except for starch, all the samples reported, at least, the presence of one adulterant. The use of chemometric methodologies such as the Principal Component Analysis (PCA) and Hierarchical Cluster Analysis (HCA) enabled the verification of the occurrence of certain adulterations in specific regions. The proposed multivariate approaches may allow the sanitary agency authorities to optimise materials, human and financial resources, as they associate the occurrence of adulterations to the geographical location of the industrial plants. (c) 2010 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is part of a large study to assess the adequacy of the use of multivariate statistical techniques in theses and dissertations of some higher education institutions in the area of marketing with theme of consumer behavior from 1997 to 2006. The regression and conjoint analysis are focused on in this paper, two techniques with great potential of use in marketing studies. The objective of this study was to analyze whether the employement of these techniques suits the needs of the research problem presented in as well as to evaluate the level of success in meeting their premisses. Overall, the results suggest the need for more involvement of researchers in the verification of all the theoretical precepts of application of the techniques classified in the category of investigation of dependence among variables.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ultimate check of the actual dose delivered to a patient in radiotherapy can only be achieved by using in vivo dosimetry. This work reports a pilot study to test the applicability of a thermoluminescent dosimetric system for performing in vivo entrance dose measurements in external photon beam radiotherapy. The measurements demonstrated the value of thermoluminescent dosimetry as a treatment verification method and its applicability as a part of a quality assurance program in radiotherapy. (c) 2009 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: The aim of this study was to evaluate the performances of observers in diagnosing proximal caries in digital images obtained from digital bitewing radiographs using two scanners and four digital cameras in Joint Photographic Experts Group (JPEG) and tagged image file format (TIFF) files, and comparing them with the original conventional radiographs. Method: In total, 56 extracted teeth were radiographed with Kodak Insight film (Eastman Kodak, Rochester, NY) in a Kaycor Yoshida X-ray device (Kaycor X-707;Yoshida Dental Manufacturing Co., Tokyo, Japan) operating at 70 kV and 7 mA with an exposure time of 0.40 s. The radiographs were obtained and scanned by CanonScan D646U (Canon USA Inc., Newport News, VA) and Genius ColorPage HR7X (KYE Systems Corp. America, Doral, FL) scanners, and by Canon Powershot G2 (Canon USA Inc.), Canon RebelXT (Canon USA Inc.), Nikon Coolpix 8700 (Nikon Inc., Melville, NY), and Nikon D70s (Nikon Inc.) digital cameras in JPEG and TIFF formats. Three observers evaluated the images. The teeth were then observed under the microscope in polarized light for the verification of the presence and depth of the carious lesions. Results: The probability of no diagnosis ranged from 1.34% (Insight film) to 52.83% (CanonScan/JPEG). The sensitivity ranged from 0.24 (Canon RebelXT/JPEG) to 0.53 (Insight film), the specificity ranged from 0.93 (Nikon Coolpix/JPEG, Canon Powershot/TIFF, Canon RebelXT/JPEG and TIFF) to 0.97 (CanonScan/TIFF and JPEG) and the accuracy ranged from 0.82 (Canon RebelXT/JPEG) to 0.91 (CanonScan/JPEG). Conclusion: The carious lesion diagnosis did not change in either of the file formats (JPEG and TIFF) in which the images were saved for any of the equipment used. Only the CanonScan scanner did not have adequate performance in radiography digitalization for caries diagnosis and it is not recommended for this purpose. Dentomaxillofacial Radiology (2011) 40, 338-343. doi: 10.1259/dmfr/67185962

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This four-experiment series sought to evaluate the potential of children with neurosensory deafness and cochlear implants to exhibit auditory-visual and visual-visual stimulus equivalence relations within a matching-to-sample format. Twelve children who became deaf prior to acquiring language (prelingual) and four who became deaf afterwards (postlingual) were studied. All children learned auditory-visual conditional discriminations and nearly all showed emergent equivalence relations. Naming tests, conducted with a subset of the: children, showed no consistent relationship to the equivalence-test outcomes.. This study makes several contributions: to the literature on stimulus equivalence. First; it demonstrates that both pre- and postlingually deaf children-can: acquire auditory-visual equivalence-relations after cochlear implantation, thus demonstrating symbolic functioning. Second, it directs attention to a population that may be especially interesting for researchers seeking to analyze the relationship. between speaker and listener repertoires. Third, it demonstrates the feasibility of conducting experimental studies of stimulus control processes within the limitations of a hospital, which these children must visit routinely for the maintenance of their cochlear implants.