953 resultados para noisy speaker verification


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since the 1990s several large companies have been publishing nonfinancial performance reports. Focusing initially on the physical environment, these reports evolved to consider social relations, as well as data on the firm`s economic performance. A few mining companies pioneered this trend, and in the last years some of them incorporated the three dimensions of sustainable development, publishing so-called sustainability reports. This article reviews 31 reports published between 2001 and 2006 by four major mining companies. A set of 62 assessment items organized in six categories (namely context and commitment, management, environmental, social and economic performance, and accessibility and assurance) were selected to guide the review. The items were derived from international literature and recommended best practices, including the Global Reporting Initiative G3 framework. A content analysis was performed using the report as a sampling unit, and using phrases, graphics, or tables containing certain information as data collection units. A basic rating scale (0 or 1) was used for noting the presence or absence of information and a final percentage score was obtained for each report. Results show that there is a clear evolution in report`s comprehensiveness and depth. Categories ""accessibility and assurance"" and ""economic performance"" featured the lowest scores and do not present a clear evolution trend in the period, whereas categories ""context and commitment"" and ""social performance"" presented the best results and regular improvement; the category ""environmental performance,"" despite it not reaching the biggest scores, also featured constant evolution. Description of data measurement techniques, besides more comprehensive third-party verification are the items most in need of improvement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The computational design of a composite where the properties of its constituents change gradually within a unit cell can be successfully achieved by means of a material design method that combines topology optimization with homogenization. This is an iterative numerical method, which leads to changes in the composite material unit cell until desired properties (or performance) are obtained. Such method has been applied to several types of materials in the last few years. In this work, the objective is to extend the material design method to obtain functionally graded material architectures, i.e. materials that are graded at the local level (e.g. microstructural level). Consistent with this goal, a continuum distribution of the design variable inside the finite element domain is considered to represent a fully continuous material variation during the design process. Thus the topology optimization naturally leads to a smoothly graded material system. To illustrate the theoretical and numerical approaches, numerical examples are provided. The homogenization method is verified by considering one-dimensional material gradation profiles for which analytical solutions for the effective elastic properties are available. The verification of the homogenization method is extended to two dimensions considering a trigonometric material gradation, and a material variation with discontinuous derivatives. These are also used as benchmark examples to verify the optimization method for functionally graded material cell design. Finally the influence of material gradation on extreme materials is investigated, which includes materials with near-zero shear modulus, and materials with negative Poisson`s ratio.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Petri net (PN) modeling is one of the most used formal methods in the automation applications field, together with programmable logic controllers (PLCs). Therefore, the creation of a modeling methodology for PNs compatible with the IEC61131 standard is a necessity of automation specialists. Different works dealing with this subject have been carried out; they are presented in the first part of this paper [Frey (2000a, 2000b); Peng and Zhou (IEEE Trans Syst Man Cybern, Part C Appl Rev 34(4):523-531, 2004); Uzam and Jones (Int J Adv Manuf Technol 14(10):716-728, 1998)], but they do not present a completely compatible methodology with this standard. At the same time, they do not maintain the simplicity required for such applications, nor the use of all-graphical and all-mathematical ordinary Petri net (OPN) tools to facilitate model verification and validation. The proposal presented here completes these requirements. Educational applications at the USP and UEA (Brazil) and the UO (Cuba), as well as industrial applications in Brazil and Cuba, have already been carried out with good results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study examines the applicability of a micromechanics approach based upon the computational cell methodology incorporating the Gurson-Tvergaard (GT) model and the CTOA criterion to describe ductile crack extension of longitudinal crack-like defects in high pressure pipeline steels. A central focus is to gain additional insight into the effectiveness and limitations of both approaches to describe crack growth response and to predict the burst pressure for the tested cracked pipes. A verification study conducted on burst testing of large-diameter, precracked pipe specimens with varying crack depth to thickness ratio (a/t) shows the potential predictive capability of the cell approach even though both the CT model and the CTOA criterion appear to depend on defect geometry. Overall, the results presented here lend additional support for further developments in the cell methodology as a valid engineering tool for integrity assessments of pipelines with axial defects. (C) 2011 Elsevier Ltd. All rights reserved,

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Light touch of a fingertip on an external stable surface greatly improves the postural stability of standing subjects. The hypothesis of the present work was that a vibrating surface could increase the effectiveness of fingertip signaling to the central nervous system (e.g., by a stochastic resonance mechanism) and hence improve postural stability beyond that achieved by light touch. Subjects stood quietly over a force plate while touching with their right index fingertip a surface that could be either quiescent or randomly vibrated at two low-level noise intensities. The vibratory noise of the contact surface caused a significant decrease in postural sway, as assessed by center of pressure measures in both time and frequency domains. Complementary experiments were designed to test whether postural control improvements were associated with a stochastic resonance mechanism or whether attentional mechanisms could be contributing. A full curve relating body sway parameters and different levels of vibratory noise resulted in a U-like function, suggesting that the improvement in sway relied on a stochastic resonance mechanism. Additionally, no decrease in postural sway was observed when the vibrating contact surface was attached to the subject`s body, suggesting that no attentional mechanisms were involved. These results indicate that sensory cues obtained from the fingertip need not necessarily be associated with static contact surfaces to cause improvement in postural stability. A low-level noisy vibration applied to the contact surface could lead to a better performance of the postural control system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a novel algorithm to successfully achieve viable integrity and authenticity addition and verification of n-frame DICOM medical images using cryptographic mechanisms. The aim of this work is the enhancement of DICOM security measures, especially for multiframe images. Current approaches have limitations that should be properly addressed for improved security. The algorithm proposed in this work uses data encryption to provide integrity and authenticity, along with digital signature. Relevant header data and digital signature are used as inputs to cipher the image. Therefore, one can only retrieve the original data if and only if the images and the inputs are correct. The encryption process itself is a cascading scheme, where a frame is ciphered with data related to the previous frames, generating also additional data on image integrity and authenticity. Decryption is similar to encryption, featuring also the standard security verification of the image. The implementation was done in JAVA, and a performance evaluation was carried out comparing the speed of the algorithm with other existing approaches. The evaluation showed a good performance of the algorithm, which is an encouraging result to use it in a real environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sound source localization (SSL) is an essential task in many applications involving speech capture and enhancement. As such, speaker localization with microphone arrays has received significant research attention. Nevertheless, existing SSL algorithms for small arrays still have two significant limitations: lack of range resolution, and accuracy degradation with increasing reverberation. The latter is natural and expected, given that strong reflections can have amplitudes similar to that of the direct signal, but different directions of arrival. Therefore, correctly modeling the room and compensating for the reflections should reduce the degradation due to reverberation. In this paper, we show a stronger result. If modeled correctly, early reflections can be used to provide more information about the source location than would have been available in an anechoic scenario. The modeling not only compensates for the reverberation, but also significantly increases resolution for range and elevation. Thus, we show that under certain conditions and limitations, reverberation can be used to improve SSL performance. Prior attempts to compensate for reverberation tried to model the room impulse response (RIR). However, RIRs change quickly with speaker position, and are nearly impossible to track accurately. Instead, we build a 3-D model of the room, which we use to predict early reflections, which are then incorporated into the SSL estimation. Simulation results with real and synthetic data show that even a simplistic room model is sufficient to produce significant improvements in range and elevation estimation, tasks which would be very difficult when relying only on direct path signal components.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When building genetic maps, it is necessary to choose from several marker ordering algorithms and criteria, and the choice is not always simple. In this study, we evaluate the efficiency of algorithms try (TRY), seriation (SER), rapid chain delineation (RCD), recombination counting and ordering (RECORD) and unidirectional growth (UG), as well as the criteria PARF (product of adjacent recombination fractions), SARF (sum of adjacent recombination fractions), SALOD (sum of adjacent LOD scores) and LHMC (likelihood through hidden Markov chains), used with the RIPPLE algorithm for error verification, in the construction of genetic linkage maps. A linkage map of a hypothetical diploid and monoecious plant species was simulated containing one linkage group and 21 markers with fixed distance of 3 cM between them. In all, 700 F(2) populations were randomly simulated with and 400 individuals with different combinations of dominant and co-dominant markers, as well as 10 and 20% of missing data. The simulations showed that, in the presence of co-dominant markers only, any combination of algorithm and criteria may be used, even for a reduced population size. In the case of a smaller proportion of dominant markers, any of the algorithms and criteria (except SALOD) investigated may be used. In the presence of high proportions of dominant markers and smaller samples (around 100), the probability of repulsion linkage increases between them and, in this case, use of the algorithms TRY and SER associated to RIPPLE with criterion LHMC would provide better results. Heredity (2009) 103, 494-502; doi:10.1038/hdy.2009.96; published online 29 July 2009

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, chemometric methods are reported as potential tools for monitoring the authenticity of Brazilian ultra-high temperature (UHT) milk processed in industrial plants located in different regions of the country. A total of 100 samples were submitted to the qualitative analysis of adulterants such as starch, chlorine, formal. hydrogen peroxide and urine. Except for starch, all the samples reported, at least, the presence of one adulterant. The use of chemometric methodologies such as the Principal Component Analysis (PCA) and Hierarchical Cluster Analysis (HCA) enabled the verification of the occurrence of certain adulterations in specific regions. The proposed multivariate approaches may allow the sanitary agency authorities to optimise materials, human and financial resources, as they associate the occurrence of adulterations to the geographical location of the industrial plants. (c) 2010 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper disputes two influential claims in the Romance Linguistics literature. The first is that the synthetic future tenses in spoken Western Romance are now rivalled, if not supplanted, as temporal functors by the more recently developed GO futures. The second is that these synthetic futures now have modal rather than temporal meanings in spoken Romance. These claims are seen as reflecting a universal cycle of diachronic change, in which verb forms originally expressing modal (or aspectual) values take on future temporal reference, becoming tenses. The new modal meanings supplant the temporal, which are then taken up by new forms. Challenges to this theory for French are raised on the basis of empirical evidence of two sorts. Positively, future tenses in spoken Romance continue to be used with temporal meaning. Negatively, evidence of modal meaning for these forms is lacking. The evidence comes froma corpora of spoken French, native speaker judgements and verb data from a daily broadsheet. Cumulatively, it points to the reverse of the claims noted above: the synthetic future in spoken French has temporal but little modal meaning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The one-way quantum computing model introduced by Raussendorf and Briegel [Phys. Rev. Lett. 86, 5188 (2001)] shows that it is possible to quantum compute using only a fixed entangled resource known as a cluster state, and adaptive single-qubit measurements. This model is the basis for several practical proposals for quantum computation, including a promising proposal for optical quantum computation based on cluster states [M. A. Nielsen, Phys. Rev. Lett. (to be published), quant-ph/0402005]. A significant open question is whether such proposals are scalable in the presence of physically realistic noise. In this paper we prove two threshold theorems which show that scalable fault-tolerant quantum computation may be achieved in implementations based on cluster states, provided the noise in the implementations is below some constant threshold value. Our first threshold theorem applies to a class of implementations in which entangling gates are applied deterministically, but with a small amount of noise. We expect this threshold to be applicable in a wide variety of physical systems. Our second threshold theorem is specifically adapted to proposals such as the optical cluster-state proposal, in which nondeterministic entangling gates are used. A critical technical component of our proofs is two powerful theorems which relate the properties of noisy unitary operations restricted to act on a subspace of state space to extensions of those operations acting on the entire state space. We expect these theorems to have a variety of applications in other areas of quantum-information science.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Contrary to the common pattern of spatial terms being metaphorically extended to location in time, the Australian language Jingulu shows an unusual extension of temporal markers to indicate location in space. Light verbs, which typically encode tense, aspect, mood and associated motion, are occasionally found on nouns to indicate the relative location of the referent with respect to the speaker. It is hypothesised that this pattern resulted from the reduction of verbal clauses used as relative modifiers to the nouns in question.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Data mining is the process to identify valid, implicit, previously unknown, potentially useful and understandable information from large databases. It is an important step in the process of knowledge discovery in databases, (Olaru & Wehenkel, 1999). In a data mining process, input data can be structured, seme-structured, or unstructured. Data can be in text, categorical or numerical values. One of the important characteristics of data mining is its ability to deal data with large volume, distributed, time variant, noisy, and high dimensionality. A large number of data mining algorithms have been developed for different applications. For example, association rules mining can be useful for market basket problems, clustering algorithms can be used to discover trends in unsupervised learning problems, classification algorithms can be applied in decision-making problems, and sequential and time series mining algorithms can be used in predicting events, fault detection, and other supervised learning problems (Vapnik, 1999). Classification is among the most important tasks in the data mining, particularly for data mining applications into engineering fields. Together with regression, classification is mainly for predictive modelling. So far, there have been a number of classification algorithms in practice. According to (Sebastiani, 2002), the main classification algorithms can be categorized as: decision tree and rule based approach such as C4.5 (Quinlan, 1996); probability methods such as Bayesian classifier (Lewis, 1998); on-line methods such as Winnow (Littlestone, 1988) and CVFDT (Hulten 2001), neural networks methods (Rumelhart, Hinton & Wiliams, 1986); example-based methods such as k-nearest neighbors (Duda & Hart, 1973), and SVM (Cortes & Vapnik, 1995). Other important techniques for classification tasks include Associative Classification (Liu et al, 1998) and Ensemble Classification (Tumer, 1996).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using the framework of communication accommodation theory the authors examined convergence and maintenance on evaluations of Chinese and Australian students. In Study 1, Australian students judged interactions between an Anglo-Australian. and another interactant who either maintained his or converged in speech style. Results indicated that participants were aware of convergence but that speaker ethnicity (Anglo-Australian, Chinese Australian or Chinese national) was a stronger influence on evaluations and future intentions to interact with the speaker In Study 2, Australian students judged Chinese speakers who maintained communication style or converged on interpersonal speech markers, intergroup markers, or both types of markers. Results indicated that the more participants defined themselves in intergroup terms, the more positively they judged intergroup convergence relative to interpersonal convergence and maintenance. This points to the importance of distinguishing between, convergence on interpersonal and intergroup speech markers, and underlines the role of individual differences in the evaluation of convergence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study forms part of a larger anthropological investigation of the Ngaraangbal Aboriginal Tribe's ancestral burial ground at Broadbeach, Australia. It examines the dentition, records the associated pathology in a noninvasive manner, and relates this to the likely subsistence diet of the tribe. The Broadbeach osteological collection was returned for reburial in 1985; however, radiographic and photographic records of 36 adult males were available. These form the basis of our study. The pathology noted in the study sample was compared with a representative sample (n = 38) of pre-European Aboriginal remains from throughout Queensland for verification purposes only. Rates of dental pathology and injury were calculated from the radiographic and photographic records. There was a significant rate of tooth-wear related intra-bony pathology (4.0%), moderate to severe alveolar bone loss, and heavy dental attrition, of which the mandibular posterior teeth were the most severely affected. Caries prevalence (0.8%) was low for hunter-gatherer populations. A large number of molar pulp chambers had a distinctive cruciate morphology resulting from the formation of secondary dentine and pulp stones. Injuries and abnormalities included upper central incisor avulsion (58.3%) and taurodontism. These results support the proposal that the Ngaraangbal tribe was a hunter-gatherer population subsisting on an abrasive diet that included marine foods. (C) 1998 Wiley-Liss, Inc.