78 resultados para Speaker verification
Resumo:
In the design of lattice domes, design engineers need expertise in areas such as configuration processing, nonlinear analysis, and optimization. These are extensive numerical, iterative, and lime-consuming processes that are prone to error without an integrated design tool. This article presents the application of a knowledge-based system in solving lattice-dome design problems. An operational prototype knowledge-based system, LADOME, has been developed by employing the combined knowledge representation approach, which uses rules, procedural methods, and an object-oriented blackboard concept. The system's objective is to assist engineers in lattice-dome design by integrating all design tasks into a single computer-aided environment with implementation of the knowledge-based system approach. For system verification, results from design examples are presented.
Resumo:
Teachers who are new to the country often find themselves as 'the stranger' in their own classroom. Languages education is one area where such overseas-educated teachers are common. The study reported here investigated what cultural factors might influence the classroom performance of such teachers. The early classroom experience of beginning Japanese native speaker teachers and trainees was examined to this end.
Resumo:
The phase II glutathione S-transferases (GSTs) GSTT1, GSTM1 and GSTP1 catalyse glutathione-mediated reduction of exogenous and endogenous electrophiles. These GSTs have broad and overlapping substrate specificities and it has been hypothesized that allelic variants associated with less effective detoxification of potential carcinogens may confer an increased susceptibility to cancer. To assess the role of GST gene variants in ovarian cancer development, we screened 285 epithelial ovarian cancer cases and 299 unaffected controls for the GSTT1 deletion (null) variant, the GSTM1 deletion (null) variant and the GSTP1 codon 104 A-->G Ile-->Val amino acid substitution variant, The frequencies of the GSTT1, GSTM1 and GSTP1 polymorphic variants did not vary with tumour behaviour (low malignant potential or invasive) or p53 immunohistochemical status. There was a suggestion that ovarian cancers of the endometrioid or clear cell histological subtype had a higher frequency of the GSTT1 and GSTM1 deletion genotype than other histological subgroups. The GSTT1, GSTM1 and GSTP1 genotype distributions did not differ significantly between unaffected controls and ovarian cancer cases (overall or invasive cancers only). However, the GSTM1 null genotype was associated with increased risk of endometrioid/clear cell invasive cancer [age-adjusted OR (95% CI) = 2.04 (1.01-4.09), P = 0.05], suggesting that deletion of GSTM1 may increase the risk of ovarian cancer of these histological subtypes specifically. This marginally significant finding will require verification by independent studies.
Resumo:
Activated sludge models are used extensively in the study of wastewater treatment processes. While various commercial implementations of these models are available, there are many people who need to code models themselves using the simulation packages available to them, Quality assurance of such models is difficult. While benchmarking problems have been developed and are available, the comparison of simulation data with that of commercial models leads only to the detection, not the isolation of errors. To identify the errors in the code is time-consuming. In this paper, we address the problem by developing a systematic and largely automated approach to the isolation of coding errors. There are three steps: firstly, possible errors are classified according to their place in the model structure and a feature matrix is established for each class of errors. Secondly, an observer is designed to generate residuals, such that each class of errors imposes a subspace, spanned by its feature matrix, on the residuals. Finally. localising the residuals in a subspace isolates coding errors. The algorithm proved capable of rapidly and reliably isolating a variety of single and simultaneous errors in a case study using the ASM 1 activated sludge model. In this paper a newly coded model was verified against a known implementation. The method is also applicable to simultaneous verification of any two independent implementations, hence is useful in commercial model development.
Resumo:
GLUT4 is a mammalian facilitative glucose transporter that is highly expressed in adipose tissue and striated muscle. In response to insulin, GLUT4 moves from intracellular storage areas to the plasma membrane, thus increasing cellular glucose uptake. While the verification of this 'translocation hypothesis' (Cushman SW. Wardzala LJ. J Biol Chem 1980;255: 4758-4762 and Suzuki K, Kono T. Proc Natl Acad Sci 1980;77: 2542-2545) has increased our understanding of insulin-regulated glucose transport, a number of fundamental questions remain unanswered. Where is GLUT4 stored within the basal cell? How does GLUT4 move to the cell surface and what mechanism does insulin employ to accelerate this process) Ultimately we require a convergence of trafficking studies with research in signal transduction. However, despite more than 30 years of intensive research we have still not reached this point. The problem is complex, involving at least two separate signal transduction pathways which feed into what appears to be a very dynamic sorting process. Below we discuss some of these complexities and highlight new data that are bringing us closer to the resolution of these questions.
Resumo:
Read-only-memory-based (ROM-based) quantum computation (QC) is an alternative to oracle-based QC. It has the advantages of being less magical, and being more suited to implementing space-efficient computation (i.e., computation using the minimum number of writable qubits). Here we consider a number of small (one- and two-qubit) quantum algorithms illustrating different aspects of ROM-based QC. They are: (a) a one-qubit algorithm to solve the Deutsch problem; (b) a one-qubit binary multiplication algorithm; (c) a two-qubit controlled binary multiplication algorithm; and (d) a two-qubit ROM-based version of the Deutsch-Jozsa algorithm. For each algorithm we present experimental verification using nuclear magnetic resonance ensemble QC. The average fidelities for the implementation were in the ranges 0.9-0.97 for the one-qubit algorithms, and 0.84-0.94 for the two-qubit algorithms. We conclude with a discussion of future prospects for ROM-based quantum computation. We propose a four-qubit algorithm, using Grover's iterate, for solving a miniature real-world problem relating to the lengths of paths in a network.
Resumo:
The anaerobic ammonium oxidation process is a new process for ammonia removal from wastewater. It is also a new microbial physiology that was previously believed to be impossible. The identification of Candidatus Brocadia anammoxidans and its relatives as the responsible bacteria was only possible with the development of a new experimental approach. That approach is the focus of this paper. The approach is a modernisation of the Winogradsky/Beyerinck strategy of selective enrichment and is based on the introduction of the molecular toolbox and modern bioreactor engineering to microbial ecology. It consists of five steps: (1) postulation of an ecological niche based on thermodynamic considerations and macro-ecological field data; (2) engineering of this niche into a laboratory bioreactor for enrichment culture; (3) black-box physiological characterisation of the enrichment culture as a whole; (4) phylogenetic characterisation of the enriched community using molecular tools; (5) physical separation of the dominant members of the enrichment culture using gradient centrifugation and the identification of the species of interest in accordance with Koch's postulates; (6) verification of the in situ importance of these species in the actual ecosystems. The power of this approach is illustrated with a case study: the identification of the planctomycetes responsible for anaerobic ammonium oxidation. We argue that this was impossible using molecular ecology or conventional 'cultivation based techniques' alone. We suggest that the approach might also be used for the microbiological study of many interesting microbes such as anaerobic methane oxidisers.
Resumo:
Program compilation can be formally defined as a sequence of equivalence-preserving transformations, or refinements, from high-level language programs to assembler code, Recent models also incorporate timing properties, but the resulting formalisms are intimidatingly complex. Here we take advantage of a new, simple model of real-time refinement, based on predicate transformer semantics, to present a straightforward compilation formalism that incorporates real-time constraints. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Three experiments were conducted examining group members' responses to criticism from ingroup and outgroup members. In Experiment I a, Australians read scripts of a person making either negative or positive comments about Australia. The speaker was identified as coming from either Australia (ingroup member) or another country (outgroup member). Responses indicated an intergroup sensitivity effect; that is, while ingroup criticisms were tolerated surprisingly well, outgroup criticisms were met with sensitivity and defensiveness. This pattern was replicated using the identity of,university student' (Experiment 1b). Experiment 2 demonstrated that the intergroup sensitivity effect is driven by perceptions that ingroup criticisms are seen to be more legitimate and more constructive than are outgroup criticisms. The results are discussed in terms of their implications for intragroup and intergroup relations. Copyright (C) 2002 John Wiley Sons, Ltd.
Resumo:
Purpose: The range of variability between individuals of the same chronological age (CA) in somatic and biological maturity is large and especially accentuated around the adolescent growth spurt. Maturity assessment is an important consideration when dealing with adolescents, from both a research perspective and youth sports stratification. A noninvasive, practical method predicting years from peak height velocity (a maturity offset value) by using anthropometric variables is developed in one sample and cross-validated in two different samples. Methods: Gender specific multiple regression equations were calculated on a sample of 152 Canadian children aged 8-16 yr (79 boys; 73 girls) who were followed through adolescence from 1991 to 1997, The equations included three somatic dimensions (height, sitting height, and leg length), CA, and their interactions. The equations were cross-validated on a Combined sample of Canadian (71 boys, 40 girls measured from 1964 through 1973) and Flemish children (50 boys, 48 girls measured from 1985 through 1999). Results: The coefficient of determination (R2) for the boys' model was 0.92 and for the girls' model 0.91 the SEEs were 0.49 and 0.50, respectively, Mean difference between actual and predicted maturity offset for the verification samples was 0.24 (SD 0.65) yr in boys and 0,001 (SD 0.68) yr in girls. Conclusion: Although the cross-validation meets statistical standards or acceptance, caution 1, warranted with regard to implementation. It is recommended that maturity offset be considered as a categorical rather than a continuous assessment. Nevertheless, the equations presented are a reliable, noninvasive and a practical solution for the measure of biological maturity for matching adolescent athletes.
Resumo:
The notion of implicature was first introduced by Paul Grice (1967, 1989), who defined it essentially as what is communicated less what is said. This definition contributed in part to the proliferation of a large number of different species of implicature by neo-Griceans. Relevance theorists have responded to this by proposing a shift back to the distinction between "explicit" & "implicit" meaning (corresponding to "explicature" & "implicature," respectively). However, they appear to have pared down the concept of implicature too much, ignoring phenomena that may be better treated as implicatures in their overgeneralization of the concept of explicature. These problems have their roots in the fact that explicit & implicit meaning intuitively overlap & thus do not provide a suitable basis for distinguishing implicature from other types of pragmatic phenomena. An alternative conceptualization of implicature based on the concept of "implying" with which Grice originally associated his notion of implicature is thus proposed. From this definition, it emerges that implicature constitutes something else inferred by the addressee that is not literally said by the speaker. Instead, it is meant in addition to what the speaker literally says & is consequently defeasible like all other types of pragmatic phenomena. 1 Figure, 60 References. Adapted from the source document
Resumo:
We present an abstract model of the leader election protocol used in the IEEE 1394 High Performance Serial Bus standard. The model is expressed in the probabilistic Guarded Command Language. By formal reasoning based on this description, we establish the probability of the root contention part of the protocol successfully terminating in terms of the number of attempts to do so. Some simple calculations then allow us to establish an upper bound on the time taken for those attempts.
Resumo:
For dynamic simulations to be credible, verification of the computer code must be an integral part of the modelling process. This two-part paper describes a novel approach to verification through program testing and debugging. In Part 1, a methodology is presented for detecting and isolating coding errors using back-to-back testing. Residuals are generated by comparing the output of two independent implementations, in response to identical inputs. The key feature of the methodology is that a specially modified observer is created using one of the implementations, so as to impose an error-dependent structure on these residuals. Each error can be associated with a fixed and known subspace, permitting errors to be isolated to specific equations in the code. It is shown that the geometric properties extend to multiple errors in either one of the two implementations. Copyright (C) 2003 John Wiley Sons, Ltd.