995 resultados para Mixture Experiments
Resumo:
This paper examines the issue of face, speaker and bi-modal authentication in mobile environments when there is significant condition mismatch. We introduce this mismatch by enrolling client models on high quality biometric samples obtained on a laptop computer and authenticating them on lower quality biometric samples acquired with a mobile phone. To perform these experiments we develop three novel authentication protocols for the large publicly available MOBIO database. We evaluate state-of-the-art face, speaker and bi-modal authentication techniques and show that inter-session variability modelling using Gaussian mixture models provides a consistently robust system for face, speaker and bi-modal authentication. It is also shown that multi-algorithm fusion provides a consistent performance improvement for face, speaker and bi-modal authentication. Using this bi-modal multi-algorithm system we derive a state-of-the-art authentication system that obtains a half total error rate of 6.3% and 1.9% for Female and Male trials, respectively.
Resumo:
In this paper we propose a novel approach to multi-action recognition that performs joint segmentation and classification. This approach models each action using a Gaussian mixture using robust low-dimensional action features. Segmentation is achieved by performing classification on overlapping temporal windows, which are then merged to produce the final result. This approach is considerably less complicated than previous methods which use dynamic programming or computationally expensive hidden Markov models (HMMs). Initial experiments on a stitched version of the KTH dataset show that the proposed approach achieves an accuracy of 78.3%, outperforming a recent HMM-based approach which obtained 71.2%.
Resumo:
This paper presents the results of an experimental program for evaluating sensors and sensing technologies in an underground mining applications. The objective of the experiments is to infer what combinations of sensors will provide reliable navigation systems for autonomous vehicles operating in a harsh underground environment. Results from a wide range of sensors are presented and analysed. Conclusions as to a best combination of sensors are drawn.
Resumo:
This paper describes the experiences gained performing multiple experiments while developing a large autonomous industrial vehicle. Hot Metal Carriers (HMCs) are large forklift-type vehicles used in the light metals industry to move molten or hot metal around a smelter. Autonomous vehicles of this type must be dependable as they are large and potentially hazardous to infrastructure and people. This paper will talk about four aspects of dependability, that of safety, reliability, availability and security and how they have been addressed on our experimental autonomous HMC.
Resumo:
Asoftware-based environment was developed to provide practical training in medical radiation principles and safety. The Virtual Radiation Laboratory application allowed students to conduct virtual experiments using simulated diagnostic and radiotherapy X-ray generators. The experiments were designed to teach students about the inverse square law, half value layer and radiation protection measures and utilised genuine clinical and experimental data. Evaluation of the application was conducted in order to ascertain the impact of the software on students’ understanding, satisfaction and collaborative learning skills and also to determine potential further improvements to the software and guidelines for its continued use. Feedback was gathered via an anonymous online survey consisting of a mixture of Likert-style questions and short answer open questions. Student feedback was highly positive with 80 % of students reporting increased understanding of radiation protection principles. Furthermore 72 % enjoyed using the software and 87 %of students felt that the project facilitated collaboration within small groups. The main themes arising in the qualitative feedback comments related to efficiency and effectiveness of teaching, safety of environment, collaboration and realism. Staff and students both report gains in efficiency and effectiveness associated with the virtual experiments. In addition students particularly value the visualisation of ‘‘invisible’’ physical principles and increased opportunity for experimentation and collaborative problembased learning. Similar ventures will benefit from adopting an approach that allows for individual experimentation while visualizing challenging concepts.
Resumo:
While numerous full scale experimental programs have been conducted around the world over the past 50 years to investigate the behaviour of steel portal frame buildings, none have comprehensively investigated the behaviour of such buildings under wind uplift. Wind uplift loads often govern designs in the Australian environment and this became the subject of a recent research project at Queensland University of Technology (OUT). This paper describes the full scale experiments on a steel portal frame building subject to wind uplift, racking and gravity loads. The portal rafter and column members utilised hollow flange beam (HFB) sections [5-8] though the paper's findings on the theoretical and experimental building responses relate to conventional types of steel portal frame buildings.
Resumo:
Field emission (FE) electron gun sources provide new capabilities for high lateral resolution EPMA. The determination of analytical lateral resolution is not as straightforward as that for electron microscopy imaging. Results from two sets of experiments to determine the actual lateral resolution for accurate EPMA are presented for Kα X-ray lines of Si and Al and Lα of Fe at 5 and 7 keV in a silicate glass. These results are compared to theoretical predictions and Monte Carlo simulations of analytical lateral resolution. The experiments suggest little is gained in lateral resolution by dropping from 7 to 5 keV in EPMA of this silicate glass.
Resumo:
Molecular phylogenetic studies of homologous sequences of nucleotides often assume that the underlying evolutionary process was globally stationary, reversible, and homogeneous (SRH), and that a model of evolution with one or more site-specific and time-reversible rate matrices (e.g., the GTR rate matrix) is enough to accurately model the evolution of data over the whole tree. However, an increasing body of data suggests that evolution under these conditions is an exception, rather than the norm. To address this issue, several non-SRH models of molecular evolution have been proposed, but they either ignore heterogeneity in the substitution process across sites (HAS) or assume it can be modeled accurately using the distribution. As an alternative to these models of evolution, we introduce a family of mixture models that approximate HAS without the assumption of an underlying predefined statistical distribution. This family of mixture models is combined with non-SRH models of evolution that account for heterogeneity in the substitution process across lineages (HAL). We also present two algorithms for searching model space and identifying an optimal model of evolution that is less likely to over- or underparameterize the data. The performance of the two new algorithms was evaluated using alignments of nucleotides with 10 000 sites simulated under complex non-SRH conditions on a 25-tipped tree. The algorithms were found to be very successful, identifying the correct HAL model with a 75% success rate (the average success rate for assigning rate matrices to the tree's 48 edges was 99.25%) and, for the correct HAL model, identifying the correct HAS model with a 98% success rate. Finally, parameter estimates obtained under the correct HAL-HAS model were found to be accurate and precise. The merits of our new algorithms were illustrated with an analysis of 42 337 second codon sites extracted from a concatenation of 106 alignments of orthologous genes encoded by the nuclear genomes of Saccharomyces cerevisiae, S. paradoxus, S. mikatae, S. kudriavzevii, S. castellii, S. kluyveri, S. bayanus, and Candida albicans. Our results show that second codon sites in the ancestral genome of these species contained 49.1% invariable sites, 39.6% variable sites belonging to one rate category (V1), and 11.3% variable sites belonging to a second rate category (V2). The ancestral nucleotide content was found to differ markedly across these three sets of sites, and the evolutionary processes operating at the variable sites were found to be non-SRH and best modeled by a combination of eight edge-specific rate matrices (four for V1 and four for V2). The number of substitutions per site at the variable sites also differed markedly, with sites belonging to V1 evolving slower than those belonging to V2 along the lineages separating the seven species of Saccharomyces. Finally, sites belonging to V1 appeared to have ceased evolving along the lineages separating S. cerevisiae, S. paradoxus, S. mikatae, S. kudriavzevii, and S. bayanus, implying that they might have become so selectively constrained that they could be considered invariable sites in these species.
Resumo:
Take-it or leave-it offers are probably as old as mankind. Our objective here is, first, to provide a, probably subjectively colored, recollection of the initial ultimatum game experiment, its motivation and the immediate responses. Second, we discuss extensions of the standard ultimatum bargaining game in a unified framework, and, third, we offer a survey of the experimental ultimatum bargaining literature containing papers published since the turn of the century. The paper argues that the ultimatum game is a versatile tool for research in bargaining and on social preferences. Finally, we provide examples for open research questions and directions for future studies.
Resumo:
A novel electrochemical biosensor, DNA/hemin/nafion–graphene/GCE, was constructed for the analysis of the benzo(a)pyrene PAH, which can produce DNA damage induced by a benzo(a)pyrene (BaP) enzyme-catalytic product. This biosensor was assembled layer-by-layer, and was characterized with the use of cyclic voltammetry, electrochemical impedance spectroscopy (EIS) and atomic force microscopy. Ultimately, it was demonstrated that the hemin/nafion–graphene/GCE was a viable platform for the immobilization of DNA. This DNA biosensor was treated separately in benzo(a)pyrene, hydrogen peroxide (H2O2) and in their mixture, respectively, and differential pulse voltammetry (DPV) analysis showed that an oxidation peak was apparent after the electrode was immersed in H2O2. Such experiments indicated that in the presence of H2O2, hemin could mimic cytochrome P450 to metabolize benzo(a)pyrene, and a voltammogram of its metabolite was recorded. The DNA damage induced by this metabolite was also detected by electrochemical impedance and ultraviolet spectroscopy. Finally, a novel, indirect DPV analytical method for BaP in aqueous solution was developed based on the linear metabolite versus BaP concentration plot; this method provided a new, indirect, quantitative estimate of DNA damage.
Resumo:
In the field of face recognition, sparse representation (SR) has received considerable attention during the past few years, with a focus on holistic descriptors in closed-set identification applications. The underlying assumption in such SR-based methods is that each class in the gallery has sufficient samples and the query lies on the subspace spanned by the gallery of the same class. Unfortunately, such an assumption is easily violated in the face verification scenario, where the task is to determine if two faces (where one or both have not been seen before) belong to the same person. In this study, the authors propose an alternative approach to SR-based face verification, where SR encoding is performed on local image patches rather than the entire face. The obtained sparse signals are pooled via averaging to form multiple region descriptors, which then form an overall face descriptor. Owing to the deliberate loss of spatial relations within each region (caused by averaging), the resulting descriptor is robust to misalignment and various image deformations. Within the proposed framework, they evaluate several SR encoding techniques: l1-minimisation, Sparse Autoencoder Neural Network (SANN) and an implicit probabilistic technique based on Gaussian mixture models. Thorough experiments on AR, FERET, exYaleB, BANCA and ChokePoint datasets show that the local SR approach obtains considerably better and more robust performance than several previous state-of-the-art holistic SR methods, on both the traditional closed-set identification task and the more applicable face verification task. The experiments also show that l1-minimisation-based encoding has a considerably higher computational cost when compared with SANN-based and probabilistic encoding, but leads to higher recognition rates.
Resumo:
Experiments were conducted to determine the fate of bensulfuron-methyl (BSM) and imazosulfuron (IMS) under paddy conditions. Initially, laboratory experiments were conducted and the photolysis half-lives of the two herbicides were found to be much shorter than their hydrolysis half-lives in aqueous solutions. In the aerobic water–soil system, dissipation followed first-order kinetics with water half-lives of 9.1 and 11.0 days and soil half-lives of 12.4 and 18.5 days (first phase) and 35.0 and 44.1 days (second phase) for bensulfuron-methyl and imazosulfuron, respectively. However, the anaerobic soil half-lives were only 12.7 and 9.8 days for BSM and IMS, respectively. The values of K d were determined to be 16.0 and 13.8 for BSM and IMS, respectively. Subsequent field measurements for the two herbicides revealed that dissipation of both herbicides in paddy water involved biphasic first-order kinetics, with the dissipation rates in the first phase being much faster than those in the second phase. The dissipation of bensulfuron-methyl and imazosulfuron in the paddy surface soil were also followed biphasic first-order kinetics. These results were then used as input parameters for the PCPF-1 model to simulate the fate and transport of BSM and IMS in the paddy environment (water and 1-cm surface soil layer). The measured and simulated values agreed well and the mass balance error during the simulation period was −1.2 and 2.8% of applied pesticide, respectively, for BSM and IMS.
Resumo:
The main objective of statistical analysis of experi- mental investigations is to make predictions on the basis of mathematical equations so as the number of experiments. Abrasive jet machining (AJM) is an unconventional and novel machining process wherein microabrasive particles are propelled at high veloc- ities on to a workpiece. The resulting erosion can be used for cutting, etching, cleaning, deburring, drilling and polishing. In the study completed by the authors, statistical design of experiments was successfully employed to predict the rate of material removal by AJM. This paper discusses the details of such an approach and the findings.
Resumo:
We report here on a series of laboratory experiments on plumes, undertaken with the object of simulating the effect of the heat release that occurs in clouds on condensation of water vapor. The experimental technique used for this purpose relies on ohmic heating generated in an electrically conducting plume fluid subjected to a suitable alternating voltage across specified axial stations in the plume flow [Bhat et al., 1989]. The present series of experiments achieves a value of the Richardson number that is toward the lower end of the range that characteristics cumulus clouds. It is found that the buoyancy enhancement due to heating disrupts the eddy structures in the flow and reduces the dilution owing to entrainment of ambient fluid that would otherwise have occurred in the central region of the plume. Heating also reduces the spread rate of the plume, but as it accelerates the flow as well, the overall specific mass flux in the plume does not show a very significant change at the heat input employed in the experiment. However, there is some indication that the entrainment rate (proportional to the streamwise derivative of the mass flux) is slightly higher immediately after heat injection and slightly lower farther downstream. The measurements support a previous proposal for a cloud scenario [Bhat and Narasimha, 1996] and demonstrate how fresh insights into certain aspects of the fluid dynamics of clouds may be derived from the experimental techniques employed here.