1000 resultados para Techniques : spectroscopique
Resumo:
A significant amount of speech is typically required for speaker verification system development and evaluation, especially in the presence of large intersession variability. This paper introduces a source and utterance duration normalized linear discriminant analysis (SUN-LDA) approaches to compensate session variability in short-utterance i-vector speaker verification systems. Two variations of SUN-LDA are proposed where normalization techniques are used to capture source variation from both short and full-length development i-vectors, one based upon pooling (SUN-LDA-pooled) and the other on concatenation (SUN-LDA-concat) across the duration and source-dependent session variation. Both the SUN-LDA-pooled and SUN-LDA-concat techniques are shown to provide improvement over traditional LDA on NIST 08 truncated 10sec-10sec evaluation conditions, with the highest improvement obtained with the SUN-LDA-concat technique achieving a relative improvement of 8% in EER for mis-matched conditions and over 3% for matched conditions over traditional LDA approaches.
Resumo:
A people-to-people matching system (or a match-making system) refers to a system in which users join with the objective of meeting other users with the common need. Some real-world examples of these systems are employer-employee (in job search networks), mentor-student (in university social networks), consume-to-consumer (in marketplaces) and male-female (in an online dating network). The network underlying in these systems consists of two groups of users, and the relationships between users need to be captured for developing an efficient match-making system. Most of the existing studies utilize information either about each of the users in isolation or their interaction separately, and develop recommender systems using the one form of information only. It is imperative to understand the linkages among the users in the network and use them in developing a match-making system. This study utilizes several social network analysis methods such as graph theory, small world phenomenon, centrality analysis, density analysis to gain insight into the entities and their relationships present in this network. This paper also proposes a new type of graph called “attributed bipartite graph”. By using these analyses and the proposed type of graph, an efficient hybrid recommender system is developed which generates recommendation for new users as well as shows improvement in accuracy over the baseline methods.
Resumo:
Using cooperative learning in classrooms promotes academic achievement, communication skills, problem-solving, social skills and student motivation. Yet it is reported that cooperative learning as a Western educational concept may be ineffective in Asian cultural contexts. The study aims to investigate the utilisation of scaffolding techniques for cooperative learning in Thailand primary mathematics classes. A teacher training program was designed to foster Thai primary school teachers’ cooperative learning implementation. Two teachers participated in this experimental program for one and a half weeks and then implemented cooperative learning strategies in their mathematics classes for six weeks. The data collected from teacher interviews and classroom observations indicates that the difficulty or failure of implementing cooperative learning in Thailand education may not be directly derived from cultural differences. Instead, it does indicate that Thai culture can be constructively merged with cooperative learning through a teacher training program and practices of scaffolding techniques.
Resumo:
Airport efficiency is important because it has a direct impact on customer safety and satisfaction and therefore the financial performance and sustainability of airports, airlines, and affiliated service providers. This is especially so in a world characterized by an increasing volume of both domestic and international air travel, price and other forms of competition between rival airports, airport hubs and airlines, and rapid and sometimes unexpected changes in airline routes and carriers. It also reflects expansion in the number of airports handling regional, national, and international traffic and the growth of complementary airport facilities including industrial, commercial, and retail premises. This has fostered a steadily increasing volume of research aimed at modeling and providing best-practice measures and estimates of airport efficiency using mathematical and econometric frontiers. The purpose of this chapter is to review these various methods as they apply to airports throughout the world. Apart from discussing the strengths and weaknesses of the different approaches and their key findings, the paper also examines the steps faced by researchers as they move through the modeling process in defining airport inputs and outputs and the purported efficiency drivers. Accordingly, the chapter provides guidance to those conducting empirical research on airport efficiency and serves as an aid for aviation regulators and airport operators among others interpreting airport efficiency research outcomes.
Resumo:
This paper presents a comparative study on the response of a buried tunnel to surface blast using the arbitrary Lagrangian-Eulerian (ALE) and smooth particle hydrodynamics (SPH) techniques. Since explosive tests with real physical models are extremely risky and expensive, the results of a centrifuge test were used to validate the numerical techniques. The numerical study shows that the ALE predictions were faster and closer to the experimental results than those from the SPH simulations which over predicted the strains. The findings of this research demonstrate the superiority of the ALE modelling techniques for the present study. They also provide a comprehensive understanding of the preferred ALE modelling techniques which can be used to investigate the surface blast response of underground tunnels.
Resumo:
This paper proposes techniques to improve the performance of i-vector based speaker verification systems when only short utterances are available. Short-length utterance i-vectors vary with speaker, session variations, and the phonetic content of the utterance. Well established methods such as linear discriminant analysis (LDA), source-normalized LDA (SN-LDA) and within-class covariance normalisation (WCCN) exist for compensating the session variation but we have identified the variability introduced by phonetic content due to utterance variation as an additional source of degradation when short-duration utterances are used. To compensate for utterance variations in short i-vector speaker verification systems using cosine similarity scoring (CSS), we have introduced a short utterance variance normalization (SUVN) technique and a short utterance variance (SUV) modelling approach at the i-vector feature level. A combination of SUVN with LDA and SN-LDA is proposed to compensate the session and utterance variations and is shown to provide improvement in performance over the traditional approach of using LDA and/or SN-LDA followed by WCCN. An alternative approach is also introduced using probabilistic linear discriminant analysis (PLDA) approach to directly model the SUV. The combination of SUVN, LDA and SN-LDA followed by SUV PLDA modelling provides an improvement over the baseline PLDA approach. We also show that for this combination of techniques, the utterance variation information needs to be artificially added to full-length i-vectors for PLDA modelling.
Resumo:
Pile foundations transfer loads from superstructures to stronger sub soil. Their strength and stability can hence affect structural safety. This paper treats the response of reinforced concrete pile in saturated sand to a buried explosion. Fully coupled computer simulation techniques are used together with five different material models. Influence of reinforcement on pile response is investigated and important safety parameters of horizontal deformations and tensile stresses in the pile are evaluated. Results indicate that adequate longitudinal reinforcement and proper detailing of transverse reinforcement can reduce pile damage. Present findings can serve as a benchmark reference for future analysis and design.
Resumo:
Visual localization in outdoor environments is often hampered by the natural variation in appearance caused by such things as weather phenomena, diurnal fluctuations in lighting, and seasonal changes. Such changes are global across an environment and, in the case of global light changes and seasonal variation, the change in appearance occurs in a regular, cyclic manner. Visual localization could be greatly improved if it were possible to predict the appearance of a particular location at a particular time, based on the appearance of the location in the past and knowledge of the nature of appearance change over time. In this paper, we investigate whether global appearance changes in an environment can be learned sufficiently to improve visual localization performance. We use time of day as a test case, and generate transformations between morning and afternoon using sample images from a training set. We demonstrate the learned transformation can be generalized from training data and show the resulting visual localization on a test set is improved relative to raw image comparison. The improvement in localization remains when the area is revisited several weeks later.
Resumo:
Polarisation diversity is a technique to improve the quality of mobile communications, but its reliability is suboptimal because it depends on the mobile channel and the antenna orientations at both ends of the mobile link. A method to optimise the reliability is established by minimising the dependency on antenna orientations. While the mobile base station can have fixed antenna orientation, the mobile terminal is typically a handheld device with random orientations. This means orientation invariance needs to be established at the receiver in the downlink, and at the transmitter in the uplink. This research presents separate solutions for both cases, and is based on the transmission of an elliptically polarised signal synthesised from the channel statistics. Complete receiver orientation invariance is achieved in the downlink. Effects of the transmitter orientation are minimised in the uplink.
Resumo:
Previous studies have shown that the human lens contains glycerophospholipids with ether linkages. These lipids differ from conventional glycerophospholipids in that the sn-1 substituent is attached to the glycerol backbone via an 1-O-alkyl or an 1-O-alk-1'-enyl ether rather than an ester bond. The present investigation employed a combination of collision-induced dissociation (CID) and ozone-induced dissociation (OzID) to unambiguously distinguish such 1-O-alkyl and 1-O-alk-1'-enyl ethers. Using these methodologies the human lens was found to contain several abundant 1-O-alkyl glycerophos-phoethanolamines, including GPEtn(16:0e/9Z-18:1), GPEtn(11Z-18:1e/9Z-18:1), and GPEtn(18:0e/9Z-18:1), as well as a related series of unusual 1-O-alkyl glycerophosphoserines, including GPSer(16:0e/9Z-18:1), GPSer(11Z-18:1e/9Z-18:1), GPSer(18:0e/9Z-18:1) that to our knowledge have not previously been observed in human tissue. Isomeric 1-O-alk-1'-enyl ethers were absent or in low abundance. Examination of the double bond position within the phospholipids using OzID revealed that several positional isomers were present, including sites of unsaturation at the n-9, n-7, and even n-5 positions. Tandem CID/OzID experiments revealed a preference for double bonds in the n-7 position of 1-O-ether linked chains, while n-9 double bonds predominated in the ester-linked fatty acids [e.g., GPEtn(11Z-18:1e/9Z-18:1) and GPSer(11Z-18:1e/9Z-18:1)]. Different combinations of these double bond positional isomers within chains at the sn-1 and sn-2 positions point to a remarkable molecular diversity of ether-lipids within the human lens.
Resumo:
E-mail spam has remained a scourge and menacing nuisance for users, internet and network service operators and providers, in spite of the anti-spam techniques available; and spammers are relentlessly circumventing these anti-spam techniques embedded or installed in form of software products on both client and server sides of both fixed and mobile devices to their advantage. This continuous evasion degrades the capabilities of these anti-spam techniques as none of them provides a comprehensive reliable solution to the problem posed by spam and spammers. Major problem for instance arises when these anti-spam techniques misjudge or misclassify legitimate emails as spam (false positive); or fail to deliver or block spam on the SMTP server (false negative); and the spam passes-on to the receiver, and yet this server from where it originates does not notice or even have an auto alert service to indicate that the spam it was designed to prevent has slipped and moved on to the receiver’s SMTP server; and the receiver’s SMTP server still fail to stop the spam from reaching user’s device and with no auto alert mechanism to inform itself of this inability; thus causing a staggering cost in loss of time, effort and finance. This paper takes a comparative literature overview of some of these anti-spam techniques, especially the filtering technological endorsements designed to prevent spam, their merits and demerits to entrench their capability enhancements, as well as evaluative analytical recommendations that will be subject to further research.
Resumo:
In this paper, we propose a new multi-class steganalysis for binary image. The proposed method can identify the type of steganographic technique used by examining on the given binary image. In addition, our proposed method is also capable of differentiating an image with hidden message from the one without hidden message. In order to do that, we will extract some features from the binary image. The feature extraction method used is a combination of the method extended from our previous work and some new methods proposed in this paper. Based on the extracted feature sets, we construct our multi-class steganalysis from the SVM classifier. We also present the empirical works to demonstrate that the proposed method can effectively identify five different types of steganography.
Resumo:
Designing systems for multiple stakeholders requires frequent collaboration with multiple stakeholders from the start. In many cases at least some stakeholders lack a professional habit of formal modeling. We report observations from student design teams as well as two case studies, respectively of a prototype for supporting creative communication to design objects, and of stakeholder-involvement in early design. In all observations and case studies we found that non-formal techniques supported strong collaboration resulting in deep understanding of early design ideas, of their value and of the feasibility of solutions.
Resumo:
Rolling Element Bearings (REBs) are vital components in rotating machineries for providing rotating motion. In slow speed rotating machines, bearings are normally subjected to heavy static loads and a catastrophic failure can cause enormous disruption to production and human safety. Due to its low operating speed the impact energy generated by the rotating elements on the defective components is not sufficient to produce a detectable vibration response. This is further aggravated by the inability of general measuring instruments to detect and process the weak signals at the initiation of the defect accurately. Furthermore, the weak signals are often corrupted by background noise. This is a serious problem faced by maintenance engineers today and the inability to detect an incipient failure of the machine can significantly increases the risk of functional failure and costly downtime. This paper presents the application of noise removal techniques for enhancing the detection capability for slow speed REB condition monitoring. Blind deconvolution (BD) and adaptive line enhancer (ALE) are compared to evaluate their performance in enhancing the source signal with consequential removal of background noise. In the experimental study, incipient defects were seeded on a number of roller bearings and the signals were acquired using acoustic emission (AE) sensor. Kurtosis and modified peak ratio (mPR) were used to determine the detectability of signal corrupted by noise.
Resumo:
This qualitative study explores the methods that chefs use to create innovative marketable product and compares these findings to other design tools. This study is based on a series of interviews with locally recognised chefs in Minnesota and observations of them in their kitchens in order to understand the details of how they conceive and develop dishes from preliminary concept to final plating and user consumption. This paper focuses on idea generation and discusses two key findings: first, the variety of idea generation techniques presented by the chefs can be classified into the creativity tool SCAMPER (substitute, combine, adapt, modify/magnify, put to other use, eliminate, reverse/rearrange); second, chefs evoke the theory of MAYA or Most Advanced Yet Acceptable when innovating new dishes, which implies making novel changes while remaining relatable to the consumer. Other reoccurring topics in the interview discussion of food innovation include play, surprise, and humour.