893 resultados para noisy speaker verification
Resumo:
We presented 28 sentences uttered by 28 unfamiliar speakers to sleeping participants to investigate whether humans can encode new verbal messages, learn voices of unfamiliar speakers, and form associations between speakers and messages during EEG-defined deep sleep. After waking, participants performed three tests which assessed the unconscious recognition of sleep-played speakers, messages, and speaker-message associations. Recognition performance in all tests was at chance level. However, response latencies revealed implicit memory for sleep-played messages but neither for speakers nor for speaker-message combinations. Only participants with excellent implicit memory for sleep-played messages also displayed implicit memory for speakers but not speaker-message associations. Hence, deep sleep allows for the semantic encoding of novel verbal messages.
VERIFICATION OF DNA PREDICTED PROTEIN SEQUENCES BY ENZYME HYDROLYSIS AND MASS SPECTROMETRIC ANALYSIS
Resumo:
The focus of this thesis lies in the development of a sensitive method for the analysis of protein primary structure which can be easily used to confirm the DNA sequence of a protein's gene and determine the modifications which are made after translation. This technique involves the use of dipeptidyl aminopeptidase (DAP) and dipeptidyl carboxypeptidase (DCP) to hydrolyze the protein and the mass spectrometric analysis of the dipeptide products.^ Dipeptidyl carboxypeptidase was purified from human lung tissue and characterized with respect to its proteolytic activity. The results showed that the enzyme has a relatively unrestricted specificity, making it useful for the analysis of the C-terminal of proteins. Most of the dipeptide products were identified using gas chromatography/mass spectrometry (GC/MS). In order to analyze the peptides not hydrolyzed by DCP and DAP, as well as the dipeptides not identified by GC/MS, a FAB ion source was installed on a quadrupole mass spectrometer and its performance evaluated with a variety of compounds.^ Using these techniques, the sequences of the N-terminal and C-terminal regions and seven fragments of bacteriophage P22 tail protein have been verified. All of the dipeptides identified in these analysis were in the same DNA reading frame, thus ruling out the possibility of a single base being inserted or deleted from the DNA sequence. The verification of small sequences throughout the protein sequence also indicates that no large portions of the protein have been removed after translation. ^
Resumo:
The purpose of this work was to develop a comprehensive IMSRT QA procedure that examined, using EPID dosimetry and Monte Carlo (MC) calculations, each step in the treatment planning and delivery process. These steps included verification of the field shaping, treatment planning system (RTPS) dose calculations, and patient dose delivery. Verification of each step in the treatment process is assumed to result in correct dose delivery to the patient. ^ The accelerator MC model was verified against commissioning data for field sizes from 0.8 × 0.8 cm 2 to 10 × 10 cm 2. Depth doses were within 2% local percent difference (LPD) in low gradient regions and 1 mm distance to agreement (DTA) in high gradient regions. Lateral profiles were within 2% LPD in low gradient regions and 1 mm DTA in high gradient regions. Calculated output factors were within 1% of measurement for field sizes ≥1 × 1 cm2. ^ The measured and calculated pretreatment EPID dose patterns were compared using criteria of 5% LPD, 1 mm DTA, or 2% of central axis pixel value with ≥95% of compared points required to pass for successful verification. Pretreatment field verification resulted in 97% percent of the points passing. ^ The RTPS and Monte Carlo phantom dose calculations were compared using 5% LPD, 2 mm DTA, or 2% of the maximum dose with ≥95% of compared points required passing for successful verification. RTPS calculation verification resulted in 97% percent of the points passing. ^ The measured and calculated EPID exit dose patterns were compared using criteria of 5% LPD, 1 mm DTA, or 2% of central axis pixel value with ≥95% of compared points required to pass for successful verification. Exit dose verification resulted in 97% percent of the points passing. ^ Each of the processes above verified an individual step in the treatment planning and delivery process. The combination of these verification steps ensures accurate treatment delivery to the patient. This work shows that Monte Carlo calculations and EPID dosimetry can be used to quantitatively verify IMSRT treatments resulting in improved patient care and, potentially, improved clinical outcome. ^
Resumo:
The usage of intensity modulated radiotherapy (IMRT) treatments necessitates a significant amount of patient-specific quality assurance (QA). This research has investigated the precision and accuracy of Kodak EDR2 film measurements for IMRT verifications, the use of comparisons between 2D dose calculations and measurements to improve treatment plan beam models, and the dosimetric impact of delivery errors. New measurement techniques and software were developed and used clinically at M. D. Anderson Cancer Center. The software implemented two new dose comparison parameters, the 2D normalized agreement test (NAT) and the scalar NAT index. A single-film calibration technique using multileaf collimator (MLC) delivery was developed. EDR2 film's optical density response was found to be sensitive to several factors: radiation time, length of time between exposure and processing, and phantom material. Precision of EDR2 film measurements was found to be better than 1%. For IMRT verification, EDR2 film measurements agreed with ion chamber results to 2%/2mm accuracy for single-beam fluence map verifications and to 5%/2mm for transverse plane measurements of complete plan dose distributions. The same system was used to quantitatively optimize the radiation field offset and MLC transmission beam modeling parameters for Varian MLCs. While scalar dose comparison metrics can work well for optimization purposes, the influence of external parameters on the dose discrepancies must be minimized. The ability of 2D verifications to detect delivery errors was tested with simulated data. The dosimetric characteristics of delivery errors were compared to patient-specific clinical IMRT verifications. For the clinical verifications, the NAT index and percent of pixels failing the gamma index were exponentially distributed and dependent upon the measurement phantom but not the treatment site. Delivery errors affecting all beams in the treatment plan were flagged by the NAT index, although delivery errors impacting only one beam could not be differentiated from routine clinical verification discrepancies. Clinical use of this system will flag outliers, allow physicists to examine their causes, and perhaps improve the level of agreement between radiation dose distribution measurements and calculations. The principles used to design and evaluate this system are extensible to future multidimensional dose measurements and comparisons. ^
Resumo:
The clinical advantage for protons over conventional high-energy x-rays stems from their unique depth-dose distribution, which delivers essentially no dose beyond the end of range. In order to achieve it, accurate localization of the tumor volume relative to the proton beam is necessary. For cases where the tumor moves with respiration, the resultant dose distribution is sensitive to such motion. One way to reduce uncertainty caused by respiratory motion is to use gated beam delivery. The main goal of this dissertation is to evaluate the respiratory gating technique in both passive scattering and scanning delivery mode. Our hypothesis for the study was that optimization of the parameters of synchrotron operation and respiratory gating can lead to greater efficiency and accuracy of respiratory gating for all modes of synchrotron-based proton treatment delivery. The hypothesis is tested in two specific aims. The specific aim #1 is to assess the efficiency of respiratory-gated proton beam delivery and optimize the synchrotron operations for the gated proton therapy. A simulation study was performed and introduced an efficient synchrotron operation pattern, called variable Tcyc. In addition, the simulation study estimated the efficiency in the respiratory gated scanning beam delivery mode as well. The specific aim #2 is to assess the accuracy of beam delivery in respiratory-gated proton therapy. The simulation study was extended to the passive scattering mode to estimate the quality of pulsed beam delivery to the residual motion for several synchrotron operation patterns with the gating technique. The results showed that variable Tcyc operation can offer good reproducible beam delivery to the residual motion at a certain phase of the motion. For respiratory gated scanning beam delivery, the impact of motion on the dose distributions by scanned beams was investigated by measurement. The results showed the threshold for motion for a variety of scan patterns and the proper number of paintings for normal and respiratory gated beam deliveries. The results of specific aims 1 and 2 provided supporting data for implementation of the respiratory gating beam delivery technique into both passive and scanning modes and the validation of the hypothesis.
Resumo:
In this paper, the scales of Raven's Progressive Matrices Test, General Scale and Advanced Scale, Series II, for the student population (third cycle of EGB and Polimodal ) in the city of La Plata are presented. Considerations are made as regards both the increase in scores (Flynn effect) observed in relation to the previous scale (1964) and the different mean scores according to two age groups (13-16 and 17-18 years of age) and education mode. The findings enabled inferences related to the significance of the increase, particularly in the case of the higher scores in the population attending a special kind of educational institution.
Resumo:
In this paper, the scales of Raven's Progressive Matrices Test, General Scale and Advanced Scale, Series II, for the student population (third cycle of EGB and Polimodal ) in the city of La Plata are presented. Considerations are made as regards both the increase in scores (Flynn effect) observed in relation to the previous scale (1964) and the different mean scores according to two age groups (13-16 and 17-18 years of age) and education mode. The findings enabled inferences related to the significance of the increase, particularly in the case of the higher scores in the population attending a special kind of educational institution.
Resumo:
In this paper, the scales of Raven's Progressive Matrices Test, General Scale and Advanced Scale, Series II, for the student population (third cycle of EGB and Polimodal ) in the city of La Plata are presented. Considerations are made as regards both the increase in scores (Flynn effect) observed in relation to the previous scale (1964) and the different mean scores according to two age groups (13-16 and 17-18 years of age) and education mode. The findings enabled inferences related to the significance of the increase, particularly in the case of the higher scores in the population attending a special kind of educational institution.
Resumo:
The technique of Abstract Interpretation has allowed the development of very sophisticated global program analyses which are at the same time provably correct and practical. We present in a tutorial fashion a novel program development framework which uses abstract interpretation as a fundamental tool. The framework uses modular, incremental abstract interpretation to obtain information about the program. This information is used to validate programs, to detect bugs with respect to partial specifications written using assertions (in the program itself and/or in system libraries), to generate and simplify run-time tests, and to perform high-level program transformations such as multiple abstract specialization, parallelization, and resource usage control, all in a provably correct way. In the case of validation and debugging, the assertions can refer to a variety of program points such as procedure entry, procedure exit, points within procedures, or global computations. The system can reason with much richer information than, for example, traditional types. This includes data structure shape (including pointer sharing), bounds on data structure sizes, and other operational variable instantiation properties, as well as procedure-level properties such as determinacy, termination, nonfailure, and bounds on resource consumption (time or space cost). CiaoPP, the preprocessor of the Ciao multi-paradigm programming system, which implements the described functionality, will be used to illustrate the fundamental ideas.
Resumo:
The time delay of arrival (TDOA) between multiple microphones has been used since 2006 as a source of information (localization) to complement the spectral features for speaker diarization. In this paper, we propose a new localization feature, the intensity channel contribution (ICC) based on the relative energy of the signal arriving at each channel compared to the sum of the energy of all the channels. We have demonstrated that by joining the ICC features and the TDOA features, the robustness of the localization features is improved and that the diarization error rate (DER) of the complete system (using localization and spectral features) has been reduced. By using this new localization feature, we have been able to achieve a 5.2% DER relative improvement in our development data, a 3.6% DER relative improvement in the RT07 evaluation data and a 7.9% DER relative improvement in the last year's RT09 evaluation data.
Resumo:
The theoretical formulation of the smoothed particle hydrodynamics (SPH) method deserves great care because of some inconsistencies occurring when considering free-surface inviscid flows. Actually, in SPH formulations one usually assumes that (i) surface integral terms on the boundary of the interpolation kernel support are neglected, (ii) free-surface conditions are implicitly verified. These assumptions are studied in detail in the present work for free-surface Newtonian viscous flow. The consistency of classical viscous weakly compressible SPH formulations is investigated. In particular, the principle of virtual work is used to study the verification of the free-surface boundary conditions in a weak sense. The latter can be related to the global energy dissipation induced by the viscous term formulations and their consistency. Numerical verification of this theoretical analysis is provided on three free-surface test cases including a standing wave, with the three viscous term formulations investigated.
Resumo:
Two new features have been proposed and used in the Rich Transcription Evaluation 2009 by the Universidad Politécnica de Madrid, which outperform the results of the baseline system. One of the features is the intensity channel contribution, a feature related to the location of the speaker. The second feature is the logarithm of the interpolated fundamental frequency. It is the first time that both features are applied to the clustering stage of multiple distant microphone meetings diarization. It is shown that the inclusion of both features improves the baseline results by 15.36% and 16.71% relative to the development set and the RT 09 set, respectively. If we consider speaker errors only, the relative improvement is 23% and 32.83% on the development set and the RT09 set, respectively.
Resumo:
In order to satisfy the safety-critical requirements, the train control system (TCS) often employs a layered safety communication protocol to provide reliable services. However, both description and verification of the safety protocols may be formidable due to the system complexity. In this paper, interface automata (IA) are used to describe the safety service interface behaviors of safety communication protocol. A formal verification method is proposed to describe the safety communication protocols using IA and translate IA model into PROMELA model so that the protocols can be verified by the model checker SPIN. A case study of using this method to describe and verify a safety communication protocol is included. The verification results illustrate that the proposed method is effective to describe the safety protocols and verify deadlocks, livelocks and several mandatory consistency properties. A prototype of safety protocols is also developed based on the presented formally verifying method.