846 resultados para robust speaker verification


Relevância:

20.00% 20.00%

Publicador:

Resumo:

robreg provides a number of robust estimators for linear regression models. Among them are the high breakdown-point and high efficiency MM-estimator, the Huber and bisquare M-estimator, and the S-estimator, each supporting classic or robust standard errors. Furthermore, basic versions of the LMS/LQS (least median of squares) and LTS (least trimmed squares) estimators are provided. Note that the moremata package, also available from SSC, is required.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many location-based services target users in indoor environments. Similar to the case of dense urban areas where many obstacles exist, indoor localization techniques suffer from outlying measurements caused by severe multipath propaga??tion and non-line-of-sight (NLOS) reception. Obstructions in the signal path caused by static or mobile objects downgrade localization accuracy. We use robust multipath mitigation techniques to detect and filter out outlying measurements in indoor environments. We validate our approach using a power-based lo??calization system with GSM. We conducted experiments without any prior knowledge of the tracked device's radio settings or the indoor radio environment. We obtained localization errors in the range of 3m even if the sensors had NLOS links to the target device.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We presented 28 sentences uttered by 28 unfamiliar speakers to sleeping participants to investigate whether humans can encode new verbal messages, learn voices of unfamiliar speakers, and form associations between speakers and messages during EEG-defined deep sleep. After waking, participants performed three tests which assessed the unconscious recognition of sleep-played speakers, messages, and speaker-message associations. Recognition performance in all tests was at chance level. However, response latencies revealed implicit memory for sleep-played messages but neither for speakers nor for speaker-message combinations. Only participants with excellent implicit memory for sleep-played messages also displayed implicit memory for speakers but not speaker-message associations. Hence, deep sleep allows for the semantic encoding of novel verbal messages.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Global change drivers are rapidly altering resource availability and biodiversity. While there is consensus that greater biodiversity increases the functioning of ecosystems, the extent to which biodiversity buffers ecosystem productivity in response to changes in resource availability remains unclear. We use data from 16 grassland experiments across North America and Europe that manipulated plant species richness and one of two essential resources—soil nutrients or water—to assess the direction and strength of the interaction between plant diversity and resource alteration on above-ground productivity and net biodiversity, complementarity, and selection effects. Despite strong increases in productivity with nutrient addition and decreases in productivity with drought, we found that resource alterations did not alter biodiversity–ecosystem functioning relationships. Our results suggest that these relationships are largely determined by increases in complementarity effects along plant species richness gradients. Although nutrient addition reduced complementarity effects at high diversity, this appears to be due to high biomass in monocultures under nutrient enrichment. Our results indicate that diversity and the complementarity of species are important regulators of grassland ecosystem productivity, regardless of changes in other drivers of ecosystem function.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The focus of this thesis lies in the development of a sensitive method for the analysis of protein primary structure which can be easily used to confirm the DNA sequence of a protein's gene and determine the modifications which are made after translation. This technique involves the use of dipeptidyl aminopeptidase (DAP) and dipeptidyl carboxypeptidase (DCP) to hydrolyze the protein and the mass spectrometric analysis of the dipeptide products.^ Dipeptidyl carboxypeptidase was purified from human lung tissue and characterized with respect to its proteolytic activity. The results showed that the enzyme has a relatively unrestricted specificity, making it useful for the analysis of the C-terminal of proteins. Most of the dipeptide products were identified using gas chromatography/mass spectrometry (GC/MS). In order to analyze the peptides not hydrolyzed by DCP and DAP, as well as the dipeptides not identified by GC/MS, a FAB ion source was installed on a quadrupole mass spectrometer and its performance evaluated with a variety of compounds.^ Using these techniques, the sequences of the N-terminal and C-terminal regions and seven fragments of bacteriophage P22 tail protein have been verified. All of the dipeptides identified in these analysis were in the same DNA reading frame, thus ruling out the possibility of a single base being inserted or deleted from the DNA sequence. The verification of small sequences throughout the protein sequence also indicates that no large portions of the protein have been removed after translation. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this work was to develop a comprehensive IMSRT QA procedure that examined, using EPID dosimetry and Monte Carlo (MC) calculations, each step in the treatment planning and delivery process. These steps included verification of the field shaping, treatment planning system (RTPS) dose calculations, and patient dose delivery. Verification of each step in the treatment process is assumed to result in correct dose delivery to the patient. ^ The accelerator MC model was verified against commissioning data for field sizes from 0.8 × 0.8 cm 2 to 10 × 10 cm 2. Depth doses were within 2% local percent difference (LPD) in low gradient regions and 1 mm distance to agreement (DTA) in high gradient regions. Lateral profiles were within 2% LPD in low gradient regions and 1 mm DTA in high gradient regions. Calculated output factors were within 1% of measurement for field sizes ≥1 × 1 cm2. ^ The measured and calculated pretreatment EPID dose patterns were compared using criteria of 5% LPD, 1 mm DTA, or 2% of central axis pixel value with ≥95% of compared points required to pass for successful verification. Pretreatment field verification resulted in 97% percent of the points passing. ^ The RTPS and Monte Carlo phantom dose calculations were compared using 5% LPD, 2 mm DTA, or 2% of the maximum dose with ≥95% of compared points required passing for successful verification. RTPS calculation verification resulted in 97% percent of the points passing. ^ The measured and calculated EPID exit dose patterns were compared using criteria of 5% LPD, 1 mm DTA, or 2% of central axis pixel value with ≥95% of compared points required to pass for successful verification. Exit dose verification resulted in 97% percent of the points passing. ^ Each of the processes above verified an individual step in the treatment planning and delivery process. The combination of these verification steps ensures accurate treatment delivery to the patient. This work shows that Monte Carlo calculations and EPID dosimetry can be used to quantitatively verify IMSRT treatments resulting in improved patient care and, potentially, improved clinical outcome. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The usage of intensity modulated radiotherapy (IMRT) treatments necessitates a significant amount of patient-specific quality assurance (QA). This research has investigated the precision and accuracy of Kodak EDR2 film measurements for IMRT verifications, the use of comparisons between 2D dose calculations and measurements to improve treatment plan beam models, and the dosimetric impact of delivery errors. New measurement techniques and software were developed and used clinically at M. D. Anderson Cancer Center. The software implemented two new dose comparison parameters, the 2D normalized agreement test (NAT) and the scalar NAT index. A single-film calibration technique using multileaf collimator (MLC) delivery was developed. EDR2 film's optical density response was found to be sensitive to several factors: radiation time, length of time between exposure and processing, and phantom material. Precision of EDR2 film measurements was found to be better than 1%. For IMRT verification, EDR2 film measurements agreed with ion chamber results to 2%/2mm accuracy for single-beam fluence map verifications and to 5%/2mm for transverse plane measurements of complete plan dose distributions. The same system was used to quantitatively optimize the radiation field offset and MLC transmission beam modeling parameters for Varian MLCs. While scalar dose comparison metrics can work well for optimization purposes, the influence of external parameters on the dose discrepancies must be minimized. The ability of 2D verifications to detect delivery errors was tested with simulated data. The dosimetric characteristics of delivery errors were compared to patient-specific clinical IMRT verifications. For the clinical verifications, the NAT index and percent of pixels failing the gamma index were exponentially distributed and dependent upon the measurement phantom but not the treatment site. Delivery errors affecting all beams in the treatment plan were flagged by the NAT index, although delivery errors impacting only one beam could not be differentiated from routine clinical verification discrepancies. Clinical use of this system will flag outliers, allow physicists to examine their causes, and perhaps improve the level of agreement between radiation dose distribution measurements and calculations. The principles used to design and evaluate this system are extensible to future multidimensional dose measurements and comparisons. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The clinical advantage for protons over conventional high-energy x-rays stems from their unique depth-dose distribution, which delivers essentially no dose beyond the end of range. In order to achieve it, accurate localization of the tumor volume relative to the proton beam is necessary. For cases where the tumor moves with respiration, the resultant dose distribution is sensitive to such motion. One way to reduce uncertainty caused by respiratory motion is to use gated beam delivery. The main goal of this dissertation is to evaluate the respiratory gating technique in both passive scattering and scanning delivery mode. Our hypothesis for the study was that optimization of the parameters of synchrotron operation and respiratory gating can lead to greater efficiency and accuracy of respiratory gating for all modes of synchrotron-based proton treatment delivery. The hypothesis is tested in two specific aims. The specific aim #1 is to assess the efficiency of respiratory-gated proton beam delivery and optimize the synchrotron operations for the gated proton therapy. A simulation study was performed and introduced an efficient synchrotron operation pattern, called variable Tcyc. In addition, the simulation study estimated the efficiency in the respiratory gated scanning beam delivery mode as well. The specific aim #2 is to assess the accuracy of beam delivery in respiratory-gated proton therapy. The simulation study was extended to the passive scattering mode to estimate the quality of pulsed beam delivery to the residual motion for several synchrotron operation patterns with the gating technique. The results showed that variable Tcyc operation can offer good reproducible beam delivery to the residual motion at a certain phase of the motion. For respiratory gated scanning beam delivery, the impact of motion on the dose distributions by scanned beams was investigated by measurement. The results showed the threshold for motion for a variety of scan patterns and the proper number of paintings for normal and respiratory gated beam deliveries. The results of specific aims 1 and 2 provided supporting data for implementation of the respiratory gating beam delivery technique into both passive and scanning modes and the validation of the hypothesis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Proton therapy is growing increasingly popular due to its superior dose characteristics compared to conventional photon therapy. Protons travel a finite range in the patient body and stop, thereby delivering no dose beyond their range. However, because the range of a proton beam is heavily dependent on the tissue density along its beam path, uncertainties in patient setup position and inherent range calculation can degrade thedose distribution significantly. Despite these challenges that are unique to proton therapy, current management of the uncertainties during treatment planning of proton therapy has been similar to that of conventional photon therapy. The goal of this dissertation research was to develop a treatment planning method and a planevaluation method that address proton-specific issues regarding setup and range uncertainties. Treatment plan designing method adapted to proton therapy: Currently, for proton therapy using a scanning beam delivery system, setup uncertainties are largely accounted for by geometrically expanding a clinical target volume (CTV) to a planning target volume (PTV). However, a PTV alone cannot adequately account for range uncertainties coupled to misaligned patient anatomy in the beam path since it does not account for the change in tissue density. In order to remedy this problem, we proposed a beam-specific PTV (bsPTV) that accounts for the change in tissue density along the beam path due to the uncertainties. Our proposed method was successfully implemented, and its superiority over the conventional PTV was shown through a controlled experiment.. Furthermore, we have shown that the bsPTV concept can be incorporated into beam angle optimization for better target coverage and normal tissue sparing for a selected lung cancer patient. Treatment plan evaluation method adapted to proton therapy: The dose-volume histogram of the clinical target volume (CTV) or any other volumes of interest at the time of planning does not represent the most probable dosimetric outcome of a given plan as it does not include the uncertainties mentioned earlier. Currently, the PTV is used as a surrogate of the CTV’s worst case scenario for target dose estimation. However, because proton dose distributions are subject to change under these uncertainties, the validity of the PTV analysis method is questionable. In order to remedy this problem, we proposed the use of statistical parameters to quantify uncertainties on both the dose-volume histogram and dose distribution directly. The robust plan analysis tool was successfully implemented to compute both the expectation value and its standard deviation of dosimetric parameters of a treatment plan under the uncertainties. For 15 lung cancer patients, the proposed method was used to quantify the dosimetric difference between the nominal situation and its expected value under the uncertainties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hierarchical linear growth model (HLGM), as a flexible and powerful analytic method, has played an increased important role in psychology, public health and medical sciences in recent decades. Mostly, researchers who conduct HLGM are interested in the treatment effect on individual trajectories, which can be indicated by the cross-level interaction effects. However, the statistical hypothesis test for the effect of cross-level interaction in HLGM only show us whether there is a significant group difference in the average rate of change, rate of acceleration or higher polynomial effect; it fails to convey information about the magnitude of the difference between the group trajectories at specific time point. Thus, reporting and interpreting effect sizes have been increased emphases in HLGM in recent years, due to the limitations and increased criticisms for statistical hypothesis testing. However, most researchers fail to report these model-implied effect sizes for group trajectories comparison and their corresponding confidence intervals in HLGM analysis, since lack of appropriate and standard functions to estimate effect sizes associated with the model-implied difference between grouping trajectories in HLGM, and also lack of computing packages in the popular statistical software to automatically calculate them. ^ The present project is the first to establish the appropriate computing functions to assess the standard difference between grouping trajectories in HLGM. We proposed the two functions to estimate effect sizes on model-based grouping trajectories difference at specific time, we also suggested the robust effect sizes to reduce the bias of estimated effect sizes. Then, we applied the proposed functions to estimate the population effect sizes (d ) and robust effect sizes (du) on the cross-level interaction in HLGM by using the three simulated datasets, and also we compared the three methods of constructing confidence intervals around d and du recommended the best one for application. At the end, we constructed 95% confidence intervals with the suitable method for the effect sizes what we obtained with the three simulated datasets. ^ The effect sizes between grouping trajectories for the three simulated longitudinal datasets indicated that even though the statistical hypothesis test shows no significant difference between grouping trajectories, effect sizes between these grouping trajectories can still be large at some time points. Therefore, effect sizes between grouping trajectories in HLGM analysis provide us additional and meaningful information to assess group effect on individual trajectories. In addition, we also compared the three methods to construct 95% confident intervals around corresponding effect sizes in this project, which handled with the uncertainty of effect sizes to population parameter. We suggested the noncentral t-distribution based method when the assumptions held, and the bootstrap bias-corrected and accelerated method when the assumptions are not met.^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Arctic permafrost landscapes are among the most vulnerable and dynamic landscapes globally, but due to their extent and remoteness most of the landscape changes remain unnoticed. In order to detect disturbances in these areas we developed an automated processing chain for the calculation and analysis of robust trends of key land surface indicators based on the full record of available Landsat TM, ETM +, and OLI data. The methodology was applied to the ~ 29,000 km**2 Lena Delta in Northeast Siberia, where robust trend parameters (slope, confidence intervals of the slope, and intercept) were calculated for Tasseled Cap Greenness, Wetness and Brightness, NDVI, and NDWI, and NDMI based on 204 Landsat scenes for the observation period between 1999 and 2014. The resulting datasets revealed regional greening trends within the Lena Delta with several localized hot-spots of change, particularly in the vicinity of the main river channels. With a 30-m spatial resolution various permafrost-thaw related processes and disturbances, such as thermokarst lake expansion and drainage, fluvial erosion, and coastal changes were detected within the Lena Delta region, many of which have not been noticed or described before. Such hotspots of permafrost change exhibit significantly different trend parameters compared to non-disturbed areas. The processed dataset, which is made freely available through the data archive PANGAEA, will be a useful resource for further process specific analysis by researchers and land managers. With the high level of automation and the use of the freely available Landsat archive data, the workflow is scalable and transferrable to other regions, which should enable the comparison of land surface changes in different permafrost affected regions and help to understand and quantify permafrost landscape dynamics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the scales of Raven's Progressive Matrices Test, General Scale and Advanced Scale, Series II, for the student population (third cycle of EGB and Polimodal ) in the city of La Plata are presented. Considerations are made as regards both the increase in scores (Flynn effect) observed in relation to the previous scale (1964) and the different mean scores according to two age groups (13-16 and 17-18 years of age) and education mode. The findings enabled inferences related to the significance of the increase, particularly in the case of the higher scores in the population attending a special kind of educational institution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the scales of Raven's Progressive Matrices Test, General Scale and Advanced Scale, Series II, for the student population (third cycle of EGB and Polimodal ) in the city of La Plata are presented. Considerations are made as regards both the increase in scores (Flynn effect) observed in relation to the previous scale (1964) and the different mean scores according to two age groups (13-16 and 17-18 years of age) and education mode. The findings enabled inferences related to the significance of the increase, particularly in the case of the higher scores in the population attending a special kind of educational institution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the scales of Raven's Progressive Matrices Test, General Scale and Advanced Scale, Series II, for the student population (third cycle of EGB and Polimodal ) in the city of La Plata are presented. Considerations are made as regards both the increase in scores (Flynn effect) observed in relation to the previous scale (1964) and the different mean scores according to two age groups (13-16 and 17-18 years of age) and education mode. The findings enabled inferences related to the significance of the increase, particularly in the case of the higher scores in the population attending a special kind of educational institution.