67 resultados para noisy speaker verification


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper considers the separation and recognition of overlapped speech sentences assuming single-channel observation. A system based on a combination of several different techniques is proposed. The system uses a missing-feature approach for improving crosstalk/noise robustness, a Wiener filter for speech enhancement, hidden Markov models for speech reconstruction, and speaker-dependent/-independent modeling for speaker and speech recognition. We develop the system on the Speech Separation Challenge database, involving a task of separating and recognizing two mixing sentences without assuming advanced knowledge about the identity of the speakers nor about the signal-to-noise ratio. The paper is an extended version of a previous conference paper submitted for the challenge.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Elucidation of the transcriptome and proteome of the normal retina will be difficult since it is comprised of at least 55 different cell types. However the characteristic layered cellular anatomy of the retina makes it amenable to planar sectioning, enabling the generation of enriched retinal cell populations. The aim of this study was to validate a reproducible method for preparing enriched retinal layers from porcine retina.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aims to evaluate the use of Varian radiotherapy dynamic treatment log (DynaLog) files to verify IMRT plan delivery as part of a routine quality assurance procedure. Delivery accuracy in terms of machine performance was quantified by multileaf collimator (MLC) position errors and fluence delivery accuracy for patients receiving intensity modulated radiation therapy (IMRT) treatment. The relationship between machine performance and plan complexity, quantified by the modulation complexity score (MCS) was also investigated. Actual MLC positions and delivered fraction of monitor units (MU), recorded every 50 ms during IMRT delivery, were extracted from the DynaLog files. The planned MLC positions and fractional MU were taken from the record and verify system MLC control file. Planned and delivered beam data were compared to determine leaf position errors with and without the overshoot effect. Analysis was also performed on planned and actual fluence maps reconstructed from the MLC control file and delivered treatment log files respectively. This analysis was performed for all treatment fractions for 5 prostate, 5 prostate and pelvic node (PPN) and 5 head and neck (H&N) IMRT plans, totalling 82 IMRT fields in ∼5500 DynaLog files. The root mean square (RMS) leaf position errors without the overshoot effect were 0.09, 0.26, 0.19 mm for the prostate, PPN and H&N plans respectively, which increased to 0.30, 0.39 and 0.30 mm when the overshoot effect was considered. Average errors were not affected by the overshoot effect and were 0.05, 0.13 and 0.17 mm for prostate, PPN and H&N plans respectively. The percentage of pixels passing fluence map gamma analysis at 3%/3 mm was 99.94 ± 0.25%, which reduced to 91.62 ± 11.39% at 1%/1 mm criterion. Leaf position errors, but not gamma passing rate, were directly related to plan complexity as determined by the MCS. Site specific confidence intervals for average leaf position errors were set at -0.03-0.12 mm for prostate and -0.02-0.28 mm for more complex PPN and H&N plans. For all treatment sites confidence intervals for RMS errors with the overshoot was set at 0-0.50 mm and for the percentage of pixels passing a gamma analysis at 1%/1 mm a confidence interval of 68.83% was set also for all treatment sites. This work demonstrates the successful implementation of treatment log files to validate IMRT deliveries and how dynamic log files can diagnose delivery errors not possible with phantom based QC. Machine performance was found to be directly related to plan complexity but this is not the dominant determinant of delivery accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article the multibody simulation software package MADYMO for analysing and optimizing occupant safety design was used to model crash tests for Normal Containment barriers in accordance with EN 1317. The verification process was carried out by simulating a TB31 and a TB32 crash test performed on vertical portable concrete barriers and by comparing the numerical results to those obtained experimentally. The same modelling approach was applied to both tests to evaluate the predictive capacity of the modelling at two different impact speeds. A sensitivity analysis of the vehicle stiffness was also carried out. The capacity to predict all of the principal EN1317 criteria was assessed for the first time: the acceleration severity index, the theoretical head impact velocity, the barrier working width and the vehicle exit box. Results showed a maximum error of 6% for the acceleration severity index and 21% for theoretical head impact velocity for the numerical simulation in comparison to the recorded data. The exit box position was predicted with a maximum error of 4°. For the working width, a large percentage difference was observed for test TB31 due to the small absolute value of the barrier deflection but the results were well within the limit value from the standard for both tests. The sensitivity analysis showed the robustness of the modelling with respect to contact stiffness increase of ±20% and ±40%. This is the first multibody model of portable concrete barriers that can reproduce not only the acceleration severity index but all the test criteria of EN 1317 and is therefore a valuable tool for new product development and for injury biomechanics research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: The aim of this study was to investigate the effect of pre-treatment verification imaging with megavoltage (MV) X-rays on cancer and normal cell survival in vitro and to compare the findings with theoretically modelled data. Since the dose received from pre-treatment imaging can be significant, incorporation of this dose at the planning stage of treatment has been suggested.

Methods: The impact of imaging dose incorporation on cell survival was investigated by clonogenic assay, irradiating DU-145 prostate cancer, H460 non-small cell lung cancer and AGO-1522b normal tissue fibroblast cells. Clinically relevant imaging-to-treatment times of 7.5 minutes and 15 minutes were chosen for this study. The theoretical magnitude of the loss of radiobiological efficacy due to sublethal damage repair was investigated using the Lea-Catcheside dose protraction factor model.

Results: For the cell lines investigated, the experimental data showed that imaging dose incorporation had no significant impact upon cell survival. These findings were in close agreement with the theoretical results.

Conclusions: For the conditions investigated, the results suggest that allowance for the imaging dose at the planning stage of treatment should not adversely affect treatment efficacy.

Advances in Knowledge: There is a paucity of data in the literature on imaging effects in radiotherapy. This paper presents a systematic study of imaging dose effects on cancer and normal cell survival, providing both theoretical and experimental evidence for clinically relevant imaging doses and imaging-to-treatment times. The data provide a firm foundation for further study into this highly relevant area of research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports image analysis methods that have been developed to study the microstructural changes of non-wovens made by the hydroentanglement process. The validity of the image processing techniques has been ascertained by applying them to test images with known properties. The parameters in preprocessing of the scanning electron microscope (SEM) images used in image processing have been tested and optimized. The fibre orientation distribution is estimated using fast Fourier transform (FFT) and Hough transform (HT) methods. The results obtained using these two methods are in good agreement. The HT method is more demanding in computational time compared with the Fourier transform (FT) method. However, the advantage of the HT method is that the actual orientation of the lines can be concluded directly from the result of the transform without the need for any further computation. The distribution of the length of the straight fibre segments of the fabrics is evaluated by the HT method. The effect of curl of the fibres on the result of this evaluation is shown.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We address the distribution of quantum information among many parties in the presence of noise. In particular, we consider how to optimally send to m receivers the information encoded into an unknown coherent state. On one hand, a local strategy is considered, consisting in a local cloning process followed by direct transmission. On the other hand, a telecloning protocol based on nonlocal quantum correlations is analysed. Both the strategies are optimized to minimize the detrimental effects due to losses and thermal noise during the propagation. The comparison between the local and the nonlocal protocol shows that telecloning is more effective than local cloning for a wide range of noise parameters. Our results indicate that nonlocal strategies can be more robust against noise than local ones, thus being suitable candidates for playing a major role in quantum information networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We address the generation, propagation, and application of multipartite continuous variable entanglement in a noisy environment. In particular, we focus our attention on the multimode entangled states achievable by second-order nonlinear crystals-i.e., coherent states of the SU(m,1) group-which provide a generalization of the twin-beam state of a bipartite system. The full inseparability in the ideal case is shown, whereas thresholds for separability are given for the tripartite case in the presence of noise. We find that entanglement of tripartite states is robust against thermal noise, both in the generation process and during propagation. We then consider coherent states of SU(m,1) as a resource for multipartite distribution of quantum information and analyze a specific protocol for telecloning, proving its optimality in the case of symmetric cloning of pure Gaussian states. We show that the proposed protocol also provides the first example of a completely asymmetric 1 -> m telecloning and derive explicitly the optimal relation among the different fidelities of the m clones. The effect of noise in the various stages of the protocol is taken into account, and the fidelities of the clones are analytically obtained as a function of the noise parameters. In turn, this permits the optimization of the telecloning protocol, including its adaptive modifications to the noisy environment. In the optimized scheme the clones' fidelity remains maximal even in the presence of losses (in the absence of thermal noise), for propagation times that diverge as the number of modes increases. In the optimization procedure the prominent role played by the location of the entanglement source is analyzed in details. Our results indicate that, when only losses are present, telecloning is a more effective way to distribute quantum information than direct transmission followed by local cloning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An intralaminar damage model (IDM), based on continuum damage mechanics, was developed for the simulation of composite structures subjected to damaging loads. This model can capture the complex intralaminar damage mechanisms, accounting for mode interactions, and delaminations. Its development is driven by a requirement for reliable crush simulations to design composite structures with a high specific energy absorption. This IDM was implemented as a user subroutine within the commercial finite element package, Abaqus/Explicit[1]. In this paper, the validation of the IDM is presented using two test cases. Firstly, the IDM is benchmarked against published data for a blunt notched specimen under uniaxial tensile loading, comparing the failure strength as well as showing the damage. Secondly, the crush response of a set of tulip-triggered composite cylinders was obtained experimentally. The crush loading and the associated energy of the specimen is compared with the FE model prediction. These test cases show that the developed IDM is able to capture the structural response with satisfactory accuracy

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background

]In modern radiotherapy, it is crucial to monitor the performance of all linac components including gantry, collimation system and electronic portal imaging device (EPID) during arc deliveries. In this study, a simple EPID-based measurement method has been introduced in conjunction with an algorithm to investigate the stability of these systems during arc treatments with the aim of ensuring the accuracy of linac mechanical performance.


Methods

The Varian EPID sag, gantry sag, changes in source-to-detector distance (SDD), EPID and collimator skewness, EPID tilt, and the sag in MLC carriages as a result of linac rotation were separately investigated by acquisition of EPID images of a simple phantom comprised of 5 ball-bearings during arc delivery. A fast and robust software package was developed for automated analysis of image data. Twelve Varian linacs of different models were investigated.


Results

The average EPID sag was within 1 mm for all tested linacs. All machines showed less than 1 mm gantry sag. Changes in SDD values were within 1.7 mm except for three linacs of one centre which were within 9 mm. Values of EPID skewness and tilt were negligible in all tested linacs. The maximum sag in MLC leaf bank assemblies was around 1 mm. The EPID sag showed a considerable improvement in TrueBeam linacs.


Conclusion

The methodology and software developed in this study provide a simple tool for effective investigation of the behaviour of linac components with gantry rotation. It is reproducible and accurate and can be easily performed as a routine test in clinics.




Relevância:

20.00% 20.00%

Publicador:

Resumo:

While current speech recognisers give acceptable performance in carefully controlled environments, their performance degrades rapidly when they are applied in more realistic situations. Generally, the environmental noise may be classified into two classes: the wide-band noise and narrow band noise. While the multi-band model has been shown to be capable of dealing with speech corrupted by narrow-band noise, it is ineffective for wide-band noise. In this paper, we suggest a combination of the frequency-filtering technique with the probabilistic union model in the multi-band approach. The new system has been tested on the TIDIGITS database, corrupted by white noise, noise collected from a railway station, and narrow-band noise, respectively. The results have shown that this approach is capable of dealing with noise of narrow-band or wide-band characteristics, assuming no knowledge about the noisy environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A high-fidelity composite damage model is presented and applied to predict low-velocity impact damage, compression after impact (CAI) strength and crushing of thin-walled composite structures. The simulated results correlated well with experimental testing in terms of overall force-displacement response, damage morphologies and energy dissipation. The predictive power of this model makes it suitable for use as part of a virtual testing methodology, reducing the reliance on physical testing.