883 resultados para Traffic emissions
Resumo:
This paper reviews spontaneous otoacoustic emissions.
Resumo:
This paper discusses the effect of noise exposure on high school aged boys' hearing levels and how to measure the effects.
Resumo:
This paper studies the relationship between hearing sensitivity and the presence of otoacoustic emissions by examining the variability of same ear emissions in a group of normal-hearing subjects.
Resumo:
This paper evaluates the routine of one pediatrics facility interested in incorporating a hearing screening protocol into their practice and suggests such a protocol using distortion product otoacoustic emission tests (DPOAE).
Resumo:
This paper examines distortion product otoacoustic emissions (DPOAEs) used to test peripheral auditory function, and how noise level in the ear affects the detectability of DPOAEs. The study examines the clinical feasibility of different time averages at different frequencies on the noise floor.
Resumo:
This paper is a review of a study on distortion product emissions in normal hearing chinchillas.
Resumo:
This paper reviews a study to evaluate the audiogram microstructure of a chinchilla with a documented spontaneous otoacoustic emission.
Resumo:
This paper discusses the distortion-product otoacoustic emissions (DPOAE) of chinchillas when exposed to noise.
Resumo:
Recovery of distortion product otoacoustic emissions in the bullfrog after noise exposure does not correlate with hair cell damage noted on the amphibian papilla.
Resumo:
Model based vision allows use of prior knowledge of the shape and appearance of specific objects to be used in the interpretation of a visual scene; it provides a powerful and natural way to enforce the view consistency constraint. A model based vision system has been developed within ESPRIT VIEWS: P2152 which is able to classify and track moving objects (cars and other vehicles) in complex, cluttered traffic scenes. The fundamental basis of the method has been previously reported. This paper presents recent developments which have extended the scope of the system to include (i) multiple cameras, (ii) variable camera geometry, and (iii) articulated objects. All three enhancements have easily been accommodated within the original model-based approach
Resumo:
The paper describes a novel integrated vision system in which two autonomous visual modules are combined to interpret a dynamic scene. The first module employs a 3D model-based scheme to track rigid objects such as vehicles. The second module uses a 2D deformable model to track non-rigid objects such as people. The principal contribution is a novel method for handling occlusion between objects within the context of this hybrid tracking system. The practical aim of the work is to derive a scene description that is sufficiently rich to be used in a range of surveillance tasks. The paper describes each of the modules in outline before detailing the method of integration and the handling of occlusion in particular. Experimental results are presented to illustrate the performance of the system in a dynamic outdoor scene involving cars and people.