2 resultados para side illumination fluorescence

em Deakin Research Online - Australia


Relevância:

30.00% 30.00%

Publicador:

Resumo:

As in many parrots, the plumage of the budgerigar Melopsittacus undulatus reflects near-ultraviolet (UVA) wavelengths (300-400 nm) and exhibits UVA-induced fluorescence. However, there have, to our knowledge, been no tests of whether the yellow fluorescence observed under intense UVA illumination has any role in signalling. Four experiments were carried out on wild-type budgerigars, where the presence and absence of UV reflectance and fluorescence were manipulated using filters. Few studies have attempted to separate the contribution of UV reflectance to plumage hue as opposed to brightness or distinguish between a role in sexual as opposed to social preferences. However, our first experiments show that not only do females consistently prefer UV-reflecting males, but also that the observed preferences are due to removal of UV affecting the perceived hue rather than brightness. Furthermore, we found no effect Of the light environment on male response to females, suggesting that the female preferences relate to plumage colour per se. Whilst UV reflectance appears important in heterosexual choice by, females, it has no detectable influence on same-sex association preferences. The results from the second series of experiments suggest that enhancement of the budgerigar's yellow coloration through fluorescence has no effect on male attractiveness. However, the fluorescent plumage may play a role in signalling by virtue of the fact that it absorbs UVA and so increases contrast with nearby UV-reflecting plumage. Our study provides convincing evidence that UV reflectances can play a role in mate choice in non-passerines, but no evidence that the yellow fluorescence observed under UVA illumination is itself important as a signal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our aim in this paper is to robustly match frontal faces in the presence of extreme illumination changes, using only a single training image per person and a single probe image. In the illumination conditions we consider, which include those with the dominant light source placed behind and to the side of the user, directly above and pointing downwards or indeed below and pointing upwards, this is a most challenging problem. The presence of sharp cast shadows, large poorly illuminated regions of the face, quantum and quantization noise and other nuisance effects, makes it difficult to extract a sufficiently discriminative yet robust representation. We introduce a representation which is based on image gradient directions near robust edges which correspond to characteristic facial features. Robust edges are extracted using a cascade of processing steps, each of which seeks to harness further discriminative information or normalize for a particular source of extra-personal appearance variability. The proposed representation was evaluated on the extremely difficult YaleB data set. Unlike most of the previous work we include all available illuminations, perform training using a single image per person and match these also to a single probe image. In this challenging evaluation setup, the proposed gradient edge map achieved 0.8% error rate, demonstrating a nearly perfect receiver-operator characteristic curve behaviour. This is by far the best performance achieved in this setup reported in the literature, the best performing methods previously proposed attaining error rates of approximately 6–7%.