266 resultados para Gender category


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two field studies demonstrated that majority and minority size moderate perceived group variability. In Study 1 we found an outgroup homogeneity (OH) effect for female nurses in the majority, but an ingroup homogeneity (IH) effect for a token minority of male nurses. In Study 2 we found similar effects in a different setting - an OH effect for policemen in the majority and an IH effect for policewomen in the minority. Although measures of visibility, status, and, especially, familiarity tended to show the same pattern as perceived variability, there was no evidence that they mediated perceived dispersion. Results are discussed in terms of group size, rather than gender, being moderators of perceived variability, and with reference to Kanter's (1977a, 1977b) theory of group proportions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We tested the hypothesis that evaluative bias in common ingroup contexts versus crossed categorization contexts can be associated with two distinct underlying processes. We reasoned that in common ingroup contexts, self-categorization, but not perceived complexity, would be positively related to intergroup bias. In contrast, in crossed categorization contexts, perceived complexity, but not self-categorization, would be negatively related to intergroup bias. In two studies, and in line with predictions, we found that while self-categorization and intergroup bias were related in common ingroup contexts, this was not the case in crossed categorization contexts. Moreover, we found that perceived category complexity, and not self-categorization, predicted bias in crossed categorization contexts. We discuss the implications of these findings for models of social categorization and intergroup bias.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Automatic gender classification has many security and commercial applications. Various modalities have been investigated for gender classification with face-based classification being the most popular. In some real-world scenarios the face may be partially occluded. In these circumstances a classification based on individual parts of the face known as local features must be adopted. We investigate gender classification using lip movements. We show for the first time that important gender specific information can be obtained from the way in which a person moves their lips during speech. Furthermore our study indicates that the lip dynamics during speech provide greater gender discriminative information than simply lip appearance. We also show that the lip dynamics and appearance contain complementary gender information such that a model which captures both traits gives the highest overall classification result. We use Discrete Cosine Transform based features and Gaussian Mixture Modelling to model lip appearance and dynamics and employ the XM2VTS database for our experiments. Our experiments show that a model which captures lip dynamics along with appearance can improve gender classification rates by between 16-21% compared to models of only lip appearance.