130 resultados para BEHAVIORAL COMPONENTS
Resumo:
The present study examined the consistency over time of individual differences in behavioral and physiological responsiveness of calves to intuitively alarming test situations as well as the relationships between behavioral and physiological measures. Twenty Holstein Friesian heifer calves were individually subjected to the same series of two behavioral and two hypothalamo-pituitary-adrenocortical (HPA) axis reactivity tests at 3, 13 and 26 weeks of age. Novel environment (open field, OF) and novel object (NO) tests involved measurement of behavioral, plasma cortisol and heart rate responses. Plasma ACTH and/or cortisol response profiles were determined after administration of exogenous CRH and ACTH, respectively, in the HPA axis reactivity tests. Principal component analysis (PCA) was used to condense correlated measures within ages into principal components reflecting independent dimensions underlying the calves' reactivity. Cortisol responses to the OF and NO tests were positively associated with the latency to contact and negatively related to the time spent in contact with the NO. Individual differences in scores of a principal component summarizing this pattern of inter-correlations, as well as differences in separate measures of adrenocortical and behavioral reactivity in the OF and NO tests proved highly consistent over time. The cardiac response to confinement in a start box prior to the OF test was positively associated with the cortisol responses to the OF and NO tests at 26 weeks of age. HPA axis reactivity to ACTH or CRH was unrelated to adrenocortical and behavioral responses to novelty. These findings strongly suggest that the responsiveness of calves was mediated by stable individual characteristics. Correlated adrenocortical and behavioral responses to novelty may reflect underlying fearfulness, defining the individual's susceptibility to the elicitation of fear. Other independent characteristics mediating reactivity may include activity or coping style (related to locomotion) and underlying sociality (associated with vocalization). (c) 2005 Elsevier Inc. All rights reserved.
Resumo:
Temperament tests are widely accepted as instruments for profiling behavioral variability in dogs, and they are applied in numerous areas of investigation (e.g. suitability for adoption or for breeding). During testing, to elicit a dog's reaction toward novel stimuli and predict its behavior in everyday life, model devices such as a child-like doll, or a fake dog, are often employed. However, the reliability of these devices to accurately stimulate dogs' reactions to children or dogs, is unknown and perhaps overestimated. This may be a particular concern in the case of aggressive behavior toward humans, a significant public health issue. The aim of this study was to: (1) evaluate the correlation between dogs' reactions to these devices, and owners' reports of their dog's aggression history (using the C-BARQ ??); (2) compare reactions toward the devices of dogs with and without histories of aggression. Subjects were selected among those visiting for behavioral consultation at the Veterinary Hospital of the University of Pennsylvania, and previously categorized as aggressive toward unfamiliar children, conspecifics, or as non-aggressive dogs (control). The test consisted of different components: an unfamiliar female tester approaching the dog; the presentation of a child-like doll, an ambiguous object, and a fake plastic dog. All tests were videotaped and durations of behaviors were later analyzed on the basis of a specified ethogram. Dogs' reactions were compared to C-BARQ scores, and interesting correlations emerged for 'dog-directed aggression/fear' (R = 0.48, P = 0.004), and 'stranger-directed aggression' (R = 0.58, P <0.001) factors. Dogs differed in their reactions toward the devices: the child-like doll and the fake dog elicited more social behaviors than the ambiguous object used as a control stimulus. Issues concerning the reliability of these tools to assess canine temperament are discussed. ?? 2012 Elsevier B.V. All rights reserved.
Resumo:
The relative resistance of 15 winter barley, three winter wheat and three winter oat cultivars on the UK recommended list 2003 and two spring wheat cultivars on the Irish 2003 recommended list were evaluated using Microdochium nivale in detached leaf assays to further understand components of partial disease resistance (PDR) and Fusarium head blight (FHB) resistance across cereal species. Barley cultivars showed incubation periods comparable to, and latent periods longer than the most FHB resistant Irish and UK wheat cultivars evaluated. In addition, lesions on barley differed from those on wheat as they were not visibly chlorotic when placed over a light box until sporulation occurred, in contrast to wheat cultivars where chlorosis of the infected area occurred when lesions first developed. The pattern of delayed chlorosis of the infected leaf tissue and longer latent periods indicate that resistances are expressed in barley after the incubation period is observed, and that these temporarily arrest the development of mycelium and sporulation. Incubation periods were longer for oats compared to barley or wheat cultivars. However, oat cultivars differed from both wheat and barley in that mycelial growth was observed before obvious tissue damage was detected under macroscopic examination, indicating tolerance of infection rather than inhibition of pathogen development, and morphology of sporodochia differed, appearing less well developed and being much less abundant. Longer latent periods have previously been related to greater FHB resistance in wheat. The present results suggest the longer latent periods of barley and oat cultivars, than wheat, are likely to play a role in overall FHB resistance if under the same genetic control as PDR components expressed in the head. However the limited range of incubation and latent periods observed within barley and oat cultivars evaluated was in contrast with wheat where incubation and latent periods were shorter and more variable among genotypes. The significance of the various combinations of PDR components detected in the detached leaf assay as components of FHB resistance in each crop requires further investigation, particularly with regard to the apparent tolerance of infection in oats and necrosis in barley, after the incubation period is observed, associated with retardation of mycelial growth and sporulation.
Resumo:
Components of partial disease resistance (PDR) to fusarium head blight (FHB), detected in a seed-germination assay, were compared with whole-plant FHB resistance of 30 USA soft red winter wheat entries in the 2002 Uniform Southern FHB Nursery. Highly significant (P <0·001) differences between cultivars in the in vitro seed-germination assay inoculated with Microdochium majus were correlated to FHB disease incidence (r = -0·41; P <0·05), severity (r = -0·47; P <0·01), FHB index (r = -0·46; P <0·01), damaged kernels (r = -0·52; P <0·01), grain deoxynivalenol (DON) concentration (r = -0·40; P <0·05) and incidence/severity/kernel-damage index (ISK) (r = -0·45; P <0·01) caused by Fusarium graminearum. Multiple linear regression analysis explained a greater percentage of variation in FHB resistance using the seed-germination assay and the previously reported detached-leaf assay PDR components as explanatory factors. Shorter incubation periods, longer latent periods, shorter lesion lengths in the detached-leaf assay and higher germination rates in the seed-germination assay were related to greater FHB resistance across all disease variables, collectively explaining 62% of variation for incidence, 49% for severity, 56% for F. graminearum-damaged kernels (FDK), 39% for DON and 59% for ISK index. Incubation period was most strongly related to disease incidence and the early stages of infection, while resistance detected in the seed germination assay and latent period were more strongly related to FHB disease severity. Resistance detected using the seed-germination assay was notable as it related to greater decline in the level of FDK and a smaller reduction in DON than would have been expected from the reduction in FHB disease assessed by visual symptoms.