49 resultados para Effective coping, on the stressor-coping match
em CentAUR: Central Archive University of Reading - UK
Resumo:
Recent evidence has suggested a crucial role of people’s current goals in attention to emotional information. This asks for research investigating how and what kinds of goals shape emotional attention. The present study investigated how the goal to suppress a negative emotional state influences attention to emotion-congruent events. After inducing disgust, we instructed participants to suppress all feelings of disgust during a subsequent dot probe task. Attention to disgusting images was modulated by the sort of distracter that was presented in parallel with disgusting imagery. When disgusting images were presented together with neutral images, emotion suppression was accompanied by a tendency to attend to disgusting images. However, when disgusting images were shown with positive images that allow coping with disgust (i.e., images representing cleanliness), attention tended away from disgusting images and toward images representing cleanliness. These findings show that emotion suppression influences the allocation of attention but that the successful avoidance of emotion-congruent events depends on the availability of effective distracters.
Resumo:
Diffusive isotopic fractionation factors are important in order to understand natural processes and have practical application in radioactive waste storage and carbon dioxide sequestration. We determined the isotope fractionation factors and the effective diffusion coefficients of chloride and bromide ions during aqueous diffusion in polyacrylamide gel. Diffusion was determined as functions of temperature, time and concentration. The effect of temperature is relatively large on the diffusion coefficient (D) but only small on isotope fractionation. For chlorine, the ratio, D-35cl/D-37cl varied from 1.00128 +/- 0.00017 (1 sigma) at 2 degrees C to 1.00192 +/- 0.00015 at 80 degrees C. For bromine, D-79Br/D-81Br varied from 1.00098 +/- 0.00009 at 2 degrees C to 1.0064 +/- 0.00013 at 21 degrees C and 1.00078 +/- 0.00018 (1 sigma) at 80 degrees C. There were no significant effects on the isotope fractionation due to concentration. The lack of sensitivity of the diffusive isotope fractionation to anything at the most common temperatures (0 to 30 C) makes it particularly valuable for application to understanding processes in geological environments and an important natural tracer in order to understand fluid transport processes. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Matheron's usual variogram estimator can result in unreliable variograms when data are strongly asymmetric or skewed. Asymmetry in a distribution can arise from a long tail of values in the underlying process or from outliers that belong to another population that contaminate the primary process. This paper examines the effects of underlying asymmetry on the variogram and on the accuracy of prediction, and the second one examines the effects arising from outliers. Standard geostatistical texts suggest ways of dealing with underlying asymmetry; however, this is based on informed intuition rather than detailed investigation. To determine whether the methods generally used to deal with underlying asymmetry are appropriate, the effects of different coefficients of skewness on the shape of the experimental variogram and on the model parameters were investigated. Simulated annealing was used to create normally distributed random fields of different size from variograms with different nugget:sill ratios. These data were then modified to give different degrees of asymmetry and the experimental variogram was computed in each case. The effects of standard data transformations on the form of the variogram were also investigated. Cross-validation was used to assess quantitatively the performance of the different variogram models for kriging. The results showed that the shape of the variogram was affected by the degree of asymmetry, and that the effect increased as the size of data set decreased. Transformations of the data were more effective in reducing the skewness coefficient in the larger sets of data. Cross-validation confirmed that variogram models from transformed data were more suitable for kriging than were those from the raw asymmetric data. The results of this study have implications for the 'standard best practice' in dealing with asymmetry in data for geostatistical analyses. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Asymmetry in a distribution can arise from a long tail of values in the underlying process or from outliers that belong to another population that contaminate the primary process. The first paper of this series examined the effects of the former on the variogram and this paper examines the effects of asymmetry arising from outliers. Simulated annealing was used to create normally distributed random fields of different size that are realizations of known processes described by variograms with different nugget:sill ratios. These primary data sets were then contaminated with randomly located and spatially aggregated outliers from a secondary process to produce different degrees of asymmetry. Experimental variograms were computed from these data by Matheron's estimator and by three robust estimators. The effects of standard data transformations on the coefficient of skewness and on the variogram were also investigated. Cross-validation was used to assess the performance of models fitted to experimental variograms computed from a range of data contaminated by outliers for kriging. The results showed that where skewness was caused by outliers the variograms retained their general shape, but showed an increase in the nugget and sill variances and nugget:sill ratios. This effect was only slightly more for the smallest data set than for the two larger data sets and there was little difference between the results for the latter. Overall, the effect of size of data set was small for all analyses. The nugget:sill ratio showed a consistent decrease after transformation to both square roots and logarithms; the decrease was generally larger for the latter, however. Aggregated outliers had different effects on the variogram shape from those that were randomly located, and this also depended on whether they were aggregated near to the edge or the centre of the field. The results of cross-validation showed that the robust estimators and the removal of outliers were the most effective ways of dealing with outliers for variogram estimation and kriging. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Titration curves were determined for soil from horizon samples of a clayey and a sandy loam Oxisol by (a) adding NaOH to soil suspensions and (b) incubating moist soils with Ca(OH)(2). The organic fraction was primarily responsible for buffering in both soils. Humic acids were more important than fulvic acids in buffering against NaOH additions. With Ca(OH)(2), greater buffer capacities were found due to carboxyl sites, primarily on fulvic acids, becoming complexed with Ca2+ so that in the clay soil humic and fulvic acids were equally important as buffering components while fulvic acids were more important in the sandy loam soil. The buffer capacity of organic matter against Ca(OH)(2) additions was 1.1 mol(c) kg(-1) pH(-1). In the incubated soils, exchangeable cations were also determined and changes in the amounts of exchangeable and non-exchangeable Ca2+ acidity and effective cation exchange capacity were calculated. Up to half the added Ca2+ became complexed and was nonexchangeable. Aluminum complexed by organic matter appears to be an important buffering component, together with non exchangeable H+. With the increase of pH the dissociated sites from the carboxyl groups could complex Ca2+. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
We report evidence for a major ice stream that operated over the northwestern Canadian Shield in the Keewatin Sector of the Laurentide Ice Sheet during the last deglaciation 9000-8200 (uncalibrated) yr BP. It is reconstructed at 450 km in length, 140 km in width, and had an estimated catchment area of 190000 km. Mapping from satellite imagery reveals a suite of bedforms ('flow-set') characterized by a highly convergent onset zone, abrupt lateral margins, and where flow was presumed to have been fastest, a remarkably coherent pattern of mega-scale glacial lineations with lengths approaching 13 km and elongation ratios in excess of 40:1. Spatial variations in bedform elongation within the flow-set match the expected velocity field of a terrestrial ice stream. The flow pattern does not appear to be steered by topography and its location on the hard bedrock of the Canadian Shield is surprising. A soft sedimentary basin may have influenced ice-stream activity by lubricating the bed over the downstream crystalline bedrock, but it is unlikely that it operated over a pervasively deforming till layer. The location of the ice stream challenges the view that they only arise in deep bedrock troughs or over thick deposits of 'soft' fine-grained sediments. We speculate that fast ice flow may have been triggered when a steep ice sheet surface gradient with high driving stresses contacted a proglacial lake. An increase in velocity through calving could have propagated fast ice flow upstream (in the vicinity of the Keewatin Ice Divide) through a series of thermomechanical feedback mechanisms. It exerted a considerable impact on the Laurentide Ice Sheet, forcing the demise of one of the last major ice centres.
Resumo:
Elucidating the controls on the location and vigor of ice streams is crucial to understanding the processes that lead to fast disintegration of ice flows and ice sheets. In the former North American Laurentide ice sheet, ice stream occurrence appears to have been governed by topographic troughs or areas of soft-sediment geology. This paper reports robust evidence of a major paleo-ice stream over the northwestern Canadian Shield, an area previously assumed to be incompatible with fast ice flow because of the low relief and relatively hard bedrock. A coherent pattern of subglacial bedforms (drumlins and megascalle glacial lineations) demarcates the ice stream flow set, which exhibits a convergent onset zone, a narrow main trunk with abrupt lateral margins, and a lobate terminus. Variations in bedform elongation ratio within the flow set match theoretical expectations of ice velocity. In the center of the ice stream, extremely parallel megascalle glacial lineations tens of kilometers long with elongation ratios in excess of 40:1 attest to a single episode of rapid ice flow. We conclude that while bed properties are likely to be influential in determining the occurrence and vigor of ice streams, contrary to established views, widespread soft-bed geology is not an essential requirement for those ice streams without topographic control. We speculate that the ice stream acted as a release valve on ice-sheet mass balance and was initiated by the presence of a proglacial lake that destabilized the ice-sheet margin and propagated fast ice flow through a series of thermomechanical feedbacks involving ice flow and temperature.
Resumo:
During deglaciation of the North American Laurentide Ice Sheet large proglacial lakes developed in positions where proglacial drainage was impeded by the ice margin. For some of these lakes, it is known that subsequent drainage had an abrupt and widespread impact on North Atlantic Ocean circulation and climate, but less is known about the impact that the lakes exerted on ice sheet dynamics. This paper reports palaeogeographic reconstructions of the evolution of proglacial lakes during deglaciation across the northwestern Canadian Shield, covering an area in excess of 1,000,000 km(2) as the ice sheet retreated some 600 km. The interactions between proglacial lakes and ice sheet flow are explored, with a particular emphasis on whether the disposition of lakes may have influenced the location of the Dubawnt Lake ice stream. This ice stream falls outside the existing paradigm for ice streams in the Laurentide Ice Sheet because it did not operate over fined-grained till or lie in a topographic trough. Ice margin positions and a digital elevation model are utilised to predict the geometry and depth of proglacial takes impounded at the margin at 30-km increments during deglaciation. Palaeogeographic reconstructions match well with previous independent estimates of lake coverage inferred from field evidence, and results suggest that the development of a deep lake in the Thelon drainage basin may have been influential in initiating the ice stream by inducing calving, drawing down ice and triggering fast ice flow. This is the only location alongside this sector of the ice sheet where large (>3000 km(2)), deep lakes (similar to120 m) are impounded for a significant length of time and exactly matches the location of the ice stream. It is speculated that the commencement of calving at the ice sheet margin may have taken the system beyond a threshold and was sufficient to trigger rapid motion but that once initiated, calving processes and losses were insignificant to the functioning of the ice stream. It is thus concluded that proglacial lakes are likely to have been an important control on ice sheet dynamics during deglaciation of the Laurentide Ice Sheet. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
The Kasparov-World match was initiated by Microsoft with sponsorship from the bank First USA. The concept was that Garry Kasparov as White would play the rest of the world on the Web: one ply would be played per day and the World Team was to vote for its move. The Kasparov-World game was a success from many points of view. It certainly gave thousands the feeling of facing the world’s best player across the board and did much for the future of the game. Described by Kasparov as “phenomenal ... the most complex in chess history”, it is probably a worthy ‘Greatest Game’ candidate. Computer technology has given chess a new mode of play and taken it to new heights: the experiment deserves to be repeated. We look forward to another game and experience of this quality although it will be difficult to surpass the event we have just enjoyed. We salute and thank all those who contributed - sponsors, moderator, coaches, unofficial analysts, organisers, technologists, voters and our new friends.
Resumo:
Architects and engineers depend on copyright law to protect their original works. Copyright protection is automatic once a tangible medium of expression in any form of an innovative material, conforming the Copyright Designs and Patents Act 1988, is created. In terms of architectural works, they are protected as literary works (design drawings and plans) and as artistic works (the building or model of the building). The case law on the concept of “originality” however discloses that it may be difficult for certain artistic works of architecture to achieve copyright protection. Although copyright law provides automatic protection to all original architectural plans, the limitation is that it only protects the expression of ideas but not the ideas themselves. The purpose of this research is to explore how effective the UK’s copyright law regime is for protecting the rights and interests of architects in their works. In addition, the United States system of copyright law will be analysed to determine whether it provides more effective protection for architects and engineers with regard to architectural works. The key objective in carrying out this comparison is to compare and contrast the extent to which the two systems protect the rights and interests of architects against copyright infringement. This comparative analysis concludes by considering the possibility of copyright law reform in the UK.
Resumo:
A simple model for the effective vibrational hamiltonian of the XH stretching vibrations in H2O, NH3 and CH4 is considered, based on a morse potential function for the bond stretches plus potential and kinetic energy coupling between pairs of bond oscillators. It is shown that this model can be set up as a matrix in local mode basis functions, or as a matrix in normal mode basis functions, leading to identical results. The energy levels obtained exhibit normal mode patterns at low vibrational excitation, and local mode patterns at high excitation. When the hamiltonian is set up in the normal mode basis it is shown that Darling-Dennison resonances must be included, and simple relations are found to exist between the xrs, gtt, and Krrss anharmonic constants (where the Darling-Dennison coefficients are denoted K) due to their contributions from morse anharmonicity in the bond stretches. The importance of the Darling-Dennison resonances is stressed. The relationship of the two alternative representations of this local mode/normal mode model are investigated, and the potential uses and limitations of the model are discussed.
Resumo:
A study was conducted to assess the effect of condensed tannins on the activity of fibrolytic enzymes from the anaerobic rumen fungus, Neocallimastix hurleyensis and a recombinant ferulic acid esterase (FAE) from the aerobic fungus Aspergillus niger. Condensed tannins were extracted from the tropical legumes Desmodium ovalifolium, Flemingia macrophylla, Leucaena leticocephala, Leucaena pallida, Calliandra calothyrsus and Clitoria fairchildiana and incubated in fungal enzyme mixtures or with the recombinant FAE. In most cases, the greatest reductions in enzyme activities were observed with tannins purified from D. ovalifolium and F macrophylla and the least with tannins from L leucocephala. Thus, whereas 40 mu g ml(-1) of condensed tannins from C. calothyrsus and L. leucocephala were needed to halve the activity of N. hurleyensis carboxymethylcellulase (CMCase), just 5.5 mu g ml(-1) of the same tannins were required to inhibit 50% of xylanase activity. The beta-D-glucosidase and beta-D-Xylosidase enzymes were less sensitive to tannin inhibition and concentrations greater than 100 mu g ml(-1) were required to reduce their activity by 50%. In other assays, the inhibitory effect of condensed tannins when added to incubation mixtures containing particulate substrates (the primary cell walls of E arundinacea) or when bound to these substrate was compared. Substrate-associated tannins were more effective in preventing fibrolytic activities than tannins added directly to incubations solutions. It was concluded that condensed tannins from tropical legumes can inhibit fibrolytic enzyme activities, although the extent of the effect was dependent on the tannin, the nature of its association with the substrate and the enzyme involved. (c) 2005 Elsevier Inc. All rights reserved.
Resumo:
We developed three different knowledge-dissemination methods for educating Tanzanian smallholder farmers about mastitis in their dairy cattle. The effectiveness of these methods (and their combinations) was evaluated and quantified using a randomised controlled trial and multilevel statistical modelling. To our knowledge, this is the first study that has used such techniques to evaluate the effectiveness of different knowledge-dissemination interventions for adult learning in developing countries. Five different combinations of knowledge-dissemination method were compared: 'diagrammatic handout' ('HO'), 'village meeting' ('VM'), 'village meeting and video' ('VM + V), 'village meeting and diagrammatic handout' ('VM + HO') and 'village meeting, video and diagrammatic handout' ('VM + V + HO'). Smallholder dairy farmers were exposed to only one of these interventions, and the effectiveness of each was compared to a control ('C') group, who received no intervention. The mastitis knowledge of each farmer (n = 256) was evaluated by questionnaire both pre- and post-dissemination. Generalised linear mixed models were used to evaluate the effectiveness of the different interventions. The outcome variable considered was the probability of volunteering correct responses to mastitis questions post-dissemination, with 'village' and 'farmer' considered as random effects in the model. Results showed that all five interventions, 'HO' (odds ratio (OR) = 3.50, 95% confidence intervals (CI) = 3.10, 3.96), 'VM + V + HO' (OR = 3.34, 95% CI = 2.94, 3.78), 'VM + HO, (OR=3.28, 95% CI=2.90, 3.71), WM+V (OR=3.22, 95% CI=2.84, 3.64) and 'VM' (OR = 2.61, 95% CI = 2.31, 2.95), were significantly (p < 0.0001) more effective at disseminating mastitis knowledge than no intervention. In addition, the 'VM' method was less effective at disseminating mastitis knowledge than other interventions. Combinations of methods showed no advantage over the diagrammatic handout alone. Other explanatory variables with significant positive associations on mastitis knowledge included education to secondary school level or higher, and having previously learned about mastitis by reading pamphlets or attendance at an animal-health course. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
The development of genetically modified (GM) crops has led the European Union (EU) to put forward the concept of 'coexistence' to give fanners the freedom to plant both conventional and GM varieties. Should a premium for non-GM varieties emerge in the market, 'contamination' by GM pollen would generate a negative externality to conventional growers. It is therefore important to assess the effect of different 'policy variables'on the magnitude of the externality to identify suitable policies to manage coexistence. In this paper, taking GM herbicide tolerant oilseed rape as a model crop, we start from the model developed in Ceddia et al. [Ceddia, M.G., Bartlett, M., Perrings, C., 2007. Landscape gene flow, coexistence and threshold effect: the case of genetically modified herbicide tolerant oilseed rape (Brassica napus). Ecol. Modell. 205, pp. 169-180] use a Monte Carlo experiment to generate data and then estimate the effect of the number of GM and conventional fields, width of buffer areas and the degree of spatial aggregation (i.e. the 'policy variables') on the magnitude of the externality at the landscape level. To represent realistic conditions in agricultural production, we assume that detection of GM material in conventional produce might occur at the field level (no grain mixing occurs) or at the silos level (where grain mixing from different fields in the landscape occurs). In the former case, the magnitude of the externality will depend on the number of conventional fields with average transgenic presence above a certain threshold. In the latter case, the magnitude of the externality will depend on whether the average transgenic presence across all conventional fields exceeds the threshold. In order to quantify the effect of the relevant' policy variables', we compute the marginal effects and the elasticities. Our results show that when relying on marginal effects to assess the impact of the different 'policy variables', spatial aggregation is far more important when transgenic material is detected at field level, corroborating previous research. However, when elasticity is used, the effectiveness of spatial aggregation in reducing the externality is almost identical whether detection occurs at field level or at silos level. Our results show also that the area planted with GM is the most important 'policy variable' in affecting the externality to conventional growers and that buffer areas on conventional fields are more effective than those on GM fields. The implications of the results for the coexistence policies in the EU are discussed. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Entomopathogenic nematodes, Steinernema carpocapsae, S. feltiae (Steinernematids) Heterorhabditis indica and H. bacteriophora (Heterorhabditids) were studied to control nymphs of desert locust Schistocerca gregaria. Results of all experiments showed a significant difference in mortality percentage among all isolates. All nematodes were found more effective when exposure time was increased up to 10 days. On the other hand, both Heterorhabditids caused maximum mortality as compared to Steinernematids at 30 degree C. When different moisture levels were tested in the sand arena, a medium level of moisture (1%) caused maximum insect mortality in all isolates. However, highest concentration of each isolate (200 IJs per ml) proved to be most appropriate for maximum insect death. Similarly, both Heterorhabditis nematodes when orally applied to insects killed maximum nymphs as compared to other two Steinernematids. A similar response was observed in infectivity test when maximum percentage of IJs of both isolates of Heterorhabditis successfully penetrated into the body of locust nymphs. This research suggests some useful basic findings in developing biocides with suitable virulent of entomopathogenic nematode for controlling nymphs of desert locust.