45 resultados para Incommensurability of values

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a model of the evolution of identity via dynamic interaction between the choice of education and the transmission of values in a community from parents to children, when parents care about the preservation of their traditional community values, different from the values of the host society. We compare the educational and socioeconomic outcomes in different scenarios (melting pot versus multiculturalism). If schooling shifts children’s identity away from their parents’ values, parents may choose lower levels of education for their children, at the cost of reducing their future earnings. We show how this effect can be attenuated and reversed when the school or, indeed, the host society are willing to accommodate the values of the community and/or to adjust to these values; otherwise the community gradually becomes alienated. This approach may be applied to the analysis of temporal changes in values and attitudes in a community of immigrants, as well as ethnic, religious, or other minority groups.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract The concept of values “fit” has been a significant theme in the management literature for many years. It is argued that where there is alignment of staff and organizational values a range of positive outcomes are encountered. What is unclear is how this translates into the charity sector. This study explores the phenomenon of values alignment in two UK charities. Questionnaires were used to measure staff values, perceptions of organization values and staff commitment. Drawing on the work of Finegan (2000), an interaction term is used as a proxy for fit. Analyses of data from 286 participants indicated that it was the perceptions of organization values that had the greatest impact on staff commitment. The alignment of staff values and perceptions of organization values only had a degree of effect within one of the charities. This challenges the dominant view on such alignment and the implications of this are discussed. Keywords staff, values fit, commitment, organizational identification

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Variable rate applications of nitrogen (N) are of environmental and economic interest. Regular measurements of soil N supply are difficult to achieve practically. Therefore accurate model simulations of soil N supply might provide a practical solution for site-specific management of N. Mineral N, an estimate of N supply, was simulated by the model SUNDIAL (Simulation of Nitrogen Dynamics In Arable Land) at more than 100 locations within three arable fields in Bedfordshire, UK. The results were compared with actual measurements. The outcomes showed that the spatial patterns of the simulations of mineral N corresponded to the measurements but the range of values was underestimated.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

1. We compared the baseline phosphorus (P) concentrations inferred by diatom-P transfer functions and export coefficient models at 62 lakes in Great Britain to assess whether the techniques produce similar estimates of historical nutrient status. 2. There was a strong linear relationship between the two sets of values over the whole total P (TP) gradient (2-200 mu g TP L-1). However, a systematic bias was observed with the diatom model producing the higher values in 46 lakes (of which values differed by more than 10 mu g TP L-1 in 21). The export coefficient model gave the higher values in 10 lakes (of which the values differed by more than 10 mu g TP L-1 in only 4). 3. The difference between baseline and present-day TP concentrations was calculated to compare the extent of eutrophication inferred by the two sets of model output. There was generally poor agreement between the amounts of change estimated by the two approaches. The discrepancy in both the baseline values and the degree of change inferred by the models was greatest in the shallow and more productive sites. 4. Both approaches were applied to two lakes in the English Lake District where long-term P data exist, to assess how well the models track measured P concentrations since approximately 1850. There was good agreement between the pre-enrichment TP concentrations generated by the models. The diatom model paralleled the steeper rise in maximum soluble reactive P (SRP) more closely than the gradual increase in annual mean TP in both lakes. The export coefficient model produced a closer fit to observed annual mean TP concentrations for both sites, tracking the changes in total external nutrient loading. 5. A combined approach is recommended, with the diatom model employed to reflect the nature and timing of the in-lake response to changes in nutrient loading, and the export coefficient model used to establish the origins and extent of changes in the external load and to assess potential reduction in loading under different management scenarios. 6. However, caution must be exercised when applying these models to shallow lakes where the export coefficient model TP estimate will not include internal P loading from lake sediments and where the diatom TP inferences may over-estimate TP concentrations because of the high abundance of benthic taxa, many of which are poor indicators of trophic state.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Matheron's usual variogram estimator can result in unreliable variograms when data are strongly asymmetric or skewed. Asymmetry in a distribution can arise from a long tail of values in the underlying process or from outliers that belong to another population that contaminate the primary process. This paper examines the effects of underlying asymmetry on the variogram and on the accuracy of prediction, and the second one examines the effects arising from outliers. Standard geostatistical texts suggest ways of dealing with underlying asymmetry; however, this is based on informed intuition rather than detailed investigation. To determine whether the methods generally used to deal with underlying asymmetry are appropriate, the effects of different coefficients of skewness on the shape of the experimental variogram and on the model parameters were investigated. Simulated annealing was used to create normally distributed random fields of different size from variograms with different nugget:sill ratios. These data were then modified to give different degrees of asymmetry and the experimental variogram was computed in each case. The effects of standard data transformations on the form of the variogram were also investigated. Cross-validation was used to assess quantitatively the performance of the different variogram models for kriging. The results showed that the shape of the variogram was affected by the degree of asymmetry, and that the effect increased as the size of data set decreased. Transformations of the data were more effective in reducing the skewness coefficient in the larger sets of data. Cross-validation confirmed that variogram models from transformed data were more suitable for kriging than were those from the raw asymmetric data. The results of this study have implications for the 'standard best practice' in dealing with asymmetry in data for geostatistical analyses. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Asymmetry in a distribution can arise from a long tail of values in the underlying process or from outliers that belong to another population that contaminate the primary process. The first paper of this series examined the effects of the former on the variogram and this paper examines the effects of asymmetry arising from outliers. Simulated annealing was used to create normally distributed random fields of different size that are realizations of known processes described by variograms with different nugget:sill ratios. These primary data sets were then contaminated with randomly located and spatially aggregated outliers from a secondary process to produce different degrees of asymmetry. Experimental variograms were computed from these data by Matheron's estimator and by three robust estimators. The effects of standard data transformations on the coefficient of skewness and on the variogram were also investigated. Cross-validation was used to assess the performance of models fitted to experimental variograms computed from a range of data contaminated by outliers for kriging. The results showed that where skewness was caused by outliers the variograms retained their general shape, but showed an increase in the nugget and sill variances and nugget:sill ratios. This effect was only slightly more for the smallest data set than for the two larger data sets and there was little difference between the results for the latter. Overall, the effect of size of data set was small for all analyses. The nugget:sill ratio showed a consistent decrease after transformation to both square roots and logarithms; the decrease was generally larger for the latter, however. Aggregated outliers had different effects on the variogram shape from those that were randomly located, and this also depended on whether they were aggregated near to the edge or the centre of the field. The results of cross-validation showed that the robust estimators and the removal of outliers were the most effective ways of dealing with outliers for variogram estimation and kriging. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper reports on research undertaken by the author into what secondary school drama teachers think they need to possess in terms of subject knowledge in order to operate effectively as subject specialists. ‘Subject knowledge’ is regarded as being multi faceted and the paper reports on how drama teachers prioritise its different aspects. A discussion of what ‘subject knowledge’ may be seen to encompass reveals interesting tensions between aspects of professional knowledge that are prescribed by statutory dictate and local context, and those that are valued by individual teachers and are manifest in their construction of a professional identity. The paper proposes that making judgements that associate propositional and substantive knowledge with traditionally held academic values as ‘bad’ or ‘irrelevant’ to drama education, and what Foucault has coined as ‘subjugated knowledge’ (i.e. local, vernacular, enactive knowledge that eludes inscription) as ‘good’ and more apposite to the work of all those involved in drama education, fails to reflect the complex matrices of values that specialists appear to hold. While the reported research focused on secondary school drama teachers in England, Bourdieu’s conception of field and habitus is invoked to suggest a model which recognises how drama educators more generally may construct a professional identity that necessarily balances personal interests and beliefs with externally imposed demands.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The concept of an organism's niche is central to ecological theory, but an operational definition is needed that allows both its experimental delineation and interpretation of field distributions of the species. Here we use population growth rate (hereafter, pgr) to de. ne the niche as the set of points in niche space where pgr. 0. If there are just two axes to the niche space, their relationship to pgr can be pictured as a contour map in which pgr varies along the axes in the same way that the height of land above sea level varies with latitude and longitude. In laboratory experiments we measured the pgr of Daphnia magna over a grid of values of pH and Ca2+, and so defined its "laboratory niche'' in pH-Ca2+ space. The position of the laboratory niche boundary suggests that population persistence is only possible above 0.5 mg Ca2+/L and between pH 5.75 and pH 9, though more Ca2+ is needed at lower pH values. To see how well the measured niche predicts the field distribution of D. magna, we examined relevant field data from 422 sites in England and Wales. Of the 58 colonized water bodies, 56 lay within the laboratory niche. Very few of the sites near the niche boundary were colonized, probably because pgr there is so low that populations are vulnerable to extinction by other factors. Our study shows how the niche can be quantified and used to predict field distributions successfully.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Individual identification via DNA profiling is important in molecular ecology, particularly in the case of noninvasive sampling. A key quantity in determining the number of loci required is the probability of identity (PIave), the probability of observing two copies of any profile in the population. Previously this has been calculated assuming no inbreeding or population structure. Here we introduce formulae that account for these factors, whilst also accounting for relatedness structure in the population. These formulae are implemented in API-CALC 1.0, which calculates PIave for either a specified value, or a range of values, for F-IS and F-ST.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents a study investigating how the performance of motion-impaired computer users in point and click tasks varies with target distance (A), target width (W), and force-feedback gravity well width (GWW). Six motion-impaired users performed point and click tasks across a range of values for A, W, and GWW. Times were observed to increase with A, and to decrease with W. Times also improved with GWW, and, with the addition of a gravity well, a greater improvement was observed for smaller targets than for bigger ones. It was found that Fitts Law gave a good description of behaviour for each value of GWW, and that gravity wells reduced the effect of task difficulty on performance. A model based on Fitts Law is proposed, which incorporates the effect of GWW on movement time. The model accounts for 88.8% of the variance in the observed data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A combination of photoelectron spectroscopy, temperature programmed desorption and low energy electron diffraction structure determinations have been applied to study the p(2 x 2) structures of pure hydrogen and co-adsorbed hydrogen and CO on Ni {111}. In agreement with earlier work atomic hydrogen is found to adsorb on fcc and hcp sites in the pure layer with H-Ni bond lengths of 1.74Angstrom. The substrate interlayer distances, d(12) = 2.05Angstrom and d(23) = 2.06Angstrom, are expanded with respect to clean Ni {111} with buckling of 0.04Angstrom in the first layer. In the co-adsorbed phase Co occupies hcp sites and only the hydrogen atoms on fcc sites remain on the surface. d(12) is even further expanded to 2.08Angstrom with buckling in the first and second layer of 0.06 and 0.02Angstrom, respectively. The C-O, C-Ni, and H-Ni bond lengths are within the range of values also found for the pure adsorbates.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present an extensive thermodynamic analysis of a hysteresis experiment performed on a simplified yet Earth-like climate model. We slowly vary the solar constant by 20% around the present value and detect that for a large range of values of the solar constant the realization of snowball or of regular climate conditions depends on the history of the system. Using recent results on the global climate thermodynamics, we show that the two regimes feature radically different properties. The efficiency of the climate machine monotonically increases with decreasing solar constant in present climate conditions, whereas the opposite takes place in snowball conditions. Instead, entropy production is monotonically increasing with the solar constant in both branches of climate conditions, and its value is about four times larger in the warm branch than in the corresponding cold state. Finally, the degree of irreversibility of the system, measured as the fraction of excess entropy production due to irreversible heat transport processes, is much higher in the warm climate conditions, with an explosive growth in the upper range of the considered values of solar constants. Whereas in the cold climate regime a dominating role is played by changes in the meridional albedo contrast, in the warm climate regime changes in the intensity of latent heat fluxes are crucial for determining the observed properties. This substantiates the importance of addressing correctly the variations of the hydrological cycle in a changing climate. An interpretation of the climate transitions at the tipping points based upon macro-scale thermodynamic properties is also proposed. Our results support the adoption of a new generation of diagnostic tools based on the second law of thermodynamics for auditing climate models and outline a set of parametrizations to be used in conceptual and intermediate-complexity models or for the reconstruction of the past climate conditions. Copyright © 2010 Royal Meteorological Society

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The estimation of the long-term wind resource at a prospective site based on a relatively short on-site measurement campaign is an indispensable task in the development of a commercial wind farm. The typical industry approach is based on the measure-correlate-predict �MCP� method where a relational model between the site wind velocity data and the data obtained from a suitable reference site is built from concurrent records. In a subsequent step, a long-term prediction for the prospective site is obtained from a combination of the relational model and the historic reference data. In the present paper, a systematic study is presented where three new MCP models, together with two published reference models �a simple linear regression and the variance ratio method�, have been evaluated based on concurrent synthetic wind speed time series for two sites, simulating the prospective and the reference site. The synthetic method has the advantage of generating time series with the desired statistical properties, including Weibull scale and shape factors, required to evaluate the five methods under all plausible conditions. In this work, first a systematic discussion of the statistical fundamentals behind MCP methods is provided and three new models, one based on a nonlinear regression and two �termed kernel methods� derived from the use of conditional probability density functions, are proposed. All models are evaluated by using five metrics under a wide range of values of the correlation coefficient, the Weibull scale, and the Weibull shape factor. Only one of all models, a kernel method based on bivariate Weibull probability functions, is capable of accurately predicting all performance metrics studied.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper is concerned with the problem of propagation from a monofrequency coherent line source above a plane of homogeneous surface impedance. The solution of this problem occurs in the kernel of certain boundary integral equation formulations of acoustic propagation above an impedance boundary, and the discussion of the paper is motivated by this application. The paper starts by deriving representations, as Laplace-type integrals, of the solution and its first partial derivatives. The evaluation of these integral representations by Gauss-Laguerre quadrature is discussed, and theoretical bounds on the truncation error are obtained. Specific approximations are proposed which are shown to be accurate except in the very near field, for all angles of incidence and a wide range of values of surface impedance. The paper finishes with derivations of partial results and analogous Laplace-type integral representations for the case of a point source.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Sea ice contains flaws including frictional contacts. We aim to describe quantitatively the mechanics of those contacts, providing local physics for geophysical models. With a focus on the internal friction of ice, we review standard micro-mechanical models of friction. The solid's deformation under normal load may be ductile or elastic. The shear failure of the contact may be by ductile flow, brittle fracture, or melting and hydrodynamic lubrication. Combinations of these give a total of six rheological models. When the material under study is ice, several of the rheological parameters in the standard models are not constant, but depend on the temperature of the bulk, on the normal stress under which samples are pressed together, or on the sliding velocity and acceleration. This has the effect of making the shear stress required for sliding dependent on sliding velocity, acceleration, and temperature. In some cases, it also perturbs the exponent in the normal-stress dependence of that shear stress away from the value that applies to most materials. We unify the models by a principle of maximum displacement for normal deformation, and of minimum stress for shear failure, reducing the controversy over the mechanism of internal friction in ice to the choice of values of four parameters in a single model. The four parameters represent, for a typical asperity contact, the sliding distance required to expel melt-water, the sliding distance required to break contact, the normal strain in the asperity, and the thickness of any ductile shear zone.