31 resultados para Discriminating limits

em Aston University Research Archive


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose - The idea that knowledge needs to be codified is central to many claims that knowledge can be managed. However, there appear to be no empirical studies in the knowledge management context that examine the process of knowledge codification. This paper therefore seeks to explore codification as a knowledge management process. Design/methodology/approach - The paper draws on findings from research conducted around a knowledge management project in a section of the UK Post Office, using a methodology of participant-observation. Data were collected through observations of project meetings, correspondence between project participants, and individual interviews. Findings - The principal findings about the nature of knowledge codification are first, that the process of knowledge codification also involves the process of defining the codes needed to codify knowledge, and second, that people who participate in the construction of these codes are able to interpret and use the codes more similarly. From this it can be seen that the ability of people to decodify codes similarly places restrictions on the transferability of knowledge between them. Research limitations/implications - The paper therefore argues that a new conceptual approach is needed for the role of knowledge codification in knowledge management that emphasizes the importance of knowledge decodification. Such an approach would start with one's ability to decodify rather than codify knowledge as a prerequisite for knowledge management. Originality/value - The paper provides a conceptual basis for explaining limitations to the management and transferability of knowledge. © Emerald Group Publishing Limited.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With luminance gratings, psychophysical thresholds for detecting a small increase in the contrast of a weak ‘pedestal’ grating are 2–3 times lower than for detection of a grating when the pedestal is absent. This is the ‘dipper effect’ – a reliable improvement whose interpretation remains controversial. Analogies between luminance and depth (disparity) processing have attracted interest in the existence of a ‘disparity dipper’. Are thresholds for disparity modulation (corrugated surfaces), facilitated by the presence of a weak disparity-modulated pedestal? We used a 14-bit greyscale to render small disparities accurately, and measured 2AFC discrimination thresholds for disparity modulation (0.3 or 0.6 c/deg) of a random texture at various pedestal levels. In the first experiment, a clear dipper was found. Thresholds were about 2× lower with weak pedestals than without. But here the phase of modulation (0 or 180 deg) was varied from trial to trial. In a noisy signal-detection framework, this creates uncertainty that is reduced by the pedestal, which thus improves performance. When the uncertainty was eliminated by keeping phase constant within sessions, the dipper effect was weak or absent. Monte Carlo simulations showed that the influence of uncertainty could account well for the results of both experiments. A corollary is that the visual depth response to small disparities is probably linear, with no threshold-like nonlinearity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measurement of detection and discrimination thresholds yields information about visual signal processing. For luminance contrast, we are 2 - 3 times more sensitive to a small increase in the contrast of a weak 'pedestal' grating, than when the pedestal is absent. This is the 'dipper effect' - a reliable improvement whose interpretation remains controversial. Analogies between luminance and depth (disparity) processing have attracted interest in the existence of a 'disparity dipper' - are thresholds for disparity, or disparity modulation (corrugated surfaces), facilitated by the presence of a weak pedestal? Lunn and Morgan (1997 Journal of the Optical Society of America A 14 360 - 371) found no dipper for disparity-modulated gratings, but technical limitations (8-bit greyscale) might have prevented the necessary measurement of very small disparity thresholds. We used a true 14-bit greyscale to render small disparities accurately, and measured 2AFC discrimination thresholds for disparity modulation (0.6 cycle deg-1) of a random texture at various pedestal levels. Which interval contained greater modulation of depth? In the first experiment, a clear dipper was found. Thresholds were about 2X1 lower with weak pedestals than without. But here the phase of modulation (0° or 180°) was randomised from trial to trial. In a noisy signal-detection framework, this creates uncertainty that is reduced by the pedestal, thus improving performance. When the uncertainty was eliminated by keeping phase constant within sessions, the dipper effect disappeared, confirming Lunn and Morgan's result. The absence of a dipper, coupled with shallow psychometric slopes, suggests that the visual response to small disparities is essentially linear, with no threshold-like nonlinearity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent studies have stressed the importance of ‘open innovation’ as a means of enhancing innovation performance. The essence of the open innovation model is to take advantage of external as well as internal knowledge sources in developing and commercialising innovation, so avoiding an excessively narrow internal focus in a key area of corporate activity. Although the external aspect of open innovation is often stressed, another key aspect involves maximising the flow of ideas and knowledge from different sources within the firm, for example through knowledge sharing via the use of cross-functional teams. A fully open innovation approach would therefore combine both aspects i.e. cross-functional teams with boundary-spanning knowledge linkages. This suggests that there should be complementarities between the use cross-functional teams with boundary-spanning knowledge linkages i.e. the returns to implementing open innovation in one innovation activity is should be greater if open innovation is already in place in another innovation activity. However, our findings – based on a large sample of UK and German manufacturing plants – do not support this view. Our results suggest that in practice the benefits envisaged in the open innovation model are not generally achievable by the majority of plants, and that instead the adoption of open innovation across the whole innovation process is likely to reduce innovation outputs. Our results provide some guidance on the type of activities where the adoption of a market-based governance structure such as open innovation may be most valuable. This is likely to be in innovation activities where search is deterministic, activities are separable, and where the required level of knowledge sharing is correspondingly moderate – in other words those activities which are more routinized. For this type of activity market-based governance mechanisms (i.e. open innovation) may well be more efficient than hierarchical governance structures. For other innovation activities where outcomes are more uncertain and unpredictable and the risks of knowledge exchange hazards are greater, quasi-market based governance structures such as open innovation are likely to be subject to rapidly diminishing returns in terms of innovation outputs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article we compare the current debate about global warming with the earlier discourse of Limits to Growth (LtG) of the 1970's. We are especially interested in the similarities of and differences between the two cases and therefore compare the policy challenges and lessons to be drawn. While the two debates differ on important issues, they share a technocratic orientation to public policy, and susceptibility to similar pitfalls. In both debates alarming scenarios about future catastrophes play an important role. We suggest that climate change policy discourse needs to focus more closely on the social, economic, and political dimensions of climate change, as opposed to its excessive emphasis on emission reduction targets. We also argue that an excessive faith in the market mechanisms to supply global warming mitigation technologies is problematic. In this respect, we provide a reality check regarding the political implications of emission targets and timetables and suggest how policy issues can be moved forward.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The human visual system is sensitive to second-order modulations of the local contrast (CM) or amplitude (AM) of a carrier signal. Second-order cues are detected independently of first-order luminance signals; however, it is not clear why vision should benet from second-order sensitivity. Analysis of the first-and second-order contents of natural images suggests that these cues tend to occur together, but their phase relationship varies. We have shown that in-phase combinations of LM and AM are perceived as a shaded corrugated surface whereas the anti-phase combination can be seen as corrugated when presented alone or as a flat material change when presented in a plaid containing the in-phase cue. We now extend these findings using new stimulus types and a novel haptic matching task. We also introduce a computational model based on initially separate first-and second-order channels that are combined within orientation and subsequently across orientation to produce a shading signal. Contrast gain control allows the LM + AM cue to suppress responses to the LM-AM when presented in a plaid. Thus, the model sees LM -AM as flat in these circumstances. We conclude that second-order vision plays a key role in disambiguating the origin of luminance changes within an image. © ARVO.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The thesis is concerned with the electron properties of single-polepiece magnetic electron lenses especially under conditions of extreme polepiece saturation. The electron optical properties are first analysed under conditions of high polepiece permeability. From this analysis, a general idea can be obtained of the important parameters that affect ultimate lens performance. In addition, useful information is obtained concerning the design of improved lenses operating under conditions of extreme polepiece saturation, for example at flux densities of the order of 10 Tesla. It is shown that in a single-polepiece lens , the position and shape of the lens exciting coil plays an important role. In particular, the maximum permissible current density in the windings,rather than the properties of the iron, can set a limit to lens performance. This factor was therefore investigated in some detail. The axial field distribution of a single-polepiece lens, unlike that of a conventional lens, is highly asymmetrical. There are therefore two possible physical arrangements of the lens with respect to the incoming electron beam. In general these two orientations will result in different aberration coefficients. This feature has also been investigated in some detail. Single-pole piece lenses are thus considerably more complicated electron- optically than conventional double polepiece lenses. In particular, the absence of the usual second polepiece causes most of the axial magnetic flux density distribution to lie outside the body of the lens. This can have many advantages in electron microscopy but it creates problems in calculating the magnetic field distribution. In particular, presently available computer programs are liable to be considerably in error when applied to such structures. It was therefore necessary to find independent ways of checking the field calculations. Furthermore, if the polepiece is allowed to saturate, much more calculation is involved since the field distribution becomes a non-linear function of the lens excitation. In searching for optimum lens designs, care was therefore taken to ensure that the coil was placed in the optimum position. If this condition is satisfied there seems to be no theoretical limit to the maximum flux density that can be attained at the polepiece tip. However , under iron saturation condition, some broadening of the axial field distribution will take place, thereby changing the lens aberrations . Extensive calculations were therefore made to find the minimum spherical and chromatic aberration coefficients . The focal properties of such lens designs are presented and compared with the best conventional double-polepiece lenses presently available.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The thesis examines and explains the development of occupational exposure limits (OELs) as a means of preventing work related disease and ill health. The research focuses on the USA and UK and sets the work within a certain historical and social context. A subsidiary aim of the thesis is to identify any short comings in OELs and the methods by which they are set and suggest alternatives. The research framework uses Thomas Kuhn's idea of science progressing by means of paradigms which he describes at one point, `lq ... universally recognised scientific achievements that for a time provide model problems and solutions to a community of practitioners. KUHN (1970). Once learned individuals in the community, `lq ... are committed to the same rules and standards for scientific practice. Ibid. Kuhn's ideas are adapted by combining them with a view of industrial hygiene as an applied science-based profession having many of the qualities of non-scientific professions. The great advantage of this approach to OELs is that it keeps the analysis grounded in the behaviour and priorities of the groups which have forged, propounded, used, benefited from, and defended, them. The development and use of OELs on a larger scale is shown to be connected to the growth of a new profession in the USA; industrial hygiene, with the assistance of another new profession; industrial toxicology. The origins of these professions, particularly industrial hygiene, are traced. By examining the growth of the professions and the writings of key individuals it is possible to show how technical, economic and social factors became embedded in the OEL paradigm which industrial hygienists and toxicologists forged. The origin, mission and needs of these professions and their clients made such influences almost inevitable. The use of the OEL paradigm in practice is examined by an analysis of the process of the American Conference of Governmental Industrial Hygienists, Threshold Limit Value (ACGIH, TLV) Committee via the Minutes from 1962-1984. A similar approach is taken with the development of OELs in the UK. Although the form and definition of TLVs has encouraged the belief that they are health-based OELs the conclusion is that they, and most other OELs, are, and always have been, reasonably practicable limits: the degree of risk posed by a substance is weighed against the feasibility and cost of controlling exposure to that substance. The confusion over the status of TLVs and other OELs is seen to be a confusion at the heart of the OEL paradigm and the historical perspective explains why this should be. The paradigm has prevented the creation of truly health-based and, conversely, truly reasonably practicable OELs. In the final part of the thesis the analysis of the development of OELs is set in a contemporary context and a proposal for a two-stage, two-committee procedure for producing sets of OELs is put forward. This approach is set within an alternative OEL paradigm. The advantages, benefits and likely obstacles to these proposals are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis is about the relationship between schooling and the economic and political structures which constrain its institutional framework. It focusses on teachers, as mediators of structural constraints, and on middle schools, as institutions which occupy a functionally transitional place between the primary and secondary traditions. In approaching the problem of linking the different perspectives of macro and micro sociologies, I argue the view that the individual mediates the contradictions between the socially cooperative processes of production and the competitive individualism which legitimates the private appropriation of wealth and income. The link is observable in the schooling process as a pattern of contradictions and tensions mediated by the rhetoric of equality of opportunity. In order to elucidate the link, the processes within the boundaried institutions must be viewed in the context of those changing tensions within the state administrative systems, which reverberate into schools as economic and political constraints. Framed within the ideology of the Flowden Report (1967), middle schooling was set within a discourse which stressed cooperative relationships rather than competitive standards. Since the mid-1970s, administrative policies have heightened the competitive battle for declining resources and attacked the Plowden ideology. Focussing the fieldwork on six middle schools in one local authority, T use an eclectic methodology to relate economic and political policies, generated in the state administrative system, to the situation in the schools between 1979 and 1981. The methodology incorporates a time dimension in order to highlight the tensions as they play upon teachers' changing definitions of the changing situation. I conclude that the intersubjective socio-cultural relations of schooling cannot be properly explained without making explicit the changing tensions in the rule/resource relationships which teachers mediate through their particular institutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has been postulated that immunogenicity results from the overall dissimilarity of pathogenic proteins versus the host proteome. We have sought to use this concept to discriminate between antigens and non-antigens of bacterial origin. Sets of 100 known antigenic and nonantigenic peptide sequences from bacteria were compared to human and mouse proteomes. Both antigenic and non-antigenic sequences lacked human or mouse homologues. Observed distributions were compared using the non-parametric Mann-Whitney test. The statistical null hypothesis was accepted, indicating that antigen and non-antigens did not differ significantly. Likewise, we were unable to determine a threshold able to separate meaningfully antigen from non-antigen. Thus, antigens cannot be predicted from pathogen genomes based solely on their dissimilarity to the human genome.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computational genome analysis enables systematic identification of potential immunogenic proteins within a pathogen. Immunogenicity is a system property that arises through the interaction of host and pathogen as mediated through the medium of a immunogenic protein. The overt dissimilarity of pathogenic proteins when compared to the host proteome is conjectured by some to be the determining principal of immunogenicity. Previously, we explored this idea in the context of Bacterial, Viral, and Fungal antigen. In this paper, we broaden and extend our analysis to include complex antigens of eukaryotic origin, arising from tumours and from parasite pathogens. For both types of antigen, known antigenic and non-antigenic protein sequences were compared to human and mouse proteomes. In contrast to our previous results, both visual inspection and statistical evaluation indicate a much wider range of homologues and a significant level of discrimination; but, as before, we could not determine a viable threshold capable of properly separating non-antigen from antigen. In concert with our previous work, we conclude that global proteome dissimilarity is not a useful metric for immunogenicity for presently available antigens arising from Bacteria, viruses, fungi, parasites, and tumours. While we see some signal for certain antigen types, using dissimilarity is not a useful approach to identifying antigenic molecules within pathogen genomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Immunogenicity arises via many synergistic mechanisms, yet the overall dissimilarity of pathogenic proteins versus the host proteome has been proposed as a key arbiter. We have previously explored this concept in relation to Bacterial antigens; here we extend our analysis to antigens of viral and fungal origin. Sets of known viral and fungal antigenic and non-antigenic protein sequences were compared to human and mouse proteomes. Both antigenic and non-antigenic sequences lacked human or mouse homologues. Observed distributions were compared using the non-parametric Mann-Whitney test. The statistical null hypothesis was accepted, indicating that antigen and non-antigens did not differ significantly. Likewise, we could not determine a threshold able meaningfully to separate non-antigen from antigen. We conclude that viral and fungal antigens cannot be predicted from pathogen genomes based solely on their dissimilarity to mammalian genomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite many interest in e-grocery, little has changed, over the years, in the offering that is often geared only towards low value staple products. Yet, from an e-supermarket perspective, the number of sourcing stores is increasing regularly providing an illusion of service improvement. This situation, we argue is leading e-grocery providers to forego profits as consumers need to look both at the competition online and offline to satisfy their overall regular grocery needs. Expansion of e-grocery operations could be better achieved, we argue, by serving diverse and premium priced products (e.g. organic, limited production, regional items; special occasions items and products related to health e.g. allergies, diabetes) and utilizing more efficiently modern logistic techniques. A framework is offered presenting a model including the delivery of premium products from various suppliers and providing an integrated service solution to e-grocery customers that complete traditional supermarket ranges, creating potential high value added products niches. In this context, the objective was to understand the consumer discrimination factors (ie: range of product, delivery timing, location, service quality) leading to intentions towards purchasing more items from e-grocery retailers. Data are derived from a survey of 356 respondents in Turkey’s three biggest metropolitan areas. The relationship between consumer attitudes and demographic characteristics are also analyzed. Factor and SEM analyses are used to discriminate within the sample (n=356, no of items=150). Results, future research and policy implications are discussed.