34 resultados para methods and measurement

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background There is a paucity of data describing the prevalence of childhood refractive error in the United Kingdom. The Northern Ireland Childhood Errors of Refraction study, along with its sister study the Aston Eye Study, are the first population-based surveys of children using both random cluster sampling and cycloplegic autorefraction to quantify levels of refractive error in the United Kingdom. Methods Children aged 6–7 years and 12–13 years were recruited from a stratified random sample of primary and post-primary schools, representative of the population of Northern Ireland as a whole. Measurements included assessment of visual acuity, oculomotor balance, ocular biometry and cycloplegic binocular open-field autorefraction. Questionnaires were used to identify putative risk factors for refractive error. Results 399 (57%) of 6–7 years and 669 (60%) of 12–13 years participated. School participation rates did not vary statistically significantly with the size of the school, whether the school is urban or rural, or whether it is in a deprived/non-deprived area. The gender balance, ethnicity and type of schooling of participants are reflective of the Northern Ireland population. Conclusions The study design, sample size and methodology will ensure accurate measures of the prevalence of refractive errors in the target population and will facilitate comparisons with other population-based refractive data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The state of the art in productivity measurement and analysis shows a gap between simple methods having little relevance in practice and sophisticated mathematical theory which is unwieldy for strategic and tactical planning purposes, -particularly at company level. An extension is made in this thesis to the method of productivity measurement and analysis based on the concept of added value, appropriate to those companies in which the materials, bought-in parts and services change substantially and a number of plants and inter-related units are involved in providing components for final assembly. Reviews and comparisons of productivity measurement dealing with alternative indices and their problems have been made and appropriate solutions put forward to productivity analysis in general and the added value method in particular. Based on this concept and method, three kinds of computerised models two of them deterministic, called sensitivity analysis and deterministic appraisal, and the third one, stochastic, called risk simulation, have been developed to cope with the planning of productivity and productivity growth with reference to the changes in their component variables, ranging from a single value 'to• a class interval of values of a productivity distribution. The models are designed to be flexible and can be adjusted according to the available computer capacity expected accuracy and 'presentation of the output. The stochastic model is based on the assumption of statistical independence between individual variables and the existence of normality in their probability distributions. The component variables have been forecasted using polynomials of degree four. This model is tested by comparisons of its behaviour with that of mathematical model using real historical data from British Leyland, and the results were satisfactory within acceptable levels of accuracy. Modifications to the model and its statistical treatment have been made as required. The results of applying these measurements and planning models to the British motor vehicle manufacturing companies are presented and discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study is concerned with several proposals concerning multiprocessor systems and with the various possible methods of evaluating such proposals. After a discussion of the advantages and disadvantages of several performance evaluation tools, the author decides that simulation is the only tool powerful enough to develop a model which would be of practical use, in the design, comparison and extension of systems. The main aims of the simulation package developed as part of this study are cost effectiveness, ease of use and generality. The methodology on which the simulation package is based is described in detail. The fundamental principles are that model design should reflect actual systems design, that measuring procedures should be carried out alongside design that models should be well documented and easily adaptable and that models should be dynamic. The simulation package itself is modular, and in this way reflects current design trends. This approach also aids documentation and ensures that the model is easily adaptable. It contains a skeleton structure and a library of segments which can be added to or directly swapped with segments of the skeleton structure, to form a model which fits a user's requirements. The study also contains the results of some experimental work carried out using the model, the first part of which tests• the model's capabilities by simulating a large operating system, the ICL George 3 system; the second part deals with general questions and some of the many proposals concerning multiprocessor systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Whereas the competitive advantage of firms can arise from size and position within their industry as well as physical assets, the pattern of competition in advanced economies has increasingly come to favour those firms that can mobilise knowledge and technological skills to create novelty in their products. At the same time, regions are attracting growing attention as an economic unit of analysis, with firms increasingly locating their functions in select regions within the global space. This article introduces the concept of knowledge competitiveness, defined as an economy’s knowledge capacity, capability and sustainability, and the extent to which this knowledge is translated into economic value and transferred into the wealth of the citizens. The article discusses the way in which the knowledge competitiveness of regions is measured and further introduces the World Knowledge Competitiveness Index, which is the first composite and relative measure of the knowledge competitiveness of the globe’s best performing regions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Regulation is subject to information asymmetries that can lead to allocative and productive inefficiencies. One solution, suggested by Shleifer in 1985 and now adopted by many regulatory bodies round the world, is 'benchmarking', which is sometimes called 'yardstick competition'. In this paper we consider Shleifer's original approach to benchmarking and contrast this with the actual use of benchmarking by UK regulatory bodies in telecommunications, water and the energy sector since the privatizations of the 1980s and early 1990s. We find that benchmarking plays only one part and sometimes a small part in the setting of regulatory price caps in the UK. We also find that in practice benchmarking has been subject to a number of difficulties, which mean that it is never likely to be more than one tool in the regulator's armoury. The UK's experience provides lessons for regulation internationally. © 2006 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, technologically advanced methodologies such as Translog have gained a lot of ground in translation process research. However, in this paper it will be argued that quantitative research methods can be supplemented by ethnographic qualitative ones so as to enhance our understanding of what underlies the translation process. Although translation studies scholars have sometimes applied an ethnographic approach to the study of translation, this paper offers a different perspective and considers the potential of ethnographic research methods for tapping cognitive and behavioural aspects of the translation process. A number of ethnographic principles are discussed and it is argued that process researchers aiming to understand translators’ perspectives and intentions, how these shape their behaviours, as well as how translators reflect on the situations they face and how they see themselves, would undoubtedly benefit from adopting an ethnographic framework for their studies on translation processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although there is a large body of research on brand equity, little in terms of a literature review has been published on this since Feldwick’s (1996) paper. To address this gap, this paper brings together the scattered literature on consumer based brand equity’s conceptualisation and measurement. Measures of consumer based brand equity are classified as either direct or indirect. Indirect measures assess consumer-based brand equity through its demonstrable dimensions and are superior from a diagnostic level. The paper concludes with directions for future research and managerial pointers for setting up a brand equity measurement system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Removing noise from piecewise constant (PWC) signals is a challenging signal processing problem arising in many practical contexts. For example, in exploration geosciences, noisy drill hole records need to be separated into stratigraphic zones, and in biophysics, jumps between molecular dwell states have to be extracted from noisy fluorescence microscopy signals. Many PWC denoising methods exist, including total variation regularization, mean shift clustering, stepwise jump placement, running medians, convex clustering shrinkage and bilateral filtering; conventional linear signal processing methods are fundamentally unsuited. This paper (part I, the first of two) shows that most of these methods are associated with a special case of a generalized functional, minimized to achieve PWC denoising. The minimizer can be obtained by diverse solver algorithms, including stepwise jump placement, convex programming, finite differences, iterated running medians, least angle regression, regularization path following and coordinate descent. In the second paper, part II, we introduce novel PWC denoising methods, and comparisons between these methods performed on synthetic and real signals, showing that the new understanding of the problem gained in part I leads to new methods that have a useful role to play.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Removing noise from signals which are piecewise constant (PWC) is a challenging signal processing problem that arises in many practical scientific and engineering contexts. In the first paper (part I) of this series of two, we presented background theory building on results from the image processing community to show that the majority of these algorithms, and more proposed in the wider literature, are each associated with a special case of a generalized functional, that, when minimized, solves the PWC denoising problem. It shows how the minimizer can be obtained by a range of computational solver algorithms. In this second paper (part II), using this understanding developed in part I, we introduce several novel PWC denoising methods, which, for example, combine the global behaviour of mean shift clustering with the local smoothing of total variation diffusion, and show example solver algorithms for these new methods. Comparisons between these methods are performed on synthetic and real signals, revealing that our new methods have a useful role to play. Finally, overlaps between the generalized methods of these two papers and others such as wavelet shrinkage, hidden Markov models, and piecewise smooth filtering are touched on.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis begins with a review of the literature on team-based working in organisations, highlighting the variations in research findings, and the need for greater precision in our measurement of teams. It continues with an illustration of the nature and prevalence of real and pseudo team-based working, by presenting results from a large sample of secondary data from the UK National Health Service. Results demonstrate that ‘real teams’ have an important and significant impact on the reduction of many work-related safety outcomes. Based on both theoretical and methodological limitations of existing approaches, the thesis moves on to provide a clarification and extension of the ‘real team’ construct, demarcating this from other (pseudo-like) team typologies on a sliding scale, rather than a simple dichotomy. A conceptual model for defining real teams is presented, providing a theoretical basis for the development of a scale on which teams can be measured for varying extents of ‘realness’. A new twelve-item scale is developed and tested with three samples of data comprising 53 undergraduate teams, 52 postgraduate teams, and 63 public sector teams from a large UK organisation. Evidence for the content, construct and criterion-related validity of the real team scale is examined over seven separate validation studies. Theoretical, methodological and practical implications of the real team scale are then discussed.