51 resultados para Error of measurement
em Aston University Research Archive
Resumo:
Three dimensions of subordinate-supervisor relations (affective attachment, deference to supervisor, and personal-life inclusion) that had been found by Y. Chen, Friedman, Yu, Fang, and Lu to be characteristic of a guanxi relationship between subordinates and their supervisors in China were surveyed in Taiwan, Singapore, and six non-Chinese cultural contexts. The Affective Attachment and Deference subscales demonstrated full metric invariance whereas the Personal-Life Inclusion subscale was found to have partial metric invariance across all eight samples. Structural equation modeling revealed that the affective attachment dimension had a cross-nationally invariant positive relationship to affective organizational commitment and a negative relationship to turnover intention. The deference to the supervisor dimension had invariant positive relationships with both affective and normative organizational commitment. The personal-life inclusion dimension was unrelated to all outcomes. These results indicate the relevance of aspects of guanxi to superior-subordinate relations in non-Chinese cultures. Studies of indigenous concepts can contribute to a broader understanding of organizational behavior. © The Author(s) 2014.
Resumo:
The uncertainty of measurements must be quantified and considered in order to prove conformance with specifications and make other meaningful comparisons based on measurements. While there is a consistent methodology for the evaluation and expression of uncertainty within the metrology community industry frequently uses the alternative Measurement Systems Analysis methodology. This paper sets out to clarify the differences between uncertainty evaluation and MSA and presents a novel hybrid methodology for industrial measurement which enables a correct evaluation of measurement uncertainty while utilising the practical tools of MSA. In particular the use of Gage R&R ANOVA and Attribute Gage studies within a wider uncertainty evaluation framework is described. This enables in-line measurement data to be used to establish repeatability and reproducibility, without time consuming repeatability studies being carried out, while maintaining a complete consideration of all sources of uncertainty and therefore enabling conformance to be proven with a stated level of confidence. Such a rigorous approach to product verification will become increasingly important in the era of the Light Controlled Factory with metrology acting as the driving force to achieve the right first time and highly automated manufacture of high value large scale products such as aircraft, spacecraft and renewable power generation structures.
Resumo:
The concept of measurement-enabled production is based on integrating metrology systems into production processes and generated significant interest in industry, due to its potential to increase process capability and accuracy, which in turn reduces production times and eliminates defective parts. One of the most promising methods of integrating metrology into production is the usage of external metrology systems to compensate machine tool errors in real time. The development and experimental performance evaluation of a low-cost, prototype three-axis machine tool that is laser tracker assisted are described in this paper. Real-time corrections of the machine tool's absolute volumetric error have been achieved. As a result, significant increases in static repeatability and accuracy have been demonstrated, allowing the low-cost three-axis machine tool to reliably reach static positioning accuracies below 35 μm throughout its working volume without any prior calibration or error mapping. This is a significant technical development that demonstrated the feasibility of the proposed methods and can have wide-scale industrial applications by enabling low-cost and structural integrity machine tools that could be deployed flexibly as end-effectors of robotic automation, to achieve positional accuracies that were the preserve of large, high-precision machine tools.
Resumo:
The Intensive Care Unit (ICU) being one of those vital areas of a hospital providing clinical care, the quality of service rendered must be monitored and measured quantitatively. It is, therefore, essential to know the performance of an ICU, in order to identify any deficits and enable the service providers to improve the quality of service. Although there have been many attempts to do this with the help of illness severity scoring systems, the relative lack of success using these methods has led to the search for a form of measurement, which would encompass all the different aspects of an ICU in a holistic manner. The Analytic Hierarchy Process (AHP), a multiple-attribute, decision-making technique is utilised in this study to evolve a system to measure the performance of ICU services reliably. This tool has been applied to a surgical ICU in Barbados; we recommend AHP as a valuable tool to quantify the performance of an ICU. Copyright © 2004 Inderscience Enterprises Ltd.
Resumo:
The accuracy of altimetrically derived oceanographic and geophysical information is limited by the precision of the radial component of the satellite ephemeris. A non-dynamic technique is proposed as a method of reducing the global radial orbit error of altimetric satellites. This involves the recovery of each coefficient of an analytically derived radial error correction through a refinement of crossover difference residuals. The crossover data is supplemented by absolute height measurements to permit the retrieval of otherwise unobservable geographically correlated and linearly combined parameters. The feasibility of the radial reduction procedure is established upon application to the three day repeat orbit of SEASAT. The concept of arc aggregates is devised as a means of extending the method to incorporate longer durations, such as the 35 day repeat period of ERS-1. A continuous orbit is effectively created by including the radial misclosure between consecutive long arcs as an infallible observation. The arc aggregate procedure is validated using a combination of three successive SEASAT ephemerides. A complete simulation of the 501 revolution per 35 day repeat orbit of ERS-1 is derived and the recovery of the global radial orbit error over the full repeat period is successfully accomplished. The radial reduction is dependent upon the geographical locations of the supplementary direct height data. Investigations into the respective influences of various sites proposed for the tracking of ERS-1 by ground-based transponders are carried out. The potential effectiveness on the radial orbital accuracy of locating future tracking sites in regions of high latitudinal magnitude is demonstrated.
Resumo:
This thesis first considers the calibration and signal processing requirements of a neuromagnetometer for the measurement of human visual function. Gradiometer calibration using straight wire grids is examined and optimal grid configurations determined, given realistic constructional tolerances. Simulations show that for gradiometer balance of 1:104 and wire spacing error of 0.25mm the achievable calibration accuracy of gain is 0.3%, of position is 0.3mm and of orientation is 0.6°. Practical results with a 19-channel 2nd-order gradiometer based system exceed this performance. The real-time application of adaptive reference noise cancellation filtering to running-average evoked response data is examined. In the steady state, the filter can be assumed to be driven by a non-stationary step input arising at epoch boundaries. Based on empirical measures of this driving step an optimal progression for the filter time constant is proposed which improves upon fixed time constant filter performance. The incorporation of the time-derivatives of the reference channels was found to improve the performance of the adaptive filtering algorithm by 15-20% for unaveraged data, falling to 5% with averaging. The thesis concludes with a neuromagnetic investigation of evoked cortical responses to chromatic and luminance grating stimuli. The global magnetic field power of evoked responses to the onset of sinusoidal gratings was shown to have distinct chromatic and luminance sensitive components. Analysis of the results, using a single equivalent current dipole model, shows that these components arise from activity within two distinct cortical locations. Co-registration of the resulting current source localisations with MRI shows a chromatically responsive area lying along the midline within the calcarine fissure, possibly extending onto the lingual and cuneal gyri. It is postulated that this area is the human homologue of the primate cortical area V4.
Resumo:
Visual perception is dependent on both light transmission through the eye and neuronal conduction through the visual pathway. Advances in clinical diagnostics and treatment modalities over recent years have increased the opportunities to improve the optical path and retinal image quality. Higher order aberrations and retinal straylight are two major factors that influence light transmission through the eye and ultimately, visual outcome. Recent technological advancements have brought these important factors into the clinical domain, however the potential applications of these tools and considerations regarding interpretation of data are much underestimated. The purpose of this thesis was to validate and optimise wavefront analysers and a new clinical tool for the objective evaluation of intraocular scatter. The application of these methods in a clinical setting involving a range of conditions was also explored. The work was divided into two principal sections: 1. Wavefront Aberrometry: optimisation, validation and clinical application The main findings of this work were: • Observer manipulation of the aberrometer increases variability by a factor of 3. • Ocular misalignment can profoundly affect reliability, notably for off-axis aberrations. • Aberrations measured with wavefront analysers using different principles are not interchangeable, with poor relationships and significant differences between values. • Instrument myopia of around 0.30D is induced when performing wavefront analysis in non-cyclopleged eyes; values can be as high as 3D, being higher as the baseline level of myopia decreases. Associated accommodation changes may result in relevant changes to the aberration profile, particularly with respect to spherical aberration. • Young adult healthy Caucasian eyes have significantly more spherical aberration than Asian eyes when matched for age, gender, axial length and refractive error. Axial length is significantly correlated with most components of the aberration profile. 2. Intraocular light scatter: Evaluation of subjective measures and validation and application of a new objective method utilising clinically derived wavefront patterns. The main findings of this work were: • Subjective measures of clinical straylight are highly repeatable. Three measurements are suggested as the optimum number for increased reliability. • Significant differences in straylight values were found for contact lenses designed for contrast enhancement compared to clear lenses of the same design and material specifications. Specifically, grey/green tints induced significantly higher values of retinal straylight. • Wavefront patterns from a commercial Hartmann-Shack device can be used to obtain objective measures of scatter and are well correlated with subjective straylight values. • Perceived retinal stray light was similar in groups of patients implanted with monofocal and multi focal intraocular lenses. Correlation between objective and subjective measurements of scatter is poor, possibly due to different illumination conditions between the testing procedures, or a neural component which may alter with age. Careful acquisition results in highly reproducible in vivo measures of higher order aberrations; however, data from different devices are not interchangeable which brings the accuracy of measurement into question. Objective measures of intraocular straylight can be derived from clinical aberrometry and may be of great diagnostic and management importance in the future.
Resumo:
An experimental method for characterizing the time-resolved phase noise of a fast switching tunable laser is discussed. The method experimentally determines a complementary cumulative distribution function of the laser's differential phase as a function of time after a switching event. A time resolved bit error rate of differential quadrature phase shift keying formatted data, calculated using the phase noise measurements, was fitted to an experimental time-resolved bit error rate measurement using a field programmable gate array, finding a good agreement between the time-resolved bit error rates.
Resumo:
In the present paper we numerically study instrumental impact on statistical properties of quasi-CW Raman fiber laser using a simple model of multimode laser radiation. Effects, that have the most influence, are limited electrical bandwidth of measurement equipment and noise. To check this influence, we developed a simple model of the multimode quasi- CW generation with exponential statistics (i.e. uncorrelated modes). We found that the area near zero intensity in probability density function (PDF) is strongly affected by both factors, for example both lead to formation of a negative wing of intensity distribution. But far wing slope of PDF is not affected by noise and, for moderate mismatch between optical and electrical bandwidth, is only slightly affected by bandwidth limitation. The generation spectrum often becomes broader at higher power in experiments, so the spectral/electrical bandwidth mismatch factor increases over the power that can lead to artificial dependence of the PDF slope over the power. It was also found that both effects influence the ACF background level: noise impact decreases it, while limited bandwidth leads to its increase. © (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Resumo:
As the largest source of dimensional measurement uncertainty, addressing the challenges of thermal variation is vital to ensure product and equipment integrity in the factories of the future. While it is possible to closely control room temperature, this is often not practical or economical to realise in all cases where inspection is required. This article reviews recent progress and trends in seven key commercially available industrial temperature measurement sensor technologies primarily in the range of 0 °C–50 °C for invasive, semi-invasive and non-invasive measurement. These sensors will ultimately be used to measure and model thermal variation in the assembly, test and integration environment. The intended applications for these technologies are presented alongside some consideration of measurement uncertainty requirements with regard to the thermal expansion of common materials. Research priorities are identified and discussed for each of the technologies as well as temperature measurement at large. Future developments are briefly discussed to provide some insight into which direction the development and application of temperature measurement technologies are likely to head.
Resumo:
In dimensional metrology, often the largest source of uncertainty of measurement is thermal variation. Dimensional measurements are currently scaled linearly, using ambient temperature measurements and coefficients of thermal expansion, to ideal metrology conditions at 20˚C. This scaling is particularly difficult to implement with confidence in large volumes as the temperature is unlikely to be uniform, resulting in thermal gradients. A number of well-established computational methods are used in the design phase of product development for the prediction of thermal and gravitational effects, which could be used to a greater extent in metrology. This paper outlines the theory of how physical measurements of dimension and temperature can be combined more comprehensively throughout the product lifecycle, from design through to the manufacturing phase. The Hybrid Metrology concept is also introduced: an approach to metrology, which promises to improve product and equipment integrity in future manufacturing environments. The Hybrid Metrology System combines various state of the art physical dimensional and temperature measurement techniques with established computational methods to better predict thermal and gravitational effects.
Resumo:
It has never been easy for manufacturing companies to understand their confidence level in terms of how accurate and to what degree of flexibility parts can be made. This brings uncertainty in finding the most suitable manufacturing method as well as in controlling their product and process verification systems. The aim of this research is to develop a system for capturing the company’s knowledge and expertise and then reflect it into an MRP (Manufacturing Resource Planning) system. A key activity here is measuring manufacturing and machining capabilities to a reasonable confidence level. For this purpose an in-line control measurement system is introduced to the company. Using SPC (Statistical Process Control) not only helps to predict the trend in manufacturing of parts but also minimises the human error in measurement. Gauge R&R (Repeatability and Reproducibility) study identifies problems in measurement systems. Measurement is like any other process in terms of variability. Reducing this variation via an automated machine probing system helps to avoid defects in future products.Developments in aerospace, nuclear, oil and gas industries demand materials with high performance and high temperature resistance under corrosive and oxidising environments. Superalloys were developed in the latter half of the 20th century as high strength materials for such purposes. For the same characteristics superalloys are considered as difficult-to-cut alloys when it comes to formation and machining. Furthermore due to the sensitivity of superalloy applications, in many cases they should be manufactured with tight tolerances. In addition superalloys, specifically Nickel based, have unique features such as low thermal conductivity due to having a high amount of Nickel in their material composition. This causes a high surface temperature on the work-piece at the machining stage which leads to deformation in the final product.Like every process, the material variations have a significant impact on machining quality. The main cause of variations can originate from chemical composition and mechanical hardness. The non-uniform distribution of metal elements is a major source of variation in metallurgical structures. Different heat treatment standards are designed for processing the material to the desired hardness levels based on application. In order to take corrective actions, a study on the material aspects of superalloys has been conducted. In this study samples from different batches of material have been analysed. This involved material preparation for microscopy analysis, and the effect of chemical compositions on hardness (before and after heat treatment). Some of the results are discussed and presented in this paper.
Resumo:
This paper details a method of estimating the uncertainty of dimensional measurement for a three-dimensional coordinate measurement machine. An experimental procedure was developed to compare three-dimensional coordinate measurements with calibrated reference points. The reference standard used to calibrate these reference points was a fringe counting interferometer with a multilateration-like technique employed to establish three-dimensional coordinates. This is an extension of the established technique of comparing measured lengths with calibrated lengths. Specifically a distributed coordinate measurement device was tested which consisted of a network of Rotary-Laser Automatic Theodolites (R-LATs), this system is known commercially as indoor GPS (iGPS). The method was found to be practical and was used to estimate that the uncertainty of measurement for the basic iGPS system is approximately 1 mm at a 95% confidence level throughout a measurement volume of approximately 10 m × 10 m × 1.5 m. © 2010 IOP Publishing Ltd.
Resumo:
Long period gratings (LPGs) were written into a D-shaped optical fibre that has an elliptical core with a W-shaped refractive index profile and the first detailed investigation of such LPGs is presented. The LPGs’ attenuation bands were found to be sensitive to the polarisation of the interrogating light with a spectral separation of about 15 nm between the two orthogonal polarisation states. A finite element method was successfully used to model many of the behavioural features of the LPGs. In addition, two spectrally overlapping attenuation bands corresponding to orthogonal polarisation states were observed; modelling successfully reproduced this spectral feature. The spectral sensitivity of both orthogonal states was experimentally measured with respect to temperature and bending. These LPG devices produced blue and red wavelength shifts depending upon the orientation of the bend with measured maximum sensitivities of -3.56 and +6.51 nm m, suggesting that this type of fibre LPG may be useful as a shape/bend orientation sensor with reduced errors associated with polarisation dependence. The use of neighbouring bands to discriminate between temperature and bending was also demonstrated, leading to an overall curvature error of ±0.14 m-1 and an overall temperature error of ±0.3 °C with a maximum polarisation dependence error of ±8 × 10-2 m-1 for curvature and ±5 × 10-2 °C for temperature.
Resumo:
Growth in availability and ability of modern statistical software has resulted in greater numbers of research techniques being applied across the marketing discipline. However, with such advances come concerns that techniques may be misinterpreted by researchers. This issue is critical since misinterpretation could cause erroneous findings. This paper investigates some assumptions regarding: 1) the assessment of discriminant validity; and 2) what confirmatory factor analysis accomplishes. Examples that address these points are presented, and some procedural remedies are suggested based upon the literature. This paper is, therefore, primarily concerned with the development of measurement theory and practice. If advances in theory development are not based upon sound methodological practice, we as researchers could be basing our work upon shaky foundations.