969 resultados para Dimensional measurement accuracy
Resumo:
The state of the art in productivity measurement and analysis shows a gap between simple methods having little relevance in practice and sophisticated mathematical theory which is unwieldy for strategic and tactical planning purposes, -particularly at company level. An extension is made in this thesis to the method of productivity measurement and analysis based on the concept of added value, appropriate to those companies in which the materials, bought-in parts and services change substantially and a number of plants and inter-related units are involved in providing components for final assembly. Reviews and comparisons of productivity measurement dealing with alternative indices and their problems have been made and appropriate solutions put forward to productivity analysis in general and the added value method in particular. Based on this concept and method, three kinds of computerised models two of them deterministic, called sensitivity analysis and deterministic appraisal, and the third one, stochastic, called risk simulation, have been developed to cope with the planning of productivity and productivity growth with reference to the changes in their component variables, ranging from a single value 'to• a class interval of values of a productivity distribution. The models are designed to be flexible and can be adjusted according to the available computer capacity expected accuracy and 'presentation of the output. The stochastic model is based on the assumption of statistical independence between individual variables and the existence of normality in their probability distributions. The component variables have been forecasted using polynomials of degree four. This model is tested by comparisons of its behaviour with that of mathematical model using real historical data from British Leyland, and the results were satisfactory within acceptable levels of accuracy. Modifications to the model and its statistical treatment have been made as required. The results of applying these measurements and planning models to the British motor vehicle manufacturing companies are presented and discussed.
Resumo:
Aim: The aim of this study was to evaluate the practicality and accuracy of tonometers used in routine clinical practice for established keratoconus (KC). Methods: This was a prospective study of 118 normal and 76 keratoconic eyes where intraocular pressure (IOP) was measured in random order using the Goldman applanation tonometer (GAT), Pascal dynamic contour tonometer (DCT), Reichert ocular response analyser (ORA) and TonoPen XL tonometer. Corneal hysteresis (CH) and corneal resistance factor (CRF), as calculated by the ORA, were recorded. Central corneal thickness (CCT) was measured using an ultrasound pachymeter. Results: The difference in IOP values between instruments was highly significant in both study groups (p<0.001). All other IOP measures were significantly higher than those for GAT, except for the Goldmann-correlated IOP (average of the two applanation pressure points) (IOPg) as measured by ORA in the control group and the CH-corrected IOP (corneal-compensated IOP value) (IOPcc) measures in the KC group. CCT, CH and CRF were significantly less in the KC group (p<0.001). Apart from the DCT, all techniques tended to measure IOP higher in eyes with thicker corneas. Conclusion: The DCT and the ORA are currently the most appropriate tonometers to use in KC for the measurement of IOPcc. Corneal factors such as CH and CRT may be of more importance than CCT in causing inaccuracies in applanation tonometry techniques.
Resumo:
An array of in-line curvature sensors on a garment is used to monitor the thoracic and abdominal movements of a human during respiration. The results are used to obtain volumetric changes of the human torso in agreement with a spirometer used simultaneously at the mouth. The array of 40 in-line fiber Bragg gratings is used to produce 20 curvature sensors at different locations, each sensor consisting of two fiber Bragg gratings. The 20 curvature sensors and adjoining fiber are encapsulated into a low-temperature-cured synthetic silicone. The sensors are wavelength interrogated by a commercially available system from Moog Insensys, and the wavelength changes are calibrated to recover curvature. A three-dimensional algorithm is used to generate shape changes during respiration that allow the measurement of absolute volume changes at various sections of the torso. It is shown that the sensing scheme yields a volumetric error of 6%. Comparing the volume data obtained from the spirometer with the volume estimated with the synchronous data from the shape-sensing array yielded a correlation value 0.86 with a Pearson's correlation coefficient p <0.01.
Resumo:
We investigate two numerical procedures for the Cauchy problem in linear elasticity, involving the relaxation of either the given boundary displacements (Dirichlet data) or the prescribed boundary tractions (Neumann data) on the over-specified boundary, in the alternating iterative algorithm of Kozlov et al. (1991). The two mixed direct (well-posed) problems associated with each iteration are solved using the method of fundamental solutions (MFS), in conjunction with the Tikhonov regularization method, while the optimal value of the regularization parameter is chosen via the generalized cross-validation (GCV) criterion. An efficient regularizing stopping criterion which ceases the iterative procedure at the point where the accumulation of noise becomes dominant and the errors in predicting the exact solutions increase, is also presented. The MFS-based iterative algorithms with relaxation are tested for Cauchy problems for isotropic linear elastic materials in various geometries to confirm the numerical convergence, stability, accuracy and computational efficiency of the proposed method.
Resumo:
We propose a self-reference multiplexed fibre interferometer (MFI) by using a tunable laser and fibre Bragg grating (FBG). The optical measurement system multiplexes two Michelson fibre interferometers with shared optical path in the main part of optical system. One fibre optic interferometer is used as a reference interferometer to monitor and control the high accuracy of the measurement system under environmental perturbations. The other is used as a measurement interferometer to obtain information from the target. An active phase tracking homodyne (APTH) technique is applied for signal processing to achieve high resolution. MFI can be utilised for high precision absolute displacement measurement with different combination of wavelengths from the tuneable laser. By means of Wavelength-Division-Multiplexing (WDM) technique, MFI is also capable of realising on-line surface measurement, in which traditional stylus scanning is replaced by spatial light-wave scanning so as to greatly improve the measurement speed and robustness.
Resumo:
Computational Fluid Dynamics (CFD) has found great acceptance among the engineering community as a tool for research and design of processes that are practically difficult or expensive to study experimentally. One of these processes is the biomass gasification in a Circulating Fluidized Bed (CFB). Biomass gasification is the thermo-chemical conversion of biomass at a high temperature and a controlled oxygen amount into fuel gas, also sometime referred to as syngas. Circulating fluidized bed is a type of reactor in which it is possible to maintain a stable and continuous circulation of solids in a gas-solid system. The main objectives of this thesis are four folds: (i) Develop a three-dimensional predictive model of biomass gasification in a CFB riser using advanced Computational Fluid Dynamic (CFD) (ii) Experimentally validate the developed hydrodynamic model using conventional and advanced measuring techniques (iii) Study the complex hydrodynamics, heat transfer and reaction kinetics through modelling and simulation (iv) Study the CFB gasifier performance through parametric analysis and identify the optimum operating condition to maximize the product gas quality. Two different and complimentary experimental techniques were used to validate the hydrodynamic model, namely pressure measurement and particle tracking. The pressure measurement is a very common and widely used technique in fluidized bed studies, while, particle tracking using PEPT, which was originally developed for medical imaging, is a relatively new technique in the engineering field. It is relatively expensive and only available at few research centres around the world. This study started with a simple poly-dispersed single solid phase then moved to binary solid phases. The single solid phase was used for primary validations and eliminating unnecessary options and steps in building the hydrodynamic model. Then the outcomes from the primary validations were applied to the secondary validations of the binary mixture to avoid time consuming computations. Studies on binary solid mixture hydrodynamics is rarely reported in the literature. In this study the binary solid mixture was modelled and validated using experimental data from the both techniques mentioned above. Good agreement was achieved with the both techniques. According to the general gasification steps the developed model has been separated into three main gasification stages; drying, devolatilization and tar cracking, and partial combustion and gasification. The drying was modelled as a mass transfer from the solid phase to the gas phase. The devolatilization and tar cracking model consist of two steps; the devolatilization of the biomass which is used as a single reaction to generate the biomass gases from the volatile materials and tar cracking. The latter is also modelled as one reaction to generate gases with fixed mass fractions. The first reaction was classified as a heterogeneous reaction while the second reaction was classified as homogenous reaction. The partial combustion and gasification model consisted of carbon combustion reactions and carbon and gas phase reactions. The partial combustion considered was for C, CO, H2 and CH4. The carbon gasification reactions used in this study is the Boudouard reaction with CO2, the reaction with H2O and Methanation (Methane forming reaction) reaction to generate methane. The other gas phase reactions considered in this study are the water gas shift reaction, which is modelled as a reversible reaction and the methane steam reforming reaction. The developed gasification model was validated using different experimental data from the literature and for a wide range of operating conditions. Good agreement was observed, thus confirming the capability of the model in predicting biomass gasification in a CFB to a great accuracy. The developed model has been successfully used to carry out sensitivity and parametric analysis. The sensitivity analysis included: study of the effect of inclusion of various combustion reaction; and the effect of radiation in the gasification reaction. The developed model was also used to carry out parametric analysis by changing the following gasifier operating conditions: fuel/air ratio; biomass flow rates; sand (heat carrier) temperatures; sand flow rates; sand and biomass particle sizes; gasifying agent (pure air or pure steam); pyrolysis models used; steam/biomass ratio. Finally, based on these parametric and sensitivity analysis a final model was recommended for the simulation of biomass gasification in a CFB riser.
Resumo:
Distributed Brillouin sensing of strain and temperature works by making spatially resolved measurements of the position of the measurand-dependent extremum of the resonance curve associated with the scattering process in the weakly nonlinear regime. Typically, measurements of backscattered Stokes intensity (the dependent variable) are made at a number of predetermined fixed frequencies covering the design measurand range of the apparatus and combined to yield an estimate of the position of the extremum. The measurand can then be found because its relationship to the position of the extremum is assumed known. We present analytical expressions relating the relative error in the extremum position to experimental errors in the dependent variable. This is done for two cases: (i) a simple non-parametric estimate of the mean based on moments and (ii) the case in which a least squares technique is used to fit a Lorentzian to the data. The question of statistical bias in the estimates is discussed and in the second case we go further and present for the first time a general method by which the probability density function (PDF) of errors in the fitted parameters can be obtained in closed form in terms of the PDFs of the errors in the noisy data.
Resumo:
An array of in-line curvature sensors on a garment is used to monitor the thoracic and abdominal movements of a human during respiration. The results are used to obtain volumetric changes of the human torso in agreement with a spirometer used simultaneously at the mouth. The array of 40 in-line fiber Bragg gratings is used to produce 20 curvature sensors at different locations, each sensor consisting of two fiber Bragg gratings. The 20 curvature sensors and adjoining fiber are encapsulated into a low-temperature-cured synthetic silicone. The sensors are wavelength interrogated by a commercially available system from Moog Insensys, and the wavelength changes are calibrated to recover curvature. A three-dimensional algorithm is used to generate shape changes during respiration that allow the measurement of absolute volume changes at various sections of the torso. It is shown that the sensing scheme yields a volumetric error of 6%. Comparing the volume data obtained from the spirometer with the volume estimated with the synchronous data from the shape-sensing array yielded a correlation value 0.86 with a Pearson's correlation coefficient p <0.01.
Resumo:
Purpose – This paper aims to consider how climate change performance is measured and accounted for within the performance framework for local authority areas in England adopted in 2008. It critically evaluates the design of two mitigation and one adaptation indicators that are most relevant to climate change. Further, the potential for these performance indicators to contribute to climate change mitigation and adaptation is discussed. Design/methodology/approach – The authors begin by examining the importance of the performance framework and the related Local Area Agreements (LAAs), which were negotiated for all local areas in England between central government and Local Strategic Partnerships (LSPs). This development is located within the broader literature relating to new public management. The potential for this framework to assist in delivering the UK's climate change policy objectives is researched in a two-stage process. First, government publications and all 150 LAAs were analysed to identify the level of priority given to the climate change indicators. Second, interviews were conducted in spring 2009 with civil servants and local authority officials from the English West Midlands who were engaged in negotiating the climate change content of the LAAs. Findings – Nationally, the authors find that 97 per cent of LAAs included at least one climate change indicator as a priority. The indicators themselves, however, are perceived to be problematic – in terms of appropriateness, accuracy and timeliness. In addition, concerns were identified about the level of local control over the drivers of climate change performance and, therefore, a question is raised as to how LSPs can be held accountable for this. On a more positive note, for those concerned about climate change, the authors do find evidence that the inclusion of these indicators within the performance framework has helped to move climate change up the agenda for local authorities and their partners. However, actions by the UK's new coalition government to abolish the national performance framework and substantially reduce public expenditure potentially threaten this advance. Originality/value – This paper offers an insight into a new development for measuring climate change performance at a local level, which is relatively under-researched. It also contributes to knowledge of accountability within a local government setting and provides a reference point for further research into the potential role of local actions to address the issue of climate change.
Resumo:
Background/aim: The technique of photoretinoscopy is unique in being able to measure the dynamics of the oculomotor system (ocular accommodation, vergence, and pupil size) remotely (working distance typically 1 metre) and objectively in both eyes simultaneously. The aim af this study was to evaluate clinically the measurement of refractive error by a recent commercial photoretinoscopic device, the PowerRefractor (PlusOptiX, Germany). Method: The validity and repeatability of the PowerRefractor was compared to: subjective (non-cycloplegic) refraction on 100 adult subjects (mean age 23.8 (SD 5.7) years) and objective autarefractian (Shin-Nippon SRW-5000, Japan) on 150 subjects (20.1 (4.2) years). Repeatability was assessed by examining the differences between autorefractor readings taken from each eye and by re-measuring the objective prescription of 100 eyes at a subsequent session. Results: On average the PowerRefractor prescription was not significantly different from the subjective refraction, although quite variable (difference -0.05 (0.63) D, p = 0.41) and more negative than the SRW-5000 prescription (by -0.20 (0.72) D, p<0.001). There was no significant bias in the accuracy of the instrument with regard to the type or magnitude of refractive error. The PowerRefractor was found to be repeatable over the prescription range of -8.75D to +4.00D (mean spherical equivalent) examined. Conclusion: The PowerRefractor is a useful objective screening instrument and because of its remote and rapid measurement of both eyes simultaneously is able to assess the oculomotor response in a variety of unrestricted viewing conditions and patient types.
Resumo:
This paper reviews the state of the art in measuring, modeling, and managing clogging in subsurface-flow treatment wetlands. Methods for measuring in situ hydraulic conductivity in treatment wetlands are now available, which provide valuable insight into assessing and evaluating the extent of clogging. These results, paired with the information from more traditional approaches (e.g., tracer testing and composition of the clog matter) are being incorporated into the latest treatment wetland models. Recent finite element analysis models can now simulate clogging development in subsurface-flow treatment wetlands with reasonable accuracy. Various management strategies have been developed to extend the life of clogged treatment wetlands, including gravel excavation and/or washing, chemical treatment, and application of earthworms. These strategies are compared and available cost information is reported. © 2012 Elsevier Ltd.
Resumo:
We propose a self-reference multiplexed fibre interferometer (MFI) by using a tunable laser and fibre Bragg grating (FBG). The optical measurement system multiplexes two Michelson fibre interferometers with shared optical path in the main part of optical system. One fibre optic interferometer is used as a reference interferometer to monitor and control the high accuracy of the measurement system under environmental perturbations. The other is used as a measurement interferometer to obtain information from the target. An active phase tracking homodyne (APTH) technique is applied for signal processing to achieve high resolution. MFI can be utilised for high precision absolute displacement measurement with different combination of wavelengths from the tuneable laser. By means of Wavelength-Division-Multiplexing (WDM) technique, MFI is also capable of realising on-line surface measurement, in which traditional stylus scanning is replaced by spatial light-wave scanning so as to greatly improve the measurement speed and robustness. © 2004 Optical Society of America.
Resumo:
This paper presents for the first time the concept of measurement assisted assembly (MAA) and outlines the research priorities of the realisation of this concept in the industry. MAA denotes a paradigm shift in assembly for high value and complex products and encompasses the development and use of novel metrology processes for the holistic integration and capability enhancement of key assembly and ancillary processes. A complete framework for MAA is detailed showing how this can facilitate a step change in assembly process capability and efficiency for large and complex products, such as airframes, where traditional assembly processes exhibit the requirement for rectification and rework, use inflexible tooling and are largely manual, resulting in cost and cycle time pressures. The concept of MAA encompasses a range of innovativemeasurement- assisted processes which enable rapid partto- part assembly, increased use of flexible automation, traceable quality assurance and control, reduced structure weight and improved levels of precision across the dimensional scales. A full scale industrial trial of MAA technologies has been carried out on an experimental aircraft wing demonstrating the viability of the approach while studies within 140 smaller companies have highlighted the need for better adoption of existing process capability and quality control standards. The identified research priorities for MAA include the development of both frameless and tooling embedded automated metrology networks. Other research priorities relate to the development of integrated dimensional variation management, thermal compensation algorithms as well as measurement planning and inspection of algorithms linking design to measurement and process planning. © Springer-Verlag London 2013.
Resumo:
In the Light Controlled Factory part-to-part assembly and reduced weight will be enabled through the use of predictive fitting processes; low cost high accuracy reconfigurable tooling will be made possible by active compensation; improved control will allow accurate robotic machining; and quality will be improved through the use of traceable uncertainty based quality control throughout the production system. A number of challenges must be overcome before this vision will be realized; 1) controlling industrial robots for accurate machining; 2) compensation of measurements for thermal expansion; 3) Compensation of measurements for refractive index changes; 4) development of Embedded Metrology Tooling for in-tooling measurement and active tooling compensation; and 5) development of Software for the Planning and Control of Integrated Metrology Networks based on Quality Control with Uncertainty Evaluation and control systems for predictive processes. This paper describes how these challenges are being addressed, in particular the central challenge of developing large volume measurement process models within an integrated dimensional variation management (IDVM) system.
Resumo:
Aircraft assembly is the most important part of aircraft manufacturing. A large number of assembly fixtures must be used to ensure the assembly accuracy in the aircraft assembly process. Traditional fixed assembly fixture could not satisfy the change of the aircraft types, so the digital flexible assembly fixture was developed and was gradually applied in the aircraft assembly. Digital flexible assembly technology has also become one of the research directions in the field of aircraft manufacturing. The aircraft flexible assembly can be divided into three assembly stages that include component-level flexible assembly, large component-level flexible assembly, and large components alignment and joining. This article introduces the architecture of flexible assembly systems and the principles of three types of flexible assembly fixtures. The key technologies of the digital flexible assembly are also discussed. The digital metrology system provides the basis for the accurate digital flexible assembly. Aircraft flexible assembly systems mainly use laser tracking metrology systems and indoor Global Positioning System metrology systems. With the development of flexible assembly technology, the digital flexible assembly system will be widely used in current aircraft manufacturing.