43 resultados para Dimensional measurement accuracy


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The measurement of 8-oxo-7,8-dihydro-2'-deoxyguanosine is an increasingly popular marker of in vivo oxidative damage to DNA. A random-sequence 21-mer oligonucleotide 5'-TCA GXC GTA CGT GAT CTC AGT-3' in which X was 8-oxo-guanine (8-oxo-G) was purified and accurate determination of the oxidised base was confirmed by a 32P-end labelling strategy. The lyophilised material was analysed for its absolute content of 8-oxo-dG by several major laboratories in Europe and one in Japan. Most laboratories using HPLC-ECD underestimated, while GC-MS-SIM overestimated the level of the lesion. HPLC-ECD measured the target value with greatest accuracy. The results also suggest that none of the procedures can accurately quantitate levels of 1 in 10(6) 8-oxo-(d)G in DNA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper begins by suggesting that when considering Corporate Social Responsibility (CSR), even CSR as justified in terms of the business case, stakeholders are of great importance to corporations. In the UK the Company Law Review (DTI, 2002) has suggested that it is appropriate for UK companies to be managed upon the basis of an enlightened shareholder approach. Within this approach the importance of stakeholders, other than shareholders, is recognised as being instrumental in succeeding in providing shareholder value. Given the importance of these other stakeholders it is then important that corporate management measure and manage stakeholder performance. In order to do this there are two general approaches that could be adopted and these are the use of monetary values to reflect stakeholder value or cost and non-monetary values. In order to consider these approaches further this paper considered the possible use of these approaches for two stakeholder groups: namely employees and the environment. It concludes that there are ethical and practical difficulties with calculating economic values for stakeholder resources and so prefers a multi-dimensional approach to stakeholder performance measurement that does not use economic valuation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We assess the accuracy of the Visante anterior segment optical coherence tomographer (AS-OCT) and present improved formulas for measurement of surface curvature and axial separation. Measurements are made in physical model eyes. Accuracy is compared for measurements of corneal thickness (d1) and anterior chamber depth (d2) using-built-in AS-OCT software versus the improved scheme. The improved scheme enables measurements of lens thickness (d 3) and surface curvature, in the form of conic sections specified by vertex radii and conic constants. These parameters are converted to surface coordinates for error analysis. The built-in AS-OCT software typically overestimates (mean±standard deviation(SD)]d1 by +62±4 μm and d2 by +4±88μm. The improved scheme reduces d1 (-0.4±4 μm) and d2 (0±49 μm) errors while also reducing d3 errors from +218±90 (uncorrected) to +14±123 μm (corrected). Surface x coordinate errors gradually increase toward the periphery. Considering the central 6-mm zone of each surface, the x coordinate errors for anterior and posterior corneal surfaces reached +3±10 and 0±23 μm, respectively, with the improved scheme. Those of the anterior and posterior lens surfaces reached +2±22 and +11±71 μm, respectively. Our improved scheme reduced AS-OCT errors and could, therefore, enhance pre- and postoperative assessments of keratorefractive or cataract surgery, including measurement of accommodating intraocular lenses. © 2007 Society of Photo-Optical Instrumentation Engineers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE. A methodology for noninvasively characterizing the three-dimensional (3-D) shape of the complete human eye is not currently available for research into ocular diseases that have a structural substrate, such as myopia. A novel application of a magnetic resonance imaging (MRI) acquisition and analysis technique is presented that, for the first time, allows the 3-D shape of the eye to be investigated fully. METHODS. The technique involves the acquisition of a T2-weighted MRI, which is optimized to reveal the fluid-filled chambers of the eye. Automatic segmentation and meshing algorithms generate a 3-D surface model, which can be shaded with morphologic parameters such as distance from the posterior corneal pole and deviation from sphericity. Full details of the method are illustrated with data from 14 eyes of seven individuals. The spatial accuracy of the calculated models is demonstrated by comparing the MRI-derived axial lengths with values measured in the same eyes using interferometry. RESULTS. The color-coded eye models showed substantial variation in the absolute size of the 14 eyes. Variations in the sphericity of the eyes were also evident, with some appearing approximately spherical whereas others were clearly oblate and one was slightly prolate. Nasal-temporal asymmetries were noted in some subjects. CONCLUSIONS. The MRI acquisition and analysis technique allows a novel way of examining 3-D ocular shape. The ability to stratify and analyze eye shape, ocular volume, and sphericity will further extend the understanding of which specific biometric parameters predispose emmetropic children subsequently to develop myopia. Copyright © Association for Research in Vision and Ophthalmology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A consequence of a loss of coolant accident is the damage of adjacent insulation materials (IM). IM may then be transported to the containment sump strainers where water is drawn into the ECCS (emergency core cooling system). Blockage of the strainers by IM lead to an increased pressure drop acting on the operating ECCS pumps. IM can also penetrate the strainers, enter the reactor coolant system and then accumulate in the reactor pressure vessel. An experimental and theoretical study that concentrates on mineral wool fiber transport in the containment sump and the ECCS is being performed. The study entails fiber generation and the assessment of fiber transport in single and multi-effect experiments. The experiments include measurement of the terminal settling velocity, the strainer pressure drop, fiber sedimentation and resuspension in a channel flow and jet flow in a rectangular tank. An integrated test facility is also operated to assess the compounded effects. Each experimental facility is used to provide data for the validation of equivalent computational fluid dynamic models. The channel flow facility allows the determination of the steady state distribution of the fibers at different flow velocities. The fibers are modeled in the Eulerian-Eulerian reference frame as spherical wetted agglomerates. The fiber agglomerate size, density, the relative viscosity of the fluid-fiber mixture and the turbulent dispersion of the fibers all affect the steady state accumulation of fibers at the channel base. In the current simulations, two fiber phases are separately considered. The particle size is kept constant while the density is modified, which affects both the terminal velocity and volume fraction. The relative viscosity is only significant at higher concentrations. The numerical model finds that the fibers accumulate at the channel base even at high velocities; therefore, modifications to the drag and turbulent dispersion forces can be made to reduce fiber accumulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distributed Brillouin sensing of strain and temperature works by making spatially resolved measurements of the position of the measurand-dependent extremum of the resonance curve associated with the scattering process in the weakly nonlinear regime. Typically, measurements of backscattered Stokes intensity (the dependent variable) are made at a number of predetermined fixed frequencies covering the design measurand range of the apparatus and combined to yield an estimate of the position of the extremum. The measurand can then be found because its relationship to the position of the extremum is assumed known. We present analytical expressions relating the relative error in the extremum position to experimental errors in the dependent variable. This is done for two cases: (i) a simple non-parametric estimate of the mean based on moments and (ii) the case in which a least squares technique is used to fit a Lorentzian to the data. The question of statistical bias in the estimates is discussed and in the second case we go further and present for the first time a general method by which the probability density function (PDF) of errors in the fitted parameters can be obtained in closed form in terms of the PDFs of the errors in the noisy data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The state of the art in productivity measurement and analysis shows a gap between simple methods having little relevance in practice and sophisticated mathematical theory which is unwieldy for strategic and tactical planning purposes, -particularly at company level. An extension is made in this thesis to the method of productivity measurement and analysis based on the concept of added value, appropriate to those companies in which the materials, bought-in parts and services change substantially and a number of plants and inter-related units are involved in providing components for final assembly. Reviews and comparisons of productivity measurement dealing with alternative indices and their problems have been made and appropriate solutions put forward to productivity analysis in general and the added value method in particular. Based on this concept and method, three kinds of computerised models two of them deterministic, called sensitivity analysis and deterministic appraisal, and the third one, stochastic, called risk simulation, have been developed to cope with the planning of productivity and productivity growth with reference to the changes in their component variables, ranging from a single value 'to• a class interval of values of a productivity distribution. The models are designed to be flexible and can be adjusted according to the available computer capacity expected accuracy and 'presentation of the output. The stochastic model is based on the assumption of statistical independence between individual variables and the existence of normality in their probability distributions. The component variables have been forecasted using polynomials of degree four. This model is tested by comparisons of its behaviour with that of mathematical model using real historical data from British Leyland, and the results were satisfactory within acceptable levels of accuracy. Modifications to the model and its statistical treatment have been made as required. The results of applying these measurements and planning models to the British motor vehicle manufacturing companies are presented and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim: The aim of this study was to evaluate the practicality and accuracy of tonometers used in routine clinical practice for established keratoconus (KC). Methods: This was a prospective study of 118 normal and 76 keratoconic eyes where intraocular pressure (IOP) was measured in random order using the Goldman applanation tonometer (GAT), Pascal dynamic contour tonometer (DCT), Reichert ocular response analyser (ORA) and TonoPen XL tonometer. Corneal hysteresis (CH) and corneal resistance factor (CRF), as calculated by the ORA, were recorded. Central corneal thickness (CCT) was measured using an ultrasound pachymeter. Results: The difference in IOP values between instruments was highly significant in both study groups (p<0.001). All other IOP measures were significantly higher than those for GAT, except for the Goldmann-correlated IOP (average of the two applanation pressure points) (IOPg) as measured by ORA in the control group and the CH-corrected IOP (corneal-compensated IOP value) (IOPcc) measures in the KC group. CCT, CH and CRF were significantly less in the KC group (p<0.001). Apart from the DCT, all techniques tended to measure IOP higher in eyes with thicker corneas. Conclusion: The DCT and the ORA are currently the most appropriate tonometers to use in KC for the measurement of IOPcc. Corneal factors such as CH and CRT may be of more importance than CCT in causing inaccuracies in applanation tonometry techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An array of in-line curvature sensors on a garment is used to monitor the thoracic and abdominal movements of a human during respiration. The results are used to obtain volumetric changes of the human torso in agreement with a spirometer used simultaneously at the mouth. The array of 40 in-line fiber Bragg gratings is used to produce 20 curvature sensors at different locations, each sensor consisting of two fiber Bragg gratings. The 20 curvature sensors and adjoining fiber are encapsulated into a low-temperature-cured synthetic silicone. The sensors are wavelength interrogated by a commercially available system from Moog Insensys, and the wavelength changes are calibrated to recover curvature. A three-dimensional algorithm is used to generate shape changes during respiration that allow the measurement of absolute volume changes at various sections of the torso. It is shown that the sensing scheme yields a volumetric error of 6%. Comparing the volume data obtained from the spirometer with the volume estimated with the synchronous data from the shape-sensing array yielded a correlation value 0.86 with a Pearson's correlation coefficient p <0.01.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate two numerical procedures for the Cauchy problem in linear elasticity, involving the relaxation of either the given boundary displacements (Dirichlet data) or the prescribed boundary tractions (Neumann data) on the over-specified boundary, in the alternating iterative algorithm of Kozlov et al. (1991). The two mixed direct (well-posed) problems associated with each iteration are solved using the method of fundamental solutions (MFS), in conjunction with the Tikhonov regularization method, while the optimal value of the regularization parameter is chosen via the generalized cross-validation (GCV) criterion. An efficient regularizing stopping criterion which ceases the iterative procedure at the point where the accumulation of noise becomes dominant and the errors in predicting the exact solutions increase, is also presented. The MFS-based iterative algorithms with relaxation are tested for Cauchy problems for isotropic linear elastic materials in various geometries to confirm the numerical convergence, stability, accuracy and computational efficiency of the proposed method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a self-reference multiplexed fibre interferometer (MFI) by using a tunable laser and fibre Bragg grating (FBG). The optical measurement system multiplexes two Michelson fibre interferometers with shared optical path in the main part of optical system. One fibre optic interferometer is used as a reference interferometer to monitor and control the high accuracy of the measurement system under environmental perturbations. The other is used as a measurement interferometer to obtain information from the target. An active phase tracking homodyne (APTH) technique is applied for signal processing to achieve high resolution. MFI can be utilised for high precision absolute displacement measurement with different combination of wavelengths from the tuneable laser. By means of Wavelength-Division-Multiplexing (WDM) technique, MFI is also capable of realising on-line surface measurement, in which traditional stylus scanning is replaced by spatial light-wave scanning so as to greatly improve the measurement speed and robustness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computational Fluid Dynamics (CFD) has found great acceptance among the engineering community as a tool for research and design of processes that are practically difficult or expensive to study experimentally. One of these processes is the biomass gasification in a Circulating Fluidized Bed (CFB). Biomass gasification is the thermo-chemical conversion of biomass at a high temperature and a controlled oxygen amount into fuel gas, also sometime referred to as syngas. Circulating fluidized bed is a type of reactor in which it is possible to maintain a stable and continuous circulation of solids in a gas-solid system. The main objectives of this thesis are four folds: (i) Develop a three-dimensional predictive model of biomass gasification in a CFB riser using advanced Computational Fluid Dynamic (CFD) (ii) Experimentally validate the developed hydrodynamic model using conventional and advanced measuring techniques (iii) Study the complex hydrodynamics, heat transfer and reaction kinetics through modelling and simulation (iv) Study the CFB gasifier performance through parametric analysis and identify the optimum operating condition to maximize the product gas quality. Two different and complimentary experimental techniques were used to validate the hydrodynamic model, namely pressure measurement and particle tracking. The pressure measurement is a very common and widely used technique in fluidized bed studies, while, particle tracking using PEPT, which was originally developed for medical imaging, is a relatively new technique in the engineering field. It is relatively expensive and only available at few research centres around the world. This study started with a simple poly-dispersed single solid phase then moved to binary solid phases. The single solid phase was used for primary validations and eliminating unnecessary options and steps in building the hydrodynamic model. Then the outcomes from the primary validations were applied to the secondary validations of the binary mixture to avoid time consuming computations. Studies on binary solid mixture hydrodynamics is rarely reported in the literature. In this study the binary solid mixture was modelled and validated using experimental data from the both techniques mentioned above. Good agreement was achieved with the both techniques. According to the general gasification steps the developed model has been separated into three main gasification stages; drying, devolatilization and tar cracking, and partial combustion and gasification. The drying was modelled as a mass transfer from the solid phase to the gas phase. The devolatilization and tar cracking model consist of two steps; the devolatilization of the biomass which is used as a single reaction to generate the biomass gases from the volatile materials and tar cracking. The latter is also modelled as one reaction to generate gases with fixed mass fractions. The first reaction was classified as a heterogeneous reaction while the second reaction was classified as homogenous reaction. The partial combustion and gasification model consisted of carbon combustion reactions and carbon and gas phase reactions. The partial combustion considered was for C, CO, H2 and CH4. The carbon gasification reactions used in this study is the Boudouard reaction with CO2, the reaction with H2O and Methanation (Methane forming reaction) reaction to generate methane. The other gas phase reactions considered in this study are the water gas shift reaction, which is modelled as a reversible reaction and the methane steam reforming reaction. The developed gasification model was validated using different experimental data from the literature and for a wide range of operating conditions. Good agreement was observed, thus confirming the capability of the model in predicting biomass gasification in a CFB to a great accuracy. The developed model has been successfully used to carry out sensitivity and parametric analysis. The sensitivity analysis included: study of the effect of inclusion of various combustion reaction; and the effect of radiation in the gasification reaction. The developed model was also used to carry out parametric analysis by changing the following gasifier operating conditions: fuel/air ratio; biomass flow rates; sand (heat carrier) temperatures; sand flow rates; sand and biomass particle sizes; gasifying agent (pure air or pure steam); pyrolysis models used; steam/biomass ratio. Finally, based on these parametric and sensitivity analysis a final model was recommended for the simulation of biomass gasification in a CFB riser.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distributed Brillouin sensing of strain and temperature works by making spatially resolved measurements of the position of the measurand-dependent extremum of the resonance curve associated with the scattering process in the weakly nonlinear regime. Typically, measurements of backscattered Stokes intensity (the dependent variable) are made at a number of predetermined fixed frequencies covering the design measurand range of the apparatus and combined to yield an estimate of the position of the extremum. The measurand can then be found because its relationship to the position of the extremum is assumed known. We present analytical expressions relating the relative error in the extremum position to experimental errors in the dependent variable. This is done for two cases: (i) a simple non-parametric estimate of the mean based on moments and (ii) the case in which a least squares technique is used to fit a Lorentzian to the data. The question of statistical bias in the estimates is discussed and in the second case we go further and present for the first time a general method by which the probability density function (PDF) of errors in the fitted parameters can be obtained in closed form in terms of the PDFs of the errors in the noisy data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An array of in-line curvature sensors on a garment is used to monitor the thoracic and abdominal movements of a human during respiration. The results are used to obtain volumetric changes of the human torso in agreement with a spirometer used simultaneously at the mouth. The array of 40 in-line fiber Bragg gratings is used to produce 20 curvature sensors at different locations, each sensor consisting of two fiber Bragg gratings. The 20 curvature sensors and adjoining fiber are encapsulated into a low-temperature-cured synthetic silicone. The sensors are wavelength interrogated by a commercially available system from Moog Insensys, and the wavelength changes are calibrated to recover curvature. A three-dimensional algorithm is used to generate shape changes during respiration that allow the measurement of absolute volume changes at various sections of the torso. It is shown that the sensing scheme yields a volumetric error of 6%. Comparing the volume data obtained from the spirometer with the volume estimated with the synchronous data from the shape-sensing array yielded a correlation value 0.86 with a Pearson's correlation coefficient p <0.01.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – This paper aims to consider how climate change performance is measured and accounted for within the performance framework for local authority areas in England adopted in 2008. It critically evaluates the design of two mitigation and one adaptation indicators that are most relevant to climate change. Further, the potential for these performance indicators to contribute to climate change mitigation and adaptation is discussed. Design/methodology/approach – The authors begin by examining the importance of the performance framework and the related Local Area Agreements (LAAs), which were negotiated for all local areas in England between central government and Local Strategic Partnerships (LSPs). This development is located within the broader literature relating to new public management. The potential for this framework to assist in delivering the UK's climate change policy objectives is researched in a two-stage process. First, government publications and all 150 LAAs were analysed to identify the level of priority given to the climate change indicators. Second, interviews were conducted in spring 2009 with civil servants and local authority officials from the English West Midlands who were engaged in negotiating the climate change content of the LAAs. Findings – Nationally, the authors find that 97 per cent of LAAs included at least one climate change indicator as a priority. The indicators themselves, however, are perceived to be problematic – in terms of appropriateness, accuracy and timeliness. In addition, concerns were identified about the level of local control over the drivers of climate change performance and, therefore, a question is raised as to how LSPs can be held accountable for this. On a more positive note, for those concerned about climate change, the authors do find evidence that the inclusion of these indicators within the performance framework has helped to move climate change up the agenda for local authorities and their partners. However, actions by the UK's new coalition government to abolish the national performance framework and substantially reduce public expenditure potentially threaten this advance. Originality/value – This paper offers an insight into a new development for measuring climate change performance at a local level, which is relatively under-researched. It also contributes to knowledge of accountability within a local government setting and provides a reference point for further research into the potential role of local actions to address the issue of climate change.