24 resultados para technical error of measurement
Resumo:
This theoretical study shows the technical feasibility of self-powered geothermal desalination of groundwater sources at <100 °C. A general method and framework are developed and then applied to specific case studies. First, the analysis considers an ideal limit to performance based on exergy analysis using generalised idealised assumptions. This thermodynamic limit applies to any type of process technology. Then, the analysis focuses specifically on the Organic Rankine Cycle (ORC) driving Reverse Osmosis (RO), as these are among the most mature and efficient applicable technologies. Important dimensionless parameters are calculated for the ideal case of the self-powered arrangement and semi-ideal case where only essential losses dependent on the RO system configuration are considered. These parameters are used to compare the performance of desalination systems using ORC-RO under ideal, semi-ideal and real assumptions for four case studies relating to geothermal sources located in India, Saudi Arabia, Tunisia and Turkey. The overall system recovery ratio (the key performance measure for the self-powered process) depends strongly on the geothermal source temperature. It can be as high as 91.5% for a hot spring emerging at 96 °C with a salinity of 1830 mg/kg.
Resumo:
In the present paper we numerically study instrumental impact on statistical properties of quasi-CW Raman fiber laser using a simple model of multimode laser radiation. Effects, that have the most influence, are limited electrical bandwidth of measurement equipment and noise. To check this influence, we developed a simple model of the multimode quasi- CW generation with exponential statistics (i.e. uncorrelated modes). We found that the area near zero intensity in probability density function (PDF) is strongly affected by both factors, for example both lead to formation of a negative wing of intensity distribution. But far wing slope of PDF is not affected by noise and, for moderate mismatch between optical and electrical bandwidth, is only slightly affected by bandwidth limitation. The generation spectrum often becomes broader at higher power in experiments, so the spectral/electrical bandwidth mismatch factor increases over the power that can lead to artificial dependence of the PDF slope over the power. It was also found that both effects influence the ACF background level: noise impact decreases it, while limited bandwidth leads to its increase. © (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Resumo:
Context Many large organizations juggle an application portfolio that contains different applications that fulfill similar tasks in the organization. In an effort to reduce operating costs, they are attempting to consolidate such applications. Before consolidating applications, the work that is done with these applications must be harmonized. This is also known as process harmonization. Objective The increased interest in process harmonization calls for measures to quantify the extent to which processes have been harmonized. These measures should also uncover the factors that are of interest when harmonizing processes. Currently, such measures do not exist. Therefore, this study develops and validates a measurement model to quantify the level of process harmonization in an organization. Method The measurement model was developed by means of a literature study and structured interviews. Subsequently, it was validated through a survey, using factor analysis and correlations with known related constructs. Results As a result, a valid and reliable measurement model was developed. The factors that are found to constitute process harmonization are: the technical design of the business process and its data, the resources that execute the process, and the information systems that are used in the process. In addition, strong correlations were found between process harmonization and process standardization and between process complexity and process harmonization. Conclusion The measurement model can be used by practitioners, because it shows them the factors that must be taken into account when harmonizing processes, and because it provides them with a means to quantify the extent to which they succeeded in harmonizing their processes. At the same time, it can be used by researchers to conduct further empirical research in the area of process harmonization.
Resumo:
Fluorescence spectroscopy has recently become more common in clinical medicine. However, there are still many unresolved issues related to the methodology and implementation of instruments with this technology. In this study, we aimed to assess individual variability of fluorescence parameters of endogenous markers (NADH, FAD, etc.) measured by fluorescent spectroscopy (FS) in situ and to analyse the factors that lead to a significant scatter of results. Most studied fluorophores have an acceptable scatter of values (mostly up to 30%) for diagnostic purposes. Here we provide evidence that the level of blood volume in tissue impacts FS data with a significant inverse correlation. The distribution function of the fluorescence intensity and the fluorescent contrast coefficient values are a function of the normal distribution for most of the studied fluorophores and the redox ratio. The effects of various physiological (different content of skin melanin) and technical (characteristics of optical filters) factors on the measurement results were additionally studied.The data on the variability of the measurement results in FS should be considered when interpreting the diagnostic parameters, as well as when developing new algorithms for data processing and FS devices.
Resumo:
As the largest source of dimensional measurement uncertainty, addressing the challenges of thermal variation is vital to ensure product and equipment integrity in the factories of the future. While it is possible to closely control room temperature, this is often not practical or economical to realise in all cases where inspection is required. This article reviews recent progress and trends in seven key commercially available industrial temperature measurement sensor technologies primarily in the range of 0 °C–50 °C for invasive, semi-invasive and non-invasive measurement. These sensors will ultimately be used to measure and model thermal variation in the assembly, test and integration environment. The intended applications for these technologies are presented alongside some consideration of measurement uncertainty requirements with regard to the thermal expansion of common materials. Research priorities are identified and discussed for each of the technologies as well as temperature measurement at large. Future developments are briefly discussed to provide some insight into which direction the development and application of temperature measurement technologies are likely to head.
Resumo:
In dimensional metrology, often the largest source of uncertainty of measurement is thermal variation. Dimensional measurements are currently scaled linearly, using ambient temperature measurements and coefficients of thermal expansion, to ideal metrology conditions at 20˚C. This scaling is particularly difficult to implement with confidence in large volumes as the temperature is unlikely to be uniform, resulting in thermal gradients. A number of well-established computational methods are used in the design phase of product development for the prediction of thermal and gravitational effects, which could be used to a greater extent in metrology. This paper outlines the theory of how physical measurements of dimension and temperature can be combined more comprehensively throughout the product lifecycle, from design through to the manufacturing phase. The Hybrid Metrology concept is also introduced: an approach to metrology, which promises to improve product and equipment integrity in future manufacturing environments. The Hybrid Metrology System combines various state of the art physical dimensional and temperature measurement techniques with established computational methods to better predict thermal and gravitational effects.
Resumo:
It has never been easy for manufacturing companies to understand their confidence level in terms of how accurate and to what degree of flexibility parts can be made. This brings uncertainty in finding the most suitable manufacturing method as well as in controlling their product and process verification systems. The aim of this research is to develop a system for capturing the company’s knowledge and expertise and then reflect it into an MRP (Manufacturing Resource Planning) system. A key activity here is measuring manufacturing and machining capabilities to a reasonable confidence level. For this purpose an in-line control measurement system is introduced to the company. Using SPC (Statistical Process Control) not only helps to predict the trend in manufacturing of parts but also minimises the human error in measurement. Gauge R&R (Repeatability and Reproducibility) study identifies problems in measurement systems. Measurement is like any other process in terms of variability. Reducing this variation via an automated machine probing system helps to avoid defects in future products.Developments in aerospace, nuclear, oil and gas industries demand materials with high performance and high temperature resistance under corrosive and oxidising environments. Superalloys were developed in the latter half of the 20th century as high strength materials for such purposes. For the same characteristics superalloys are considered as difficult-to-cut alloys when it comes to formation and machining. Furthermore due to the sensitivity of superalloy applications, in many cases they should be manufactured with tight tolerances. In addition superalloys, specifically Nickel based, have unique features such as low thermal conductivity due to having a high amount of Nickel in their material composition. This causes a high surface temperature on the work-piece at the machining stage which leads to deformation in the final product.Like every process, the material variations have a significant impact on machining quality. The main cause of variations can originate from chemical composition and mechanical hardness. The non-uniform distribution of metal elements is a major source of variation in metallurgical structures. Different heat treatment standards are designed for processing the material to the desired hardness levels based on application. In order to take corrective actions, a study on the material aspects of superalloys has been conducted. In this study samples from different batches of material have been analysed. This involved material preparation for microscopy analysis, and the effect of chemical compositions on hardness (before and after heat treatment). Some of the results are discussed and presented in this paper.
Resumo:
This paper details a method of estimating the uncertainty of dimensional measurement for a three-dimensional coordinate measurement machine. An experimental procedure was developed to compare three-dimensional coordinate measurements with calibrated reference points. The reference standard used to calibrate these reference points was a fringe counting interferometer with a multilateration-like technique employed to establish three-dimensional coordinates. This is an extension of the established technique of comparing measured lengths with calibrated lengths. Specifically a distributed coordinate measurement device was tested which consisted of a network of Rotary-Laser Automatic Theodolites (R-LATs), this system is known commercially as indoor GPS (iGPS). The method was found to be practical and was used to estimate that the uncertainty of measurement for the basic iGPS system is approximately 1 mm at a 95% confidence level throughout a measurement volume of approximately 10 m × 10 m × 1.5 m. © 2010 IOP Publishing Ltd.
Resumo:
A method of precise characterization of surface nanoscale axial photonics (SNAP) structures with a reference fiber is proposed, analyzed, and demonstrated experimentally. The method is based on simultaneous coupling of a microfiber to a SNAP structure under test and to a reference optical fiber. Significant reduction of measurement errors associated with the environmental temperature variations and technical noise of the spectrum analyzer is demonstrated. The achieved measurement precision of the effective radius variation of the SNAP structure is 0.2 Å.