951 resultados para FIELD MEASUREMENT
Resumo:
The ERS-1 Satellite was launched in July 1991 by the European Space Agency into a polar orbit at about 800 km, carrying a C-band scatterometer. A scatterometer measures the amount of backscatter microwave radiation reflected by small ripples on the ocean surface induced by sea-surface winds, and so provides instantaneous snap-shots of wind flow over large areas of the ocean surface, known as wind fields. Inherent in the physics of the observation process is an ambiguity in wind direction; the scatterometer cannot distinguish if the wind is blowing toward or away from the sensor device. This ambiguity implies that there is a one-to-many mapping between scatterometer data and wind direction. Current operational methods for wind field retrieval are based on the retrieval of wind vectors from satellite scatterometer data, followed by a disambiguation and filtering process that is reliant on numerical weather prediction models. The wind vectors are retrieved by the local inversion of a forward model, mapping scatterometer observations to wind vectors, and minimising a cost function in scatterometer measurement space. This thesis applies a pragmatic Bayesian solution to the problem. The likelihood is a combination of conditional probability distributions for the local wind vectors given the scatterometer data. The prior distribution is a vector Gaussian process that provides the geophysical consistency for the wind field. The wind vectors are retrieved directly from the scatterometer data by using mixture density networks, a principled method to model multi-modal conditional probability density functions. The complexity of the mapping and the structure of the conditional probability density function are investigated. A hybrid mixture density network, that incorporates the knowledge that the conditional probability distribution of the observation process is predominantly bi-modal, is developed. The optimal model, which generalises across a swathe of scatterometer readings, is better on key performance measures than the current operational model. Wind field retrieval is approached from three perspectives. The first is a non-autonomous method that confirms the validity of the model by retrieving the correct wind field 99% of the time from a test set of 575 wind fields. The second technique takes the maximum a posteriori probability wind field retrieved from the posterior distribution as the prediction. For the third technique, Markov Chain Monte Carlo (MCMC) techniques were employed to estimate the mass associated with significant modes of the posterior distribution, and make predictions based on the mode with the greatest mass associated with it. General methods for sampling from multi-modal distributions were benchmarked against a specific MCMC transition kernel designed for this problem. It was shown that the general methods were unsuitable for this application due to computational expense. On a test set of 100 wind fields the MAP estimate correctly retrieved 72 wind fields, whilst the sampling method correctly retrieved 73 wind fields.
Resumo:
The study investigated the potential applications and the limitations of non-standard techniques of visual field investigation utilizing automated perimetry. Normal subjects exhibited a greater sensitivity to kinetic stimuli than to static stimuli of identical size. The magnitude of physiological SKD was found to be largely independent of age, stimulus size, meridian and eccentricity. The absence of a dependency on stimulus size indicated that successive lateral spatial summation could not totally account for the underlying mechanism of physiological SKD. The visual field indices MD and LV exhibited a progressive deterioration during the time course of a conventional central visual field examination both for normal subjects and for ocular hypertensive patients. The fatigue effect was more pronounced in the latter stages and for the second eye tested. The confidence limits for the definition of abnormality should reflect the greater effect of fatigue on the second eye. A 330 cdm-2 yellow background was employed for blue-on-yellow perimetry. Instrument measurement range was preserved by positioning a concave mirror behind the stimulus bulb to increase the light output by 60% . The mean magnitude of SWS pathway isolation was approximately 1.4 log units relative to a 460nm stimulus filter. The absorption spectra of the ocular media exhibited an exponential increase with increase in age, whilst that of the macular pigment showed no systematic trend. The magnitude of ocular media absorption was demonstrated to reduce with increase in wavelength. Ocular media absorption was significantly greater in diabetic patients than in normal subjects. Five diabetic patients with either normal or borderline achromatic sensitivity exhibited an abnormal blue-on-yellow sensitivity; two of these patients showed no signs of retinopathy. A greater vulnerability of the SWS pathway to the diabetic disease process was hypothesized.
Resumo:
Off-highway motive plant equipment is costly in capital outlay and maintenance. To reduce these overheads and increase site safety and workrate, a technique of assessing and limiting the velocity of such equipment is required. Due to the extreme environmental conditions met on such sites, conventional velocity measurement techniques are inappropriate. Ogden Electronics Limited were formed specifically to manufacture a motive plant safety system incorporating a speed sensor and sanction unit; to date, the only such commercial unit available. However, problems plague the reliability, accuracy and mass production of this unit. This project assesses the company's exisiting product, and in conjunction with an appreciation of the company history and structure, concludes that this unit is unsuited to its intended application. Means of improving the measurement accuracy and longevity of this unit, commensurate with the company's limited resources and experience, are proposed, both for immediate retrofit and for longer term use. This information is presented in the form of a number of internal reports for the company. The off-highway environment is examined; and in conjunction with an evaluation of means of obtaining a returned signal, comparisons of processing techniques, and on-site gathering of previously unavailable data, preliminary designs for an alternative product are drafted. Theoretical aspects are covered by a literature review of ground-pointing radar, vehicular radar, and velocity measuring systems. This review establishes and collates the body of knowledge in areas previously considered unrelated. Based upon this work, a new design is proposed which is suitable for incorporation into the existing company product range. Following production engineering of the design, five units were constructed, tested and evaluated on-site. After extended field trials, this design has shown itself to possess greater accuracy, reliability and versatility than the existing sensor, at a lower unit cost.
Resumo:
The principle theme of this thesis is the advancement and expansion of ophthalmic research via the collaboration between professional Engineers and professional Optometrists. The aim has been to develop new and novel approaches and solutions to contemporary problems in the field. The work is sub divided into three areas of investigation; 1) High technology systems, 2) Modification of current systems to increase functionality, and 3) Development of smaller more portable and cost effective systems. High Technology Systems: A novel high speed Optical Coherence Tomography (OCT) system with integrated simultaneous high speed photography was developed achieving better operational speed than is currently available commercially. The mechanical design of the system featured a novel 8 axis alignment system. A full set of capture, analysis, and post processing software was developed providing custom analysis systems for ophthalmic OCT imaging, expanding the current capabilities of the technology. A large clinical trial was undertaken to test the dynamics of contact lens edge interaction with the cornea in-vivo. The interaction between lens edge design, lens base curvature, post insertion times and edge positions was investigated. A novel method for correction of optical distortion when assessing lens indentation was also demonstrated. Modification of Current Systems: A commercial autorefractor, the WAM-5500, was modified with the addition of extra hardware and a custom software and firmware solution to produce a system that was capable of measuring dynamic accommodative response to various stimuli in real time. A novel software package to control the data capture process was developed allowing real time monitoring of data by the practitioner, adding considerable functionality of the instrument further to the standard system. The device was used to assess the accommodative response differences between subjects who had worn UV blocking contact lens for 5 years, verses a control group that had not worn UV blocking lenses. While the standard static measurement of accommodation showed no differences between the two groups, it was determined that the UV blocking group did show better accommodative rise and fall times (faster), thus demonstrating the benefits of the modification of this commercially available instrumentation. Portable and Cost effective Systems: A new instrument was developed to expand the capability of the now defunct Keeler Tearscope. A device was developed that provided a similar capability in allowing observation of the reflected mires from the tear film surface, but with the added advantage of being able to record the observations. The device was tested comparatively with the tearscope and other tear film break-up techniques, demonstrating its potential. In Conclusion: This work has successfully demonstrated the advantages of interdisciplinary research between engineering and ophthalmic research has provided new and novel instrumented solutions as well as having added to the sum of scientific understanding in the ophthalmic field.
Resumo:
Guest editorial Ali Emrouznejad is a Senior Lecturer at the Aston Business School in Birmingham, UK. His areas of research interest include performance measurement and management, efficiency and productivity analysis as well as data mining. He has published widely in various international journals. He is an Associate Editor of IMA Journal of Management Mathematics and Guest Editor to several special issues of journals including Journal of Operational Research Society, Annals of Operations Research, Journal of Medical Systems, and International Journal of Energy Management Sector. He is in the editorial board of several international journals and co-founder of Performance Improvement Management Software. William Ho is a Senior Lecturer at the Aston University Business School. Before joining Aston in 2005, he had worked as a Research Associate in the Department of Industrial and Systems Engineering at the Hong Kong Polytechnic University. His research interests include supply chain management, production and operations management, and operations research. He has published extensively in various international journals like Computers & Operations Research, Engineering Applications of Artificial Intelligence, European Journal of Operational Research, Expert Systems with Applications, International Journal of Production Economics, International Journal of Production Research, Supply Chain Management: An International Journal, and so on. His first authored book was published in 2006. He is an Editorial Board member of the International Journal of Advanced Manufacturing Technology and an Associate Editor of the OR Insight Journal. Currently, he is a Scholar of the Advanced Institute of Management Research. Uses of frontier efficiency methodologies and multi-criteria decision making for performance measurement in the energy sector This special issue aims to focus on holistic, applied research on performance measurement in energy sector management and for publication of relevant applied research to bridge the gap between industry and academia. After a rigorous refereeing process, seven papers were included in this special issue. The volume opens with five data envelopment analysis (DEA)-based papers. Wu et al. apply the DEA-based Malmquist index to evaluate the changes in relative efficiency and the total factor productivity of coal-fired electricity generation of 30 Chinese administrative regions from 1999 to 2007. Factors considered in the model include fuel consumption, labor, capital, sulphur dioxide emissions, and electricity generated. The authors reveal that the east provinces were relatively and technically more efficient, whereas the west provinces had the highest growth rate in the period studied. Ioannis E. Tsolas applies the DEA approach to assess the performance of Greek fossil fuel-fired power stations taking undesirable outputs into consideration, such as carbon dioxide and sulphur dioxide emissions. In addition, the bootstrapping approach is deployed to address the uncertainty surrounding DEA point estimates, and provide bias-corrected estimations and confidence intervals for the point estimates. The author revealed from the sample that the non-lignite-fired stations are on an average more efficient than the lignite-fired stations. Maethee Mekaroonreung and Andrew L. Johnson compare the relative performance of three DEA-based measures, which estimate production frontiers and evaluate the relative efficiency of 113 US petroleum refineries while considering undesirable outputs. Three inputs (capital, energy consumption, and crude oil consumption), two desirable outputs (gasoline and distillate generation), and an undesirable output (toxic release) are considered in the DEA models. The authors discover that refineries in the Rocky Mountain region performed the best, and about 60 percent of oil refineries in the sample could improve their efficiencies further. H. Omrani, A. Azadeh, S. F. Ghaderi, and S. Abdollahzadeh presented an integrated approach, combining DEA, corrected ordinary least squares (COLS), and principal component analysis (PCA) methods, to calculate the relative efficiency scores of 26 Iranian electricity distribution units from 2003 to 2006. Specifically, both DEA and COLS are used to check three internal consistency conditions, whereas PCA is used to verify and validate the final ranking results of either DEA (consistency) or DEA-COLS (non-consistency). Three inputs (network length, transformer capacity, and number of employees) and two outputs (number of customers and total electricity sales) are considered in the model. Virendra Ajodhia applied three DEA-based models to evaluate the relative performance of 20 electricity distribution firms from the UK and the Netherlands. The first model is a traditional DEA model for analyzing cost-only efficiency. The second model includes (inverse) quality by modelling total customer minutes lost as an input data. The third model is based on the idea of using total social costs, including the firm’s private costs and the interruption costs incurred by consumers, as an input. Both energy-delivered and number of consumers are treated as the outputs in the models. After five DEA papers, Stelios Grafakos, Alexandros Flamos, Vlasis Oikonomou, and D. Zevgolis presented a multiple criteria analysis weighting approach to evaluate the energy and climate policy. The proposed approach is akin to the analytic hierarchy process, which consists of pairwise comparisons, consistency verification, and criteria prioritization. In the approach, stakeholders and experts in the energy policy field are incorporated in the evaluation process by providing an interactive mean with verbal, numerical, and visual representation of their preferences. A total of 14 evaluation criteria were considered and classified into four objectives, such as climate change mitigation, energy effectiveness, socioeconomic, and competitiveness and technology. Finally, Borge Hess applied the stochastic frontier analysis approach to analyze the impact of various business strategies, including acquisition, holding structures, and joint ventures, on a firm’s efficiency within a sample of 47 natural gas transmission pipelines in the USA from 1996 to 2005. The author finds that there were no significant changes in the firm’s efficiency by an acquisition, and there is a weak evidence for efficiency improvements caused by the new shareholder. Besides, the author discovers that parent companies appear not to influence a subsidiary’s efficiency positively. In addition, the analysis shows a negative impact of a joint venture on technical efficiency of the pipeline company. To conclude, we are grateful to all the authors for their contribution, and all the reviewers for their constructive comments, which made this special issue possible. We hope that this issue would contribute significantly to performance improvement of the energy sector.
Resumo:
Some critical aspects of a new kind of on-line measurement technique for micro and nanoscale surface measurements are described. This attempts to use spatial light-wave scanning to replace mechanical stylus scanning, and an optical fibre interferometer to replace optically bulky interferometers for measuring the surfaces. The basic principle is based on measuring the phase shift of a reflected optical signal. Wavelength-division-multiplexing and fibre Bragg grating techniques are used to carry out wavelength-to-field transformation and phase-to-depth detection, allowing a large dynamic measurement ratio (range/resolution) and high signal-to-noise ratio with remote access. In effect the paper consists of two parts: multiplexed fibre interferometry and remote on-machine surface detection sensor (an optical dispersive probe). This paper aims to investigate the metrology properties of a multiplexed fibre interferometer and to verify its feasibility by both theoretical and experimental studies. Two types of optical probes, using a dispersive prism and a blazed grating, respectively, are introduced to realize wavelength-to-spatial scanning.
Resumo:
A vision system is applied to full-field displacements and deformation measurements in solid mechanics. A speckle like pattern is preliminary formed on the surface under investigation. To determine displacements field of one speckle image with respect to a reference speckle image, sub-images, referred to Zones Of Interest (ZOI) are considered. The field is obtained by matching a ZOI in the reference image with the respective ZOI in the moved image. Two image processing techniques are used for implementing the matching procedure: – cross correlation function and minimum mean square error (MMSE) of the ZOI intensity distribution. The two algorithms are compared and the influence of the ZOI size on the accuracy of measurements is studied.
Resumo:
Aircraft assembly is the most important part of aircraft manufacturing. A large number of assembly fixtures must be used to ensure the assembly accuracy in the aircraft assembly process. Traditional fixed assembly fixture could not satisfy the change of the aircraft types, so the digital flexible assembly fixture was developed and was gradually applied in the aircraft assembly. Digital flexible assembly technology has also become one of the research directions in the field of aircraft manufacturing. The aircraft flexible assembly can be divided into three assembly stages that include component-level flexible assembly, large component-level flexible assembly, and large components alignment and joining. This article introduces the architecture of flexible assembly systems and the principles of three types of flexible assembly fixtures. The key technologies of the digital flexible assembly are also discussed. The digital metrology system provides the basis for the accurate digital flexible assembly. Aircraft flexible assembly systems mainly use laser tracking metrology systems and indoor Global Positioning System metrology systems. With the development of flexible assembly technology, the digital flexible assembly system will be widely used in current aircraft manufacturing.
Resumo:
This study explores institutional complexity in Thai State-Owned Enterprises (SOEs). In doing so, a qualitative approach has been employed in this study in order to identify institutional logics in the field of Thai SOEs and to understand organisational and individual perceptions of institutional complexity in the implementation of performance measurement systems (PMS) and how they respond to the complexity. To achieve this goal, two Thai SOEs were studied, both of which faced challenges in the implementation of Economic Value Management (EVM) and Balance Scorecard (BSC) as well as difficulties in linking their individual BSC and incentive systems. The qualitative data were collected from semi-structured interviews and document reviews. The empirical aspects of this study reveal that the institutional logics in the field of Thai SOEs are the logic of bureaucracy, commercial operations, social activities, seniority and unity. Regarding the multiple institutional logics embedded, SOEs experienced the institutional complexity in the implementation of PMS. The results suggest that the organisations have decoupled the EVM and loosely coupled the BSC from organisational practices to cope with institutional complexity and conflict institutional demands. Also, the evidence shows that the institutional logics influence SOEs’ actions towards resisting changes incentive systems and the relationship between individual BSC and incentives.
Resumo:
Mai világunkban egyre több olyan erőforrást élünk fel, amelyek hatását az otthonunknak számító Föld egyszerűen már nem képes helyreállítani. Ebben számos jelenség mellett a gazdaság globalizációja, az élesedő versenyhelyzet, a fogyasztói társadalom további térnyerése, ebből adódóan pedig a logisztikai folyamatok intenzitásának növekedése kulcsszerepet játszik. A logisztikát érő kritikáknak ösztönözniük kell a vállalatok szakembereit arra, hogy változtassanak ezen. Ehhez elengedhetetlen a jelenlegi működés szénlábnyomának mérése. Csak a jelenállapot felmérése szolgálhat alapjául a fejlesztéseknek. A szerzők tanulmányának célja a szénlábnyomszámítás egy gyakorlati alkalmazásának ismertetése. Esettanulmány jelleggel bemutatják egy nagy nemzetközi vállalat hazai leányvállalatának a szénlábnyom-számítása során alkalmazott módszertanát. A számítások során a vállalat disztribúciós logisztikai folyamataira fókuszálnak, kiemelten vizsgálták a közúti szállítás és a raktározás széndioxid-kibocsátását. Számításaikban igyekeztek pontosak lenni, a hazai energiamixre számolt legfrissebb konverziós faktorokkal számoltak. Meggyőződésük, hogy az ilyen esettanulmányok hasznosak, hiszen a bemutatott módszertan mintául, útmutatásul szolgálhat további vállalatok számára. Reményeik szerint ezzel segíthetik, hogy minél több hazai vállalat kezdje el széndioxid-kibocsátásának szisztematikus és tudományos alapokon nyugvó mérését. ____ Due to globalization, intense competition and the consumer society logistics processes have been intensified during the last decades. This led to increased environmental strain generating intense criticism towards logistics profession. In order to decrease the environmental burden of logistics several professionals and companies have tried to make progress in this field and introduced techniques that are capable to measure the Carbon Footprint of logistics. Still public case studies are very limited. The paper presents the case of the Hungarian subsidiary of a big multinational FMCG firm. Calculations are built on the actual conversion factor developed for the Hungarian energy mix. A complex set of key performance indi actors usable to capture key characteristics of the present situation is presented. Not only the constructs of these KPIs are described in the paper but a detailed description of methodology used to calculate them is also given. The authors hope such detailed case study description will help other companies as well to initiate sustainable logistics programs.
Resumo:
Clusters are aggregations of atoms or molecules, generally intermediate in size between individual atoms and aggregates that are large enough to be called bulk matter. Clusters can also be called nanoparticles, because their size is on the order of nanometers or tens of nanometers. A new field has begun to take shape called nanostructured materials which takes advantage of these atom clusters. The ultra-small size of building blocks leads to dramatically different properties and it is anticipated that such atomically engineered materials will be able to be tailored to perform as no previous material could.^ The idea of ionized cluster beam (ICB) thin film deposition technique was first proposed by Takagi in 1972. It was based upon using a supersonic jet source to produce, ionize and accelerate beams of atomic clusters onto substrates in a vacuum environment. Conditions for formation of cluster beams suitable for thin film deposition have only recently been established following twenty years of effort. Zinc clusters over 1,000 atoms in average size have been synthesized both in our lab and that of Gspann. More recently, other methods of synthesizing clusters and nanoparticles, using different types of cluster sources, have come under development.^ In this work, we studied different aspects of nanoparticle beams. The work includes refinement of a model of the cluster formation mechanism, development of a new real-time, in situ cluster size measurement method, and study of the use of ICB in the fabrication of semiconductor devices.^ The formation process of the vaporized-metal cluster beam was simulated and investigated using classical nucleation theory and one dimensional gas flow equations. Zinc cluster sizes predicted at the nozzle exit are in good quantitative agreement with experimental results in our laboratory.^ A novel in situ real-time mass, energy and velocity measurement apparatus has been designed, built and tested. This small size time-of-flight mass spectrometer is suitable to be used in our cluster deposition systems and does not suffer from problems related to other methods of cluster size measurement like: requirement for specialized ionizing lasers, inductive electrical or electromagnetic coupling, dependency on the assumption of homogeneous nucleation, limits on the size measurement and non real-time capability. Measured ion energies using the electrostatic energy analyzer are in good accordance with values obtained from computer simulation. The velocity (v) is measured by pulsing the cluster beam and measuring the time of delay between the pulse and analyzer output current. The mass of a particle is calculated from m = (2E/v$\sp2).$ The error in the measured value of background gas mass is on the order of 28% of the mass of one N$\sb2$ molecule which is negligible for the measurement of large size clusters. This resolution in cluster size measurement is very acceptable for our purposes.^ Selective area deposition onto conducting patterns overlying insulating substrates was demonstrated using intense, fully-ionized cluster beams. Parameters influencing the selectivity are ion energy, repelling voltage, the ratio of the conductor to insulator dimension, and substrate thickness. ^
Resumo:
We measure the energy emitted by extensive air showers in the form of radio emission in the frequency range from 30 to 80 MHz. Exploiting the accurate energy scale of the Pierre Auger Observatory, we obtain a radiation energy of 15.8 +/- 0.7 (stat) +/- 6.7 (syst) MeV for cosmic rays with an energy of 1 EeV arriving perpendicularly to a geomagnetic field of 0.24 G, scaling quadratically with the cosmic-ray energy. A comparison with predictions from state-of-the-art first-principles calculations shows agreement with our measurement. The radiation energy provides direct access to the calorimetric energy in the electromagnetic cascade of extensive air showers. Comparison with our result thus allows the direct calibration of any cosmic-ray radio detector against the well-established energy scale of the Pierre Auger Observatory.
Resumo:
The gasotransmitter hydrogen sulfide (H2S) is known as an important regulator in several physiological and pathological responses. Among the challenges facing the field is the accurate and reliable measurement of hydrogen sulfide bioavailability. We have reported an approach to discretely measure sulfide and sulfide pools using the monobromobimane (MBB) method coupled with reversed phase high-performance liquid chromatography (RP-HPLC). The method involves the derivatization of sulfide with excess MBB under precise reaction conditions at room temperature to form sulfide dibimane (SDB). The resultant fluorescent SDB is analyzed by RP-HPLC using fluorescence detection with the limit of detection for SDB (2 nM). Care must be taken to avoid conditions that may confound H2S measurement with this method. Overall, RP-HPLC with fluorescence detection of SDB is a useful and powerful tool to measure biological sulfide levels.
Resumo:
Optical nanofibres are ultrathin optical fibres with a waist diameter typically less than the wavelength of light being guided through them. Cold atoms can couple to the evanescent field of the nanofibre-guided modes and such systems are emerging as promising technologies for the development of atom-photon hybrid quantum devices. Atoms within the evanescent field region of an optical nanofibre can be probed by sending near or on-resonant light through the fibre; however, the probe light can detrimentally affect the properties of the atoms. In this paper, we report on the modification of the local temperature of laser-cooled 87Rb atoms in a magneto-optical trap centred around an optical nanofibre when near-resonant probe light propagates through it. A transient absorption technique has been used to measure the temperature of the affected atoms and temperature variations from 160 μk to 850 μk, for a probe power ranging from 0 to 50 nW, have been observed. This effect could have implications in relation to using optical nanofibres for probing and manipulating cold or ultracold atoms.
Resumo:
The data files give the basic field and laboratory data on five ponds in the northeast Siberian Arctic tundra on Samoylov. The files contain water and soil temperature data of the ponds, methane fluxes, measured with closed chambers in the centres without vascular plants and the margins with vascular plants, the contribution of plant mediated fluxes on total methane fluxes, the gas concentrations (methane and dissolved inorganic carbon, oxygen) in the soil and the water column of the ponds, microbial activities (methane production, methane oxidation, aerobic and anaerobic carbon dioxide production), total carbon pools in the different horizons of the bottom soils, soil bulk density, soil substance density, and soil porosity.