920 resultados para measurement techniques of Aerosoles


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article considers the role of accounting in organisational decision making. It challenges the rational nature of decisions made in organisations through the use of accounting models and the problems of predicting the future through the use of such models. The use of accounting in this manner is evaluated from an epochal postmodern stance. Issues raised by chaos theory and the uncertainty principle are used to demonstrate problems with the predictive ability of accounting models. The authors argue that any consideration of the predictive value of accounting needs to change to incorporate a recognition of the turbulent external environment, if it is to be of use for organisational decision making. Thus it is argued that the role of accounting as a mechanism for knowledge creation regarding the future is fundamentally flawed. We take this as a starting-point to argue for the real purpose of the use of the predictive techniques of accounting, using its ritualistic role in the context of myth creation to argue for the cultural benefits of the use of such flawed techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Whereas the competitive advantage of firms can arise from size and position within their industry as well as physical assets, the pattern of competition in advanced economies has increasingly come to favour those firms that can mobilise knowledge and technological skills to create novelty in their products. At the same time, regions are attracting growing attention as an economic unit of analysis, with firms increasingly locating their functions in select regions within the global space. This article introduces the concept of knowledge competitiveness, defined as an economy’s knowledge capacity, capability and sustainability, and the extent to which this knowledge is translated into economic value and transferred into the wealth of the citizens. The article discusses the way in which the knowledge competitiveness of regions is measured and further introduces the World Knowledge Competitiveness Index, which is the first composite and relative measure of the knowledge competitiveness of the globe’s best performing regions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This review compares the results of studies that have investigated the impact of lutein and zeaxanthin supplementation on macular pigment optical density (MPOD) with those that have investigated the reliability of techniques used to measure macular pigment optical density. The review will focus on studies that have used heterochromatic flicker photometry for measurement of macular pigment optical density, as this is the only technique that is currently available commercially to clinicians. We identified articles that reported on supplementation with lutein and/or zeaxanthin and/or meso-zeaxanthin on macular pigment optical density measurement techniques published in peer-reviewed journals, through a multi-staged, systematic approach. Twenty-four studies have investigated the repeatability of MPOD measurements using heterochromatic flicker photometry. Of these, 10 studies provided a coefficient of repeatability or data from which the coefficient could be calculated, with a range in values of 0.06 to 0.58. The lowest coefficient of repeatability assessed on naïve subjects alone was 0.08. These values tell us that, at best, changes greater than 0.08 can be considered clinically significant and at worst, only changes greater than 0.58 can be considered clinically significant. Six studies assessed the effect of supplementation with up to 20 mg/day lutein on macular pigment optical density measured using heterochromatic flicker photometry and the mean increase in macular pigment optical density ranged from 0.025 to 0.09. It seems reasonable to conclude that the chance of eliciting an increase in macular pigment optical density during six months of daily supplementation with between 10 and 20 mg lutein that is of sufficient magnitude to be detected by using heterochromatic flicker photometry on an individual basis is small. Commercially available heterochromatic flicker photometers for macular pigment optical density assessment in the clinical environment appear to demonstrate particularly poor coefficient of repeatability values. Clinicians should exercise caution when considering the purchase of these instruments for potential monitoring of macular pigment optical density in response to supplementation in individual patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Whilst research on work group diversity has proliferated in recent years, relatively little attention has been paid to the precise definition of diversity or its measurement. One of the few studies to do so is Harrison and Klein’s (2007) typology, which defined three types of diversity – separation, variety and disparity – and suggested possible indices with which they should be measured. However, their typology is limited by its association of diversity types with variable measurement, by a lack of clarity over the meaning of variety, and by the absence of a clear guidance about which diversity index should be employed. In this thesis I develop an extended version of the typology, including four diversity types (separation, range, spread and disparity), and propose specific indices to be used for each type of diversity with each variable type (ratio, interval, ordinal and nominal). Indices are chosen or derived from first principles based on the precise definition of the diversity type. I then test the usefulness of these indices in predicting outcomes of diversity compared with other indices, using both an extensive simulated data set (to estimate the effects of mis-specification of diversity type or index) and eight real data sets (to examine whether the proposed indices produce the strongest relationships with hypothesised outcomes). The analyses lead to the conclusion that the indices proposed in the typology are at least as good as, and usually better than, other indices in terms of both measuring effect sizes and power to find significant results, and thus provide evidence to support the typology. Implications for theory and methodology are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Through the application of novel signal processing techniques we are able to measure physical measurands with both high accuracy and low noise susceptibility. The first interrogation scheme is based upon a CCD spectrometer. We compare different algorithms for resolving the Bragg wavelength from a low resolution discrete representation of the reflected spectrum, and present optimal processing methods for providing a high integrity measurement from the reflection image. Our second sensing scheme uses a novel network of sensors to measure the distributive strain response of a mechanical system. Using neural network processing methods we demonstrate the measurement capabilities of a scalable low-cost fibre Bragg grating sensor network. This network has been shown to be comparable with the performance of existing fibre Bragg grating sensing techniques, at a greatly reduced implementation cost.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is concerned with various aspects of Air Pollution due to smell, the impact it has on communities exposed to it, the means by which it may be controlled and the manner in which a local authority may investigate the problems it causes. The approach is a practical one drawing on examples occurring within a Local Authority's experience and for that reason the research is anecdotal and is not a comprehensive treatise on the full range of options available. Odour Pollution is not yet a well organised discipline and might be considered esoteric as it is necessary to incorporate elements of science and the humanities. It has been necessary to range widely across a number of aspects of the subject so that discussion is often restricted but many references have been included to enable a reader to pursue a particular point in greater depth. In a `fuzzy' subject there is often a yawning gap separating theory and practice, thus case studies have been used to illustrate the interplay of various disciplines in resolution of a problem. The essence of any science is observation and measurement. Observation has been made of the spread of odour pollution through a community and also of relevant meterological data so that a mathematical model could be constructed and its predictions checked. It has been used to explore the results of some options for odour control. Measurements of odour perception and human behaviour seldom have the precision and accuracy of the physical sciences. However methods of social research enabled individual perception of odour pollution to be quantified and an insight gained into reaction of a community exposed to it. Odours have four attributes that can be measured and together provide a complete description of its perception. No objective techniques of measurement have yet been developed but in this thesis simple, structured procedures of subjective assessment have been improvised and their use enabled the functioning of the components of an odour control system to be assessed. Such data enabled the action of the system to be communicated using terms that are understood by a non specialist audience.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Methods of dynamic modelling and analysis of structures, for example the finite element method, are well developed. However, it is generally agreed that accurate modelling of complex structures is difficult and for critical applications it is necessary to validate or update the theoretical models using data measured from actual structures. The techniques of identifying the parameters of linear dynamic models using Vibration test data have attracted considerable interest recently. However, no method has received a general acceptance due to a number of difficulties. These difficulties are mainly due to (i) Incomplete number of Vibration modes that can be excited and measured, (ii) Incomplete number of coordinates that can be measured, (iii) Inaccuracy in the experimental data (iv) Inaccuracy in the model structure. This thesis reports on a new approach to update the parameters of a finite element model as well as a lumped parameter model with a diagonal mass matrix. The structure and its theoretical model are equally perturbed by adding mass or stiffness and the incomplete number of eigen-data is measured. The parameters are then identified by an iterative updating of the initial estimates, by sensitivity analysis, using eigenvalues or both eigenvalues and eigenvectors of the structure before and after perturbation. It is shown that with a suitable choice of the perturbing coordinates exact parameters can be identified if the data and the model structure are exact. The theoretical basis of the technique is presented. To cope with measurement errors and possible inaccuracies in the model structure, a well known Bayesian approach is used to minimize the least squares difference between the updated and the initial parameters. The eigen-data of the structure with added mass or stiffness is also determined using the frequency response data of the unmodified structure by a structural modification technique. Thus, mass or stiffness do not have to be added physically. The mass-stiffness addition technique is demonstrated by simulation examples and Laboratory experiments on beams and an H-frame.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis was concerned with investigating methods of improving the IOP pulse’s potential as a measure of clinical utility. There were three principal sections to the work. 1. Optimisation of measurement and analysis of the IOP pulse. A literature review, covering the years 1960 – 2002 and other relevant scientific publications, provided a knowledge base on the IOP pulse. Initial studies investigated suitable instrumentation and measurement techniques. Fourier transformation was identified as a promising method of analysing the IOP pulse and this technique was developed. 2. Investigation of ocular and systemic variables that affect IOP pulse measurements In order to recognise clinically important changes in IOP pulse measurement, studies were performed to identify influencing factors. Fourier analysis was tested against traditional parameters in order to assess its ability to detect differences in IOP pulse. In addition, it had been speculated that the waveform components of the IOP pulse contained vascular characteristic analogous to those components found in arterial pulse waves. Validation studies to test this hypothesis were attempted. 3. The nature of the intraocular pressure pulse in health and disease and its relation to systemic cardiovascular variables. Fourier analysis and traditional parameters were applied to the IOP pulse measurements taken on diseased and healthy eyes. Only the derived parameter, pulsatile ocular blood flow (POBF) detected differences in diseased groups. The use of an ocular pressure-volume relationship may have improved the POBF measure’s variance in comparison to the measurement of the pulse’s amplitude or Fourier components. Finally, the importance of the driving force of pulsatile blood flow, the arterial pressure pulse, is highlighted. A method of combining the measurements of pulsatile blood flow and pulsatile blood pressure to create a measure of ocular vascular impedance is described along with its advantages for future studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Intraocular light scatter is high in certain subject groups eg the elderly, due to increased optical media turbidity, which scatters and attenuates light travelling towards the retina. This causes reduced retinal contrast especially in the presence of glare light. Such subjects have depressed Contrast Sensitivity Functions (CSF). Currently available clinical tests do not effectively reflect this visual disability. Intraocular light scatter may be quantified by measuring the CSF with and without glare light and calculating Light Scatter Factors (LSF). To record the CSF on clinically available equipment (Nicolet CS2000), several psychophysical measurement techniques were investigated, and the 60 sec Method of Increasing Contrast was selected as the most appropriate. It was hypothesised that intraocular light scatter due to particles of different dimensions could be identified by glare sources at wide (30°) and narrow (3.5°) angles. CSFs andLSFs were determined for: (i) Subjects in young, intermediate and old age groups. (ii) Subjects during recovery from large amounts of induced corneal oedema. (iii) A clinical sample of contact lens (CL) wearers with a group of matched controls. The CSF was attenuated at all measured spatial frequencies with the intermediate and old group compared to the young group. High LSF values were found only in the old group (over 60 years). It was concluded that CSF attenuation in the intermediate group was due to reduced pupil size, media absorption and/or neural factors. In the old group, the additional factor was high intraocular light scatter levels of lenticular origin. The rate of reduction of the LSF for the 3.5° glare angle was steeper than that for the 30° angle, following induced corneal oedema. This supported the hypothesis, as it was anticipated that epithelial oedema would recover more rapidly than stromal oedema. CSFs and LSFs were markedly abnormal in the CL wearers. The analytical details and the value of these investigative techniques in contact lens research are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Off-highway motive plant equipment is costly in capital outlay and maintenance. To reduce these overheads and increase site safety and workrate, a technique of assessing and limiting the velocity of such equipment is required. Due to the extreme environmental conditions met on such sites, conventional velocity measurement techniques are inappropriate. Ogden Electronics Limited were formed specifically to manufacture a motive plant safety system incorporating a speed sensor and sanction unit; to date, the only such commercial unit available. However, problems plague the reliability, accuracy and mass production of this unit. This project assesses the company's exisiting product, and in conjunction with an appreciation of the company history and structure, concludes that this unit is unsuited to its intended application. Means of improving the measurement accuracy and longevity of this unit, commensurate with the company's limited resources and experience, are proposed, both for immediate retrofit and for longer term use. This information is presented in the form of a number of internal reports for the company. The off-highway environment is examined; and in conjunction with an evaluation of means of obtaining a returned signal, comparisons of processing techniques, and on-site gathering of previously unavailable data, preliminary designs for an alternative product are drafted. Theoretical aspects are covered by a literature review of ground-pointing radar, vehicular radar, and velocity measuring systems. This review establishes and collates the body of knowledge in areas previously considered unrelated. Based upon this work, a new design is proposed which is suitable for incorporation into the existing company product range. Following production engineering of the design, five units were constructed, tested and evaluated on-site. After extended field trials, this design has shown itself to possess greater accuracy, reliability and versatility than the existing sensor, at a lower unit cost.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To compare graticule and image capture assessment of the lower tear film meniscus height (TMH). Methods: Lower tear film meniscus height measures were taken in the right eyes of 55 healthy subjects at two study visits separated by 6 months. Two images of the TMH were captured in each subject with a digital camera attached to a slit-lamp biomicroscope and stored in a computer for future analysis. Using the best of two images, the TMH was quantified by manually drawing a line across the tear meniscus profile, following which the TMH was measured in pixels and converted into millimetres, where one pixel corresponded to 0.0018 mm. Additionally, graticule measures were carried out by direct observation using a calibrated graticule inserted into the same slit-lamp eyepiece. The graticule was calibrated so that actual readings, in 0.03 mm increments, could be made with a 40× ocular. Results: Smaller values of TMH were found in this study compared to previous studies. TMH, as measured with the image capture technique (0.13 ± 0.04 mm), was significantly greater (by approximately 0.01 ± 0.05 mm, p = 0.03) than that measured with the graticule technique (0.12 ± 0.05 mm). No bias was found across the range sampled. Repeatability of the TMH measurements taken at two study visits showed that graticule measures were significantly different (0.02 ± 0.05 mm, p = 0.01) and highly correlated (r = 0.52, p < 0.0001), whereas image capture measures were similar (0.01 ± 0.03 mm, p = 0.16), and also highly correlated (r = 0.56, p < 0.0001). Conclusions: Although graticule and image analysis techniques showed similar mean values for TMH, the image capture technique was more repeatable than the graticule technique and this can be attributed to the higher measurement resolution of the image capture (i.e. 0.0018 mm) compared to the graticule technique (i.e. 0.03 mm). © 2006 British Contact Lens Association.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Congenital Nystagmus (CN) is an ocular-motor disorder characterised by involuntary, conjugated ocular oscillations, and its pathogenesis is still unknown. The pathology is de fined as "congenital" from the onset time of its arise which could be at birth or in the first months of life. Visual acuity in CN subjects is often diminished due to nystagmus continuous oscillations, mainly on the horizontal plane, which disturb image fixation on the retina. However, during short periods in which eye velocity slows down while the target image is placed onto the fovea (called foveation intervals) the image of a given target can still be stable, allowing a subject to reach a higher visual acuity. In CN subjects, visual acuity is usually assessed both using typical measurement techniques (e.g. Landolt C test) and with eye movement recording in different gaze positions. The offline study of eye movement recordings allows physicians to analyse nystagmus main features such as waveform shape, amplitude and frequency and to compute estimated visual acuity predictors. This analytical functions estimates the best corrected visual acuity using foveation time and foveation position variability, hence a reliable estimation of this two parameters is a fundamental factor in assessing visual acuity. This work aims to enhance the foveation time estimation in CN eye movement recording, computing a second order approximation of the slow phase components of nystag-mus oscillations. About 19 infraredoculographic eye-movement recordings from 10 CN subjects were acquired and the visual acuity assessed with an acuity predictor was compared to the one measured in primary position. Results suggest that visual acuity measurements based on foveation time estimation obtained from interpolated data are closer to value obtained during Landolt C tests. © 2010 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper details a method of estimating the uncertainty of dimensional measurement for a three-dimensional coordinate measurement machine. An experimental procedure was developed to compare three-dimensional coordinate measurements with calibrated reference points. The reference standard used to calibrate these reference points was a fringe counting interferometer with a multilateration-like technique employed to establish three-dimensional coordinates. This is an extension of the established technique of comparing measured lengths with calibrated lengths. Specifically a distributed coordinate measurement device was tested which consisted of a network of Rotary-Laser Automatic Theodolites (R-LATs), this system is known commercially as indoor GPS (iGPS). The method was found to be practical and was used to estimate that the uncertainty of measurement for the basic iGPS system is approximately 1 mm at a 95% confidence level throughout a measurement volume of approximately 10 m × 10 m × 1.5 m. © 2010 IOP Publishing Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Market orientation (MO) and marketing performance measurement (MPM) are two of the most widespread strategic marketing concepts among practitioners. However, some have questioned the benefits of extensive investments in MO and MPM. More importantly, little is known about which combinations of MO and MPM are optimal in ensuring high business performance. To address this research gap, the authors analyze a unique data set of 628 firms with a novel method of configurational analysis: fuzzy-set qualitative comparative analysis. In line with prior research, the authors find that MO is an important determinant of business performance. However, to reap its benefits, managers need to complement it with appropriate MPM, the level and focus of which vary across firms. For example, whereas large firms and market leaders generally benefit from comprehensive MPM, small firms may benefit from measuring marketing performance only selectively or by focusing on particular dimensions of marketing performance. The study also finds that many of the highest-performing firms do not follow any of the particular best practices identified.