53 resultados para Non-contact mapping
Resumo:
BACKGROUND: Retinal vessel oxygenation saturation measurements have been the focus of much attention in recent years as a potential diagnostic parameter in a number of ocular and systemic pathologies. This interest has been heightened by the ability to measure oxygen saturation in vivo using a photographic technique. METHODS: Retinal vessel oxygenation in venules and arterioles of 279 retinal vessels of 12 healthy Caucasian participants (mean age: 30 SD (+/- 6) years) were measured consecutively three times to evaluate short-term variation in oxygen saturation and regional variability of retinal vessel oxygen saturation using dual-wavelength technology (Oxymetry Modul, Imedos, Germany). All subjects underwent standard optometric assessment including non-contact intra-ocular pressure assessment as well as having their systemic blood pressure measured. RESULTS: Vessels were grouped as either near-macula or peripheral, depending on their location. Peripheral arterioles and venules exhibited significantly lower oxygen saturation compared to their near-macula counterparts (arterioles: 94.7% (SD 3.9) vs. 99.7% (SD 3.2); venules: 65.1% (SD 7.2) vs. 90.3% (SD 6.7)). Both arterioles and venules, main branches, and those feeding and draining the retina near the macula and periphery showed low short-term variability of oxygen saturation (arterioles: COV 1.2-1.8%; venules: COV 2.9-4.9%). CONCLUSIONS: Retinal arterioles and venules exhibit low short-term variation of oxygen saturation in healthy subjects. Regional differences in oxygen saturation could be a potential useful marker for risk stratification and diagnostic purposes of area-specific retinal pathology such as age-related macula degeneration and diabetic maculopathy.
Resumo:
The initial aim of this project was to develop a non-contact fibre optic based displacement sensor to operate in the harsh environment of a 'Light Gas Gun' (LGG), which can 'fire' small particles at velocities ranging from 1-8.4 km/s. The LGG is used extensively for research in aerospace to analyze the effects of high speed impacts on materials. Ideally the measurement should be made close to the centre of the impact to minimise corruption of the data from edge effects and survive the impact. A further requirement is that it should operate at a stand-off distance of ~ 8cm. For these reasons we chose to develop a pseudo con-focal intensity sensor, which demonstrated resolution comparable with conventional PVDF sensors combined with high survivability and low cost. A second sensor was developed based on 'Fibre Bragg Gratings' (FBG) which although requiring contact with the target the low weight and very small contact area had minimal effect on the dynamics of the target. The FBG was mounted either on the surface of the target or tangentially between a fixed location. The output signals from the FBG were interrogated in time by a new method. Measurements were made on composite and aluminium plates in the LGG and on low speed drop tests. The particle momentum for the drop tests was chosen to be similar to that of the particles used in the LGG.
Resumo:
To assess the impact of light scatter, similar to that introduced by cataract on retinal vessel blood oxygen saturation measurements using poly-bead solutions of varying concentrations. Eight healthy, young, non-smoking individuals were enrolled for this study. All subjects underwent digital blood pressure measurements, assessment of non-contact intraocular pressure, pupil dilation and retinal vessel oximetry using dual wavelength photography (Oximetry Module, Imedos Systems, Germany). To simulate light scatter, cells comprising a plastic collar and two plano lenses were filled with solutions of differing concentrations (0.001, 0.002 and 0.004%) of polystyrene microspheres (Polysciences Inc., USA). The adopted light scatter model showed an artifactual increase in venous optical density ratio (p=0.036), with the 0.004% condition producing significantly higher venous optical density ratio values when compared to images without a cell in place. Spectrophotometric analysis, and thus retinal vessel oximetry of the retinal vessels, is altered by artificial light scatter. © 2013 Elsevier Ltd.
Resumo:
Single crystal Mo3Si specimens were grown and tested at room temperature using established nanoindentation techniques at various crystallographic orientations. The indentation modulus and hardness were obtained for loads that were large enough to determine bulk properties, yet small enough to avoid cracking in the specimens. From the indentation modulus results, anisotropic elastic constants were determined. As load was initially increased to approximately 1.5 mN, the hardness exhibited a sudden drop that corresponded to a jump in displacement. The resolved shear stress that was determined from initial yielding was 10-15% of the shear modulus, but 3 to 4 times the value obtained from the bulk hardness. Non-contact atomic force microscopy images in the vicinity of indents revealed features consistent with {100}(010) slip.
Resumo:
The initial aim of this project was to develop a non-contact fibre optic based displacement sensor to operate in the harsh environment of a 'Light Gas Gun' (LGG), which can 'fire' small particles at velocities ranging from 1-8.4 km/s. The LGG is used extensively for research in aerospace to analyze the effects of high speed impacts on materials. Ideally the measurement should be made close to the centre of the impact to minimise corruption of the data from edge effects and survive the impact. A further requirement is that it should operate at a stand-off distance of ~ 8cm. For these reasons we chose to develop a pseudo con-focal intensity sensor, which demonstrated resolution comparable with conventional PVDF sensors combined with high survivability and low cost. A second sensor was developed based on 'Fibre Bragg Gratings' (FBG) which although requiring contact with the target the low weight and very small contact area had minimal effect on the dynamics of the target. The FBG was mounted either on the surface of the target or tangentially between a fixed location. The output signals from the FBG were interrogated in time by a new method. Measurements were made on composite and aluminium plates in the LGG and on low speed drop tests. The particle momentum for the drop tests was chosen to be similar to that of the particles used in the LGG.
Resumo:
PURPOSE: To assess the impact of human crystalline lens opacification and yellowing, similar to that observed in patients with cataracts, on retinal vessel blood oxygen saturation measurements using custom manufactured soft contact lenses. METHODS: Ten healthy, non-smoking individuals were enrolled for this study. All subjects underwent digital blood pressure measurements, assessment of non-contact intra-ocular pressure, pupil dilation and retinal vessel oximetry using dual-wavelength photography (Oximetry Module, Imedos Systems). To simulate lens changes, three different contact lenses were inserted, one to simulate opacities followed by two more lenses to simulate different levels of lens yellowing (Cantor & Nissel). RESULTS: The measurements obtained showed an opposite change in arterial and venous oxygen saturation and optical density ratio across conditions, resulting in a statistically significant difference in arterial minus venous oxygen saturation value (p = 0.003). However, this difference was only significant for the 'opacity' condition but not for the 'yellowing' conditions. CONCLUSION: Lenticular changes such as cataracts can impact on spectrophotometric analysis in particular dual-wavelength retinal vessel oximetry. Hence, lenticular assessment and cataract grading should be considered when assessing elderly individuals and patient groups developing cataract earlier in life such as those suffering from diabetes mellitus.
Resumo:
Surface quality is important in engineering and a vital aspect of it is surface roughness, since it plays an important role in wear resistance, ductility, tensile, and fatigue strength for machined parts. This paper reports on a research study on the development of a geometrical model for surface roughness prediction when face milling with square inserts. The model is based on a geometrical analysis of the recreation of the tool trail left on the machined surface. The model has been validated with experimental data obtained for high speed milling of aluminum alloy (Al 7075-T7351) when using a wide range of cutting speed, feed per tooth, axial depth of cut and different values of tool nose radius (0.8. mm and 2.5. mm), using the Taguchi method as the design of experiments. The experimental roughness was obtained by measuring the surface roughness of the milled surfaces with a non-contact profilometer. The developed model can be used for any combination of material workpiece and tool, when tool flank wear is not considered and is suitable for using any tool diameter with any number of teeth and tool nose radius. The results show that the developed model achieved an excellent performance with almost 98% accuracy in terms of predicting the surface roughness when compared to the experimental data. © 2014 The Society of Manufacturing Engineers.
Resumo:
Thermal effects in uncontrolled factory environments are often the largest source of uncertainty in large volume dimensional metrology. As the standard temperature for metrology of 20°C cannot be achieved practically or economically in many manufacturing facilities, the characterisation and modelling of temperature offers a solution for improving the uncertainty of dimensional measurement and quantifying thermal variability in large assemblies. Technologies that currently exist for temperature measurement in the range of 0-50°C have been presented alongside discussion of these temperature measurement technologies' usefulness for monitoring temperatures in a manufacturing context. Particular aspects of production where the technology could play a role are highlighted as well as practical considerations for deployment. Contact sensors such as platinum resistance thermometers can produce accuracy closest to the desired accuracy given the most challenging measurement conditions calculated to be ∼0.02°C. Non-contact solutions would be most practical in the light controlled factory (LCF) and semi-invasive appear least useful but all technologies can play some role during the initial development of thermal variability models.
Resumo:
It has been argued that a single two-dimensional visualization plot may not be sufficient to capture all of the interesting aspects of complex data sets, and therefore a hierarchical visualization system is desirable. In this paper we extend an existing locally linear hierarchical visualization system PhiVis (Bishop98a) in several directions: 1. We allow for em non-linear projection manifolds. The basic building block is the Generative Topographic Mapping. 2. We introduce a general formulation of hierarchical probabilistic models consisting of local probabilistic models organized in a hierarchical tree. General training equations are derived, regardless of the position of the model in the tree. 3. Using tools from differential geometry we derive expressions for local directionalcurvatures of the projection manifold. Like PhiVis, our system is statistically principled and is built interactively in a top-down fashion using the EM algorithm. It enables the user to interactively highlight those data in the parent visualization plot which are captured by a child model.We also incorporate into our system a hierarchical, locally selective representation of magnification factors and directional curvatures of the projection manifolds. Such information is important for further refinement of the hierarchical visualization plot, as well as for controlling the amount of regularization imposed on the local models. We demonstrate the principle of the approach on a toy data set andapply our system to two more complex 12- and 19-dimensional data sets.
Resumo:
The primary objective of this research has been to determine the potential of fluorescence spectroscopy as a method for analysis of surface deposition on contact lenses. In order to achieve this it was first necessary to ascertain whether fluorescence analysis would be able to detect and distinguish between protein and lipid deposited on a lens surface. In conjunction with this it was important to determine the specific excitation wavelengths at which these deposited species were detected with the greatest sensitivity. Experimental observations showed that an excitation wavelength of 360nm would detect lipid deposited on a lens surface, and an excitation wavelength of 280nm would detect and distinguish between protein and lipid deposited on a contact lens. It was also very important to determine whether clean unspoilt lenses showed significant levels of fluorescence themselves. Fluorescence spectra recorded from a variety of unworn contact lenses at excitation wavelengths of 360nm and 280nm indicated that most contact lens materials do not fluoresce themselves to any great extent. Following these initial experiments various clinically and laboratory based studies were performed using fluorescence spectroscopy as a method of analysing contact lens deposition levels. The clinically based studies enabled analysis of contact lenses with known wear backgrounds to be rapidly and individually analysed following discontinuation of wear. Deposition levels in the early stages of lens wear were determined for various lens materials. The effect of surfactant cleaning on deposition levels was also investigated. The laboratory based studies involved comparing some of the in vivo results with those of identical lenses that had been spoilt using an in vitro method. Finally, an examination of lysosyme migration into and out of stored ionic high water contact lenses was made.
Resumo:
Latent variable models represent the probability density of data in a space of several dimensions in terms of a smaller number of latent, or hidden, variables. A familiar example is factor analysis which is based on a linear transformations between the latent space and the data space. In this paper we introduce a form of non-linear latent variable model called the Generative Topographic Mapping, for which the parameters of the model can be determined using the EM algorithm. GTM provides a principled alternative to the widely used Self-Organizing Map (SOM) of Kohonen (1982), and overcomes most of the significant limitations of the SOM. We demonstrate the performance of the GTM algorithm on a toy problem and on simulated data from flow diagnostics for a multi-phase oil pipeline.
Resumo:
Latent variable models represent the probability density of data in a space of several dimensions in terms of a smaller number of latent, or hidden, variables. A familiar example is factor analysis which is based on a linear transformations between the latent space and the data space. In this paper we introduce a form of non-linear latent variable model called the Generative Topographic Mapping, for which the parameters of the model can be determined using the EM algorithm. GTM provides a principled alternative to the widely used Self-Organizing Map (SOM) of Kohonen (1982), and overcomes most of the significant limitations of the SOM. We demonstrate the performance of the GTM algorithm on a toy problem and on simulated data from flow diagnostics for a multi-phase oil pipeline.
Resumo:
This thesis describes the Generative Topographic Mapping (GTM) --- a non-linear latent variable model, intended for modelling continuous, intrinsically low-dimensional probability distributions, embedded in high-dimensional spaces. It can be seen as a non-linear form of principal component analysis or factor analysis. It also provides a principled alternative to the self-organizing map --- a widely established neural network model for unsupervised learning --- resolving many of its associated theoretical problems. An important, potential application of the GTM is visualization of high-dimensional data. Since the GTM is non-linear, the relationship between data and its visual representation may be far from trivial, but a better understanding of this relationship can be gained by computing the so-called magnification factor. In essence, the magnification factor relates the distances between data points, as they appear when visualized, to the actual distances between those data points. There are two principal limitations of the basic GTM model. The computational effort required will grow exponentially with the intrinsic dimensionality of the density model. However, if the intended application is visualization, this will typically not be a problem. The other limitation is the inherent structure of the GTM, which makes it most suitable for modelling moderately curved probability distributions of approximately rectangular shape. When the target distribution is very different to that, theaim of maintaining an `interpretable' structure, suitable for visualizing data, may come in conflict with the aim of providing a good density model. The fact that the GTM is a probabilistic model means that results from probability theory and statistics can be used to address problems such as model complexity. Furthermore, this framework provides solid ground for extending the GTM to wider contexts than that of this thesis.
Resumo:
It has been argued that a single two-dimensional visualization plot may not be sufficient to capture all of the interesting aspects of complex data sets, and therefore a hierarchical visualization system is desirable. In this paper we extend an existing locally linear hierarchical visualization system PhiVis ¸iteBishop98a in several directions: bf(1) We allow for em non-linear projection manifolds. The basic building block is the Generative Topographic Mapping. bf(2) We introduce a general formulation of hierarchical probabilistic models consisting of local probabilistic models organized in a hierarchical tree. General training equations are derived, regardless of the position of the model in the tree. bf(3) Using tools from differential geometry we derive expressions for local directional curvatures of the projection manifold. Like PhiVis, our system is statistically principled and is built interactively in a top-down fashion using the EM algorithm. It enables the user to interactively highlight those data in the parent visualization plot which are captured by a child model. We also incorporate into our system a hierarchical, locally selective representation of magnification factors and directional curvatures of the projection manifolds. Such information is important for further refinement of the hierarchical visualization plot, as well as for controlling the amount of regularization imposed on the local models. We demonstrate the principle of the approach on a toy data set and apply our system to two more complex 12- and 19-dimensional data sets.