922 resultados para measurement techniques


Relevância:

70.00% 70.00%

Publicador:

Resumo:

In dimensional metrology, often the largest source of uncertainty of measurement is thermal variation. Dimensional measurements are currently scaled linearly, using ambient temperature measurements and coefficients of thermal expansion, to ideal metrology conditions at 20˚C. This scaling is particularly difficult to implement with confidence in large volumes as the temperature is unlikely to be uniform, resulting in thermal gradients. A number of well-established computational methods are used in the design phase of product development for the prediction of thermal and gravitational effects, which could be used to a greater extent in metrology. This paper outlines the theory of how physical measurements of dimension and temperature can be combined more comprehensively throughout the product lifecycle, from design through to the manufacturing phase. The Hybrid Metrology concept is also introduced: an approach to metrology, which promises to improve product and equipment integrity in future manufacturing environments. The Hybrid Metrology System combines various state of the art physical dimensional and temperature measurement techniques with established computational methods to better predict thermal and gravitational effects.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Premium Intraocular Lenses (IOLs) such as toric IOLs, multifocal IOLs (MIOLs) and accommodating IOLs (AIOLs) can provide better refractive and visual outcomes compared to standard monofocal designs, leading to greater levels of post-operative spectacle independence. The principal theme of this thesis relates to the development of new assessment techniques that can help to improve future premium IOL design. IOLs designed to correct astigmatism form the focus of the first part of the thesis. A novel toric IOL design was devised to decrease the effect of toric rotation on patient visual acuity, but found to have neither a beneficial or detrimental impact on visual acuity retention. IOL tilt, like rotation, may curtail visual performance; however current IOL tilt measurement techniques require the use of specialist equipment not readily available in most ophthalmological clinics. Thus a new idea that applied Pythagoras’s theory to digital images of IOL optic symmetricality in order to calculate tilt was proposed, and shown to be both accurate and highly repeatable. A literature review revealed little information on the relationship between IOL tilt, decentration and rotation and so this was examined. A poor correlation between these factors was found, indicating they occur independently of each other. Next, presbyopia correcting IOLs were investigated. The light distribution of different MIOLs and an AIOL was assessed using perimetry, to establish whether this could be used to inform optimal IOL design. Anticipated differences in threshold sensitivity between IOLs were not however found, thus perimetry was concluded to be ineffective in mapping retinal projection of blur. The observed difference between subjective and objective measures of accommodation, arising from the influence of pseudoaccommodative factors, was explored next to establish how much additional objective power would be required to restore the eye’s focus with AIOLs. Blur tolerance was found to be the key contributor to the ocular depth of focus, with an approximate dioptric influence of 0.60D. Our understanding of MIOLs may be limited by the need for subjective defocus curves, which are lengthy and do not permit important additional measures to be undertaken. The use of aberrometry to provide faster objective defocus curves was examined. Although subjective and objective measures related well, the peaks of the MIOL defocus curve profile were not evident with objective prediction of acuity, indicating a need for further refinement of visual quality metrics based on ocular aberrations. The experiments detailed in the thesis evaluate methods to improve visual performance with toric IOLs. They also investigate new techniques to allow more rapid post-operative assessment of premium IOLs, which could allow greater insights to be obtained into several aspects of visual quality, in order to optimise future IOL design and ultimately enhance patient satisfaction.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Recent advances in mobile phone cameras have poised them to take over compact hand-held cameras as the consumer’s preferred camera option. Along with advances in the number of pixels, motion blur removal, face-tracking, and noise reduction algorithms have significant roles in the internal processing of the devices. An undesired effect of severe noise reduction is the loss of texture (i.e. low-contrast fine details) of the original scene. Current established methods for resolution measurement fail to accurately portray the texture loss incurred in a camera system. The development of an accurate objective method to identify the texture preservation or texture reproduction capability of a camera device is important in this regard. The ‘Dead Leaves’ target has been used extensively as a method to measure the modulation transfer function (MTF) of cameras that employ highly non-linear noise-reduction methods. This stochastic model consists of a series of overlapping circles with radii r distributed as r−3, and having uniformly distributed gray level, which gives an accurate model of occlusion in a natural setting and hence mimics a natural scene. This target can be used to model the texture transfer through a camera system when a natural scene is captured. In the first part of our study we identify various factors that affect the MTF measured using the ‘Dead Leaves’ chart. These include variations in illumination, distance, exposure time and ISO sensitivity among others. We discuss the main differences of this method with the existing resolution measurement techniques and identify the advantages. In the second part of this study, we propose an improvement to the current texture MTF measurement algorithm. High frequency residual noise in the processed image contains the same frequency content as fine texture detail, and is sometimes reported as such, thereby leading to inaccurate results. A wavelet thresholding based denoising technique is utilized for modeling the noise present in the final captured image. This updated noise model is then used for calculating an accurate texture MTF. We present comparative results for both algorithms under various image capture conditions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We report on an inter-comparison of six different hygroscopicity tandem differential mobility analysers (HTDMAs). These HTDMAs are used worldwide in laboratories and in field campaigns to measure the water uptake of aerosol particles and were never intercompared. After an investigation of the different design of the instruments with their advantages and inconveniencies, the methods for calibration, validation and analysis are presented. Measurements of nebulised ammonium sulphate as well as of secondary organic aerosol generated from a smog chamber were performed. Agreement and discrepancies between the instrument and to the theory are discussed, and final recommendations for a standard instrument are given, as a benchmark for laboratory or field experiments to ensure a high quality of HTDMA data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Protein-energy wasting (PEW) is commonly seen in patients with chronic kidney disease (CKD). The condition is characterised by chronic, systemic low-grade inflammation which affects nutritional status by a variety of mechanisms including reducing appetite and food intake and increasing muscle catabolism. PEW is linked with co-morbidities such as cardiovascular disease, and is associated with lower quality of life, increased hospitalisations and a 6-fold increase in risk of death1. Significant gender differences have been found in the severity and effects of several markers of PEW. There have been limited studies testing the ability of anti-inflammatory agents or nutritional interventions to reduce the effects of PEW in dialysis patients. This thesis makes a significant contribution to the understanding of PEW in dialysis patients. It advances understanding of measurement techniques for two of the key components, appetite and inflammation, and explores the effect of fish oil, an anti-inflammatory agent, on markers of PEW in dialysis patients. The first part of the thesis consists of two methodological studies conducted using baseline data. The first study aims to validate retrospective ratings of hunger, desire to eat and fullness on visual analog scales (VAS) (paper and pen and electronic) as a new method of measuring appetite in dialysis patients. The second methodological study aims to assess the ability of a variety of methods available in routine practice to detect the presence of inflammation. The second part of the thesis aims to explore the effect of 12 weeks supplementation with 2g per day of Eicosapentaenoic Acid (EPA), a longchain fatty acid found in fish oil, on markers of PEW. A combination of biomarkers and psychomarkers of appetite and inflammation are the main outcomes being explored, with nutritional status, dietary intake and quality of life included as secondary outcomes. A lead in phase of 3 months prior to baseline was used so that each person acts as their own historical control. The study also examines whether there are gender differences in response to the treatment. Being an exploratory study, an important part of the work is to test the feasibility of the intervention, thus the level of adherence and factors associated with adherence are also presented. The studies were conducted at the hemodialysis unit of the Wesley Hospital. Participants met the following criteria: adult, stage 5 CKD on hemodialysis for at least 3 months, not expected to receive a transplant or switch to another dialysis modality during the study, absence of intellectual impairment or mental illness impairing ability to follow instructions or complete the intervention. A range of intermediate, clinical and patient-centred outcome measures were collected at baseline and 12 weeks. Inflammation was measured using five biomarkers: c-reactive protein (CRP), interleukin-6 (IL6), intercellular adhesion molecule (sICAM-1), vascular cell adhesion molecule (sVCAM-1) and white cell count (WCC). Subjective appetite was measured using the first question from the Appetite and Dietary Assessment (ADAT) tool and VAS for measurements of hunger, desire to eat and fullness. A novel feature of the study was the assessment of the appetite peptides leptin, ghrelin and peptide YY as biomarkers of appetite. Nutritional status/inflammation was assessed using the Malnutrition-Inflammation Score (MIS) and the Patient-Generated Subjective Global Assessment (PG-SGA). Dietary intake was measured using 3-day records. Quality of life was measured using the Kidney Disease Quality of Life Short Form version 1.3 (KDQOL-SF™ v1.3 © RAND University), which combines the Short-Form 36 (SF36) with a kidney-disease specific module2. A smaller range of these variables was available for analysis during the control phase (CRP, ADAT, dietary intake and nutritional status). Statistical analysis was carried out using SPSS version 14 (SPSS Inc, Chicago IL, USA). Analysis of the first part of the thesis involved descriptive and bivariate statistics, as well as Bland-Altman plots to assess agreement between methods, and sensitivity analysis/ROC curves to test the ability of methods to predict the presence of inflammation. The unadjusted (paired ttests) and adjusted (linear mixed model) change over time is presented for the main outcome variables of inflammation and appetite. Results are shown for the whole group followed by analyses according to gender and adherence to treatment. Due to the exploratory nature of the study, trends and clinical significance were considered as important as statistical significance. Twenty-eight patients (mean age 61±17y, 50% male, dialysis vintage 19.5 (4- 101) months) underwent baseline assessment. Seven out of 28 patients (25%) reported sub-optimal appetite (self-reported as fair, poor or very poor) despite all being well nourished (100% SGA A). Using the VAS, ratings of hunger, but not desire to eat or fullness, were significantly (p<0.05) associated with a range of relevant clinical variables including age (r=-0.376), comorbidities (r=-0.380) nutritional status (PG-SGA score, r=-0.451), inflammatory markers (CRP r=-0.383; sICAM-1 r=-0.387) and seven domains of quality of life. Patients expressed a preference for the paper and pen method of administering VAS. None of the tools (appetite, MIS, PG-SGA, albumin or iron) showed an acceptable ability to detect patients who are inflamed. It is recommended that CRP should be tested more frequently as a matter of course rather than seeking alternative methods of measuring inflammation. 27 patients completed the 12 week intervention. 20 patients were considered adherent based on changes in % plasma EPA, which rose from 1.3 (0.94)% to 5.2 (1.1)%, p<0.001, in this group. The major barriers to adherence were forgetting to take the tablets as well as their size. At 12 weeks, inflammatory markers remained steady apart from the white cell count which decreased (7.6(2.5) vs 7.0(2.2) x109/L, p=0.058) and sVCAM-1 which increased (1685(654) vs 2249(925) ng/mL, p=0.001). Subjective appetite using VAS increased (51mm to 57mm, +12%) and there was a trend towards reduction in peptide YY (660(31) vs 600(30) pg/mL, p=0.078). There were some gender differences apparent, with the following adjusted change between baseline and week 12: CRP (males -3% vs females +17%, p=0.19), IL6 (males +17% vs females +48%, p=0.77), sICAM-1 (males -5% vs females +11%, p=0.07), sVCAM-1 (males +54% vs females +19%, p=0.08) and hunger ratings (males 20% vs females -5%, p=0.18). On balance, males experienced a maintainence or reduction in three inflammatory markers and an improvement in hunger ratings, and therefore appeared to have responded better to the intervention. Compared to those who didn’t adhere, adherent patients maintained weight (mean(SE) change: +0.5(1.6) vs - 0.8(1.2) kg, p=0.052) and fat-free mass (-0.1 (1.6) vs -1.8 (1.8) kg, p=0.045). There was no difference in change between the intervention and control phase for CRP, appetite, nutritional status or dietary intake. The thesis makes a significant contribution to the evidence base for understanding of PEW in dialysis patients. It has advanced knowledge of methods of assessing inflammation and appetite. Retrospective ratings of hunger on a VAS appear to be a valid method of assessing appetite although samples which include patients with very poor appetite are required to confirm this. Supplementation with fish oil appeared to improve subjective appetite and dampen the inflammatory response. The effectiveness of the intervention is influenced by gender and adherence. Males appear to be more responsive to the primary outcome variables than females, and the quality of response is improved with better adherence. These results provide evidence to support future interventions aimed at reducing the effects of PEW in dialysis patients.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study assessed the reliability and validity of a palm-top-based electronic appetite rating system (EARS) in relation to the traditional paper and pen method. Twenty healthy subjects [10 male (M) and 10 female (F)] — mean age M=31 years (S.D.=8), F=27 years (S.D.=5); mean BMI M=24 (S.D.=2), F=21 (S.D.=5) — participated in a 4-day protocol. Measurements were made on days 1 and 4. Subjects were given paper and an EARS to log hourly subjective motivation to eat during waking hours. Food intake and meal times were fixed. Subjects were given a maintenance diet (comprising 40% fat, 47% carbohydrate and 13% protein by energy) calculated at 1.6×Resting Metabolic Rate (RMR), as three isoenergetic meals. Bland and Altman's test for bias between two measurement techniques found significant differences between EARS and paper and pen for two of eight responses (hunger and fullness). Regression analysis confirmed that there were no day, sex or order effects between ratings obtained using either technique. For 15 subjects, there was no significant difference between results, with a linear relationship between the two methods that explained most of the variance (r2 ranged from 62.6 to 98.6). The slope for all subjects was less than 1, which was partly explained by a tendency for bias at the extreme end of results on the EARS technique. These data suggest that the EARS is a useful and reliable technique for real-time data collection in appetite research but that it should not be used interchangeably with paper and pen techniques.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis applies Monte Carlo techniques to the study of X-ray absorptiometric methods of bone mineral measurement. These studies seek to obtain information that can be used in efforts to improve the accuracy of the bone mineral measurements. A Monte Carlo computer code for X-ray photon transport at diagnostic energies has been developed from first principles. This development was undertaken as there was no readily available code which included electron binding energy corrections for incoherent scattering and one of the objectives of the project was to study the effects of inclusion of these corrections in Monte Carlo models. The code includes the main Monte Carlo program plus utilities for dealing with input data. A number of geometrical subroutines which can be used to construct complex geometries have also been written. The accuracy of the Monte Carlo code has been evaluated against the predictions of theory and the results of experiments. The results show a high correlation with theoretical predictions. In comparisons of model results with those of direct experimental measurements, agreement to within the model and experimental variances is obtained. The code is an accurate and valid modelling tool. A study of the significance of inclusion of electron binding energy corrections for incoherent scatter in the Monte Carlo code has been made. The results show this significance to be very dependent upon the type of application. The most significant effect is a reduction of low angle scatter flux for high atomic number scatterers. To effectively apply the Monte Carlo code to the study of bone mineral density measurement by photon absorptiometry the results must be considered in the context of a theoretical framework for the extraction of energy dependent information from planar X-ray beams. Such a theoretical framework is developed and the two-dimensional nature of tissue decomposition based on attenuation measurements alone is explained. This theoretical framework forms the basis for analytical models of bone mineral measurement by dual energy X-ray photon absorptiometry techniques. Monte Carlo models of dual energy X-ray absorptiometry (DEXA) have been established. These models have been used to study the contribution of scattered radiation to the measurements. It has been demonstrated that the measurement geometry has a significant effect upon the scatter contribution to the detected signal. For the geometry of the models studied in this work the scatter has no significant effect upon the results of the measurements. The model has also been used to study a proposed technique which involves dual energy X-ray transmission measurements plus a linear measurement of the distance along the ray path. This is designated as the DPA( +) technique. The addition of the linear measurement enables the tissue decomposition to be extended to three components. Bone mineral, fat and lean soft tissue are the components considered here. The results of the model demonstrate that the measurement of bone mineral using this technique is stable over a wide range of soft tissue compositions and hence would indicate the potential to overcome a major problem of the two component DEXA technique. However, the results also show that the accuracy of the DPA( +) technique is highly dependent upon the composition of the non-mineral components of bone and has poorer precision (approximately twice the coefficient of variation) than the standard DEXA measurements. These factors may limit the usefulness of the technique. These studies illustrate the value of Monte Carlo computer modelling of quantitative X-ray measurement techniques. The Monte Carlo models of bone densitometry measurement have:- 1. demonstrated the significant effects of the measurement geometry upon the contribution of scattered radiation to the measurements, 2. demonstrated that the statistical precision of the proposed DPA( +) three tissue component technique is poorer than that of the standard DEXA two tissue component technique, 3. demonstrated that the proposed DPA(+) technique has difficulty providing accurate simultaneous measurement of body composition in terms of a three component model of fat, lean soft tissue and bone mineral,4. and provided a knowledge base for input to decisions about development (or otherwise) of a physical prototype DPA( +) imaging system. The Monte Carlo computer code, data, utilities and associated models represent a set of significant, accurate and valid modelling tools for quantitative studies of physical problems in the fields of diagnostic radiology and radiography.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Recent studies have detected a dominant accumulation mode (~100 nm) in the Sea Spray Aerosol (SSA) number distribution. There is evidence to suggest that particles in this mode are composed primarily of organics. To investigate this hypothesis we conducted experiments on NaCl, artificial SSA and natural SSA particles with a Volatility-Hygroscopicity-Tandem-Differential-Mobility-Analyser (VH-TDMA). NaCl particles were atomiser generated and a bubble generator was constructed to produce artificial and natural SSA particles. Natural seawater samples for use in the bubble generator were collected from biologically active, terrestrially-affected coastal water in Moreton Bay, Australia. Differences in the VH-TDMA-measured volatility curves of artificial and natural SSA particles were used to investigate and quantify the organic fraction of natural SSA particles. Hygroscopic Growth Factor (HGF) data, also obtained by the VH-TDMA, were used to confirm the conclusions drawn from the volatility data. Both datasets indicated that the organic fraction of our natural SSA particles evaporated in the VH-TDMA over the temperature range 170–200°C. The organic volume fraction for 71–77 nm natural SSA particles was 8±6%. Organic volume fraction did not vary significantly with varying water residence time (40 secs to 24 hrs) in the bubble generator or SSA particle diameter in the range 38–173 nm. At room temperature we measured shape- and Kelvin-corrected HGF at 90% RH of 2.46±0.02 for NaCl, 2.35±0.02 for artifical SSA and 2.26±0.02 for natural SSA particles. Overall, these results suggest that the natural accumulation mode SSA particles produced in these experiments contained only a minor organic fraction, which had little effect on hygroscopic growth. Our measurement of 8±6% is an order of magnitude below two previous measurements of the organic fraction in SSA particles of comparable sizes. We stress that our results were obtained using coastal seawater and they can’t necessarily be applied on a regional or global ocean scale. Nevertheless, considering the order of magnitude discrepancy between this and previous studies, further research with independent measurement techniques and a variety of different seawaters is required to better quantify how much organic material is present in accumulation mode SSA.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A major challenge in modern photonics and nano-optics is the diffraction limit of light which does not allow field localisation into regions with dimensions smaller than half the wavelength. Localisation of light into nanoscale regions (beyond its diffraction limit) has applications ranging from the design of optical sensors and measurement techniques with resolutions as high as a few nanometres, to the effective delivery of optical energy into targeted nanoscale regions such as quantum dots, nano-electronic and nano-optical devices. This field has become a major research direction over the last decade. The use of strongly localised surface plasmons in metallic nanostructures is one of the most promising approaches to overcome this problem. Therefore, the aim of this thesis is to investigate the linear and non-linear propagation of surface plasmons in metallic nanostructures. This thesis will focus on two main areas of plasmonic research –– plasmon nanofocusing and plasmon nanoguiding. Plasmon nanofocusing – The main aim of plasmon nanofocusing research is to focus plasmon energy into nanoscale regions using metallic nanostructures and at the same time achieve strong local field enhancement. Various structures for nanofocusing purposes have been proposed and analysed such as sharp metal wedges, tapered metal films on dielectric substrates, tapered metal rods, and dielectric V-grooves in metals. However, a number of important practical issues related to nanofocusing in these structures still remain unclear. Therefore, one of the main aims of this thesis is to address two of the most important of issues which are the coupling efficiency and heating effects of surface plasmons in metallic nanostructures. The method of analysis developed throughout this thesis is a general treatment that can be applied to a diversity of nanofocusing structures, with results shown here for the specific case of sharp metal wedges. Based on the geometrical optics approximation, it is demonstrated that the coupling efficiency from plasmons generated with a metal grating into the nanofocused symmetric or quasi-symmetric modes may vary between ~50% to ~100% depending on the structural parameters. Optimal conditions for nanofocusing with the view to minimise coupling and dissipative losses are also determined and discussed. It is shown that the temperature near the tip of a metal wedge heated by nanosecond plasmonic pulses can increase by several hundred degrees Celsius. This temperature increase is expected to lead to nonlinear effects, self-influence of the focused plasmon, and ultimately self-destruction of the metal tip. This thesis also investigates a different type of nanofocusing structure which consists of a tapered high-index dielectric layer resting on a metal surface. It is shown that the nanofocusing mechanism that occurs in this structure is somewhat different from other structures that have been considered thus far. For example, the surface plasmon experiences significant backreflection and mode transformation at a cut-off thickness. In addition, the reflected plasmon shows negative refraction properties that have not been observed in other nanofocusing structures considered to date. Plasmon nanoguiding – Guiding surface plasmons using metallic nanostructures is important for the development of highly integrated optical components and circuits which are expected to have a superior performance compared to their electronicbased counterparts. A number of different plasmonic waveguides have been considered over the last decade including the recently considered gap and trench plasmon waveguides. The gap and trench plasmon waveguides have proven to be difficult to fabricate. Therefore, this thesis will propose and analyse four different modified gap and trench plasmon waveguides that are expected to be easier to fabricate, and at the same time acquire improved propagation characteristics of the guided mode. In particular, it is demonstrated that the guided modes are significantly screened by the extended metal at the bottom of the structure. This is important for the design of highly integrated optics as it provides the opportunity to place two waveguides close together without significant cross-talk. This thesis also investigates the use of plasmonic nanowires to construct a Fabry-Pérot resonator/interferometer. It is shown that the resonance effect can be achieved with the appropriate resonator length and gap width. Typical quality factors of the Fabry- Pérot cavity are determined and explained in terms of radiative and dissipative losses. The possibility of using a nanowire resonator for the design of plasmonic filters with close to ~100% transmission is also demonstrated. It is expected that the results obtained in this thesis will play a vital role in the development of high resolution near field microscopy and spectroscopy, new measurement techniques and devices for single molecule detection, highly integrated optical devices, and nanobiotechnology devices for diagnostics of living cells.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose: The prevalence of refractive errors in children has been extensively researched. Comparisons between studies can, however, be compromised because of differences between accommodation control methods and techniques used for measuring refractive error. The aim of this study was to compare spherical refractive error results obtained at baseline and using two different accommodation control methods – extended optical fogging and cycloplegia, for two measurement techniques – autorefraction and retinoscopy. Methods: Participants comprised twenty-five school children aged between 6 and 13 years (mean age: 9.52 ± 2.06 years). The refractive error of one eye was measured at baseline and again under two different accommodation control conditions: extended optical fogging (+2.00DS for 20 minutes) and cycloplegia (1% cyclopentolate). Autorefraction and retinoscopy were both used to measure most plus spherical power for each condition. Results: A significant interaction was demonstrated between measurement technique and accommodation control method (p = 0.036), with significant differences in spherical power evident between accommodation control methods for each of the measurement techniques (p < 0.005). For retinoscopy, refractive errors were significantly more positive for cycloplegia compared to optical fogging, which were in turn significantly more positive than baseline, while for autorefraction, there were significant differences between cycloplegia and extended optical fogging and between cycloplegia and baseline only. Conclusions: Determination of refractive error under cycloplegia elicits more plus than using extended optical fogging as a method to relax accommodation. These findings support the use of cycloplegic refraction compared with extended optical fogging as a means of controlling accommodation for population based refractive error studies in children.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In order to provide realistic data for air pollution inventories and source apportionment at airports, the morphology and composition of ultrafine particles (UFP) in aircraft engine exhaust were measured and characterized. For this purpose, two independent measurement techniques were employed to collect emissions during normal takeoff and landing operations at Brisbane Airport, Australia. PM1 emissions in the airfield were collected on filters and analyzed using the particle-induced X-ray emission (PIXE) technique. Morphological and compositional analyses of individual ultrafine particles in aircraft plumes were performed on silicon nitride membrane grids using transmission electron microscopy (TEM) combined with energy-dispersive X-ray microanalysis (EDX). TEM results showed that the deposited particles were in the range of 5 to 100 nm in diameter, had semisolid spherical shapes and were dominant in the nucleation mode (18 – 20 nm). The EDX analysis showed the main elements in the nucleation particles were C, O, S and Cl. The PIXE analysis of the airfield samples was generally in agreement with the EDX in detecting S, Cl, K, Fe and Si in the particles. The results of this study provide important scientific information on the toxicity of aircraft exhaust and their impact on local air quality.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Accurate three-dimensional representations of cultural heritage sites are highly valuable for scientific study, conservation, and educational purposes. In addition to their use for archival purposes, 3D models enable efficient and precise measurement of relevant natural and architectural features. Many cultural heritage sites are large and complex, consisting of multiple structures spatially distributed over tens of thousands of square metres. The process of effectively digitising such geometrically complex locations requires measurements to be acquired from a variety of viewpoints. While several technologies exist for capturing the 3D structure of objects and environments, none are ideally suited to complex, large-scale sites, mainly due to their limited coverage or acquisition efficiency. We explore the use of a recently developed handheld mobile mapping system called Zebedee in cultural heritage applications. The Zebedee system is capable of efficiently mapping an environment in three dimensions by continually acquiring data as an operator holding the device traverses through the site. The system was deployed at the former Peel Island Lazaret, a culturally significant site in Queensland, Australia, consisting of dozens of buildings of various sizes spread across an area of approximately 400 × 250 m. With the Zebedee system, the site was scanned in half a day, and a detailed 3D point cloud model (with over 520 million points) was generated from the 3.6 hours of acquired data in 2.6 hours. We present results demonstrating that Zebedee was able to accurately capture both site context and building detail comparable in accuracy to manual measurement techniques, and at a greatly increased level of efficiency and scope. The scan allowed us to record derelict buildings that previously could not be measured because of the scale and complexity of the site. The resulting 3D model captures both interior and exterior features of buildings, including structure, materials, and the contents of rooms.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose: To provide a comprehensive overview of research examining the impact of astigmatism on clinical and functional measures of vision, the short and longer term adaptations to astigmatism that occur in the visual system, and the currently available clinical options for the management of patients with astigmatism. Recent findings: The presence of astigmatism can lead to substantial reductions in visual performance in a variety of clinical vision measures and functional visual tasks. Recent evidence demonstrates that astigmatic blur results in short-term adaptations in the visual system that appear to reduce the perceived impact of astigmatism on vision. In the longer term, uncorrected astigmatism in childhood can also significantly impact on visual development, resulting in amblyopia. Astigmatism is also associated with the development of spherical refractive errors. Although the clinical correction of small magnitudes of astigmatism is relatively straightforward, the precise, reliable correction of astigmatism (particularly high astigmatism) can be challenging. A wide variety of refractive corrections are now available for the patient with astigmatism, including spectacle, contact lens and surgical options. Conclusion: Astigmatism is one of the most common refractive errors managed in clinical ophthalmic practice. The significant visual and functional impacts of astigmatism emphasise the importance of its reliable clinical management. With continued improvements in ocular measurement techniques and developments in a range of different refractive correction technologies, the future promises the potential for more precise and comprehensive correction options for astigmatic patients.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This research measured particle and gaseous emissions from ships and trains operating within the Port of Brisbane, and explored their influence on ambient air composition at a downwind suburban measurement site. The ship and train emission factor investigations resulted in the development of novel measurement techniques which permit the quantification of particle and gaseous emission factors using samples collected from post-emission exhaust plumes. The urban influence investigation phase of the project produced a new approach to identifying influences from ship emissions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The surface properties of solid state pharmaceutics are of critical importance. Processing modifies the surfaces and effects surface roughness, which influences the performance of the final dosage form in many different levels. Surface roughness has an effect on, e.g., the properties of powders, tablet compression and tablet coating. The overall goal of this research was to understand the surface structures of pharmaceutical surfaces. In this context the specific purpose was to compare four different analysing techniques (optical microscopy, scanning electron microscopy, laser profilometry and atomic force microscopy) in various pharmaceutical applications where the surfaces have quite different roughness scale. This was done by comparing the image and roughness analysing techniques using powder compacts, coated tablets and crystal surfaces as model surfaces. It was found that optical microscopy was still a very efficient technique, as it yielded information that SEM and AFM imaging are not able to provide. Roughness measurements complemented the image data and gave quantitative information about height differences. AFM roughness data represents the roughness of only a small part of the surface and therefore needs other methods like laser profilometer are needed to provide a larger scale description of the surface. The new developed roughness analysing method visualised surface roughness by giving detailed roughness maps, which showed local variations in surface roughness values. The method was able to provide a picture of the surface heterogeneity and the scale of the roughness. In the coating study, the laser profilometer results showed that the increase in surface roughness was largest during the first 30 minutes of coating when the surface was not yet fully covered with coating. The SEM images and the dispersive X-ray analysis results showed that the surface was fully covered with coating within 15 to 30 minutes. The combination of the different measurement techniques made it possible to follow the change of surface roughness and development of polymer coating. The optical imaging techniques gave a good overview of processes affecting the whole crystal surface, but they lacked the resolution to see small nanometer scale processes. AFM was used to visualize the nanoscale effects of cleaving and reveal the full surface heterogeneity, which underlies the optical imaging. Ethanol washing changed small (nanoscale) structure to some extent, but the effect of ethanol washing on the larger scale was small. Water washing caused total reformation of the surface structure at all levels.