20 resultados para Electronic data processing -- Quality control

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent advances in technology have produced a significant increase in the availability of free sensor data over the Internet. With affordable weather monitoring stations now available to individual meteorology enthusiasts a reservoir of real time data such as temperature, rainfall and wind speed can now be obtained for most of the United States and Europe. Despite the abundance of available data, obtaining useable information about the weather in your local neighbourhood requires complex processing that poses several challenges. This paper discusses a collection of technologies and applications that harvest, refine and process this data, culminating in information that has been tailored toward the user. In this case we are particularly interested in allowing a user to make direct queries about the weather at any location, even when this is not directly instrumented, using interpolation methods. We also consider how the uncertainty that the interpolation introduces can then be communicated to the user of the system, using UncertML, a developing standard for uncertainty representation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent advances in technology have produced a significant increase in the availability of free sensor data over the Internet. With affordable weather monitoring stations now available to individual meteorology enthusiasts a reservoir of real time data such as temperature, rainfall and wind speed can now be obtained for most of the United States and Europe. Despite the abundance of available data, obtaining useable information about the weather in your local neighbourhood requires complex processing that poses several challenges. This paper discusses a collection of technologies and applications that harvest, refine and process this data, culminating in information that has been tailored toward the user. In this case we are particularly interested in allowing a user to make direct queries about the weather at any location, even when this is not directly instrumented, using interpolation methods. We also consider how the uncertainty that the interpolation introduces can then be communicated to the user of the system, using UncertML, a developing standard for uncertainty representation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a framework for considering quality control of volunteered geographic information (VGI). Different issues need to be considered during the conception, acquisition and post-acquisition phases of VGI creation. This includes items such as collecting metadata on the volunteer, providing suitable training, giving corrective feedback during the mapping process and use of control data, among others. Two examples of VGI data collection are then considered with respect to this quality control framework, i.e. VGI data collection by National Mapping Agencies and by the most recent Geo-Wiki tool, a game called Cropland Capture. Although good practices are beginning to emerge, there is still the need for the development and sharing of best practice, especially if VGI is to be integrated with authoritative map products or used for calibration and/or validation of land cover in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Photonic technologies for data processing in the optical domain are expected to play a major role in future high-speed communications. Nonlinear effects in optical fibres have many attractive features and great, but not yet fully explored potential for optical signal processing. Here we provide an overview of our recent advances in developing novel techniques and approaches to all-optical processing based on fibre nonlinearities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The measurement of 8-oxo-7,8-dihydro-2'-deoxyguanosine is an increasingly popular marker of in vivo oxidative damage to DNA. A random-sequence 21-mer oligonucleotide 5'-TCA GXC GTA CGT GAT CTC AGT-3' in which X was 8-oxo-guanine (8-oxo-G) was purified and accurate determination of the oxidised base was confirmed by a 32P-end labelling strategy. The lyophilised material was analysed for its absolute content of 8-oxo-dG by several major laboratories in Europe and one in Japan. Most laboratories using HPLC-ECD underestimated, while GC-MS-SIM overestimated the level of the lesion. HPLC-ECD measured the target value with greatest accuracy. The results also suggest that none of the procedures can accurately quantitate levels of 1 in 10(6) 8-oxo-(d)G in DNA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the first experimental implementation of a recently designed quasi-lossless fiber span with strongly reduced signal power excursion. The resulting fiber waveguide medium can be advantageously used both in lightwave communications and in all-optical nonlinear data processing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the first experimental implementation of a recently designed quasi-lossless fibre span with strongly reduced signal power excursion. The resulting fibre waveguide medium can be advantageously used both in lightwave communications and in all-optical nonlinear data processing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives - Powdered and granulated particulate materials make up most of the ingredients of pharmaceuticals and are often at risk of undergoing unwanted agglomeration, or caking, during transport or storage. This is particularly acute when bulk powders are exposed to extreme swings in temperature and relative humidity, which is now common as drugs are produced and administered in increasingly hostile climates and are stored for longer periods of time prior to use. This study explores the possibility of using a uniaxial unconfined compression test to compare the strength of caked agglomerates exposed to different temperatures and relative humidities. This is part of a longer-term study to construct a protocol to predict the caking tendency of a new bulk material from individual particle properties. The main challenge is to develop techniques that provide repeatable results yet are presented simply enough to be useful to a wide range of industries. Methods - Powdered sucrose, a major pharmaceutical ingredient, was poured into a split die and exposed to high and low relative humidity cycles at room temperature. The typical ranges were 20–30% for the lower value and 70–80% for the higher value. The outer die casing was then removed and the resultant agglomerate was subjected to an unconfined compression test using a plunger fitted to a Zwick compression tester. The force against displacement was logged so that the dynamics of failure as well as the failure load of the sample could be recorded. The experimental matrix included varying the number of cycles, the amount between the maximum and minimum relative humidity, the height and diameters of the samples, the number of cycles and the particle size. Results - Trends showed that the tensile strength of the agglomerates increased with the number of cycles and also with the more extreme swings in relative humidity. This agrees with previous work on alternative methods of measuring the tensile strength of sugar agglomerates formed from humidity cycling (Leaper et al 2003). Conclusions - The results show that at the very least the uniaxial tester is a good comparative tester to examine the caking tendency of powdered materials, with a simple arrangement and operation that are compatible with the requirements of industry. However, further work is required to continue to optimize the height/ diameter ratio during tests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We report the impact of cascaded reconfigurable optical add-drop multiplexer induced penalties on coherently-detected 28 Gbaud polarization multiplexed m-ary quadrature amplitude modulation (PM m-ary QAM) WDM channels. We investigate the interplay between different higher-order modulation channels and the effect of filter shapes and bandwidth of (de)multiplexers on the transmission performance, in a segment of pan-European optical network with a maximum optical path of 4,560 km (80km x 57 spans). We verify that if the link capacities are assigned assuming that digital back propagation is available, 25% of the network connections fail using electronic dispersion compensation alone. However, majority of such links can indeed be restored by employing single-channel digital back-propagation employing less than 15 steps for the whole link, facilitating practical application of DBP. We report that higher-order channels are most sensitive to nonlinear fiber impairments and filtering effects, however these formats are less prone to ROADM induced penalties due to the reduced maximum number of hops. Furthermore, it has been demonstrated that a minimum filter Gaussian order of 3 and bandwidth of 35 GHz enable negligible excess penalty for any modulation order.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT