46 resultados para Artificial intelligence -- Data processing

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Machine breakdowns are one of the main sources of disruption and throughput fluctuation in highly automated production facilities. One element in reducing this disruption is ensuring that the maintenance team responds correctly to machine failures. It is, however, difficult to determine the current practice employed by the maintenance team, let alone suggest improvements to it. 'Knowledge based improvement' is a methodology that aims to address this issue, by (a) eliciting knowledge on current practice, (b) evaluating that practice and (c) looking for improvements. The methodology, based on visual interactive simulation and artificial intelligence methods, and its application to a Ford engine assembly facility are described. Copyright © 2002 Society of Automotive Engineers, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The performance of most operations systems is significantly affected by the interaction of human decision-makers. A methodology, based on the use of visual interactive simulation (VIS) and artificial intelligence (AI), is described that aims to identify and improve human decision-making in operations systems. The methodology, known as 'knowledge-based improvement' (KBI), elicits knowledge from a decision-maker via a VIS and then uses AI methods to represent decision-making. By linking the VIS and AI representation, it is possible to predict the performance of the operations system under different decision-making strategies and to search for improved strategies. The KBI methodology is applied to the decision-making surrounding unplanned maintenance operations at a Ford Motor Company engine assembly plant.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Photonic technologies for data processing in the optical domain are expected to play a major role in future high-speed communications. Nonlinear effects in optical fibres have many attractive features and great, but not yet fully explored potential for optical signal processing. Here we provide an overview of our recent advances in developing novel techniques and approaches to all-optical processing based on fibre nonlinearities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the first experimental implementation of a recently designed quasi-lossless fiber span with strongly reduced signal power excursion. The resulting fiber waveguide medium can be advantageously used both in lightwave communications and in all-optical nonlinear data processing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the first experimental implementation of a recently designed quasi-lossless fibre span with strongly reduced signal power excursion. The resulting fibre waveguide medium can be advantageously used both in lightwave communications and in all-optical nonlinear data processing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the first experimental implementation of a recently designed quasi-lossless fiber span with strongly reduced signal power excursion. The resulting fiber waveguide medium can be advantageously used both in lightwave communications and in all-optical nonlinear data processing. © 2005 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the first experimental implementation of a recently designed quasi-lossless fibre span with strongly reduced signal power excursion. The resulting fibre waveguide medium can be advantageously used both in lightwave communications and in all-optical nonlinear data processing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We compare two methods in order to predict inflation rates in Europe. One method uses a standard back propagation neural network and the other uses an evolutionary approach, where the network weights and the network architecture is evolved. Results indicate that back propagation produces superior results. However, the evolving network still produces reasonable results with the advantage that the experimental set-up is minimal. Also of interest is the fact that the Divisia measure of money is superior as a predictive tool over simple sum.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper compares two methods to predict in°ation rates in Europe. One method uses a standard back propagation neural network and the other uses an evolutionary approach, where the network weights and the network architecture are evolved. Results indicate that back propagation produces superior results. However, the evolving network still produces reasonable results with the advantage that the experimental set-up is minimal. Also of interest is the fact that the Divisia measure of money is superior as a predictive tool over simple sum.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Yorick Wilks is a central figure in the fields of Natural Language Processing and Artificial Intelligence. His influence extends to many areas and includes contributions to Machines Translation, word sense disambiguation, dialogue modeling and Information Extraction. This book celebrates the work of Yorick Wilks in the form of a selection of his papers which are intended to reflect the range and depth of his work. The volume accompanies a Festschrift which celebrates his contribution to the fields of Computational Linguistics and Artificial Intelligence. The papers include early work carried out at Cambridge University, descriptions of groundbreaking work on Machine Translation and Preference Semantics as well as more recent works on belief modeling and computational semantics. The selected papers reflect Yorick’s contribution to both practical and theoretical aspects of automatic language processing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes the development of a complete data visualisation system for large tabular databases, such as those commonly found in a business environment. A state-of-the-art 'cyberspace cell' data visualisation technique was investigated and a powerful visualisation system using it was implemented. Although allowing databases to be explored and conclusions drawn, it had several drawbacks, the majority of which were due to the three-dimensional nature of the visualisation. A novel two-dimensional generic visualisation system, known as MADEN, was then developed and implemented, based upon a 2-D matrix of 'density plots'. MADEN allows an entire high-dimensional database to be visualised in one window, while permitting close analysis in 'enlargement' windows. Selections of records can be made and examined, and dependencies between fields can be investigated in detail. MADEN was used as a tool for investigating and assessing many data processing algorithms, firstly data-reducing (clustering) methods, then dimensionality-reducing techniques. These included a new 'directed' form of principal components analysis, several novel applications of artificial neural networks, and discriminant analysis techniques which illustrated how groups within a database can be separated. To illustrate the power of the system, MADEN was used to explore customer databases from two financial institutions, resulting in a number of discoveries which would be of interest to a marketing manager. Finally, the database of results from the 1992 UK Research Assessment Exercise was analysed. Using MADEN allowed both universities and disciplines to be graphically compared, and supplied some startling revelations, including empirical evidence of the 'Oxbridge factor'.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This collection of papers records a series of studies, carried out over a period of some 50 years, on two aspects of river pollution control - the prevention of pollution by sewage biological filtration and the monitoring of river pollution by biological surveillance. The earlier studies were carried out to develop methods of controlling flies which bred in the filters and caused serious nuisance and possible public health hazard, when they dispersed to surrounding villages. Although the application of insecticides proved effective as an alleviate measure, because it resulted in only a temporary disturbance of the ecological balance, it was considered ecologically unsound as a long-term solution. Subsequent investigations showed that the fly populations in filters were largely determined by the amount of food available to the grazing larval stage in the form of filter film. It was also established that the winter deterioration in filter performance was due to the excessive accumulation of film. Subsequent investigations were therefore carried out to determine the factors responsible for the accumulation of film in different types of filter. Methods of filtration which were considered to control film accumulation by increasing the flushing action of the sewage, were found to control fungal film by creating nutrient limiting conditions. In some filters increasing the hydraulic flushing reduced the grazing fauna population in the surface layers and resulted in an increase in film. The results of these investigations were successfully applied in modifying filters and in the design of a Double Filtration process. These studies on biological filters lead to the conclusion that they should be designed and operated as ecological systems and not merely as hydraulic ones. Studies on the effects of sewage effluents on Birmingham streams confirmed the findings of earlier workers justifying their claim for using biological methods for detecting and assessing river pollution. Further ecological studies showed the sensitivity of benthic riffle communities to organic pollution. Using experimental channels and laboratory studies the different environmental conditions associated with organic pollution were investigated. The degree and duration of the oxygen depletion during the dark hours were found to be a critical factor. The relative tolerance of different taxa to other pollutants, such as ammonia, differed. Although colonisation samplers proved of value in sampling difficult sites, the invertebrate data generated were not suitable for processing as any of the commonly used biotic indexes. Several of the papers, which were written by request for presentation at conferences etc., presented the biological viewpoint on river pollution and water quality issues at the time and advocated the use of biological methods. The information and experiences gained in these investigations was used as the "domain expert" in the development of artificial intelligence systems for use in the biological surveillance of river water quality.