935 resultados para Industrial automation techniques


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Detect and Avoid (DAA) technology is widely acknowledged as a critical enabler for unsegregated Remote Piloted Aircraft (RPA) operations, particularly Beyond Visual Line of Sight (BVLOS). Image-based DAA, in the visible spectrum, is a promising technological option for addressing the challenges DAA presents. Two impediments to progress for this approach are the scarcity of available video footage to train and test algorithms, in conjunction with testing regimes and specifications which facilitate repeatable, statistically valid, performance assessment. This paper includes three key contributions undertaken to address these impediments. In the first instance, we detail our progress towards the creation of a large hybrid collision and near-collision encounter database. Second, we explore the suitability of techniques employed by the biometric research community (Speaker Verification and Language Identification), for DAA performance optimisation and assessment. These techniques include Detection Error Trade-off (DET) curves, Equal Error Rates (EER), and the Detection Cost Function (DCF). Finally, the hybrid database and the speech-based techniques are combined and employed in the assessment of a contemporary, image based DAA system. This system includes stabilisation, morphological filtering and a Hidden Markov Model (HMM) temporal filter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a statistical aircraft trajectory clustering approach aimed at discriminating between typical manned and expected unmanned traffic patterns. First, a resampled version of each trajectory is modelled using a mixture of Von Mises distributions (circular statistics). Second, the remodelled trajectories are globally aligned using tools from bioinformatics. Third, the alignment scores are used to cluster the trajectories using an iterative k-medoids approach and an appropriate distance function. The approach is then evaluated using synthetically generated unmanned aircraft flights combined with real air traffic position reports taken over a sector of Northern Queensland, Australia. Results suggest that the technique is useful in distinguishing between expected unmanned and manned aircraft traffic behaviour, as well as identifying some common conventional air traffic patterns.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper focuses on optimisation algorithms inspired by swarm intelligence for satellite image classification from high resolution satellite multi- spectral images. Amongst the multiple benefits and uses of remote sensing, one of the most important has been its use in solving the problem of land cover mapping. As the frontiers of space technology advance, the knowledge derived from the satellite data has also grown in sophistication. Image classification forms the core of the solution to the land cover mapping problem. No single classifier can prove to satisfactorily classify all the basic land cover classes of an urban region. In both supervised and unsupervised classification methods, the evolutionary algorithms are not exploited to their full potential. This work tackles the land map covering by Ant Colony Optimisation (ACO) and Particle Swarm Optimisation (PSO) which are arguably the most popular algorithms in this category. We present the results of classification techniques using swarm intelligence for the problem of land cover mapping for an urban region. The high resolution Quick-bird data has been used for the experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Light scattering, or scattering and absorption of electromagnetic waves, is an important tool in all remote-sensing observations. In astronomy, the light scattered or absorbed by a distant object can be the only source of information. In Solar-system studies, the light-scattering methods are employed when interpreting observations of atmosphereless bodies such as asteroids, atmospheres of planets, and cometary or interplanetary dust. Our Earth is constantly monitored from artificial satellites at different wavelengths. With remote sensing of Earth the light-scattering methods are not the only source of information: there is always the possibility to make in situ measurements. The satellite-based remote sensing is, however, superior in the sense of speed and coverage if only the scattered signal can be reliably interpreted. The optical properties of many industrial products play a key role in their quality. Especially for products such as paint and paper, the ability to obscure the background and to reflect light is of utmost importance. High-grade papers are evaluated based on their brightness, opacity, color, and gloss. In product development, there is a need for computer-based simulation methods that could predict the optical properties and, therefore, could be used in optimizing the quality while reducing the material costs. With paper, for instance, pilot experiments with an actual paper machine can be very time- and resource-consuming. The light-scattering methods presented in this thesis solve rigorously the interaction of light and material with wavelength-scale structures. These methods are computationally demanding, thus the speed and accuracy of the methods play a key role. Different implementations of the discrete-dipole approximation are compared in the thesis and the results provide practical guidelines in choosing a suitable code. In addition, a novel method is presented for the numerical computations of orientation-averaged light-scattering properties of a particle, and the method is compared against existing techniques. Simulation of light scattering for various targets and the possible problems arising from the finite size of the model target are discussed in the thesis. Scattering by single particles and small clusters is considered, as well as scattering in particulate media, and scattering in continuous media with porosity or surface roughness. Various techniques for modeling the scattering media are presented and the results are applied to optimizing the structure of paper. However, the same methods can be applied in light-scattering studies of Solar-system regoliths or cometary dust, or in any remote-sensing problem involving light scattering in random media with wavelength-scale structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The prime focus of this study is to design a 50 mm internal diameter diaphragmless shock tube that can be used in an industrial facility for repeated loading of shock waves. The instantaneous rise in pressure and temperature of a medium can be used in a variety of industrial applications. We designed, fabricated and tested three different shock wave generators of which one system employs a highly elastic rubber membrane and the other systems use a fast acting pneumatic valve instead of conventional metal diaphragms. The valve opening speed is obtained with the help of a high speed camera. For shock generation systems with a pneumatic cylinder, it ranges from 0.325 to 1.15 m/s while it is around 8.3 m/s for the rubber membrane. Experiments are conducted using the three diaphragmless systems and the results obtained are analyzed carefully to obtain a relation between the opening speed of the valve and the amount of gas that is actually utilized in the generation of the shock wave for each system. The rubber membrane is not suitable for industrial applications because it needs to be replaced regularly and cannot withstand high driver pressures. The maximum shock Mach number obtained using the new diaphragmless system that uses the pneumatic valve is 2.125 +/- 0.2%. This system shows much promise for automation in an industrial environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An understanding of application I/O access patterns is useful in several situations. First, gaining insight into what applications are doing with their data at a semantic level helps in designing efficient storage systems. Second, it helps create benchmarks that mimic realistic application behavior closely. Third, it enables autonomic systems as the information obtained can be used to adapt the system in a closed loop.All these use cases require the ability to extract the application-level semantics of I/O operations. Methods such as modifying application code to associate I/O operations with semantic tags are intrusive. It is well known that network file system traces are an important source of information that can be obtained non-intrusively and analyzed either online or offline. These traces are a sequence of primitive file system operations and their parameters. Simple counting, statistical analysis or deterministic search techniques are inadequate for discovering application-level semantics in the general case, because of the inherent variation and noise in realistic traces.In this paper, we describe a trace analysis methodology based on Profile Hidden Markov Models. We show that the methodology has powerful discriminatory capabilities that enable it to recognize applications based on the patterns in the traces, and to mark out regions in a long trace that encapsulate sets of primitive operations that represent higher-level application actions. It is robust enough that it can work around discrepancies between training and target traces such as in length and interleaving with other operations. We demonstrate the feasibility of recognizing patterns based on a small sampling of the trace, enabling faster trace analysis. Preliminary experiments show that the method is capable of learning accurate profile models on live traces in an online setting. We present a detailed evaluation of this methodology in a UNIX environment using NFS traces of selected commonly used applications such as compilations as well as on industrial strength benchmarks such as TPC-C and Postmark, and discuss its capabilities and limitations in the context of the use cases mentioned above.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given a parametrized n-dimensional SQL query template and a choice of query optimizer, a plan diagram is a color-coded pictorial enumeration of the execution plan choices of the optimizer over the query parameter space. These diagrams have proved to be a powerful metaphor for the analysis and redesign of modern optimizers, and are gaining currency in diverse industrial and academic institutions. However, their utility is adversely impacted by the impractically large computational overheads incurred when standard brute-force exhaustive approaches are used for producing fine-grained diagrams on high-dimensional query templates. In this paper, we investigate strategies for efficiently producing close approximations to complex plan diagrams. Our techniques are customized to the features available in the optimizer's API, ranging from the generic optimizers that provide only the optimal plan for a query, to those that also support costing of sub-optimal plans and enumerating rank-ordered lists of plans. The techniques collectively feature both random and grid sampling, as well as inference techniques based on nearest-neighbor classifiers, parametric query optimization and plan cost monotonicity. Extensive experimentation with a representative set of TPC-H and TPC-DS-based query templates on industrial-strength optimizers indicates that our techniques are capable of delivering 90% accurate diagrams while incurring less than 15% of the computational overheads of the exhaustive approach. In fact, for full-featured optimizers, we can guarantee zero error with less than 10% overheads. These approximation techniques have been implemented in the publicly available Picasso optimizer visualization tool.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a novel algorithm for compression of single lead Electrocardiogram (ECG) signals. The method is based on Pole-Zero modelling of the Discrete Cosine Transformed (DCT) signal. An extension is proposed to the well known Steiglitz-Hcbride algorithm, to model the higher frequency components of the input signal more accurately. This is achieved by weighting the error function minimized by the algorithm to estimate the model parameters. The data compression achieved by the parametric model is further enhanced by Differential Pulse Code Modulation (DPCM) of the model parameters. The method accomplishes a compression ratio in the range of 1:20 to 1:40, which far exceeds those achieved by most of the current methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Advanced bus-clamping pulse width modulation (ABCPWM) techniques are advantageous in terms of line current distortion and inverter switching loss in voltage source inverter-fed applications. However, the PWM waveforms corresponding to these techniques are not amenable to carrier-based generation. The modulation process in ABCPWM methods is analyzed here from a “per-phase” perspective. It is shown that three sets of descendant modulating functions (or modified modulating functions) can be generated from the three-phase sinusoidal signals. Each set of the modified modulating functions can be used to produce the PWM waveform of a given phase in a computationally efficient manner. Theoretical results and experimental investigations on a 5hp motor drive are presented

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem addressed in this paper is sound, scalable, demand-driven null-dereference verification for Java programs. Our approach consists conceptually of a base analysis, plus two major extensions for enhanced precision. The base analysis is a dataflow analysis wherein we propagate formulas in the backward direction from a given dereference, and compute a necessary condition at the entry of the program for the dereference to be potentially unsafe. The extensions are motivated by the presence of certain ``difficult'' constructs in real programs, e.g., virtual calls with too many candidate targets, and library method calls, which happen to need excessive analysis time to be analyzed fully. The base analysis is hence configured to skip such a difficult construct when it is encountered by dropping all information that has been tracked so far that could potentially be affected by the construct. Our extensions are essentially more precise ways to account for the effect of these constructs on information that is being tracked, without requiring full analysis of these constructs. The first extension is a novel scheme to transmit formulas along certain kinds of def-use edges, while the second extension is based on using manually constructed backward-direction summary functions of library methods. We have implemented our approach, and applied it on a set of real-life benchmarks. The base analysis is on average able to declare about 84% of dereferences in each benchmark as safe, while the two extensions push this number up to 91%. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer Assisted Assessment (CAA) has been existing for several years now. While some forms of CAA do not require sophisticated text understanding (e.g., multiple choice questions), there are also student answers that consist of free text and require analysis of text in the answer. Research towards the latter till date has concentrated on two main sub-tasks: (i) grading of essays, which is done mainly by checking the style, correctness of grammar, and coherence of the essay and (ii) assessment of short free-text answers. In this paper, we present a structured view of relevant research in automated assessment techniques for short free-text answers. We review papers spanning the last 15 years of research with emphasis on recent papers. Our main objectives are two folds. First we present the survey in a structured way by segregating information on dataset, problem formulation, techniques, and evaluation measures. Second we present a discussion on some of the potential future directions in this domain which we hope would be helpful for researchers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The advent of molecular biology has had a dramatic impact on all aspects of biology, not least applied microbial ecology. Microbiological testing of water has traditionally depended largely on culture techniques. Growing understanding that only a small proportion of microbial species are culturable, and that many microorganisms may attain a viable but non-culturable state, has promoted the development of novel approaches to monitoring pathogens in the environment. This has been paralleled by an increased awareness of the surprising genetic diversity of natural microbial populations. By targeting gene sequences that are specific for particular microorganisms, for example genes that encode diagnostic enzymes, or species-specific domains of conserved genes such as 16S ribosomal RNA coding sequences (rrn genes), the problems of culture can be avoided. Technical developments, notably in the area of in vitro amplification of DNA using the polymerase chain reaction (PCR), now permit routine detection and identification of specific microorganisms, even when present in very low numbers. Although the techniques of molecular biology have provided some very powerful tools for environmental microbiology, it should not be forgotten that these have their own drawbacks and biases in sampling. For example, molecular techniques are dependent on efficient lysis and recovery of nucleic acids from both vegetative forms and spores of microbial species that may differ radically when growing in the laboratory compared with the natural environment. Furthermore, PCR amplification can introduce its own bias depending on the nature of the oligonucleotide primers utilised. However, despite these potential caveats, it seems likely that a molecular biological approach, particularly with its potential for automation, will provide the mainstay of diagnostic technology for the foreseeable future.