50 resultados para robot automation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Evolution strategies are a class of general optimisation algorithms which are applicable to functions that are multimodal, nondifferentiable, or even discontinuous. Although recombination operators have been introduced into evolution strategies, the primary search operator is still mutation. Classical evolution strategies rely on Gaussian mutations. A new mutation operator based on the Cauchy distribution is proposed in this paper. It is shown empirically that the new evolution strategy based on Cauchy mutation outperforms the classical evolution strategy on most of the 23 benchmark problems tested in this paper. The paper also shows empirically that changing the order of mutating the objective variables and mutating the strategy parameters does not alter the previous conclusion significantly, and that Cauchy mutations with different scaling parameters still outperform the Gaussian mutation with self-adaptation. However, the advantage of Cauchy mutations disappears when recombination is used in evolution strategies. It is argued that the search step size plays an important role in determining evolution strategies' performance. The large step size of recombination plays a similar role as Cauchy mutation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coset enumeration is a most important procedure for investigating finitely presented groups. We present a practical parallel procedure for coset enumeration on shared memory processors. The shared memory architecture is particularly interesting because such parallel computation is both faster and cheaper. The lower cost comes when the program requires large amounts of memory, and additional CPU's. allow us to lower the time that the expensive memory is being used. Rather than report on a suite of test cases, we take a single, typical case, and analyze the performance factors in-depth. The parallelization is achieved through a master-slave architecture. This results in an interesting phenomenon, whereby the CPU time is divided into a sequential and a parallel portion, and the parallel part demonstrates a speedup that is linear in the number of processors. We describe an early version for which only 40% of the program was parallelized, and we describe how this was modified to achieve 90% parallelization while using 15 slave processors and a master. In the latter case, a sequential time of 158 seconds was reduced to 29 seconds using 15 slaves.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, the minimum-order stable recursive filter design problem is proposed and investigated. This problem is playing an important role in pipeline implementation sin signal processing. Here, the existence of a high-order stable recursive filter is proved theoretically, in which the upper bound for the highest order of stable filters is given. Then the minimum-order stable linear predictor is obtained via solving an optimization problem. In this paper, the popular genetic algorithm approach is adopted since it is a heuristic probabilistic optimization technique and has been widely used in engineering designs. Finally, an illustrative example is sued to show the effectiveness of the proposed algorithm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is concerned with the use of scientific visualization methods for the analysis of feedforward neural networks (NNs). Inevitably, the kinds of data associated with the design and implementation of neural networks are of very high dimensionality, presenting a major challenge for visualization. A method is described using the well-known statistical technique of principal component analysis (PCA). This is found to be an effective and useful method of visualizing the learning trajectories of many learning algorithms such as back-propagation and can also be used to provide insight into the learning process and the nature of the error surface.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We discuss quantum error correction for errors that occur at random times as described by, a conditional Poisson process. We shoo, how a class of such errors, detected spontaneous emission, can be corrected by continuous closed loop, feedback.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Vertical direct chill (VDC) casting of aluminium alloys is a mature process that has evolved over many decades through gradual change to both equipment design and casting practice. Today, air-pressurised, continuous lubrication, hot top mould systems with advanced station automation are selected as the process of choice for producing extrusion billet. Specific sets of operating parameters are employed on these stations for each alloy and size combination to produce optimal billet quality. The designs and parameters are largely derived from past experience and accumulated know-how. Recent experimental work at the University of Queensland has concentrated on understanding the way in which the surface properties of liquid aluminium alloys, e.g., surface tension, wetting angle and oxide skin strength, influence the size and shape of the naturally-stab le meniscus for a given alloy, temperature and atmosphere. The wide range of alloy-and condition-dependent values measured has led to the consideration of how these properties impact the stability of the enforced molten metal meniscus within the hot top mould cavity. The actual shape and position of the enforced meniscus is controlled by parameters such as the upstream conduction distance (UCD) from sub-mould cooling and the molten metal head. The degree of deviation of this actual meniscus from the predicted stable meniscus is considered to be a key driver in surface defect formation. This paper reports on liquid alloy property results and proposes how this knowledge might be used to better design VDC mould systems and casting practices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The anisotropic norm of a linear discrete-time-invariant system measures system output sensitivity to stationary Gaussian input disturbances of bounded mean anisotropy. Mean anisotropy characterizes the degree of predictability (or colouredness) and spatial non-roundness of the noise. The anisotropic norm falls between the H-2 and H-infinity norms and accommodates their loss of performance when the probability structure of input disturbances is not exactly known. This paper develops a method for numerical computation of the anisotropic norm which involves linked Riccati and Lyapunov equations and an associated special type equation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Uncontrolled systems (x) over dot is an element of Ax, where A is a non-empty compact set of matrices, and controlled systems (x) over dot is an element of Ax + Bu are considered. Higher-order systems 0 is an element of Px - Du, where and are sets of differential polynomials, are also studied. It is shown that, under natural conditions commonly occurring in robust control theory, with some mild additional restrictions, asymptotic stability of differential inclusions is guaranteed. The main results are variants of small-gain theorems and the principal technique used is the Krasnosel'skii-Pokrovskii principle of absence of bounded solutions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Any given n X n matrix A is shown to be a restriction, to the A-invariant subspace, of a nonnegative N x N matrix B of spectral radius p(B) arbitrarily close to p(A). A difference inclusion x(k+1) is an element of Ax(k), where A is a compact set of matrices, is asymptotically stable if and only if A can be extended to a set B of nonnegative matrices B with \ \B \ \ (1) < 1 or \ \B \ \ (infinity) < 1. Similar results are derived for differential inclusions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A question is examined as to estimates of the norms of perturbations of a linear stable dynamic system, under which the perturbed system remains stable in a situation R:here a perturbation has a fixed structure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The QU-GENE Computing Cluster (QCC) is a hardware and software solution to the automation and speedup of large QU-GENE (QUantitative GENEtics) simulation experiments that are designed to examine the properties of genetic models, particularly those that involve factorial combinations of treatment levels. QCC automates the management of the distribution of components of the simulation experiments among the networked single-processor computers to achieve the speedup.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pasminco Century Mine has developed a geophysical logging system to provide new data for ore mining/grade control and the generation of Short Term Models for mine planning. Previous work indicated the applicability of petrophysical logging for lithology prediction, however, the automation of the method was not considered reliable enough for the development of a mining model. A test survey was undertaken using two diamond drilled control holes and eight percussion holes. All holes were logged with natural gamma, magnetic susceptibility and density. Calibration of the LogTrans auto-interpretation software using only natural gamma and magnetic susceptibility indicated that both lithology and stratigraphy could be predicted. Development of a capability to enforce stratigraphic order within LogTrans increased the reliability and accuracy of interpretations. After the completion of a feasibility program, Century Mine has invested in a dedicated logging vehicle to log blast holes as well as for use in in-fill drilling programs. Future refinement of the system may lead to the development of GPS controlled excavators for mining ore.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The focus of rapid diagnosis of infectious diseases of children in the last decade has shifted from variations of the conventional laboratory techniques of antigen detection, microscopy and culture to that of molecular diagnosis of infectious agents. Pediatricians will need to be able to interpret the use, limitations and results of molecular diagnostic techniques as they are increasingly integrated into routine clinical microbiology laboratory protocols. PCR is the best known and most successfully implemented diagnostic molecular technology to date. It can detect specific infectious agents and determine their virulence and antimicrobial genotypes with greater speed, sensitivity and specificity than conventional microbiology methods. Inherent technical limitations of PCR are present, although they are reduced in laboratories that follow suitable validation and quality control procedures. Variations of PCR together with advances in nucleic acid amplification technology have broadened its diagnostic capabilities in clinical infectious disease to now rival and even surpass traditional methods in some situations. Automation of all components of PCR is now possible. The completion of the genome sequencing projects for significant microbial pathogens, in combination with PCR and DNA chip technology, will revolutionize the diagnosis and management of infectious diseases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we establish a foundation for understanding the instrumentation needs of complex dynamic systems if ecological interface design (EID)-based interfaces are to be robust in the face of instrumentation failures. EID-based interfaces often include configural displays which reveal the higher-order properties of complex systems. However, concerns have been expressed that such displays might be misleading when instrumentation is unreliable or unavailable. Rasmussen's abstraction hierarchy (AH) formalism can be extended to include representations of sensors near the functions or properties about which they provide information, resulting in what we call a sensor-annotated abstraction hierarchy. Sensor-annotated AHs help the analyst determine the impact of different instrumentation engineering policies on higher-order system information by showing how the data provided from individual sensors propagates within and across levels of abstraction in the AH. The use of sensor-annotated AHs with a configural display is illustrated with a simple water reservoir example. We argue that if EID is to be effectively employed in the design of interfaces for complex systems, then the information needs of the human operator need to be considered at the earliest stages of system development while instrumentation requirements are being formulated. In this way, Rasmussen's AH promotes a formative approach to instrumentation engineering. (C) 2002 Elsevier Science Ltd. All rights reserved.