898 resultados para desig automation of robots


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a method for indirect orientation of aerial images using ground control lines extracted from airborne Laser system (ALS) data. This data integration strategy has shown good potential in the automation of photogrammetric tasks, including the indirect orientation of images. The most important characteristic of the proposed approach is that the exterior orientation parameters (EOP) of a single or multiple images can be automatically computed with a space resection procedure from data derived from different sensors. The suggested method works as follows. Firstly, the straight lines are automatically extracted in the digital aerial image (s) and in the intensity image derived from an ALS data-set (S). Then, correspondence between s and S is automatically determined. A line-based coplanarity model that establishes the relationship between straight lines in the object and in the image space is used to estimate the EOP with the iterated extended Kalman filtering (IEKF). Implementation and testing of the method have employed data from different sensors. Experiments were conducted to assess the proposed method and the results obtained showed that the estimation of the EOP is function of ALS positional accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human intestinal parasites constitute a problem in most tropical countries, causing death or physical and mental disorders. Their diagnosis usually relies on the visual analysis of microscopy images, with error rates that may range from moderate to high. The problem has been addressed via computational image analysis, but only for a few species and images free of fecal impurities. In routine, fecal impurities are a real challenge for automatic image analysis. We have circumvented this problem by a method that can segment and classify, from bright field microscopy images with fecal impurities, the 15 most common species of protozoan cysts, helminth eggs, and larvae in Brazil. Our approach exploits ellipse matching and image foresting transform for image segmentation, multiple object descriptors and their optimum combination by genetic programming for object representation, and the optimum-path forest classifier for object recognition. The results indicate that our method is a promising approach toward the fully automation of the enteroparasitosis diagnosis. © 2012 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Faced with an imminent restructuring of the electric power system, over the past few years many countries have invested in a new paradigm known as Smart Grid. This paradigm targets optimization and automation of electric power network, using advanced information and communication technologies. Among the main communication protocols for Smart Grids we have the DNP3 protocol, which provides secure data transmission with moderate rates. The IEEE 802.15.4 is another communication protocol also widely used in Smart Grid, especially in the so-called Home Area Network (HAN). Thus, many applications of Smart Grid depends on the interaction of these two protocols. This paper proposes modeling, in the traditional network simulator NS-2, the integration of DNP3 protocol and the IEEE 802.15.4 wireless standard for low cost simulations of Smart Grid applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2-Methylisoborneol (MIB) and geosmin (GSM) are sub products from algae decomposition and, depending on their concentration, can be toxic: otherwise, they give unpleasant taste and odor to water. For water treatment companies it is important to constantly monitor their presence in the distributed water and avoid further costumer complaints. Lower-cost and easy-to-read instrumentation would be very promising in this regard. In this study, we evaluate the potentiality of an electronic tongue (ET) system based on non-specific polymeric sensors and impedance measurements in monitoring MIB and GSM in water samples. Principal component analysis (PCA) applied to the generated data matrix indicated that this ET was capable to perform with remarkable reproducibility the discrimination of these two contaminants in either distilled or tap water, in concentrations as low as 25 ng L-1. Nonetheless, this analysis methodology was rather qualitative and laborious, and the outputs it provided were greatly subjective. Also, data analysis based on PCA severely restricts automation of the measuring system or its use by non-specialized operators. To circumvent these drawbacks, a fuzzy controller was designed to quantitatively perform sample classification while providing outputs in simpler data charts. For instance, the ET along with the referred fuzzy controller performed with a 100% hit rate the quantification of MIB and GSM samples in distilled and tap water. The hit rate could be read directly from the plot. The lower cost of these polymeric sensors allied to the especial features of the fuzzy controller (easiness on programming and numerical outputs) provided initial requirements for developing an automated ET system to monitor odorant species in water production and distribution. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the collective imaginaries a robot is a human like machine as any androids in science fiction. However the type of robots that you will encounter most frequently are machinery that do work that is too dangerous, boring or onerous. Most of the robots in the world are of this type. They can be found in auto, medical, manufacturing and space industries. Therefore a robot is a system that contains sensors, control systems, manipulators, power supplies and software all working together to perform a task. The development and use of such a system is an active area of research and one of the main problems is the development of interaction skills with the surrounding environment, which include the ability to grasp objects. To perform this task the robot needs to sense the environment and acquire the object informations, physical attributes that may influence a grasp. Humans can solve this grasping problem easily due to their past experiences, that is why many researchers are approaching it from a machine learning perspective finding grasp of an object using information of already known objects. But humans can select the best grasp amongst a vast repertoire not only considering the physical attributes of the object to grasp but even to obtain a certain effect. This is why in our case the study in the area of robot manipulation is focused on grasping and integrating symbolic tasks with data gained through sensors. The learning model is based on Bayesian Network to encode the statistical dependencies between the data collected by the sensors and the symbolic task. This data representation has several advantages. It allows to take into account the uncertainty of the real world, allowing to deal with sensor noise, encodes notion of causality and provides an unified network for learning. Since the network is actually implemented and based on the human expert knowledge, it is very interesting to implement an automated method to learn the structure as in the future more tasks and object features can be introduced and a complex network design based only on human expert knowledge can become unreliable. Since structure learning algorithms presents some weaknesses, the goal of this thesis is to analyze real data used in the network modeled by the human expert, implement a feasible structure learning approach and compare the results with the network designed by the expert in order to possibly enhance it.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

From the late 1980s, the automation of sequencing techniques and the computer spread gave rise to a flourishing number of new molecular structures and sequences and to proliferation of new databases in which to store them. Here are presented three computational approaches able to analyse the massive amount of publicly avalilable data in order to answer to important biological questions. The first strategy studies the incorrect assignment of the first AUG codon in a messenger RNA (mRNA), due to the incomplete determination of its 5' end sequence. An extension of the mRNA 5' coding region was identified in 477 in human loci, out of all human known mRNAs analysed, using an automated expressed sequence tag (EST)-based approach. Proof-of-concept confirmation was obtained by in vitro cloning and sequencing for GNB2L1, QARS and TDP2 and the consequences for the functional studies are discussed. The second approach analyses the codon bias, the phenomenon in which distinct synonymous codons are used with different frequencies, and, following integration with a gene expression profile, estimates the total number of codons present across all the expressed mRNAs (named here "codonome value") in a given biological condition. Systematic analyses across different pathological and normal human tissues and multiple species shows a surprisingly tight correlation between the codon bias and the codonome bias. The third approach is useful to studies the expression of human autism spectrum disorder (ASD) implicated genes. ASD implicated genes sharing microRNA response elements (MREs) for the same microRNA are co-expressed in brain samples from healthy and ASD affected individuals. The different expression of a recently identified long non coding RNA which have four MREs for the same microRNA could disrupt the equilibrium in this network, but further analyses and experiments are needed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In distributed systems like clouds or service oriented frameworks, applications are typically assembled by deploying and connecting a large number of heterogeneous software components, spanning from fine-grained packages to coarse-grained complex services. The complexity of such systems requires a rich set of techniques and tools to support the automation of their deployment process. By relying on a formal model of components, a technique is devised for computing the sequence of actions allowing the deployment of a desired configuration. An efficient algorithm, working in polynomial time, is described and proven to be sound and complete. Finally, a prototype tool implementing the proposed algorithm has been developed. Experimental results support the adoption of this novel approach in real life scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis, the author presents a query language for an RDF (Resource Description Framework) database and discusses its applications in the context of the HELM project (the Hypertextual Electronic Library of Mathematics). This language aims at meeting the main requirements coming from the RDF community. in particular it includes: a human readable textual syntax and a machine-processable XML (Extensible Markup Language) syntax both for queries and for query results, a rigorously exposed formal semantics, a graph-oriented RDF data access model capable of exploring an entire RDF graph (including both RDF Models and RDF Schemata), a full set of Boolean operators to compose the query constraints, fully customizable and highly structured query results having a 4-dimensional geometry, some constructions taken from ordinary programming languages that simplify the formulation of complex queries. The HELM project aims at integrating the modern tools for the automation of formal reasoning with the most recent electronic publishing technologies, in order create and maintain a hypertextual, distributed virtual library of formal mathematical knowledge. In the spirit of the Semantic Web, the documents of this library include RDF metadata describing their structure and content in a machine-understandable form. Using the author's query engine, HELM exploits this information to implement some functionalities allowing the interactive and automatic retrieval of documents on the basis of content-aware requests that take into account the mathematical nature of these documents.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Telomeres have emerged as crucial cellular elements in aging and various diseases including cancer. To measure the average length of telomere repeats in cells, we describe our protocols that use fluorescent in situ hybridization (FISH) with labeled peptide nucleic acid (PNA) probes specific for telomere repeats in combination with fluorescence measurements by flow cytometry (flow FISH). Flow FISH analysis can be performed using commercially available flow cytometers, and has the unique advantage over other methods for measuring telomere length of providing multi-parameter information on the length of telomere repeats in thousands of individual cells. The accuracy and reproducibility of the measurements is augmented by the automation of most pipetting (aspiration and dispensing) steps, and by including an internal standard (control cells) with a known telomere length in every tube. The basic protocol for the analysis of nucleated blood cells from 22 different individuals takes about 12 h spread over 2-3 days.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Kurzbeschreibung: In der Automatisierung von intralogistischen Kommissioniervorgängen ist ein großes Zukunftspotential erkennbar. Elementarer Bestandteil des Automatisierungsprozesses ist der Einsatz von Industrierobotern, die mit einem geeigneten Endeffektor, dem Greifer, ausgestattet werden müssen. Die Robotik ist in der Lage schneller, präziser und ausdauernder als die menschlichen Kommissionierer zu arbeiten und trägt damit entscheidend zur Effizienzsteigerung bei. Eine wesentliche Herausforderung für diesen Entwicklungsschritt zur Substitution der manuellen Kommissionierung ist die Konstruktion und Bereitstellung eines geeigneten Greifsystems. Am Lehrstuhl für Maschinenelemente und Technische Logistik der Helmut-Schmidt-Universität wurde mit der Erfahrung aus einem vorangegangenen Forschungsprojekt die Methode der Clusteranalyse erstmalig zur Untersuchung von Greifobjekten zur Entwicklung eines bionischen Universalgreifers für die Kommissionierung von Drogerieartikeln verwendet. Diese Abhandlung beschreibt einen Beitrag zur Entwicklung dieses Greifers am Beispiel handelsüblicher Drogerieartikel, die aktuell manuell kommissioniert werden. Diese werden hinsichtlich der für das Greifen relevanten Objektmerkmale geclustert und die daraus resultierenden Erkenntnisse in Form von Konstruktionsmerkmalen abgeleitet. Nach einer Analyse und Festlegung der greifrelevanten Merkmale der Greifobjekte wird eine Objektdatenbasis erstellt. Mit Hilfe geeigneter Methoden wird die gewonnene Datenbasis aufbereitet und reduziert. Im Anschluss werden die Greifobjekte bzw. deren Merkmalsausprägungen einer hierarchischen Clusteranalyse unterzogen. Hierbei werden die Grenzen der gebildeten Cluster mittels der zugehörigen Greifobjekte festgelegt und analysiert. Abschließend werden bestimmte greiferspezifische Merkmale auf die Anwendbarkeit in den Clustern überprüft und bewertet. Diese Betrachtungen ermöglichen es, dass spezielle Anforderungen an den Greifer, die direkt aus den Eigenschaften der Greifobjekte herrühren, zuverlässig erkannt und konstruktiv berücksichtigt werden können.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we follow a theory-based approach to study the assimilation of compliance software in highly regulated multinational enterprises. These relatively new software products support the automation of controls which are associated with mandatory compliance requirements. We use institutional and success factor theories to explain the assimilation of compliance software. A framework for analyzing the assimilation of Access Control Systems (ACS), a special type of compliance software, is developed and used to reflect the experiences obtained in four in-depth case studies. One result is that coercive, mimetic, and normative pressures significantly effect ACS assimilation. On the other hand, quality aspects have only a moderate impact at the beginning of the assimilation process, in later phases the impact may increase if performance and improvement objectives become more relevant. In addition, it turns out that position of the enterprises and compatibility heavily influence the assimilation process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two-dimensional (2D) crystallisation of Membrane proteins reconstitutes them into their native environment, the lipid bilayer. Electron crystallography allows the structural analysis of these regular protein–lipid arrays up to atomic resolution. The crystal quality depends on the protein purity, ist stability and on the crystallisation conditions. The basics of 2D crystallisation and different recent advances are reviewed and electron crystallography approaches summarised. Progress in 2D crystallisation, sample preparation, image detectors and automation of the data acquisition and processing pipeline makes 2D electron crystallography particularly attractive for the structural analysis of membrane proteins that are too small for single-particle analyses and too unstable to form three-dimensional (3D) crystals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mining in the Iberian Pyrite Belt (IPB), the biggest VMS metallogenetic province known in the world to date, has to face a deep crisis in spite of the huge reserves still known after ≈5 000 years of production. This is due to several factors, as the difficult processing of complex Cu-Pb-Zn-Ag- Au ores, the exhaustion of the oxidation zone orebodies (the richest for gold, in gossan), the scarce demand for sulphuric acid in the world market, and harder environmental regulations. Of these factors, only the first and the last mentioned can be addressed by local ore geologists. A reactivation of mining can therefore only be achieved by an improved and more efficient ore processing, under the constraint of strict environmental controls. Digital image analysis of the ores, coupled to reflected light microscopy, provides a quantified and reliable mineralogical and textural characterization of the ores. The automation of the procedure for the first time furnishes the process engineers with real-time information, to improve the process and to preclude or control pollution; it can be applied to metallurgical tailings as well. This is shown by some examples of the IPB.