925 resultados para Automation and robotics
Resumo:
The conjugate gradient is the most popular optimization method for solving large systems of linear equations. In a system identification problem, for example, where very large impulse response is involved, it is necessary to apply a particular strategy which diminishes the delay, while improving the convergence time. In this paper we propose a new scheme which combines frequency-domain adaptive filtering with a conjugate gradient technique in order to solve a high order multichannel adaptive filter, while being delayless and guaranteeing a very short convergence time.
Resumo:
In recent years, we have witnessed great changes in the industrial environment as a result of the innovations introduced by Industry 4.0, especially in the integration of Internet of Things, Automation and Robotics in the manufacturing field. The project presented in this thesis lies within this innovation context and describes the implementation of an Image Recognition application focused on the automotive field. The project aims at helping the supply chain operator to perform an effective and efficient check of the homologation tags present on vehicles. The user contribution consists in taking a picture of the tag and the application will automatically, exploiting Amazon Web Services, return the result of the control about the correctness of the tag, the correct positioning within the vehicle and the presence of faults or defects on the tag. To implement this application we ombined two IoT platforms widely used in industrial field: Amazon Web Services(AWS) and ThingWorx. AWS exploits Convolutional Neural Networks to perform Text Detection and Image Recognition, while PTC ThingWorx manages the user interface and the data manipulation.
Resumo:
In recent times, a significant research effort has been focused on how deformable linear objects (DLOs) can be manipulated for real world applications such as assembly of wiring harnesses for the automotive and aerospace sector. This represents an open topic because of the difficulties in modelling accurately the behaviour of these objects and simulate a task involving their manipulation, considering a variety of different scenarios. These problems have led to the development of data-driven techniques in which machine learning techniques are exploited to obtain reliable solutions. However, this approach makes the solution difficult to be extended, since the learning must be replicated almost from scratch as the scenario changes. It follows that some model-based methodology must be introduced to generalize the results and reduce the training effort accordingly. The objective of this thesis is to develop a solution for the DLOs manipulation to assemble a wiring harness for the automotive sector based on adaptation of a base trajectory set by means of reinforcement learning methods. The idea is to create a trajectory planning software capable of solving the proposed task, reducing where possible the learning time, which is done in real time, but at the same time presenting suitable performance and reliability. The solution has been implemented on a collaborative 7-DOFs Panda robot at the Laboratory of Automation and Robotics of the University of Bologna. Experimental results are reported showing how the robot is capable of optimizing the manipulation of the DLOs gaining experience along the task repetition, but showing at the same time a high success rate from the very beginning of the learning phase.
Resumo:
A 10-year experience of our automated molecular diagnostic platform that carries out 91 different real-time PCR is described. Progresses and future perspectives in molecular diagnostic microbiology are reviewed: why automation is important; how our platform was implemented; how homemade PCRs were developed; the advantages/disadvantages of homemade PCRs, including the critical aspects of troubleshooting and the need to further reduce the turnaround time for specific samples, at least for defined clinical settings such as emergencies. The future of molecular diagnosis depends on automation, and in a novel perspective, it is time now to fully acknowledge the true contribution of molecular diagnostic and to reconsider the indication for PCR, by also using these tests as first-line assays.
Resumo:
Horticultural science linked with basic studies in biology, chemistry, physics and engineering has laid the foundation for advances in applied knowledge which are at the heart of commercial, environmental and social horticulture. In few disciplines is science more rapidly translated into applicable technologies than in the huge range of man’s activities embraced within horticulture which are discussed in this Trilogy. This chapter surveys the origins of horticultural science developing as an integral part of the 16th century “Scientific Revolution”. It identifies early discoveries during the latter part of the 19th and early 20th centuries which rationalized the control of plant growth, flowering and fruiting and the media in which crops could be cultivated. The products of these discoveries formed the basis on which huge current industries of worldwide significance are founded in fruit, vegetable and ornamental production. More recent examples of the application of horticultural science are used in an explanation of how the integration of plant breeding, crop selection and astute marketing highlighted by the New Zealand industry have retained and expanded the viability of production which supplies huge volumes of fruit into the world’s markets. This is followed by an examination of science applied to tissue and cell culture as an example of technologies which have already produced massive industrial applications but hold the prospect for generating even greater advances in the future. Finally, examples are given of nascent scientific discoveries which hold the prospect for generating horticultural industries with considerable future impact. These include systems modeling and biology, nanotechnology, robotics, automation and electronics, genetics and plant breeding, and more efficient and effective use of resources and the employment of benign microbes. In conclusion there is an estimation of the value of horticultural science to society.
Resumo:
Autonomous robots must be able to learn and maintain models of their environments. In this context, the present work considers techniques for the classification and extraction of features from images in joined with artificial neural networks in order to use them in the system of mapping and localization of the mobile robot of Laboratory of Automation and Evolutive Computer (LACE). To do this, the robot uses a sensorial system composed for ultrasound sensors and a catadioptric vision system formed by a camera and a conical mirror. The mapping system is composed by three modules. Two of them will be presented in this paper: the classifier and the characterizer module. The first module uses a hierarchical neural network to do the classification; the second uses techiniques of extraction of attributes of images and recognition of invariant patterns extracted from the places images set. The neural network of the classifier module is structured in two layers, reason and intuition, and is trained to classify each place explored for the robot amongst four predefine classes. The final result of the exploration is the construction of a topological map of the explored environment. Results gotten through the simulation of the both modules of the mapping system will be presented in this paper. © 2008 IEEE.
Resumo:
Multiple robot, single operator scenarios suppose a challenge in terms of human factors. Two relevant issues are keeping the situational awareness and managing the workload of operators. In order to address these problems, this work analyses the management of information and commands in multi-robot missions. About the information, this paper proposes a selection based on mission and operator states. Regarding the commands, this work reflects about the levels of automation and the methods of commanding.
Resumo:
Cybercrime and related malicious activity in our increasingly digital world has become more prevalent and sophisticated, evading traditional security mechanisms. Digital forensics has been proposed to help investigate, understand and eventually mitigate such attacks. The practice of digital forensics, however, is still fraught with various challenges. Some of the most prominent of these challenges include the increasing amounts of data and the diversity of digital evidence sources appearing in digital investigations. Mobile devices and cloud infrastructures are an interesting specimen, as they inherently exhibit these challenging circumstances and are becoming more prevalent in digital investigations today. Additionally they embody further characteristics such as large volumes of data from multiple sources, dynamic sharing of resources, limited individual device capabilities and the presence of sensitive data. These combined set of circumstances make digital investigations in mobile and cloud environments particularly challenging. This is not aided by the fact that digital forensics today still involves manual, time consuming tasks within the processes of identifying evidence, performing evidence acquisition and correlating multiple diverse sources of evidence in the analysis phase. Furthermore, industry standard tools developed are largely evidence-oriented, have limited support for evidence integration and only automate certain precursory tasks, such as indexing and text searching. In this study, efficiency, in the form of reducing the time and human labour effort expended, is sought after in digital investigations in highly networked environments through the automation of certain activities in the digital forensic process. To this end requirements are outlined and an architecture designed for an automated system that performs digital forensics in highly networked mobile and cloud environments. Part of the remote evidence acquisition activity of this architecture is built and tested on several mobile devices in terms of speed and reliability. A method for integrating multiple diverse evidence sources in an automated manner, supporting correlation and automated reasoning is developed and tested. Finally the proposed architecture is reviewed and enhancements proposed in order to further automate the architecture by introducing decentralization particularly within the storage and processing functionality. This decentralization also improves machine to machine communication supporting several digital investigation processes enabled by the architecture through harnessing the properties of various peer-to-peer overlays. Remote evidence acquisition helps to improve the efficiency (time and effort involved) in digital investigations by removing the need for proximity to the evidence. Experiments show that a single TCP connection client-server paradigm does not offer the required scalability and reliability for remote evidence acquisition and that a multi-TCP connection paradigm is required. The automated integration, correlation and reasoning on multiple diverse evidence sources demonstrated in the experiments improves speed and reduces the human effort needed in the analysis phase by removing the need for time-consuming manual correlation. Finally, informed by published scientific literature, the proposed enhancements for further decentralizing the Live Evidence Information Aggregator (LEIA) architecture offer a platform for increased machine-to-machine communication thereby enabling automation and reducing the need for manual human intervention.
Resumo:
Power distribution automation and control are import-ant tools in the current restructured electricity markets. Unfortunately, due to its stochastic nature, distribution systems faults are hardly avoidable. This paper proposes a novel fault diagnosis scheme for power distribution systems, composed by three different processes: fault detection and classification, fault location, and fault section determination. The fault detection and classification technique is wavelet based. The fault-location technique is impedance based and uses local voltage and current fundamental phasors. The fault section determination method is artificial neural network based and uses the local current and voltage signals to estimate the faulted section. The proposed hybrid scheme was validated through Alternate Transient Program/Electromagentic Transients Program simulations and was implemented as embedded software. It is currently used as a fault diagnosis tool in a Southern Brazilian power distribution company.
Resumo:
Despite the frequent use of stepping motors in robotics, automation, and a variety of precision instruments, they can hardly be found in rotational viscometers. This paper proposes the use of a stepping motor to drive a conventional constant-shear-rate laboratory rotational viscometer to avoid the use of velocity sensor and gearbox and, thus, simplify the instrument design. To investigate this driving technique, a commercial rotating viscometer has been adapted to be driven by a bipolar stepping motor, which is controlled via a personal computer. Special circuitry has been added to microstep the stepping motor at selectable step sizes and to condition the torque signal. Tests have been carried out using the prototype to produce flow curves for two standard Newtonian fluids (920 and 12 560 mPa (.) s, both at 25 degrees C). The flow curves have been obtained by employing several distinct microstep sizes within the shear rate range of 50-500 s(-1). The results indicate the feasibility of the proposed driving technique.
Resumo:
This paper presents a study carried out in order to evaluate the students' perception in the development and use of remote Control and Automation education kits developed by two Universities. Three projects, based on real world environments, were implemented, being local and remotely operated. Students implemented the kits using the theoretical and practical knowledge, being the teachers a catalyst in the learning process. When kits were operational, end-user students got acquainted to the kits in the course curricula units. It is the author's believe that successful results were achieved not only in the learning progress on the Automation and Control fields (hard skills) but also on the development of the students soft skills, leading to encouraging and rewarding goals, motivating their future decisions and promoting synergies in their work. The design of learning experimental kits by students, under teacher supervision, for future use in course curricula by enduser students is an advantageous and rewarding experience.
Resumo:
In this paper we present VERITAS, a tool that focus time maintenance, that is one of the most important processes in the engineering of the time during the development of KBS. The verification and validation (V&V) process is part of a wider process denominated knowledge maintenance, in which an enterprise systematically gathers, organizes, shares, and analyzes knowledge to accomplish its goals and mission. The V&V process states if the software requirements specifications have been correctly and completely fulfilled. The methodologies proposed in software engineering have showed to be inadequate for Knowledge Based Systems (KBS) validation and verification, since KBS present some particular characteristics. VERITAS is an automatic tool developed for KBS verification which is able to detect a large number of knowledge anomalies. It addresses many relevant aspects considered in real applications, like the usage of rule triggering selection mechanisms and temporal reasoning.
Resumo:
This paper focuses on a novel formalization for assessing the five parameter modeling of a photovoltaic cell. An optimization procedure is used as a feasibility problem to find the parameters tuned at the open circuit, maximum power, and short circuit points in order to assess the data needed for plotting the I-V curve. A comparison with experimental results is presented for two monocrystalline PV modules.
Resumo:
This paper proposes a stochastic mixed-integer linear approach to deal with a short-term unit commitment problem with uncertainty on a deregulated electricity market that includes day-ahead bidding and bilateral contracts. The proposed approach considers the typically operation constraints on the thermal units and a spinning reserve. The uncertainty is due to the electricity prices, which are modeled by a scenario set, allowing an acceptable computation. Moreover, emission allowances are considered in a manner to allow for the consideration of environmental constraints. A case study to illustrate the usefulness of the proposed approach is presented and an assessment of the cost for the spinning reserve is obtained by a comparison between the situation with and without spinning reserve.
Resumo:
The fractional order calculus (FOC) is as old as the integer one although up to recently its application was exclusively in mathematics. Many real systems are better described with FOC differential equations as it is a well-suited tool to analyze problems of fractal dimension, with long-term “memory” and chaotic behavior. Those characteristics have attracted the engineers' interest in the latter years, and now it is a tool used in almost every area of science. This paper introduces the fundamentals of the FOC and some applications in systems' identification, control, mechatronics, and robotics, where it is a promissory research field.