898 resultados para desig automation of robots
Resumo:
Mobile robots provide a versatile platform for research, however they can also provide an interesting educational platform for public exhibition at museums. In general museums require exhibits that are both eye catching and exciting to the public whilst requiring a minimum of maintenance time from museum technicians. In many cases it is simply not possible to continuously change batteries and some method of supplying continous power is required. A powered flooring system is described that is capable of providing power continuously to a group of robots. Three different museum exhibit applications are described. All three robot exhibits are of a similar basic design although the exhibits are very different in appearance and behaviour. The durability and versatility of the robots also makes them extremely good candidates for long duration experiments such as those required by evolutionary robotics.
Resumo:
Cybernetics is a broad subject, encompassing many aspects of electrical, electronic, and computer engineering, which suffers from a lack of understanding on the part of potential applicants and teachers when recruiting students. However, once the engineering values, fascinating science, and pathways to rewarding, diverse careers are communicated, appropriate students can be very interested in enrolling. At the University of Reading, Reading, U.K., a key route for outreach to prospective students has been achieved through the use of robots in interactive talks at schools, competitions (often funded by Public Understanding of Science projects), a collectable fortnightly magazine, exhibitions in museums, open days at the University, and appearances in the media. This paper identifies the interactive engagement, anthropomorphic acceptability, and inspirational nature of robots as being key to their successful use in outreach activities. The statistical results presented show that the continued popularity of degrees at Reading in cybernetics, electronic engineering, and robotics over the last 20 years is in part due to the outreach activities to schools and the general public.
Resumo:
Purpose: Increasing costs of health care, fuelled by demand for high quality, cost-effective healthcare has drove hospitals to streamline their patient care delivery systems. One such systematic approach is the adaptation of Clinical Pathways (CP) as a tool to increase the quality of healthcare delivery. However, most organizations still rely on are paper-based pathway guidelines or specifications, which have limitations in process management and as a result can influence patient safety outcomes. In this paper, we present a method for generating clinical pathways based on organizational semiotics by capturing knowledge from syntactic, semantic and pragmatic to social level. Design/methodology/approach: The proposed modeling approach to generation of CPs adopts organizational semiotics and enables the generation of semantically rich representation of CP knowledge. Semantic Analysis Method (SAM) is applied to explicitly represent the semantics of the concepts, their relationships and patterns of behavior in terms of an ontology chart. Norm Analysis Method (NAM) is adopted to identify and formally specify patterns of behavior and rules that govern the actions identified on the ontology chart. Information collected during semantic and norm analysis is integrated to guide the generation of CPs using best practice represented in BPMN thus enabling the automation of CP. Findings: This research confirms the necessity of taking into consideration social aspects in designing information systems and automating CP. The complexity of healthcare processes can be best tackled by analyzing stakeholders, which we treat as social agents, their goals and patterns of action within the agent network. Originality/value: The current modeling methods describe CPs from a structural aspect comprising activities, properties and interrelationships. However, these methods lack a mechanism to describe possible patterns of human behavior and the conditions under which the behavior will occur. To overcome this weakness, a semiotic approach to generation of clinical pathway is introduced. The CP generated from SAM together with norms will enrich the knowledge representation of the domain through ontology modeling, which allows the recognition of human responsibilities and obligations and more importantly, the ultimate power of decision making in exceptional circumstances.
Resumo:
The financial crisis of 2007–2009 and the resultant pressures exerted on policymakers to prevent future crises have precipitated coordinated regulatory responses globally. A key focus of the new wave of regulation is to ensure the removal of practices now deemed problematic with new controls for conducting transactions and maintaining holdings. There is increasing pressure on organizations to retire manual processes and adopt core systems, such as Investment Management Systems (IMS). These systems facilitate trading and ensure transactions are compliant by transcribing regulatory requirements into automated rules and applying them to trades. The motivation of this study is to explore the extent to which such systems may enable the alteration of previously embedded practices. We researched implementations of an IMS at eight global financial organizations and found that overall the IMS encourages responsible trading through surveillance, monitoring and the automation of regulatory rules and that such systems are likely to become further embedded within financial organizations. We found evidence that some older practices persisted. Our study suggests that the institutionalization of technology-induced compliant behaviour is still uncertain.
Resumo:
Temperature, pressure, gas stoichiometry, and residence time were varied to control the yield and product distribution of the palladium-catalyzed aminocarbonylation of aromatic bromides in both a silicon microreactor and a packed-bed tubular reactor. Automation of the system set points and product sampling enabled facile and repeatable reaction analysis with minimal operator supervision. It was observed that the reaction was divided into two temperature regimes. An automated system was used to screen steady-state conditions for offline analysis by gas chromatography to fit a reaction rate model. Additionally, a transient temperature ramp method utilizing online infrared analysis was used, leading to more rapid determination of the reaction activation energy of the lower temperature regimes. The entire reaction spanning both regimes was modeled in good agreement with the experimental data.
Resumo:
Sociable robots are embodied agents that are part of a heterogeneous society of robots and humans. They Should be able to recognize human beings and each other, and to engage in social, interactions. The use of a robotic architecture may strongly reduce the time and effort required to construct a sociable robot. Such architecture must have structures and mechanisms to allow social interaction. behavior control and learning from environment. Learning processes described oil Science of Behavior Analysis may lead to the development of promising methods and Structures for constructing robots able to behave socially and learn through interactions from the environment by a process of contingency learning. In this paper, we present a robotic architecture inspired from Behavior Analysis. Methods and structures of the proposed architecture, including a hybrid knowledge representation. are presented and discussed. The architecture has been evaluated in the context of a nontrivial real problem: the learning of the shared attention, employing an interactive robotic head. The learning capabilities of this architecture have been analyzed by observing the robot interacting with the human and the environment. The obtained results show that the robotic architecture is able to produce appropriate behavior and to learn from social interaction. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
The main objective for this degree project is to implement an Application Availability Monitoring (AAM) system named Softek EnView for Fujitsu Services. The aim of implementing the AAM system is to proactively identify end user performance problems, such as application and site performance, before the actual end users experience them. No matter how well applications and sites are designed and nomatter how well they meet business requirements, they are useless to the end users if the performance is slow and/or unreliable. It is important for the customers to find out whether the end user problems are caused by the network or application malfunction. The Softek EnView was comprised of the following EnView components: Robot, Monitor, Reporter, Collector and Repository. The implemented system, however, is designed to use only some of these EnView elements: Robot, Reporter and depository. Robots can be placed at any key user location and are dedicated to customers, which means that when the number of customers increases, at the sametime the amount of Robots will increase. To make the AAM system ideal for the company to use, it was integrated with Fujitsu Services’ centralised monitoring system, BMC PATROL Enterprise Manager (PEM). That was actually the reason for deciding to drop the EnView Monitor element. After the system was fully implemented, the AAM system was ready for production. Transactions were (and are) written and deployed on Robots to simulate typical end user actions. These transactions are configured to run with certain intervals, which are defined collectively with customers. While they are driven against customers’ applicationsautomatically, transactions collect availability data and response time data all the time. In case of a failure in transactions, the robot immediately quits the transactionand writes detailed information to a log file about what went wrong and which element failed while going through an application. Then an alert is generated by a BMC PATROL Agent based on this data and is sent to the BMC PEM. Fujitsu Services’ monitoring room receives the alert, reacts to it according to the incident management process in ITIL and by alerting system specialists on critical incidents to resolve problems. As a result of the data gathered by the Robots, weekly reports, which contain detailed statistics and trend analyses of ongoing quality of IT services, is provided for the Customers.
Resumo:
The motivation for this thesis work is the need for improving reliability of equipment and quality of service to railway passengers as well as a requirement for cost-effective and efficient condition maintenance management for rail transportation. This thesis work develops a fusion of various machine vision analysis methods to achieve high performance in automation of wooden rail track inspection.The condition monitoring in rail transport is done manually by a human operator where people rely on inference systems and assumptions to develop conclusions. The use of conditional monitoring allows maintenance to be scheduled, or other actions to be taken to avoid the consequences of failure, before the failure occurs. Manual or automated condition monitoring of materials in fields of public transportation like railway, aerial navigation, traffic safety, etc, where safety is of prior importance needs non-destructive testing (NDT).In general, wooden railway sleeper inspection is done manually by a human operator, by moving along the rail sleeper and gathering information by visual and sound analysis for examining the presence of cracks. Human inspectors working on lines visually inspect wooden rails to judge the quality of rail sleeper. In this project work the machine vision system is developed based on the manual visual analysis system, which uses digital cameras and image processing software to perform similar manual inspections. As the manual inspection requires much effort and is expected to be error prone sometimes and also appears difficult to discriminate even for a human operator by the frequent changes in inspected material. The machine vision system developed classifies the condition of material by examining individual pixels of images, processing them and attempting to develop conclusions with the assistance of knowledge bases and features.A pattern recognition approach is developed based on the methodological knowledge from manual procedure. The pattern recognition approach for this thesis work was developed and achieved by a non destructive testing method to identify the flaws in manually done condition monitoring of sleepers.In this method, a test vehicle is designed to capture sleeper images similar to visual inspection by human operator and the raw data for pattern recognition approach is provided from the captured images of the wooden sleepers. The data from the NDT method were further processed and appropriate features were extracted.The collection of data by the NDT method is to achieve high accuracy in reliable classification results. A key idea is to use the non supervised classifier based on the features extracted from the method to discriminate the condition of wooden sleepers in to either good or bad. Self organising map is used as classifier for the wooden sleeper classification.In order to achieve greater integration, the data collected by the machine vision system was made to interface with one another by a strategy called fusion. Data fusion was looked in at two different levels namely sensor-level fusion, feature- level fusion. As the goal was to reduce the accuracy of the human error on the rail sleeper classification as good or bad the results obtained by the feature-level fusion compared to that of the results of actual classification were satisfactory.
Resumo:
This Thesis project is a part of the all-round automation of production of concentrating solar PV/T systems Absolicon X10. ABSOLICON Solar Concentrator AB has been invented and started production of the prospective solar concentrated system Absolicon X10. The aims of this Thesis project are designing, assembling, calibrating and putting in operation the automatic measurement system intended to evaluate the shape of concentrating parabolic reflectors.On the basis of the requirements of the company administration and needs of real production process the operation conditions for the Laser testing rig were formulated. The basic concept to use laser radiation was defined.At the first step, the complex design of the whole system was made and division on the parts was defined. After the preliminary conducted simulations the function and operation conditions of the all parts were formulated.At the next steps, the detailed design of all the parts was conducted. Most components were ordered from respective companies. Some of the mechanical components were made in the workshop of the company. All parts of the Laser-testing rig were assembled and tested. Software part, which controls the Laser-testing rig work, was created on the LabVIEW basis. To tune and test software part the special simulator was designed and assembled.When all parts were assembled in the complete system, the Laser-testing rig was tested, calibrated and tuned.In the workshop of Absolicon AB, the trial measurements were conducted and Laser-testing rig was installed in the production line at the plant in Soleftea.
Resumo:
The development of robots has shown itself as a very complex interdisciplinary research field. The predominant procedure for these developments in the last decades is based on the assumption that each robot is a fully personalized project, with the direct embedding of hardware and software technologies in robot parts with no level of abstraction. Although this methodology has brought countless benefits to the robotics research, on the other hand, it has imposed major drawbacks: (i) the difficulty to reuse hardware and software parts in new robots or new versions; (ii) the difficulty to compare performance of different robots parts; and (iii) the difficulty to adapt development needs-in hardware and software levels-to local groups expertise. Large advances might be reached, for example, if physical parts of a robot could be reused in a different robot constructed with other technologies by other researcher or group. This paper proposes a framework for robots, TORP (The Open Robot Project), that aims to put forward a standardization in all dimensions (electrical, mechanical and computational) of a robot shared development model. This architecture is based on the dissociation between the robot and its parts, and between the robot parts and their technologies. In this paper, the first specification for a TORP family and the first humanoid robot constructed following the TORP specification set are presented, as well as the advances proposed for their improvement.
Resumo:
O artigo aborda problemas filosóficos relativos à natureza da intencionalidade e da representação mental. A primeira parte apresenta um breve histórico dos problemas, percorrendo rapidamente alguns episódios da filosofia clássica e da filosofia contemporânea. A segunda parte examina o Chinese Room Argument (Argumento do Quarto do Chinês) formulado por J. Searle. A terceira parte desenvolve alguns argumentos visando mostrar a inadequação do modelo funcionalista de mente na construção de robots. A conclusão (quarta parte) aponta algumas alternativas ao modelo funcionalista tradicional, como, por exemplo, o conexionismo.
Resumo:
This paper describes the implementation of a multi-interface module (I2M) for automation of industrial processes, based on the IEEE1451 standard. Process automation with I2M can communicate through either wires or using wireless communication, without any hardware or software changes. We used FPGA resources to implement the I2M functions FPGA, with a NIOS II processor and ZigBee communication system (IEEE802.15), as well as RS232 serial standard. Part of the project was done in the SOPC Builder environment, which gave the designer flexibility and speed to implement the NIOS II-based microprocessor system. To test the I2M implementation, a didactic Industrial Hydraulic Module (MHI-01) was used to simulate two industrial processes to be controlled by the system proposed.
Resumo:
In this article, an implementation of structural health monitoring process automation based on vibration measurements is proposed. The work presents an alternative approach which intent is to exploit the capability of model updating techniques associated to neural networks to be used in a process of automation of fault detection. The updating procedure supplies a reliable model which permits to simulate any damage condition in order to establish direct correlation between faults and deviation in the response of the model. The ability of the neural networks to recognize, at known signature, changes in the actual data of a model in real time are explored to investigate changes of the actual operation conditions of the system. The learning of the network is performed using a compressed spectrum signal created for each specific type of fault. Different fault conditions for a frame structure are evaluated using simulated data as well as measured experimental data.
Resumo:
The constant increase in digital systems complexity definitely demands the automation of the corresponding synthesis process. This paper presents a computational environment designed to produce both software and hardware implementations of a system. The tool for code generation has been named ACG8051. As for the hardware synthesis there has been produced a larger environment consisting of four programs, namely: PIPE2TAB, AGPS, TABELA, and TAB2VHDL. ACG8051 and PIPE2TAB use place/transition net descriptions from PIPE as inputs. ACG8051 is aimed at generating assembly code for the 8051 micro-controller. PIPE2TAB produces a tabular version of a Mealy type finite state machine of the system, its output is fed into AGPS that is used for state allocation. The resulting digital system is then input to TABELA, which minimizes control functions and outputs of the digital system. Finally, the output generated by TABELA is fed to TAB2VHDL that produces a VHDL description of the system at the register transfer level. Thus, we present here a set of tools designed to take a high-level description of a digital system, represented by a place/transition net, and produces as output both an assembly code that can be immediately run on an 8051 micro-controller, and a VHDL description that can be used to directly implement the hardware parts either on an FPGA or as an ASIC.