898 resultados para desig automation of robots


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we study binary differential equations a(x, y)dy (2) + 2b(x, y) dx dy + c(x, y)dx (2) = 0, where a, b, and c are real analytic functions. Following the geometric approach of Bruce and Tari in their work on multiplicity of implicit differential equations, we introduce a definition of the index for this class of equations that coincides with the classical Hopf`s definition for positive binary differential equations. Our results also apply to implicit differential equations F(x, y, p) = 0, where F is an analytic function, p = dy/dx, F (p) = 0, and F (pp) not equal aEuro parts per thousand 0 at the singular point. For these equations, we relate the index of the equation at the singular point with the index of the gradient of F and index of the 1-form omega = dy -aEuro parts per thousand pdx defined on the singular surface F = 0.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The issue of how children learn the meaning of words is fundamental to developmental psychology. The recent attempts to develop or evolve efficient communication protocols among interacting robots or Virtual agents have brought that issue to a central place in more applied research fields, such as computational linguistics and neural networks, as well. An attractive approach to learning an object-word mapping is the so-called cross-situational learning. This learning scenario is based on the intuitive notion that a learner can determine the meaning of a word by finding something in common across all observed uses of that word. Here we show how the deterministic Neural Modeling Fields (NMF) categorization mechanism can be used by the learner as an efficient algorithm to infer the correct object-word mapping. To achieve that we first reduce the original on-line learning problem to a batch learning problem where the inputs to the NMF mechanism are all possible object-word associations that Could be inferred from the cross-situational learning scenario. Since many of those associations are incorrect, they are considered as clutter or noise and discarded automatically by a clutter detector model included in our NMF implementation. With these two key ingredients - batch learning and clutter detection - the NMF mechanism was capable to infer perfectly the correct object-word mapping. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work describes the development and optimization of a sequential injection method to automate the determination of paraquat by square-wave voltammetry employing a hanging mercury drop electrode. Automation by sequential injection enhanced the sampling throughput, improving the sensitivity and precision of the measurements as a consequence of the highly reproducible and efficient conditions of mass transport of the analyte toward the electrode surface. For instance, 212 analyses can be made per hour if the sample/standard solution is prepared off-line and the sequential injection system is used just to inject the solution towards the flow cell. In-line sample conditioning reduces the sampling frequency to 44 h(-1). Experiments were performed in 0.10 M NaCl, which was the carrier solution, using a frequency of 200 Hz, a pulse height of 25 mV, a potential step of 2 mV, and a flow rate of 100 mu L s(-1). For a concentration range between 0.010 and 0.25 mg L(-1), the current (i(p), mu A) read at the potential corresponding to the peak maximum fitted the following linear equation with the paraquat concentration (mg L(-1)): ip = (-20.5 +/- 0.3) Cparaquat -(0.02 +/- 0.03). The limits of detection and quantification were 2.0 and 7.0 mu g L(-1), respectively. The accuracy of the method was evaluated by recovery studies using spiked water samples that were also analyzed by molecular absorption spectrophotometry after reduction of paraquat with sodium dithionite in an alkaline medium. No evidence of statistically significant differences between the two methods was observed at the 95% confidence level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Direct analysis, with minimal sample pretreatment, of antidepressant drugs, fluoxetine, imipramine, desipramine, amitriptyline, and nortriptyline in biofluids was developed with a total run time of 8 min. The setup consists of two HPLC pumps, injection valve, capillary RAM-ADS-C18 pre-column and a capillary analytical C 18 column connected by means of a six-port valve in backflush mode. Detection was performed with ESI-MS/MS and only 1 mu m of sample was injected. Validation was adequately carried out using FLU-d(5) as internal standard. Calibration curves were constructed under a linear range of 1-250 ng mL(-1) in plasma, being the limit of quantification (LOQ), determined as 1 ng mL(-1), for all the analytes. With the described approach it was possible to reach a quantified mass sensitivity of 0.3 pg for each analyte (equivalent to 1.1-1.3 fmol), translating to a lower sample consumption (in the order of 103 less sample than using conventional methods). (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Expressing contractual agreements electronically potentially allows agents to automatically perform functions surrounding contract use: establishment, fulfilment, renegotiation etc. For such automation to be used for real business concerns, there needs to be a high level of trust in the agent-based system. While there has been much research on simulating trust between agents, there are areas where such trust is harder to establish. In particular, contract proposals may come from parties that an agent has had no prior interaction with and, in competitive business-to-business environments, little reputation information may be available. In human practice, trust in a proposed contract is determined in part from the content of the proposal itself, and the similarity of the content to that of prior contracts, executed to varying degrees of success. In this paper, we argue that such analysis is also appropriate in automated systems, and to provide it we need systems to record salient details of prior contract use and algorithms for assessing proposals on their content. We use provenance technology to provide the former and detail algorithms for measuring contract success and similarity for the latter, applying them to an aerospace case study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Drinking water utilities in urban areas are focused on finding smart solutions facing new challenges in their real-time operation because of limited water resources, intensive energy requirements, a growing population, a costly and ageing infrastructure, increasingly stringent regulations, and increased attention towards the environmental impact of water use. Such challenges force water managers to monitor and control not only water supply and distribution, but also consumer demand. This paper presents and discusses novel methodologies and procedures towards an integrated water resource management system based on advanced ICT technologies of automation and telecommunications for largely improving the efficiency of drinking water networks (DWN) in terms of water use, energy consumption, water loss minimization, and water quality guarantees. In particular, the paper addresses the first results of the European project EFFINET (FP7-ICT2011-8-318556) devoted to the monitoring and control of the DWN in Barcelona (Spain). Results are split in two levels according to different management objectives: (i) the monitoring level is concerned with all the aspects involved in the observation of the current state of a system and the detection/diagnosis of abnormal situations. It is achieved through sensors and communications technology, together with mathematical models; (ii) the control level is concerned with computing the best suitable and admissible control strategies for network actuators as to optimize a given set of operational goals related to the performance of the overall system. This level covers the network control (optimal management of water and energy) and the demand management (smart metering, efficient supply). The consideration of the Barcelona DWN as the case study will allow to prove the general applicability of the proposed integrated ICT solutions and their effectiveness in the management of DWN, with considerable savings of electricity costs and reduced water loss while ensuring the high European standards of water quality to citizens.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work described in this thesis aims to support the distributed design of integrated systems and considers specifically the need for collaborative interaction among designers. Particular emphasis was given to issues which were only marginally considered in previous approaches, such as the abstraction of the distribution of design automation resources over the network, the possibility of both synchronous and asynchronous interaction among designers and the support for extensible design data models. Such issues demand a rather complex software infrastructure, as possible solutions must encompass a wide range of software modules: from user interfaces to middleware to databases. To build such structure, several engineering techniques were employed and some original solutions were devised. The core of the proposed solution is based in the joint application of two homonymic technologies: CAD Frameworks and object-oriented frameworks. The former concept was coined in the late 80's within the electronic design automation community and comprehends a layered software environment which aims to support CAD tool developers, CAD administrators/integrators and designers. The latter, developed during the last decade by the software engineering community, is a software architecture model to build extensible and reusable object-oriented software subsystems. In this work, we proposed to create an object-oriented framework which includes extensible sets of design data primitives and design tool building blocks. Such object-oriented framework is included within a CAD Framework, where it plays important roles on typical CAD Framework services such as design data representation and management, versioning, user interfaces, design management and tool integration. The implemented CAD Framework - named Cave2 - followed the classical layered architecture presented by Barnes, Harrison, Newton and Spickelmier, but the possibilities granted by the use of the object-oriented framework foundations allowed a series of improvements which were not available in previous approaches: - object-oriented frameworks are extensible by design, thus this should be also true regarding the implemented sets of design data primitives and design tool building blocks. This means that both the design representation model and the software modules dealing with it can be upgraded or adapted to a particular design methodology, and that such extensions and adaptations will still inherit the architectural and functional aspects implemented in the object-oriented framework foundation; - the design semantics and the design visualization are both part of the object-oriented framework, but in clearly separated models. This allows for different visualization strategies for a given design data set, which gives collaborating parties the flexibility to choose individual visualization settings; - the control of the consistency between semantics and visualization - a particularly important issue in a design environment with multiple views of a single design - is also included in the foundations of the object-oriented framework. Such mechanism is generic enough to be also used by further extensions of the design data model, as it is based on the inversion of control between view and semantics. The view receives the user input and propagates such event to the semantic model, which evaluates if a state change is possible. If positive, it triggers the change of state of both semantics and view. Our approach took advantage of such inversion of control and included an layer between semantics and view to take into account the possibility of multi-view consistency; - to optimize the consistency control mechanism between views and semantics, we propose an event-based approach that captures each discrete interaction of a designer with his/her respective design views. The information about each interaction is encapsulated inside an event object, which may be propagated to the design semantics - and thus to other possible views - according to the consistency policy which is being used. Furthermore, the use of event pools allows for a late synchronization between view and semantics in case of unavailability of a network connection between them; - the use of proxy objects raised significantly the abstraction of the integration of design automation resources, as either remote or local tools and services are accessed through method calls in a local object. The connection to remote tools and services using a look-up protocol also abstracted completely the network location of such resources, allowing for resource addition and removal during runtime; - the implemented CAD Framework is completely based on Java technology, so it relies on the Java Virtual Machine as the layer which grants the independence between the CAD Framework and the operating system. All such improvements contributed to a higher abstraction on the distribution of design automation resources and also introduced a new paradigm for the remote interaction between designers. The resulting CAD Framework is able to support fine-grained collaboration based on events, so every single design update performed by a designer can be propagated to the rest of the design team regardless of their location in the distributed environment. This can increase the group awareness and allow a richer transfer of experiences among them, improving significantly the collaboration potential when compared to previously proposed file-based or record-based approaches. Three different case studies were conducted to validate the proposed approach, each one focusing one a subset of the contributions of this thesis. The first one uses the proxy-based resource distribution architecture to implement a prototyping platform using reconfigurable hardware modules. The second one extends the foundations of the implemented object-oriented framework to support interface-based design. Such extensions - design representation primitives and tool blocks - are used to implement a design entry tool named IBlaDe, which allows the collaborative creation of functional and structural models of integrated systems. The third case study regards the possibility of integration of multimedia metadata to the design data model. Such possibility is explored in the frame of an online educational and training platform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, the development of intelligent agents intends to be more refined, using improved architectures and reasoning mechanisms. Revise the beliefs of an agent is also an important subject, due to the consistency that agents should have about their knowledge. In this work we propose deliberative and argumentative agents using Lego Mindstorms robots, Argumentative NXT BDI-like Agents. These agents are built using the notions of the BDI model and they are capable to reason using the DeLP formalism. They update their knowledge base with their perceptions and revise it when necessary. Two variations are presented: the Single Argumentative NXT BDI-like Agent and the MAS Argumentative NXT BDI-like Agent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SOUZA, Anderson A. S. ; SANTANA, André M. ; BRITTO, Ricardo S. ; GONÇALVES, Luiz Marcos G. ; MEDEIROS, Adelardo A. D. Representation of Odometry Errors on Occupancy Grids. In: INTERNATIONAL CONFERENCE ON INFORMATICS IN CONTROL, AUTOMATION AND ROBOTICS, 5., 2008, Funchal, Portugal. Proceedings... Funchal, Portugal: ICINCO, 2008.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the study was to evaluate the production of four tomato cultivars of determinate growth, in the field, with and without ground cover. The cultivars examined were: 'AP-533', 'AP-529', 'Hypeel 5131 and 'PS-41816', which were cultivated with a spacing of 1 m between rows and 0.4 m between plants. The experimental design utilized was that of subdivided parcels, where parcels were provided with and without a ground cover of black polyethylene, and subparcels consisted of the cultivars; eight repetitions were used. The eight central plants in the parcel were evaluated according to the following characteristics: fresh weight of fruit per plant and number of fruit per plant, separated by size. For the classification of fruit, the same plants were divided into classes based on the diameter of the fruit. The diameter size was as follows: class 1, 3 cm; class II, 4 cm; class III, 4.5 cm; and class IV, 5.5 cm. The results showed that there were no significant differences in production among the cultivars studied. Cultivars 'AP-533' and 'Hypeel 513' demonstrated greater production per plant, when ground cover was used, yielding 1474 and 1404 g/plant, respectively. When black polyethylene ground cover was not used, these cultivars produced 1258 and 1271 g of fruit per plant, respectively. The cultivars 'AP-529' and 'PS41818' showed a greater yield of fruit when cultivated without ground cover, with 1548 and 1663 g/plant, respectively. The cultivar 'Hypeel 513' had the highest percentage of fruit with the largest dimensions. An interaction was appeared between 'AP-529' and ground cover, where the production of this cultivar was lower with black polyethylene ground cover compared to bare ground. It is concluded that there are no significant differences in the cultivation of the tomato hybrids 'AP-533', 'Hypeel 513' and 'PS-41816' with and without ground cover, but that the cultivar 'AP-529' should be planted preferentially in soil without black polyethylene cover.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Informatics and automation are important tools for the reduction of work, errors and costs in a hospital pharmacy. OBJECTIVES: To describe the structuring and function of an informatized system for the dispensing of medications and to assess its effect on nursing and pharmacy services during the period from 1997 to 2003. MATERIALS and METHODS: In this descriptive and retrospective study, we performed an analysis of documents addressing the structuring and implementation of the informatized medication dispensing system. In addition, we analyzed the perceptions of nurses, pharmacists and pharmacy assistants who participated in the structuring phase of the system when interviewed about the effect of informatization on administrative aspects (e.g., requisition of medications, presentation of the dispensed medication and system operationalization). RESULTS: The major advantages provided by the new system were 1) the elimination of manual transcripts for prescribed medications, 2) increased speed, 3) better identification of the doses prescribed by physicians, 4) medication labels containing all necessary identification and 5) practicality and safety of optical bar code-based verification of the requested and dispensed medications. CONCLUSIONS: The great majority of the interviewees considered the informatized medication supply system to be of good quality. Analysis of the data provided information that could contribute to the expansion and refinement of the system, provide support for studies regarding the utilization of medications and offer new perspectives for work and productivity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work shows the design, simulation, and analysis of two optical interconnection networks for a Dataflow parallel computer architecture. To verify the optical interconnection network performance on the Dataflow architecture, we have analyzed the load balancing among the processors during the parallel programs executions. The load balancing is a very important parameter because it is directly associated to the dataflow parallelism degree. This article proves that optical interconnection networks designed with simple optical devices can provide efficiently the dataflow requirements of a high performance communication system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Esse trabalho tem por objetivo o desenvolvimento de um sistema inteligente para detecção da queima no processo de retificação tangencial plana através da utilização de uma rede neural perceptron multi camadas, treinada para generalizar o processo e, conseqüentemente, obter o limiar de queima. em geral, a ocorrência da queima no processo de retificação pode ser detectada pelos parâmetros DPO e FKS. Porém esses parâmetros não são eficientes nas condições de usinagem usadas nesse trabalho. Os sinais de emissão acústica e potência elétrica do motor de acionamento do rebolo são variáveis de entrada e a variável de saída é a ocorrência da queima. No trabalho experimental, foram empregados um tipo de aço (ABNT 1045 temperado) e um tipo de rebolo denominado TARGA, modelo ART 3TG80.3 NVHB.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research investigated the underuse of technological tools by innovative organizations which are acknowledged for their use of and familiarity with new technologies. The research conducted an analysis of 58 institutional repositories (IRs) out of 43 educational and research institutions which are internationally renowned for excellence. The core aspect of the analysis was the use of IRs for publishing and dealing with evidence in order to legitimize and add value to scientific research. The following items were analyzed: (a) the logical structuring of scientific communication published in the IRs; (b) the metadata which describe scientific communication based on the terms of the DCMI protocol used; (c) the availability of software functions which facilitate the queries and publication of evidences. Results show that the introduction of IRs did not add value to the quality of research in\ terms of associating and publishing evidence that could back them up. A strong tendency to replicate the traditional library model of physical collections was observed. It was concluded that merely possessing good technological tools is not sufficient for fostering innovation and strategic gains in organizations, even if their implementation takes place in highly promising and favorable environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prevalence and dissemination of Salmonella in a Brazilian poultry slaughterhouse were evaluated by three rapid detection systems (SS/SV(TM), VICAM, OSRT(TM), Unipath/Oxoid, and REVEAL(TM), Neogen), plus the conventional procedure. The carcasses were sampled after bleeding (P1), defeathering (P2), evisceration (P3), washing (P4), chilling (P5) and the packaged end-product (P6). In the first set of carcasses, the Salmonella incidence determined by the conventional method was 38.3% and 22.5% by SS/SV(TM). In the set for evaluation of OSRT(TM), the number of positive samples was the same detected by the cultural procedure (49.0%). In the third set, the positivity by the conventional procedure was 33.3%, and 5.0% by REVEAL(TM). The comparisons of positives in the first and third sets of carcasses were significantly different (P < 0.05). The positivity for Salmonella, in carcasses at P1 to P6, as determined by at least one of the methods, was 47.5%, 47.5%, 32.5%, 30.0%, 30.0% and 37.7%, respectively.