929 resultados para Autonomous Robotic Systems. Autonomous Sailboats. Software Architecture


Relevância:

50.00% 50.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This thesis reports on a novel method to build a 3-D model of the above-water portion of icebergs using surface imaging. The goal is to work towards the automation of iceberg surveys, allowing an Autonomous Surface Craft (ASC) to acquire shape and size information. After collecting data and images, the core software algorithm is made up of three parts: occluding contour finding, volume intersection, and parameter estimation. A software module is designed that could be used on the ASC to perform automatic and fast processing of above-water surface image data to determine iceberg shape and size measurement and determination. The resolution of the method is calculated using data from the iceberg database of the Program of Energy Research and Development (PERD). The method was investigated using data from field trials conducted through the summer of 2014 by surveying 8 icebergs during 3 expeditions. The results were analyzed to determine iceberg characteristics. Limitations of this method are addressed including its accuracy. Surface imaging system and LIDAR system are developed to profile the above-water iceberg in 2015.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

In this paper a surgical robotic device for cochlear implantation surgery is described that is able to discriminate tissue interfaces and other controlling parameters ahead of a drill tip. The advantage in surgery is that tissues at interfaces can be preserved. The smart tool is able to control interaction with respect to the flexing tissue to avoid penetration control the extent of protrusion with respect to the real-time position of the tissue. To interpret drilling conditions, and conditions leading up to breakthrough at a tissue interface, the sensing scheme used enables discrimination between the variety of conditions posed in the drilling environment. The result is a robust fully autonomous system able to respond to tissue type, behaviour and deflection in real-time. The paper describes the robotic tool that has been designed to be used in the surgical environment where it has been used in the operating room.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

MEDEIROS, Adelardo A. D.A survey of control architectures for autonomous mobile robots. J. Braz. Comp. Soc., Campinas, v. 4, n. 3, abr. 1998 .Disponível em: Acesso: 27 set. 2010.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

In this paper various techniques in relation to large-scale systems are presented. At first, explanation of large-scale systems and differences from traditional systems are given. Next, possible specifications and requirements on hardware and software are listed. Finally, examples of large-scale systems are presented.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The present paper describes a system for the construction of visual maps ("mosaics") and motion estimation for a set of AUVs (Autonomous Underwater Vehicles). Robots are equipped with down-looking camera which is used to estimate their motion with respect to the seafloor and built an online mosaic. As the mosaic increases in size, a systematic bias is introduced in its alignment, resulting in an erroneous output. The theoretical concepts associated with the use of an Augmented State Kalman Filter (ASKF) were applied to optimally estimate both visual map and the fleet position.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

MEDEIROS, Adelardo A. D.A survey of control architectures for autonomous mobile robots. J. Braz. Comp. Soc., Campinas, v. 4, n. 3, abr. 1998 .Disponível em: Acesso: 27 set. 2010.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Recent developments in automation, robotics and artificial intelligence have given a push to a wider usage of these technologies in recent years, and nowadays, driverless transport systems are already state-of-the-art on certain legs of transportation. This has given a push for the maritime industry to join the advancement. The case organisation, AAWA initiative, is a joint industry-academia research consortium with the objective of developing readiness for the first commercial autonomous solutions, exploiting state-of-the-art autonomous and remote technology. The initiative develops both autonomous and remote operation technology for navigation, machinery, and all on-board operating systems. The aim of this study is to develop a model with which to estimate and forecast the operational costs, and thus enable comparisons between manned and autonomous cargo vessels. The building process of the model is also described and discussed. Furthermore, the model’s aim is to track and identify the critical success factors of the chosen ship design, and to enable monitoring and tracking of the incurred operational costs as the life cycle of the vessel progresses. The study adopts the constructive research approach, as the aim is to develop a construct to meet the needs of a case organisation. Data has been collected through discussions and meeting with consortium members and researchers, as well as through written and internal communications material. The model itself is built using activity-based life cycle costing, which enables both realistic cost estimation and forecasting, as well as the identification of critical success factors due to the process-orientation adopted from activity-based costing and the statistical nature of Monte Carlo simulation techniques. As the model was able to meet the multiple aims set for it, and the case organisation was satisfied with it, it could be argued that activity-based life cycle costing is the method with which to conduct cost estimation and forecasting in the case of autonomous cargo vessels. The model was able to perform the cost analysis and forecasting, as well as to trace the critical success factors. Later on, it also enabled, albeit hypothetically, monitoring and tracking of the incurred costs. By collecting costs this way, it was argued that the activity-based LCC model is able facilitate learning from and continuous improvement of the autonomous vessel. As with the building process of the model, an individual approach was chosen, while still using the implementation and model building steps presented in existing literature. This was due to two factors: the nature of the model and – perhaps even more importantly – the nature of the case organisation. Furthermore, the loosely organised network structure means that knowing the case organisation and its aims is of great importance when conducting a constructive research.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Miniaturization of power generators to the MEMS scale, based on the hydrogen-air fuel cell, is the object of this research. The micro fuel cell approach has been adopted for advantages of both high power and energy densities. On-board hydrogen production/storage and an efficient control scheme that facilitates integration with a fuel cell membrane electrode assembly (MEA) are key elements for micro energy conversion. Millimeter-scale reactors (ca. 10 µL) have been developed, for hydrogen production through hydrolysis of CaH2 and LiAlH4, to yield volumetric energy densities of the order of 200 Whr/L. Passive microfluidic control schemes have been implemented in order to facilitate delivery, self-regulation, and at the same time eliminate bulky auxiliaries that run on parasitic power. One technique uses surface tension to pump water in a microchannel for hydrolysis and is self-regulated, based on load, by back pressure from accumulated hydrogen acting on a gas-liquid microvalve. This control scheme improves uniformity of power delivery during long periods of lower power demand, with fast switching to mass transport regime on the order of seconds, thus providing peak power density of up to 391.85 W/L. Another method takes advantage of water recovery by backward transport through the MEA, of water vapor that is generated at the cathode half-cell reaction. This regulation-free scheme increases available reactor volume to yield energy density of 313 Whr/L, and provides peak power density of 104 W/L. Prototype devices have been tested for a range of duty periods from 2-24 hours, with multiple switching of power demand in order to establish operation across multiple regimes. Issues identified as critical to the realization of the integrated power MEMS include effects of water transport and byproduct hydrate swelling on hydrogen production in the micro reactor, and ambient relative humidity on fuel cell performance.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This thesis reports on an investigation of the feasibility and usefulness of incorporating dynamic management facilities for managing sensed context data in a distributed contextaware mobile application. The investigation focuses on reducing the work required to integrate new sensed context streams in an existing context aware architecture. Current architectures require integration work for new streams and new contexts that are encountered. This means of operation is acceptable for current fixed architectures. However, as systems become more mobile the number of discoverable streams increases. Without the ability to discover and use these new streams the functionality of any given device will be limited to the streams that it knows how to decode. The integration of new streams requires that the sensed context data be understood by the current application. If the new source provides data of a type that an application currently requires then the new source should be connected to the application without any prior knowledge of the new source. If the type is similar and can be converted then this stream too should be appropriated by the application. Such applications are based on portable devices (phones, PDAs) for semi-autonomous services that use data from sensors connected to the devices, plus data exchanged with other such devices and remote servers. Such applications must handle input from a variety of sensors, refining the data locally and managing its communication from the device in volatile and unpredictable network conditions. The choice to focus on locally connected sensory input allows for the introduction of privacy and access controls. This local control can determine how the information is communicated to others. This investigation focuses on the evaluation of three approaches to sensor data management. The first system is characterised by its static management based on the pre-pended metadata. This was the reference system. Developed for a mobile system, the data was processed based on the attached metadata. The code that performed the processing was static. The second system was developed to move away from the static processing and introduce a greater freedom of handling for the data stream, this resulted in a heavy weight approach. The approach focused on pushing the processing of the data into a number of networked nodes rather than the monolithic design of the previous system. By creating a separate communication channel for the metadata it is possible to be more flexible with the amount and type of data transmitted. The final system pulled the benefits of the other systems together. By providing a small management class that would load a separate handler based on the incoming data, Dynamism was maximised whilst maintaining ease of code understanding. The three systems were then compared to highlight their ability to dynamically manage new sensed context. The evaluation took two approaches, the first is a quantitative analysis of the code to understand the complexity of the relative three systems. This was done by evaluating what changes to the system were involved for the new context. The second approach takes a qualitative view of the work required by the software engineer to reconfigure the systems to provide support for a new data stream. The evaluation highlights the various scenarios in which the three systems are most suited. There is always a trade-o↵ in the development of a system. The three approaches highlight this fact. The creation of a statically bound system can be quick to develop but may need to be completely re-written if the requirements move too far. Alternatively a highly dynamic system may be able to cope with new requirements but the developer time to create such a system may be greater than the creation of several simpler systems.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

As unmanned autonomous vehicles (UAVs) are being widely utilized in military and civil applications, concerns are growing about mission safety and how to integrate dierent phases of mission design. One important barrier to a coste ective and timely safety certication process for UAVs is the lack of a systematic approach for bridging the gap between understanding high-level commander/pilot intent and implementation of intent through low-level UAV behaviors. In this thesis we demonstrate an entire systems design process for a representative UAV mission, beginning from an operational concept and requirements and ending with a simulation framework for segments of the mission design, such as path planning and decision making in collision avoidance. In this thesis, we divided this complex system into sub-systems; path planning, collision detection and collision avoidance. We then developed software modules for each sub-system

Relevância:

50.00% 50.00%

Publicador:

Resumo:

There are few regulatory restrictions involving the use of fully autonomous unmanned aerial systems in unpopulated, farming areas of Australia. The combination of a fully autonomous aerial and ground systems would provide efficient and cost effective retrieval of soil and vegetation data for use in precision agriculture. The aerial system will survey the site and collect spectral imagery to analyse plant density, stress and nutrition. The ground sensors will collect soil moisture content readings throughout the site. The data from both systems will be collated at a central base station. The base station will also provide housing and interface with the aerial system.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

To exploit the full potential of radio measurements of cosmic-ray air showers at MHz frequencies, a detector timing synchronization within 1 ns is needed. Large distributed radio detector arrays such as the Auger Engineering Radio Array (AERA) rely on timing via the Global Positioning System (GPS) for the synchronization of individual detector station clocks. Unfortunately, GPS timing is expected to have an accuracy no better than about 5 ns. In practice, in particular in AERA, the GPS clocks exhibit drifts on the order of tens of ns. We developed a technique to correct for the GPS drifts, and an independent method is used to cross-check that indeed we reach a nanosecond-scale timing accuracy by this correction. First, we operate a "beacon transmitter" which emits defined sine waves detected by AERA antennas recorded within the physics data. The relative phasing of these sine waves can be used to correct for GPS clock drifts. In addition to this, we observe radio pulses emitted by commercial airplanes, the position of which we determine in real time from Automatic Dependent Surveillance Broadcasts intercepted with a software-defined radio. From the known source location and the measured arrival times of the pulses we determine relative timing offsets between radio detector stations. We demonstrate with a combined analysis that the two methods give a consistent timing calibration with an accuracy of 2 ns or better. Consequently, the beacon method alone can be used in the future to continuously determine and correct for GPS clock drifts in each individual event measured by AERA.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

In this paper, by using a novel approach, we first prove a new generalization of discrete-type Halanay inequality. Based on our new generalized inequality, a novel criterion for the exponential stability of a certain class of nonlinear non-autonomous difference equations is proposed. Numerical examples are given to illustrate the effectiveness of the obtained results.