933 resultados para Educational Robotics. Low Cost Robot. Audio Interface. Smartphones
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-07
Resumo:
A presente dissertação propõe o desenvolvimento de um sistema de Irrigação de baixo custo para campos de Golfe. Este sistema é capaz de recolher a previsão meteorológica e ainda medir um conjunto de valores (temperatura, humidade, velocidade do vento) que determina quando e quanto regar. Os campos de Golfe consumem diariamente elevadas quantidades de água, sendo esta a principal crítica feita pelas organizações ambientais. Esta dissertação incorpora uma comunicação sem fios de baixo custo, que dispensa a cablagem que é necessária para haver comunicação entre os diversos equipamentos, que estão distribuídos pelo campo de Golfe. O sistema desenvolvido pretende reduzir os desperdícios dos recursos hídricos na rega, pois é um sistema inteligente que poderá ser adquirido não só por gestores de campos de Golfe, mas também por jardins residenciais e municipais. Com o objetivo de criar um sistema de baixo custo foi elaborado um algoritmo de reencaminhamento de mensagens, que permite utilizar equipamentos de comunicação sem fios de baixo custo. Todo o sistema de Irrigação é controlado e monitorizado através de uma interface, desenvolvida em Microsoft Visual Basic.
Resumo:
Security Onion is a Network Security Manager (NSM) platform that provides multiple Intrusion Detection Systems (IDS) including Host IDS (HIDS) and Network IDS (NIDS). Many types of data can be acquired using Security Onion for analysis. This includes data related to: Host, Network, Session, Asset, Alert and Protocols. Security Onion can be implemented as a standalone deployment with server and sensor included or with a master server and multiple sensors allowing for the system to be scaled as required. Many interfaces and tools are available for management of the system and analysis of data such as Sguil, Snorby, Squert and Enterprise Log Search and Archive (ELSA). These interfaces can be used for analysis of alerts and captured events and then can be further exported for analysis in Network Forensic Analysis Tools (NFAT) such as NetworkMiner, CapME or Xplico. The Security Onion platform also provides various methods of management such as Secure SHell (SSH) for management of server and sensors and Web client remote access. All of this with the ability to replay and analyse example malicious traffic makes the Security Onion a suitable low cost alternative for Network Security Management. In this paper, we have a feature and functionality review for the Security Onion in terms of: types of data, configuration, interface, tools and system management.
Resumo:
In today’s big data world, data is being produced in massive volumes, at great velocity and from a variety of different sources such as mobile devices, sensors, a plethora of small devices hooked to the internet (Internet of Things), social networks, communication networks and many others. Interactive querying and large-scale analytics are being increasingly used to derive value out of this big data. A large portion of this data is being stored and processed in the Cloud due the several advantages provided by the Cloud such as scalability, elasticity, availability, low cost of ownership and the overall economies of scale. There is thus, a growing need for large-scale cloud-based data management systems that can support real-time ingest, storage and processing of large volumes of heterogeneous data. However, in the pay-as-you-go Cloud environment, the cost of analytics can grow linearly with the time and resources required. Reducing the cost of data analytics in the Cloud thus remains a primary challenge. In my dissertation research, I have focused on building efficient and cost-effective cloud-based data management systems for different application domains that are predominant in cloud computing environments. In the first part of my dissertation, I address the problem of reducing the cost of transactional workloads on relational databases to support database-as-a-service in the Cloud. The primary challenges in supporting such workloads include choosing how to partition the data across a large number of machines, minimizing the number of distributed transactions, providing high data availability, and tolerating failures gracefully. I have designed, built and evaluated SWORD, an end-to-end scalable online transaction processing system, that utilizes workload-aware data placement and replication to minimize the number of distributed transactions that incorporates a suite of novel techniques to significantly reduce the overheads incurred both during the initial placement of data, and during query execution at runtime. In the second part of my dissertation, I focus on sampling-based progressive analytics as a means to reduce the cost of data analytics in the relational domain. Sampling has been traditionally used by data scientists to get progressive answers to complex analytical tasks over large volumes of data. Typically, this involves manually extracting samples of increasing data size (progressive samples) for exploratory querying. This provides the data scientists with user control, repeatable semantics, and result provenance. However, such solutions result in tedious workflows that preclude the reuse of work across samples. On the other hand, existing approximate query processing systems report early results, but do not offer the above benefits for complex ad-hoc queries. I propose a new progressive data-parallel computation framework, NOW!, that provides support for progressive analytics over big data. In particular, NOW! enables progressive relational (SQL) query support in the Cloud using unique progress semantics that allow efficient and deterministic query processing over samples providing meaningful early results and provenance to data scientists. NOW! enables the provision of early results using significantly fewer resources thereby enabling a substantial reduction in the cost incurred during such analytics. Finally, I propose NSCALE, a system for efficient and cost-effective complex analytics on large-scale graph-structured data in the Cloud. The system is based on the key observation that a wide range of complex analysis tasks over graph data require processing and reasoning about a large number of multi-hop neighborhoods or subgraphs in the graph; examples include ego network analysis, motif counting in biological networks, finding social circles in social networks, personalized recommendations, link prediction, etc. These tasks are not well served by existing vertex-centric graph processing frameworks whose computation and execution models limit the user program to directly access the state of a single vertex, resulting in high execution overheads. Further, the lack of support for extracting the relevant portions of the graph that are of interest to an analysis task and loading it onto distributed memory leads to poor scalability. NSCALE allows users to write programs at the level of neighborhoods or subgraphs rather than at the level of vertices, and to declaratively specify the subgraphs of interest. It enables the efficient distributed execution of these neighborhood-centric complex analysis tasks over largescale graphs, while minimizing resource consumption and communication cost, thereby substantially reducing the overall cost of graph data analytics in the Cloud. The results of our extensive experimental evaluation of these prototypes with several real-world data sets and applications validate the effectiveness of our techniques which provide orders-of-magnitude reductions in the overheads of distributed data querying and analysis in the Cloud.
Resumo:
This article introduces the genre of a digital audio game and discusses selected play interaction solutions implemented in the Audio Game Hub, a prototype designed and evaluated in the years 2014 and 2015 at the Gamification Lab at Leuphana University Lüneburg.1 The Audio Game Hub constitutes a set of familiar playful activities (aiming at a target, reflex-based reacting to sound signals, labyrinth exploration) and casual games (e.g. Tetris, Memory) adapted to the digital medium and converted into the audio sphere, where the player is guided predominantly or solely by sound. The authors will discuss the design questions raised at early stages of the project, and confront them with the results of user experience testing performed on two groups of sighted and one group of visually impaired gamers.
Resumo:
The Oporto Airport located in the northern region in Porto city is crucial because is the only one located in the northern region. This airport had an increasing in number of passengers, sales revenue and accumulated investment during the last two decades, principally after the introduction and the operation of the Low Cost Companies since 2004 to the present. In order to determine if the last changes had an impact in the competitiveness of this airport, the main aims is to analise the evolution of values of the technical efficiency and equate the results before and after the introduction of the LCCs in this airport. The methodology uses the Data Envelopment Analysis. Results show that the Oporto Airport efficiency increases highly after the introduction of LCCs since 2004. The main conclusions suggest the importance of the introduction of LCCs in the increasing efficiency of the Oporto Airport and the potential relation with tourism development in this region, but more strong studies are needed.
Resumo:
SANTANA, André M.; SOUZA, Anderson A. S.; BRITTO, Ricardo S.; ALSINA, Pablo J.; MEDEIROS, Adelardo A. D. Localization of a mobile robot based on odometry and natural landmarks using extended Kalman Filter. In: INTERNATIONAL CONFERENCE ON INFORMATICS IN CONTROL, AUTOMATION AND ROBOTICS, 5., 2008, Funchal, Portugal. Proceedings... Funchal, Portugal: ICINCO, 2008.
Resumo:
This thesis presents approximation algorithms for some NP-Hard combinatorial optimization problems on graphs and networks; in particular, we study problems related to Network Design. Under the widely-believed complexity-theoretic assumption that P is not equal to NP, there are no efficient (i.e., polynomial-time) algorithms that solve these problems exactly. Hence, if one desires efficient algorithms for such problems, it is necessary to consider approximate solutions: An approximation algorithm for an NP-Hard problem is a polynomial time algorithm which, for any instance of the problem, finds a solution whose value is guaranteed to be within a multiplicative factor of the value of an optimal solution to that instance. We attempt to design algorithms for which this factor, referred to as the approximation ratio of the algorithm, is as small as possible. The field of Network Design comprises a large class of problems that deal with constructing networks of low cost and/or high capacity, routing data through existing networks, and many related issues. In this thesis, we focus chiefly on designing fault-tolerant networks. Two vertices u,v in a network are said to be k-edge-connected if deleting any set of k − 1 edges leaves u and v connected; similarly, they are k-vertex connected if deleting any set of k − 1 other vertices or edges leaves u and v connected. We focus on building networks that are highly connected, meaning that even if a small number of edges and nodes fail, the remaining nodes will still be able to communicate. A brief description of some of our results is given below. We study the problem of building 2-vertex-connected networks that are large and have low cost. Given an n-node graph with costs on its edges and any integer k, we give an O(log n log k) approximation for the problem of finding a minimum-cost 2-vertex-connected subgraph containing at least k nodes. We also give an algorithm of similar approximation ratio for maximizing the number of nodes in a 2-vertex-connected subgraph subject to a budget constraint on the total cost of its edges. Our algorithms are based on a pruning process that, given a 2-vertex-connected graph, finds a 2-vertex-connected subgraph of any desired size and of density comparable to the input graph, where the density of a graph is the ratio of its cost to the number of vertices it contains. This pruning algorithm is simple and efficient, and is likely to find additional applications. Recent breakthroughs on vertex-connectivity have made use of algorithms for element-connectivity problems. We develop an algorithm that, given a graph with some vertices marked as terminals, significantly simplifies the graph while preserving the pairwise element-connectivity of all terminals; in fact, the resulting graph is bipartite. We believe that our simplification/reduction algorithm will be a useful tool in many settings. We illustrate its applicability by giving algorithms to find many trees that each span a given terminal set, while being disjoint on edges and non-terminal vertices; such problems have applications in VLSI design and other areas. We also use this reduction algorithm to analyze simple algorithms for single-sink network design problems with high vertex-connectivity requirements; we give an O(k log n)-approximation for the problem of k-connecting a given set of terminals to a common sink. We study similar problems in which different types of links, of varying capacities and costs, can be used to connect nodes; assuming there are economies of scale, we give algorithms to construct low-cost networks with sufficient capacity or bandwidth to simultaneously support flow from each terminal to the common sink along many vertex-disjoint paths. We further investigate capacitated network design, where edges may have arbitrary costs and capacities. Given a connectivity requirement R_uv for each pair of vertices u,v, the goal is to find a low-cost network which, for each uv, can support a flow of R_uv units of traffic between u and v. We study several special cases of this problem, giving both algorithmic and hardness results. In addition to Network Design, we consider certain Traveling Salesperson-like problems, where the goal is to find short walks that visit many distinct vertices. We give a (2 + epsilon)-approximation for Orienteering in undirected graphs, achieving the best known approximation ratio, and the first approximation algorithm for Orienteering in directed graphs. We also give improved algorithms for Orienteering with time windows, in which vertices must be visited between specified release times and deadlines, and other related problems. These problems are motivated by applications in the fields of vehicle routing, delivery and transportation of goods, and robot path planning.
Resumo:
Nos dias que correm a eficiência energética está na ordem do dia, havendo um esforço significativo para obter-se equipamentos cada vez mais eficientes. Uma parte significativa do consumo global de energia, bem como da emissão de gases nocivos e de efeito de estufa, está associado à climatização, quer doméstica quer industrial. Assim, é importante desenvolver tecnologias mais eficientes neste domínio. O principal objetivo deste trabalho consiste no desenvolvimento de um módulo de interface a uma sonda lambda para monitorização de combustão em caldeiras de biomassa. Este módulo permitirá medir a concentração de oxigénio presente na saída das caldeiras, possibilitando o ajuste dinâmico dos parâmetros de combustão por forma a maximizar o seu rendimento e minimizar a emissão de gases poluentes. O módulo desenvolvido é de baixo custo e apresenta uma interface bastante simples, facilitando a sua incorporação em equipamentos já existentes. Os resultados obtidos revelaram-se consistentes com os valores teóricos fornecidos pelo fabricante da sonda utilizada, podendo assim concluir-se que o trabalho foi realizado com sucesso.
Resumo:
The main task is to analyze the state of the art of grating couplers production and low-cost polymer substrates. Then to provide a recommendation of a new or adapted process for the production of metallic gratings on polymer sheets, based on a Failure Mode and Effect Analysis (FMEA). In order to achieve that, this thesis is divided into four chapters. After the first introductory chapter, the second section provides details about the state-of-the-art in optical technology platforms with focus on polymers and their main features for the aimed application, such as flexibility, low cost and roll to roll compatibility. It defines then the diffraction gratings and their specifications and closes with the explanation of adhesion mechanisms of inorganic materials on polymer substrates. The third chapter discusses processing of grating couplers. It introduces the basic fabrication methods and details a selection of current fabrication schemes found in literature with an assessment of their potential use for the desired application. The last chapter is a FMEA analysis of the retained fabrication process, called Flip and Fuse, in order to check its capability to realize the grating structure.
Resumo:
La actividad cerebral puede ser monitoreada mediante la electroencefalografía y utilizada como un indicador bioeléctrico. Este articulo muestra como un dispositivo de bajo costo y fácil acceso puede utilizarse para el desarrollo de aplicaciones basadas en interfaces cerebro-computador (BCI). Los resultados obtenidos muestran que el dispositivo MindWave puede efectivamente utilizarse para la adquisición de señales relacionadas a la actividad cerebral en diversas actividades cerebrales bajo la influencia de diversos estímulos. Se propone además el uso de la transformada Wavelet para el acondicionamiento de las señales EEG con el objetivo de utilizar algoritmos de inteligencia artificial y técnicas de reconocimiento de patrones para distinguir respuestas cerebrales.
Resumo:
This work describes preliminary results of a two-modality imaging system aimed at the early detection of breast cancer. The first technique is based on compounding conventional echographic images taken at regular angular intervals around the imaged breast. The other modality obtains tomographic images of propagation velocity using the same circular geometry. For this study, a low-cost prototype has been built. It is based on a pair of opposed 128-element, 3.2 MHz array transducers that are mechanically moved around tissue mimicking phantoms. Compounded images around 360 degrees provide improved resolution, clutter reduction, artifact suppression and reinforce the visualization of internal structures. However, refraction at the skin interface must be corrected for an accurate image compounding process. This is achieved by estimation of the interface geometry followed by computing the internal ray paths. On the other hand, sound velocity tomographic images from time of flight projections have been also obtained. Two reconstruction methods, Filtered Back Projection (FBP) and 2D Ordered Subset Expectation Maximization (2D OSEM), were used as a first attempt towards tomographic reconstruction. These methods yield useable images in short computational times that can be considered as initial estimates in subsequent more complex methods of ultrasound image reconstruction. These images may be effective to differentiate malignant and benign masses and are very promising for breast cancer screening. (C) 2015 The Authors. Published by Elsevier B.V.
Resumo:
Strawberries harvested for processing as frozen fruits are currently de-calyxed manually in the field. This process requires the removal of the stem cap with green leaves (i.e. the calyx) and incurs many disadvantages when performed by hand. Not only does it necessitate the need to maintain cutting tool sanitation, but it also increases labor time and exposure of the de-capped strawberries before in-plant processing. This leads to labor inefficiency and decreased harvest yield. By moving the calyx removal process from the fields to the processing plants, this new practice would reduce field labor and improve management and logistics, while increasing annual yield. As labor prices continue to increase, the strawberry industry has shown great interest in the development and implementation of an automated calyx removal system. In response, this dissertation describes the design, operation, and performance of a full-scale automatic vision-guided intelligent de-calyxing (AVID) prototype machine. The AVID machine utilizes commercially available equipment to produce a relatively low cost automated de-calyxing system that can be retrofitted into existing food processing facilities. This dissertation is broken up into five sections. The first two sections include a machine overview and a 12-week processing plant pilot study. Results of the pilot study indicate the AVID machine is able to de-calyx grade-1-with-cap conical strawberries at roughly 66 percent output weight yield at a throughput of 10,000 pounds per hour. The remaining three sections describe in detail the three main components of the machine: a strawberry loading and orientation conveyor, a machine vision system for calyx identification, and a synchronized multi-waterjet knife calyx removal system. In short, the loading system utilizes rotational energy to orient conical strawberries. The machine vision system determines cut locations through RGB real-time feature extraction. The high-speed multi-waterjet knife system uses direct drive actuation to locate 30,000 psi cutting streams to precise coordinates for calyx removal. Based on the observations and studies performed within this dissertation, the AVID machine is seen to be a viable option for automated high-throughput strawberry calyx removal. A summary of future tasks and further improvements is discussed at the end.
Resumo:
The objective of the work described in this dissertation is the development of new wireless passive force monitoring platforms for applications in the medical field, specifically monitoring lower limb prosthetics. The developed sensors consist of stress sensitive, magnetically soft amorphous metallic glass materials. The first technology is based on magnetoelastic resonance. Specifically, when exposed to an AC excitation field along with a constant DC bias field, the magnetoelastic material mechanically vibrates, and may reaches resonance if the field frequency matches the mechanical resonant frequency of the material. The presented work illustrates that an applied loading pins portions of the strip, effectively decreasing the strip length, which results in an increase in the frequency of the resonance. The developed technology is deployed in a prototype lower limb prosthetic sleeve for monitoring forces experienced by the distal end of the residuum. This work also reports on the development of a magnetoharmonic force sensor comprised of the same material. According to the Villari effect, an applied loading to the material results in a change in the permeability of the magnetic sensor which is visualized as an increase in the higher-order harmonic fields of the material. Specifically, by applying a constant low frequency AC field and sweeping the applied DC biasing field, the higher-order harmonic components of the magnetic response can be visualized. This sensor technology was also instrumented onto a lower limb prosthetic for proof of deployment; however, the magnetoharmonic sensor illustrated complications with sensor positioning and a necessity to tailor the interface mechanics between the sensing material and the surface being monitored. The novelty of these two technologies is in their wireless passive nature which allows for long term monitoring over the life time of a given device. Additionally, the developed technologies are low cost. Recommendations for future works include improving the system for real-time monitoring, useful for data collection outside of a clinical setting.
Resumo:
Resumo: Predição da concentração de baixo risco de diflubenzuron para organismos aquáticos e avaliação da argila e brita na redução da toxicidade. O diflubenzuron é um inseticida que além de ser usado agricultura, tem sido amplamente empregado na piscicultura, apesar do seu uso ser proibido nesta atividade. Este composto não consta na lista da legislação brasileira que estabelece limites máximos permissíveis em corpos de água para a proteção das comunidades aquáticas. No presente trabalho, a partir da toxicidade do diflubenzuron em organismos não-alvo, foi calculada a concentração de risco para somente 5% das espécies (HC5). O valor deste parâmetro foi estimado em aproximadamente 7 x 10-6 mg L-1 . Este baixo valor é devido à extremamente alta toxicidade do diflubenzuron para dafnídeos e à grande variação de sensibilidade entre as espécies testadas. Dois matérias de relativamente baixo custo se mostraram eficientes na remoção da toxicidade do diflubenzuron de soluções contendo este composto. Dentre esses materiais, a argila expandida promoveu a redução em aproximadamente 50% da toxicidade de uma solução contendo diflubenzuron. Os resultados podem contribuir para políticas públicas no Brasil relacionadas ao estabelecimento de limites máximos permissíveis de xenobióticos no compartimento aquático. Também, para a pesquisa de matérias inertes e de baixo custo com potencial de remoção de xenobióticos presentes em efluentes da aquicultura ou da agricultura. Abstract: Diflubenzuron is an insecticide that, besides being used in the agriculture, has been widely used in fish farming. However, its use is prohibited in this activity. Diflubenzuron is not in the list of Brazilian legislation establishing maximum permissible limits in water bodies for the protection of aquatic communities. In this paper, according toxicity data of diflubenzuron in non-target organisms, it was calculated an hazardous concentration for only 5% of the species (HC5) of the aquatic community. This parameter value was estimated to be about 7 x 10 -6 mg L -1 . The low value is due to the extreme high toxicity of diflubenzuron to daphnids and to the large variation in sensitivity among the species tested. Two relatively low cost and inert materials were efficient in removing the diflubenzuron from solutions containing this compound. Among these materials, expanded clay shown to promote reduction of approximately 50% of the toxicity of a solution containing diflubenzuron. The results may contribute to the establishment of public policies in Brazil associated to the definition of maximum permissible limits of xenobiotics in the aquatic compartment. This study is also relevant to the search of low cost and inert materials for xenobiotics removal from aquaculture or agricultural effluents.