64 resultados para Localization real-world challenges
em Instituto Politécnico do Porto, Portugal
Resumo:
Localization is a fundamental task in Cyber-Physical Systems (CPS), where data is tightly coupled with the environment and the location where it is generated. The research literature on localization has reached a critical mass, and several surveys have also emerged. This review paper contributes on the state-of-the-art with the proposal of a new and holistic taxonomy of the fundamental concepts of localization in CPS, based on a comprehensive analysis of previous research works and surveys. The main objective is to pave the way towards a deep understanding of the main localization techniques, and unify their descriptions. Furthermore, this review paper provides a complete overview on the most relevant localization and geolocation techniques. Also, we present the most important metrics for measuring the accuracy of localization approaches, which is meant to be the gap between the real location and its estimate. Finally, we present open issues and research challenges pertaining to localization. We believe that this review paper will represent an important and complete reference of localization techniques in CPS for researchers and practitioners and will provide them with an added value as compared to previous surveys.
Resumo:
The idea behind creating this special issue on real world applications of intelligent tutoring systems was to bring together in a single publication some of the most important examples of success in the use of ITS technology. This will serve as a reference to all researchers working in the area. It will also be an important resource for the industry, showing the maturity of ITS technology and creating an atmosphere for funding new ITS projects. Simultaneously, it will be valuable to academic groups, motivating students for new ideas of ITS and promoting new academic research work in the area.
Resumo:
In a real world multiagent system, where the agents are faced with partial, incomplete and intrinsically dynamic knowledge, conflicts are inevitable. Frequently, different agents have goals or beliefs that cannot hold simultaneously. Conflict resolution methodologies have to be adopted to overcome such undesirable occurrences. In this paper we investigate the application of distributed belief revision techniques as the support for conflict resolution in the analysis of the validity of the candidate beams to be produced in the CERN particle accelerators. This CERN multiagent system contains a higher hierarchy agent, the Specialist agent, which makes use of meta-knowledge (on how the con- flicting beliefs have been produced by the other agents) in order to detect which beliefs should be abandoned. Upon solving a conflict, the Specialist instructs the involved agents to revise their beliefs accordingly. Conflicts in the problem domain are mapped into conflicting beliefs of the distributed belief revision system, where they can be handled by proven formal methods. This technique builds on well established concepts and combines them in a new way to solve important problems. We find this approach generally applicable in several domains.
Resumo:
This report describes the development of a Test-bed Application for the ART-WiSe Framework with the aim of providing a means of access, validate and demonstrate that architecture. The chosen application is a kind of pursuit-evasion game where a remote controlled robot, navigating through an area covered by wireless sensor network (WSN), is detected and continuously tracked by the WSN. Then a centralized control station takes the appropriate actions for a pursuit robot to chase and “capture” the intruder one. This kind of application imposes stringent timing requirements to the underlying communication infrastructure. It also involves interesting research problems in WSNs like tracking, localization, cooperation between nodes, energy concerns and mobility. Additionally, it can be easily ported into a real-world application. Surveillance or search and rescue operations are two examples where this kind of functionality can be applied. This is still a first approach on the test-bed application and this development effort will be continuously pushed forward until all the envisaged objectives for the Art-WiSe architecture become accomplished.
Resumo:
The goal of this paper is to discuss the benefits and challenges of yielding an inter-continental network of remote laboratories supported and used by both European and Latin American Institutions of Higher Education. Since remote experimentation, understood as the ability to carry out real-world experiments through a simple Web browser, is already a proven solution for the educational community as a supplement to on-site practical lab work (and in some cases, namely for distance learning courses, a replacement to that work), the purpose is not to discuss its technical, pedagogical, or economical strengths, but rather to raise and try to answer some questions about the underlying benefits and challenges of establishing a peer-to-peer network of remote labs. Ultimately, we regard such a network as a constructive mechanism to help students gain the working and social skills often valued by multinational/global companies, while also providing awareness of local cultural aspects.
Resumo:
A área da simulação computacional teve um rápido crescimento desde o seu apareciment, sendo actualmente uma das ciências de gestão e de investigação operacional mais utilizadas. O seu princípio baseia-se na replicação da operação de processos ou sistemas ao longo de períodos de tempo, tornando-se assim uma metodologia indispensável para a resolução de variados problemas do mundo real, independentemente da sua complexidade. Das inúmeras áreas de aplicação, nos mais diversos campos, a que mais se destaca é a utilização em sistemas de produção, onde o leque de aplicações disponível é muito vasto. A sua aplicação tem vindo a ser utilizada para solucionar problemas em sistemas de produção, uma vez que permite às empresas ajustar e planear de uma maneira rápida, eficaz e ponderada as suas operações e os seus sistemas, permitindo assim uma rápida adaptação das mesmas às constantes mudanças das necessidades da economia global. As aplicações e packages de simulação têm seguindo as tendências tecnológicas pelo que é notório o recurso a tecnologias orientadas a objectos para o desenvolvimento das mesmas. Este estudo baseou-se, numa primeira fase, na recolha de informação de suporte aos conceitos de modelação e simulação, bem como a respectiva aplicação a sistemas de produção em tempo real. Posteriormente centralizou-se no desenvolvimento de um protótipo de uma aplicação de simulação de ambientes de fabrico em tempo real. O desenvolvimento desta ferramenta teve em vista eventuais fins pedagógicos e uma utilização a nível académico, sendo esta capaz de simular um modelo de um sistema de produção, estando também dotada de animação. Sem deixar de parte a possibilidade de integração de outros módulos ou, até mesmo, em outras plataformas, houve ainda a preocupação acrescida de que a sua implementação recorresse a metodologias de desenvolvimento orientadas a objectos.
Resumo:
Multi-criteria decision analysis(MCDA) has been one of the fastest-growing areas of operations research during the last decades. The academic attention devoted to MCDA motivated the development of a great variety of approaches and methods within the field. These methods distinguish themselves in terms of procedures, theoretical assumptions and type of decision addressed. This diversity poses challenges to the process of selecting the most suited method for a specific real-world decision problem. In this paper we present a case study in a real-world decision problem arising in the painting sector of an automobile plant. We tackle the problem by resorting to the well-known AHP method and to the MCDA method proposed by Pereira and Fontes (2012) (MMASSI). By relying on two, rather than one, MCDA methods we expect to improve the confidence and robustness of the obtained results. The contributions of this paper are twofold: first, we intend to investigate the contrasts and similarities of the results obtained by distinct MCDA approaches (AHP and MMASSI); secondly, we expect to enrich the literature of the field with a real-world MCDA case study on a complex decision making problem since there is a paucity of applied research work addressing real decision problems faced by organizations.
Resumo:
Multi-criteria decision analysis (MCDA) has been one of the fastest-growing areas of operations research during the last decades. The academic attention devoted to MCDA motivated the development of a great variety of approaches and methods within the field. These methods distinguish themselves in terms of procedures, theoretical assumptions and type of decision addressed. This diversity poses challenges to the process of selecting the most suited method for a specific real-world decision problem. In this paper we present a case study in a real-world decision problem arising in the painting sector of an automobile plant. We tackle the problem by resorting to the well-known AHP method and to the MCDA method proposed by Pereira and Fontes (2012) (MMASSI). By relying on two, rather than one, MCDA methods we expect to improve the confidence and robustness of the obtained results. The contributions of this paper are twofold: first, we intend to investigate the contrasts and similarities of the results obtained by distinct MCDA approaches (AHP and MMASSI); secondly, we expect to enrich the literature of the field with a real-world MCDA case study on a complex decision making problem since there is a paucity of applied research work addressing real decision problems faced by organizations.
Resumo:
As aplicações móveis de serviços baseados na localização, denominados LBS (Location-based services), disponibilizam serviços ao utilizador baseadas na sua localização geográfica. Este tipo de serviços começou a surgir ainda na década de 90 e, à medida que o número de dispositivos móveis cresceu de forma exponencial, a sua oferta disparou consideravelmente. Existem várias áreas com aplicabilidade prática, mas o foco desta tese é a pesquisa e localização de pontos de interesse (POI’s). Através dos sensores que os dispositivos móveis atualmente disponibilizam, torna-se possível localizar a posição do utilizador e apresentar-lhe os pontos de interesse que estão situados em seu redor. No entanto essa informação isolada revela-se por vezes insuficiente, uma vez que esses pontos de interesse são à partida desconhecidos para o utilizador. Através do serviço coolplaces, um projeto que pretende dedicar-se à pesquisa e partilha de POI’s, podemos criar a nossa rede de amigos e de locais, beneficiando assim da respetiva informação de contexto de um determinado POI. As inovações tecnológicas permitiram também o aparecimento de aplicações de Realidade Aumentada nos dispositivos móveis, isto é, aplicações capazes de sobrepor imagens virtuais a visualizações do mundo real. Considerando a visualização de POI’s num dado ambiente, se encararmos a Realidade Aumentada como um potenciador da interação do utilizador com o mundo real, rapidamente identificamos as potencialidades da junção destes conceitos numa só aplicação. Sendo assim, o trabalho desenvolvido nesta tese pretende constituir um estudo sobre a implementação e desenvolvimento de um módulo de Realidade Aumentada para a aplicação móvel do serviço coolplaces, fazendo uso da tecnologia disponível no mercado de forma a proporcionar uma experiência inovadora e acrescentar valor à referida aplicação.
Resumo:
Currently, due to the widespread use of computers and the internet, students are trading libraries for the World Wide Web and laboratories with simulation programs. In most courses, simulators are made available to students and can be used to proof theoretical results or to test a developing hardware/product. Although this is an interesting solution: low cost, easy and fast way to perform some courses work, it has indeed major disadvantages. As everything is currently being done with/in a computer, the students are loosing the “feel” of the real values of the magnitudes. For instance in engineering studies, and mainly in the first years, students need to learn electronics, algorithmic, mathematics and physics. All of these areas can use numerical analysis software, simulation software or spreadsheets and in the majority of the cases data used is either simulated or random numbers, but real data could be used instead. For example, if a course uses numerical analysis software and needs a dataset, the students can learn to manipulate arrays. Also, when using the spreadsheets to build graphics, instead of using a random table, students could use a real dataset based, for instance, in the room temperature and its variation across the day. In this work we present a framework which uses a simple interface allowing it to be used by different courses where the computers are the teaching/learning process in order to give a more realistic feeling to students by using real data. A framework is proposed based on a set of low cost sensors for different physical magnitudes, e.g. temperature, light, wind speed, which are connected to a central server, that the students have access with an Ethernet protocol or are connected directly to the student computer/laptop. These sensors use the communication ports available such as: serial ports, parallel ports, Ethernet or Universal Serial Bus (USB). Since a central server is used, the students are encouraged to use sensor values results in their different courses and consequently in different types of software such as: numerical analysis tools, spreadsheets or simply inside any programming language when a dataset is needed. In order to do this, small pieces of hardware were developed containing at least one sensor using different types of computer communication. As long as the sensors are attached in a server connected to the internet, these tools can also be shared between different schools. This allows sensors that aren't available in a determined school to be used by getting the values from other places that are sharing them. Another remark is that students in the more advanced years and (theoretically) more know how, can use the courses that have some affinities with electronic development to build new sensor pieces and expand the framework further. The final solution provided is very interesting, low cost, simple to develop, allowing flexibility of resources by using the same materials in several courses bringing real world data into the students computer works.
Resumo:
Cloud data centers have been progressively adopted in different scenarios, as reflected in the execution of heterogeneous applications with diverse workloads and diverse quality of service (QoS) requirements. Virtual machine (VM) technology eases resource management in physical servers and helps cloud providers achieve goals such as optimization of energy consumption. However, the performance of an application running inside a VM is not guaranteed due to the interference among co-hosted workloads sharing the same physical resources. Moreover, the different types of co-hosted applications with diverse QoS requirements as well as the dynamic behavior of the cloud makes efficient provisioning of resources even more difficult and a challenging problem in cloud data centers. In this paper, we address the problem of resource allocation within a data center that runs different types of application workloads, particularly CPU- and network-intensive applications. To address these challenges, we propose an interference- and power-aware management mechanism that combines a performance deviation estimator and a scheduling algorithm to guide the resource allocation in virtualized environments. We conduct simulations by injecting synthetic workloads whose characteristics follow the last version of the Google Cloud tracelogs. The results indicate that our performance-enforcing strategy is able to fulfill contracted SLAs of real-world environments while reducing energy costs by as much as 21%.
Resumo:
The best places to locate the Gas Supply Units (GSUs) on a natural gas systems and their optimal allocation to loads are the key factors to organize an efficient upstream gas infrastructure. The number of GSUs and their optimal location in a gas network is a decision problem that can be formulated as a linear programming problem. Our emphasis is on the formulation and use of a suitable location model, reflecting real-world operations and constraints of a natural gas system. This paper presents a heuristic model, based on lagrangean approach, developed for finding the optimal GSUs location on a natural gas network, minimizing expenses and maximizing throughput and security of supply.The location model is applied to the Iberian high pressure natural gas network, a system modelised with 65 demand nodes. These nodes are linked by physical and virtual pipelines – road trucks with gas in liquefied form. The location model result shows the best places to locate, with the optimal demand allocation and the most economical gas transport mode: by pipeline or by road truck.
Resumo:
This chapter addresses the resolution of dynamic scheduling by means of meta-heuristic and multi-agent systems. Scheduling is an important aspect of automation in manufacturing systems. Several contributions have been proposed, but the problem is far from being solved satisfactorily, especially if scheduling concerns real world applications. The proposed multi-agent scheduling system assumes the existence of several resource agents (which are decision-making entities based on meta-heuristics) distributed inside the manufacturing system that interact with other agents in order to obtain optimal or near-optimal global performances.
Resumo:
One of the most difficult problems that face researchers experimenting with complex systems in real world applications is the Facility Layout Design Problem. It relies with the design and location of production lines, machinery and equipment, inventory storage and shipping facilities. In this work it is intended to address this problem through the use of Constraint Logic Programming (CLP) technology. The use of Genetic Algorithms (GA) as optimisation technique in CLP environment is also an issue addressed. The approach aims the implementation of genetic algorithm operators following the CLP paradigm.
Resumo:
This paper analyzes DNA information using entropy and phase plane concepts. First, the DNA code is converted into a numerical format by means of histograms that capture DNA sequence length ranging from one up to ten bases. This strategy measures dynamical evolutions from 4 up to 410 signal states. The resulting histograms are analyzed using three distinct entropy formulations namely the Shannon, Rényie and Tsallis definitions. Charts of entropy versus sequence length are applied to a set of twenty four species, characterizing 486 chromosomes. The information is synthesized and visualized by adapting phase plane concepts leading to a categorical representation of chromosomes and species.