15 resultados para Shadow and Highlight Invariant Algorithm.
em Instituto Politécnico do Porto, Portugal
Resumo:
The Casa da Música Foundation, responsible for the management of Casa da Música do Porto building, has the need to obtain statistical data related to the number of building’s visitors. This information is a valuable tool for the elaboration of periodical reports concerning the success of this cultural institution. For this reason it was necessary to develop a system capable of returning the number of visitors for a requested period of time. This represents a complex task due to the building’s unique architectural design, characterized by very large doors and halls, and the sudden large number of people that pass through them in moments preceding and proceeding the different activities occurring in the building. To achieve the technical solution for this challenge, several image processing methods, for people detection with still cameras, were first studied. The next step was the development of a real time algorithm, using OpenCV libraries and computer vision concepts,to count individuals with the desired accuracy. This algorithm includes the scientific and technical knowledge acquired in the study of the previous methods. The themes developed in this thesis comprise the fields of background maintenance, shadow and highlight detection, and blob detection and tracking. A graphical interface was also built, to help on the development, test and tunning of the proposed system, as a complement to the work. Furthermore, tests to the system were also performed, to certify the proposed techniques against a set of limited circumstances. The results obtained revealed that the algorithm was successfully applied to count the number of people in complex environments with reliable accuracy.
Resumo:
Genetic Algorithms (GAs) are adaptive heuristic search algorithm based on the evolutionary ideas of natural selection and genetic. The basic concept of GAs is designed to simulate processes in natural system necessary for evolution, specifically those that follow the principles first laid down by Charles Darwin of survival of the fittest. On the other hand, Particle swarm optimization (PSO) is a population based stochastic optimization technique inspired by social behavior of bird flocking or fish schooling. PSO shares many similarities with evolutionary computation techniques such as GAs. The system is initialized with a population of random solutions and searches for optima by updating generations. However, unlike GA, PSO has no evolution operators such as crossover and mutation. In PSO, the potential solutions, called particles, fly through the problem space by following the current optimum particles. PSO is attractive because there are few parameters to adjust. This paper presents hybridization between a GA algorithm and a PSO algorithm (crossing the two algorithms). The resulting algorithm is applied to the synthesis of combinational logic circuits. With this combination is possible to take advantage of the best features of each particular algorithm.
Resumo:
This paper intends to present the legal background that support dissemination and access to documents from European institutions, namely the Parliament, the Council and the European Commission. Currently, this legal framework is accomplished with a set of Internet tools that are analyzed regarding official documents types and options searches available. Some statistical data on access to European information published in annual reports from the institutions are also evaluated. The relationship between shadow and light in transparency to access administrative documents and marketing issues of a political communication are underlined. Neo-institutional approach, reputational concept in public organizations and systemic perspective are used as theoretical background.
Resumo:
Cloud data centers have been progressively adopted in different scenarios, as reflected in the execution of heterogeneous applications with diverse workloads and diverse quality of service (QoS) requirements. Virtual machine (VM) technology eases resource management in physical servers and helps cloud providers achieve goals such as optimization of energy consumption. However, the performance of an application running inside a VM is not guaranteed due to the interference among co-hosted workloads sharing the same physical resources. Moreover, the different types of co-hosted applications with diverse QoS requirements as well as the dynamic behavior of the cloud makes efficient provisioning of resources even more difficult and a challenging problem in cloud data centers. In this paper, we address the problem of resource allocation within a data center that runs different types of application workloads, particularly CPU- and network-intensive applications. To address these challenges, we propose an interference- and power-aware management mechanism that combines a performance deviation estimator and a scheduling algorithm to guide the resource allocation in virtualized environments. We conduct simulations by injecting synthetic workloads whose characteristics follow the last version of the Google Cloud tracelogs. The results indicate that our performance-enforcing strategy is able to fulfill contracted SLAs of real-world environments while reducing energy costs by as much as 21%.
Resumo:
This paper presents a simulator for electric vehicles in the context of smart grids and distribution networks. It aims to support network operator´s planning and operations but can be used by other entities for related studies. The paper describes the parameters supported by the current version of the Electric Vehicle Scenario Simulator (EVeSSi) tool and its current algorithm. EVeSSi enables the definition of electric vehicles scenarios on distribution networks using a built-in movement engine. The scenarios created with EVeSSi can be used by external tools (e.g., power flow) for specific analysis, for instance grid impacts. Two scenarios are briefly presented for illustration of the simulator capabilities.
Resumo:
The oceans remain a major source of natural compounds with potential in pharmacology. In particular, during the last few decades, marine cyanobacteria have been in focus as producers of interesting bioactive compounds, especially for the treatment of cancer. In this study, the anticancer potential of extracts from twenty eight marine cyanobacteria strains, belonging to the underexplored picoplanktonic genera, Cyanobium, Synechocystis and Synechococcus, and the filamentous genera, Nodosilinea, Leptolyngbya, Pseudanabaena and Romeria, were assessed in eight human tumor cell lines. First, a crude extract was obtained by dichloromethane:methanol extraction, and from it, three fractions were separated in a Si column chromatography. The crude extract and fractions were tested in eight human cancer cell lines for cell viability/toxicity, accessed with the 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyl tetrazolium bromide (MTT) and lactic dehydrogenase release (LDH) assays. Eight point nine percent of the strains revealed strong cytotoxicity; 17.8% showed moderate cytotoxicity, and 14.3% assays showed low toxicity. The results obtained revealed that the studied genera of marine cyanobacteria are a promising source of novel compounds with potential anticancer activity and highlight the interest in also exploring the smaller filamentous and picoplanktonic genera of cyanobacteria.
Resumo:
Consider the problem of designing an algorithm for acquiring sensor readings. Consider specifically the problem of obtaining an approximate representation of sensor readings where (i) sensor readings originate from different sensor nodes, (ii) the number of sensor nodes is very large, (iii) all sensor nodes are deployed in a small area (dense network) and (iv) all sensor nodes communicate over a communication medium where at most one node can transmit at a time (a single broadcast domain). We present an efficient algorithm for this problem, and our novel algorithm has two desired properties: (i) it obtains an interpolation based on all sensor readings and (ii) it is scalable, that is, its time-complexity is independent of the number of sensor nodes. Achieving these two properties is possible thanks to the close interlinking of the information processing algorithm, the communication system and a model of the physical world.
Resumo:
Tese de Doutoramento
Resumo:
In this study, a new waste management solution for thermoset glass fibre reinforced polymer (GFRP) based products was assessed. Mechanical recycling approach, with reduction of GFRP waste to powdered and fibrous materials was applied, and the prospective added-value of obtained recyclates was experimentally investigated as raw material for polyester based mortars. Different GFRP waste admixed mortar formulations were analyzed varying the content, between 4% up to 12% in weight, of GFRP powder and fibre mix waste. The effect of incorporation of a silane coupling agent was also assessed. Design of experiments and data treatment was accomplished through implementation of full factorial design and analysis of variance ANOVA. Added value of potential recycling solution was assessed by means of flexural and compressive loading capacity of GFRP waste admixed mortars with regard to unmodified polymer mortars. The key findings of this study showed a viable technological option for improving the quality of polyester based mortars and highlight a potential cost-effective waste management solution for thermoset composite materials in the production of sustainable concrete-polymer based products.
Resumo:
In this study, a new waste management solution for thermoset glass fibre reinforced polymer (GFRP) based products was assessed. Mechanical recycling approach, with reduction of GFRP waste to powdered and fibrous materials was applied, and the prospective added-value of obtained recyclates was experimentally investigated as raw material for polyester based mortars. Different GFRP waste admixed mortar formulations were analyzed varying the content, between 4% up to 12% in weight, of GFRP powder and fibre mix waste. The effect of incorporation of a silane coupling agent was also assessed. Design of experiments and data treatment was accomplished through implementation of full factorial design and analysis of variance ANOVA. Added value of potential recycling solution was assessed by means of flexural and compressive loading capacity of GFRP waste admixed mortars with regard to unmodified polymer mortars. The key findings of this study showed a viable technological option for improving the quality of polyester based mortars and highlight a potential cost-effective waste management solution for thermoset composite materials in the production of sustainable concrete-polymer based products.
Resumo:
Consider the problem of assigning implicit-deadline sporadic tasks on a heterogeneous multiprocessor platform comprising a constant number (denoted by t) of distinct types of processors—such a platform is referred to as a t-type platform. We present two algorithms, LPGIM and LPGNM, each providing the following guarantee. For a given t-type platform and a task set, if there exists a task assignment such that tasks can be scheduled to meet their deadlines by allowing them to migrate only between processors of the same type (intra-migrative), then: (i) LPGIM succeeds in finding such an assignment where the same restriction on task migration applies (intra-migrative) but given a platform in which only one processor of each type is 1 + α × t-1/t times faster and (ii) LPGNM succeeds in finding a task assignment where tasks are not allowed to migrate between processors (non-migrative) but given a platform in which every processor is 1 + α times faster. The parameter α is a property of the task set; it is the maximum of all the task utilizations that are no greater than one. To the best of our knowledge, for t-type heterogeneous multiprocessors: (i) for the problem of intra-migrative task assignment, no previous algorithm exists with a proven bound and hence our algorithm, LPGIM, is the first of its kind and (ii) for the problem of non-migrative task assignment, our algorithm, LPGNM, has superior performance compared to state-of-the-art.
Resumo:
A intervenção humana no manuseamento de veículos submarinos operados remotamente (ROVs) é um requisito necessário para garantir o sucesso da missão e a integridade do equipamento. Contudo, a sua teleoperação não é fácil, pelo que a condução assistida destes veículos torna-se relevante. Esta dissertação propõe uma solução para este problema para ROVs de 3DOF (surge, heave e yaw). São propostas duas abordagens distintas – numa primeira propõe-se um sistema de controlo Image Based Visual Servoing (IBVS) tendo em vista a utilização exclusiva de uma câmara (sensor existente neste tipo de sistemas) por forma a melhorar significativamente a teleoperação de um pequeno ROV; na segunda, propõe-se um sistema de controlo cinemático para o plano horizontal do veículo e um algoritmo de uma manobra capaz de dotar o ROV de movimento lateral através de uma trajectória dente-de-serra. Demonstrou-se em cenários de operação real que o sistema proposto na primeira abordagem permite ao operador de um ROV com 3DOF executar tarefas de alguma complexidade (estabilização) apenas através de comandos de alto nível, melhorando assim drasticamente a teleoperação e qualidade de inspecção do veículo em questão. Foi também desenvolvido um simulador do ROV em MATLAB para validação e avaliação das manobras, onde o sistema proposto na segunda abordagem foi validado com sucesso.
Resumo:
O sucesso das organizações só é possível através da satisfação/superação das expectativas do mercado. Desta forma, a base para o sucesso reside na eficiência com que cada organização consegue cumprir a sua missão, só possível através de comunicações e de processos produtivos/serviços que acrescentem valor ao bem que disponibilizam. Cientes destes princípios, as organizações têm vindo a adoptar metodologias com o intuito de eliminar os desperdícios existentes nos seus negócios. Este trabalho resulta da necessidade de uma organização industrial de base tecnológica, a TRIDEC, identificar se no seu negócio existem desperdícios com origem na relação entre as suas duas unidades industriais, localizadas na Holanda e em Portugal. O trabalho foi possível pela parceria entre estudantes da Avans University, da Holanda, e do ISEP (o autor deste trabalho), e foi realizado em duas fases. Numa primeira fase foi liderado pelos estudantes Holandeses e consistiu na análise aos fluxos de comunicação entre as duas unidades. Estes resultados estão presentes no relatório dos colegas Holandeses, e destacam a necessidade haver uma maior relação de compromisso entre ambas as organizações. A segunda fase do trabalho foi centrada na unidade de Portugal, tendo-se efectuado uma análise ao seu desempenho com o enfoque a verificar-se na Produção, que culminou com o cálculo do OEE à Secção de Maquinação.
Resumo:
Ao longo dos últimos anos, os scanners 3D têm tido uma utilização crescente nas mais variadas áreas. Desde a Medicina à Arqueologia, passando pelos vários tipos de indústria, ´e possível identificar aplicações destes sistemas. Essa crescente utilização deve-se, entre vários factores, ao aumento dos recursos computacionais, à simplicidade e `a diversidade das técnicas existentes, e `as vantagens dos scanners 3D comparativamente com outros sistemas. Estas vantagens são evidentes em áreas como a Medicina Forense, onde a fotografia, tradicionalmente utilizada para documentar objectos e provas, reduz a informação adquirida a duas dimensões. Apesar das vantagens associadas aos scanners 3D, um factor negativo é o preço elevado. No âmbito deste trabalho pretendeu-se desenvolver um scanner 3D de luz estruturada económico e eficaz, e um conjunto de algoritmos para o controlo do scanner, para a reconstrução de superfícies de estruturas analisadas, e para a validação dos resultados obtidos. O scanner 3D implementado ´e constituído por uma câmara e por um projector de vídeo ”off-the-shelf”, e por uma plataforma rotativa desenvolvida neste trabalho. A função da plataforma rotativa consiste em automatizar o scanner de modo a diminuir a interação dos utilizadores. Os algoritmos foram desenvolvidos recorrendo a pacotes de software open-source e a ferramentas gratuitas. O scanner 3D foi utilizado para adquirir informação 3D de um crânio, e o algoritmo para reconstrução de superfícies permitiu obter superfícies virtuais do crânio. Através do algoritmo de validação, as superfícies obtidas foram comparadas com uma superfície do mesmo crânio, obtida por tomografia computorizada (TC). O algoritmo de validação forneceu um mapa de distâncias entre regiões correspondentes nas duas superfícies, que permitiu quantificar a qualidade das superfícies obtidas. Com base no trabalho desenvolvido e nos resultados obtidos, é possível afirmar que foi criada uma base funcional para o varrimento de superfícies 3D de estruturas, apta para desenvolvimento futuro, mostrando que é possível obter alternativas aos métodos comerciais usando poucos recursos financeiros.
Resumo:
As Redes Sem Fios Enterradas (Wireless Underground Networks - WUN) são formadas por nós que comunicam entre si através de ligações sem fios e têm como meio de propagação o solo. Os sistemas de localização mais utilizados atualmente têm desvantagens ao nível da precisão e o custo. Nesta tese é proposta uma solução de localização de precisão que recorre à utilização de redes sem fios enterradas e um algoritmo de posicionamento baseados em Wi-Fi. O objetivo é estimar a localização de objetos, utilizando dispositivos Wi-Fi de baixo custo. Os resultados experimentais obtidos demonstram que o erro de localização é inferior a 0,40 m, e que esta solução é viável para, por exemplo, localizar jogadores num campo de futebol ou localizar um objeto num campo agrícola.