894 resultados para Large-scale enterprises
Resumo:
The hidden-node problem has been shown to be a major source of Quality-of-Service (QoS) degradation in Wireless Sensor Networks (WSNs) due to factors such as the limited communication range of sensor nodes, link asymmetry and the characteristics of the physical environment. In wireless contention-based Medium Access Control protocols, if two nodes that are not visible to each other transmit to a third node that is visible to the formers, there will be a collision – usually called hidden-node or blind collision. This problem greatly affects network throughput, energy-efficiency and message transfer delays, which might be particularly dramatic in large-scale WSNs. This paper tackles the hiddennode problem in WSNs and proposes H-NAMe, a simple yet efficient distributed mechanism to overcome it. H-NAMe relies on a grouping strategy that splits each cluster of a WSN into disjoint groups of non-hidden nodes and then scales to multiple clusters via a cluster grouping strategy that guarantees no transmission interference between overlapping clusters. We also show that the H-NAMe mechanism can be easily applied to the IEEE 802.15.4/ZigBee protocols with only minor add-ons and ensuring backward compatibility with the standard specifications. We demonstrate the feasibility of H-NAMe via an experimental test-bed, showing that it increases network throughput and transmission success probability up to twice the values obtained without H-NAMe. We believe that the results in this paper will be quite useful in efficiently enabling IEEE 802.15.4/ZigBee as a WSN protocol
Resumo:
In this paper, we focus on large-scale and dense Cyber- Physical Systems, and discuss methods that tightly integrate communication and computing with the underlying physical environment. We present Physical Dynamic Priority Dominance ((PD)2) protocol that exemplifies a key mechanism to devise low time-complexity communication protocols for large-scale networked sensor systems. We show that using this mechanism, one can compute aggregate quantities such as the maximum or minimum of sensor readings in a time-complexity that is equivalent to essentially one message exchange. We also illustrate the use of this mechanism in a more complex task of computing the interpolation of smooth as well as non-smooth sensor data in very low timecomplexity.
Resumo:
Dissertação apresentada na Universidade do Minho com vista à obtenção do grau de Doutor em Tecnologias e Sistemas de Informação (Engenharia e Gestão de Sistemas de Informação)
Resumo:
With the current complexity of communication protocols, implementing its layers totally in the kernel of the operating system is too cumbersome, and it does not allow use of the capabilities only available in user space processes. However, building protocols as user space processes must not impair the responsiveness of the communication. Therefore, in this paper we present a layer of a communication protocol, which, due to its complexity, was implemented in a user space process. Lower layers of the protocol are, for responsiveness issues, implemented in the kernel. This protocol was developed to support large-scale power-line communication (PLC) with timing requirements.
Resumo:
The emergence of smartphones with Wireless LAN (WiFi) network interfaces brought new challenges to application developers. The expected increase of users connectivity will impact their expectations for example on the performance of background applications. Unfortunately, the number and breadth of the studies on the new patterns of user mobility and connectivity that result from the emergence of smartphones is still insufficient to support this claim. This paper contributes with preliminary results on a large scale study of the usage pattern of about 49000 devices and 31000 users who accessed at least one access point of the eduroam WiFi network on the campuses of the Lisbon Polytechnic Institute. Results confirm that the increasing number of smartphones resulted in significant changes to the pattern of use, with impact on the amount of traffic and users connection time.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do grau de Mestre em Engenharia do Ambiente
Resumo:
Renewable energy sources (RES) have unique characteristics that grant them preference in energy and environmental policies. However, considering that the renewable resources are barely controllable and sometimes unpredictable, some challenges are faced when integrating high shares of renewable sources in power systems. In order to mitigate this problem, this paper presents a decision-making methodology regarding renewable investments. The model computes the optimal renewable generation mix from different available technologies (hydro, wind and photovoltaic) that integrates a given share of renewable sources, minimizing residual demand variability, therefore stabilizing the thermal power generation. The model also includes a spatial optimization of wind farms in order to identify the best distribution of wind capacity. This methodology is applied to the Portuguese power system.
Resumo:
An experimental and Finite Element study was performed on the bending behaviour of wood beams of the Pinus Pinaster species repaired with adhesively-bonded carbon–epoxy patches, after sustaining damage by cross-grain failure. This damage is characterized by crack growth at a small angle to the beams longitudinal axis, due to misalignment between the wood fibres and the beam axis. Cross-grain failure can occur in large-scale in a wood member when trees that have grown spirally or with a pronounced taper are cut for lumber. Three patch lengths were tested. The simulations include the possibility of cohesive fracture of the adhesive layer, failure within the wood beam in two propagation planes and patch interlaminar failure, by the use of cohesive zone modelling. The respective cohesive properties were estimated either by an inverse method or from the literature. The comparison with the tests allowed the validation of the proposed methodology, opening a good perspective for the reduction of costs in the design stages of these repairs due to extensive experimentation.
Resumo:
Dissertation presented to obtain a Masters degree in Computer Science
Resumo:
Dynamically reconfigurable SRAM-based field-programmable gate arrays (FPGAs) enable the implementation of reconfigurable computing systems where several applications may be run simultaneously, sharing the available resources according to their own immediate functional requirements. To exclude malfunctioning due to faulty elements, the reliability of all FPGA resources must be guaranteed. Since resource allocation takes place asynchronously, an online structural test scheme is the only way of ensuring reliable system operation. On the other hand, this test scheme should not disturb the operation of the circuit, otherwise availability would be compromised. System performance is also influenced by the efficiency of the management strategies that must be able to dynamically allocate enough resources when requested by each application. As those resources are allocated and later released, many small free resource blocks are created, which are left unused due to performance and routing restrictions. To avoid wasting logic resources, the FPGA logic space must be defragmented regularly. This paper presents a non-intrusive active replication procedure that supports the proposed test methodology and the implementation of defragmentation strategies, assuring both the availability of resources and their perfect working condition, without disturbing system operation.
Resumo:
Electric vehicles (EVs) and plug-in hybrid electric vehicles (PHEVs), which obtain their fuel from the grid by charging a battery, are set to be introduced into the mass market and expected to contribute to oil consumption reduction. This research is concerned with studying the potential impacts on the electric utilities of large-scale adoption of plug-in electric vehicles from the perspective of electricity demand, fossil fuels use, CO2 emissions and energy costs. Simulations were applied to the Portuguese case study in order to analyze what would be the optimal recharge profile and EV penetration in an energy-oriented, an emissions-oriented and a cost-oriented objective. The objectives considered were: The leveling of load profiles, minimization of daily emissions and minimization of daily wholesale costs. Almost all solutions point to an off-peak recharge and a 50% reduction in daily wholesale costs can be verified from a peak recharge scenario to an off-peak recharge for a 2 million EVs in 2020. A 15% improvement in the daily total wholesale costs can be verified in the costs minimization objective when compared with the off-peak scenario result.
Resumo:
Este artigo apresenta uma nova abordagem (MM-GAV-FBI), aplicável ao problema da programação de projectos com restrições de recursos e vários modos de execução por actividade, problema conhecido na literatura anglo-saxónica por MRCPSP. Cada projecto tem um conjunto de actividades com precedências tecnológicas definidas e um conjunto de recursos limitados, sendo que cada actividade pode ter mais do que um modo de realização. A programação dos projectos é realizada com recurso a um esquema de geração de planos (do inglês Schedule Generation Scheme - SGS) integrado com uma metaheurística. A metaheurística é baseada no paradigma dos algoritmos genéticos. As prioridades das actividades são obtidas a partir de um algoritmo genético. A representação cromossómica utilizada baseia-se em chaves aleatórias. O SGS gera planos não-atrasados. Após a obtenção de uma solução é aplicada uma melhoria local. O objectivo da abordagem é encontrar o melhor plano (planning), ou seja, o plano que tenha a menor duração temporal possível, satisfazendo as precedências das actividades e as restrições de recursos. A abordagem proposta é testada num conjunto de problemas retirados da literatura da especialidade e os resultados computacionais são comparados com outras abordagens. Os resultados computacionais validam o bom desempenho da abordagem, não apenas em termos de qualidade da solução, mas também em termos de tempo útil.
Resumo:
Com a crescente integração de energias renováveis variáveis nos sistemas elétricos de energia surgem novos desafios à maneira como se efetua a exploração dos mesmos, devido à dificuldade na previsão produção e na controlabilidade destas energias. Estes desafios são ainda maiores quando se analisa uma rede elétrica isolada de pequenas dimensões sem possibilidade de ser interligada a uma rede continental, por apresentar uma maior fragilidade e dai resultarem critérios de exploração muito mais apertados, de forma a garantir da melhor forma a segurança e estabilidade da rede. Consequentemente existe a necessidade de serem adotadas medidas que atenuem os impactos da variabilidade e tornem mais previsíveis as energias renováveis. É neste âmbito que surgem as tecnologias de armazenamento de energia elétrica. O presente documento apresenta um estudo aprofundado ao sistema eletroprodutor da ilha da Madeira e às suas especificidades, analisando a viabilidade técnica da introdução de baterias em larga escala no sistema. De forma a realizar esta análise, criou-se uma ferramenta de simulação em Matlab, que visou quantificar o impacto da introdução de baterias, quer ao nível da integração de energia eólica, quer ao nível da redução da produção térmica. Esta ferramenta permite ainda uma análise gráfica do diagrama de produção agregado diário, assim como a evolução de potência e energia na bateria.
Resumo:
A principal causa de morte e incapacidade em Portugal deve-se a Acidentes Vasculares Cerebrais (AVC). Assim, este trabalho de investigação pretende identificar e quantificar quais os fatores que contribuem para a ocorrência de um AVC (por tipo e com sequelas), duração do internamento e potenciais soluções de encaminhamento terapêutico para o doente após a ocorrência de AVC. Identificando e quantificando tais fatores é possível traçar um perfil de doente em risco, atuando sobre ele através de medidas preventivas de forma a minimizar o impacto deste problema em termos pessoais, sociais e financeiros. Para atingir este objetivo foi analisada uma amostra de indivíduos internados em 2012 na Unidade de AVC do Centro Hospitalar do Tâmega e Sousa. Dos casos analisados 87,8% são causados por AVCI (isquémicos) e 12,2% por casos de AVCH (hemorrágicos). Do total dos casos, 58,9% apresentam sequelas. A hipertensão, a diabetes de Mellitus e o Colesterol apresentam-se como antecedentes clínicos com elevado fator de risco. O tabagismo regista grande importância na propulsão dos anteriores fatores analisados assim como o alcoolismo. Conclui-se que a prevenção do AVC e outras doenças cardiovasculares é importante desde a idade escolar, dando-se especial importância ao período que antecede os 36 anos de idade, altura em que se começa a verificar uma subida agravada de ocorrências. O investimento na prevenção e vigilância médica do cidadão é um fator crucial neste período podendo reduzir em grande escala os custos associados a médio-longo prazo para todas as partes intervenientes.
Resumo:
IEEE Electron Device Letters, VOL. 29, NO. 9,