934 resultados para Virtual Environments
Resumo:
Power Systems (PS), have been affected by substantial penetration of Distributed Generation (DG) and the operation in competitive environments. The future PS will have to deal with large-scale integration of DG and other distributed energy resources (DER), such as storage means, and provide to market agents the means to ensure a flexible and secure operation. Virtual power players (VPP) can aggregate a diversity of players, namely generators and consumers, and a diversity of energy resources, including electricity generation based on several technologies, storage and demand response. This paper proposes an artificial neural network (ANN) based methodology to support VPP resource schedule. The trained network is able to achieve good schedule results requiring modest computational means. A real data test case is presented.
Resumo:
Nowadays, there is a growing environmental concern about were the energy that we use comes from, bringing the att ention on renewable energies. However, the use and trade of renewable e nergies in the market seem to be complicated because of the lack of guara ntees of generation, mainly in the wind farms. The lack of guarantees is usually addressed by using a reserve generation. The aggregation of DG p lants gives place to a new concept: the Virtual Power Producer (VPP). VPPs can reinforce the importance of wind generation technologies, making them valuable in electricity markets. This paper presents some resul ts obtained with a simulation tool (ViProd) developed to support VPPs in the analysis of their operation and management methods and of their strat egies effects.
Resumo:
This paper presents the proposal of an architecture for developing systems that interact with Ambient Intelligence (AmI) environments. This architecture has been proposed as a consequence of a methodology for the inclusion of Artificial Intelligence in AmI environments (ISyRAmI - Intelligent Systems Research for Ambient Intelligence). The ISyRAmI architecture considers several modules. The first is related with the acquisition of data, information and even knowledge. This data/information knowledge deals with our AmI environment and can be acquired in different ways (from raw sensors, from the web, from experts). The second module is related with the storage, conversion, and handling of the data/information knowledge. It is understood that incorrectness, incompleteness, and uncertainty are present in the data/information/knowledge. The third module is related with the intelligent operation on the data/information/knowledge of our AmI environment. Here we include knowledge discovery systems, expert systems, planning, multi-agent systems, simulation, optimization, etc. The last module is related with the actuation in the AmI environment, by means of automation, robots, intelligent agents and users.
Resumo:
Cloud computing is increasingly being adopted in different scenarios, like social networking, business applications, scientific experiments, etc. Relying in virtualization technology, the construction of these computing environments targets improvements in the infrastructure, such as power-efficiency and fulfillment of users’ SLA specifications. The methodology usually applied is packing all the virtual machines on the proper physical servers. However, failure occurrences in these networked computing systems can induce substantial negative impact on system performance, deviating the system from ours initial objectives. In this work, we propose adapted algorithms to dynamically map virtual machines to physical hosts, in order to improve cloud infrastructure power-efficiency, with low impact on users’ required performance. Our decision making algorithms leverage proactive fault-tolerance techniques to deal with systems failures, allied with virtual machine technology to share nodes resources in an accurately and controlled manner. The results indicate that our algorithms perform better targeting power-efficiency and SLA fulfillment, in face of cloud infrastructure failures.
Resumo:
Collaborative Work plays an important role in today’s organizations, especially in areas where decisions must be made. However, any decision that involves a collective or group of decision makers is, by itself complex, but is becoming recurrent in recent years. In this work we present the VirtualECare project, an intelligent multi-agent system able to monitor, interact and serve its customers, which are, normally, in need of care services. In last year’s there has been a substantially increase on the number of people needed of intensive care, especially among the elderly, a phenomenon that is related to population ageing. However, this is becoming not exclusive of the elderly, as diseases like obesity, diabetes and blood pressure have been increasing among young adults. This is a new reality that needs to be dealt by the health sector, particularly by the public one. Given this scenarios, the importance of finding new and cost effective ways for health care delivery are of particular importance, especially when we believe they should not to be removed from their natural “habitat”. Following this line of thinking, the VirtualECare project will be presented, like similar ones that preceded it. Recently we have also assisted to a growing interest in combining the advances in information society - computing, telecommunications and presentation – in order to create Group Decision Support Systems (GDSS). Indeed, the new economy, along with increased competition in today’s complex business environments, takes the companies to seek complementarities in order to increase competitiveness and reduce risks. Under these scenarios, planning takes a major role in a company life. However, effective planning depends on the generation and analysis of ideas (innovative or not) and, as a result, the idea generation and management processes are crucial. Our objective is to apply the above presented GDSS to a new area. We believe that the use of GDSS in the healthcare arena will allow professionals to achieve better results in the analysis of one’s Electronically Clinical Profile (ECP). This achievement is vital, regarding the explosion of knowledge and skills, together with the need to use limited resources and get better results.
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
Dissertação de Mestrado, Gestão de Empresa (MBA), 16 de Julho de 2013, Universidade dos Açores.
Resumo:
Dissertação apresentada à Escola Superior de Comunicação Social como parte dos requisitos para obtenção de grau de mestre em Audiovisual e Multimédia.
Resumo:
Este artigo relata o desenvolvimento de um modelo de ensino virtual em curso na Universidade dos Açores. Depois de ter sido adotado na lecionação de disciplinas da área da Teoria e Desenvolvimento Curricular em regime de e-learning e b-learning, o modelo foi, no ano académico de 2014/15, estendido à lecionação de outras disciplinas. Além de descrever o modelo e explicar a sua evolução, o artigo destaca a sua adoção no contexto particular de uma disciplina cuja componente online foi lecionada em circunstâncias especialmente desafiadoras. Neste sentido, explica o processo de avaliação da experiência, discute os seus resultados e sugere pistas de melhoria. Essa avaliação enquadra-se num processo de investigação do design curricular – a metodologia que tem sido usada para estudar o desenvolvimento do modelo.
Resumo:
Recentemente, tem-se assistido à utilização de ambientes imersivos 3D em vários domínios tais como: actividades empresariais, educativas, lúdicas, entre outras devido à expansão do Second Life. A finalidade deste conceito é oferecer aos utilizadores um acesso alternativo a valências existentes no mundo real, a partir de um computador ligado à Internet. Uma aplicação prática pode ser a sua utilização em laboratórios remotos, com a finalidade de controlar remotamente instrumentos de medição, a partir de um ambiente imersivo. Para isso, o mesmo deve permitir a construção de um laboratório virtual e respectivos instrumentos, também virtuais. Este tipo de solução é viável, devido a existirem dispositivos com interfaces de acesso remoto, e ambientes 3D desenvolvidos em linguagens de programação que possuem bibliotecas de código para protocolos de redes de computadores. A finalidade deste trabalho é desenvolver uma metodologia de acesso remoto, a instrumentos de medição em laboratórios de electricidade e electrónica, usando ambientes imersivos 3D. Como caso de estudo, o instrumento utilizado é um multímetro, controlado remotamente a partir de uma reprodução num mundo virtual, construído no ambiente 3D Open Wonderland. Nessa reprodução virtual, numa primeira fase, só serão disponibilizadas para medição, um conjunto limitado das variáveis eléctricas passíveis de medir através do multímetro seleccionado.
Resumo:
A presente dissertação apresenta uma solução para o problema de modelização tridimensional de galerias subterrâneas. O trabalho desenvolvido emprega técnicas provenientes da área da robótica móvel para obtenção um sistema autónomo móvel de modelização, capaz de operar em ambientes não estruturados sem acesso a sistemas de posicionamento global, designadamente GPS. Um sistema de modelização móvel e autónomo pode ser bastante vantajoso, pois constitui um método rápido e simples de monitorização das estruturas e criação de representações virtuais das galerias com um elevado nível de detalhe. O sistema de modelização desloca-se no interior dos túneis para recolher informações sensoriais sobre a geometria da estrutura. A tarefa de organização destes dados com vista _a construção de um modelo coerente, exige um conhecimento exacto do percurso praticado pelo sistema, logo o problema de localização da plataforma sensorial tem que ser resolvido. A formulação de um sistema de localização autónoma tem que superar obstáculos que se manifestam vincadamente nos ambientes underground, tais como a monotonia estrutural e a já referida ausência de sistemas de posicionamento global. Neste contexto, foi abordado o conceito de SLAM (Simultaneous Loacalization and Mapping) para determinação da localização da plataforma sensorial em seis graus de liberdade. Seguindo a abordagem tradicional, o núcleo do algoritmo de SLAM consiste no filtro de Kalman estendido (EKF { Extended Kalman Filter ). O sistema proposto incorpora métodos avançados do estado da arte, designadamente a parametrização em profundidade inversa (Inverse Depth Parametrization) e o método de rejeição de outliers 1-Point RANSAC. A contribuição mais importante do método por nós proposto para o avanço do estado da arte foi a fusão da informação visual com a informação inercial. O algoritmo de localização foi testado com base em dados reais, adquiridos no interior de um túnel rodoviário. Os resultados obtidos permitem concluir que, ao fundir medidas inerciais com informações visuais, conseguimos evitar o fenómeno de degeneração do factor de escala, comum nas aplicações de localização através de sistemas puramente monoculares. Provámos simultaneamente que a correcção de um sistema de localização inercial através da consideração de informações visuais é eficaz, pois permite suprimir os desvios de trajectória que caracterizam os sistemas de dead reckoning. O algoritmo de modelização, com base na localização estimada, organiza no espaço tridimensional os dados geométricos adquiridos, resultando deste processo um modelo em nuvem de pontos, que posteriormente _e convertido numa malha triangular, atingindo-se assim uma representação mais realista do cenário original.
Resumo:
Tese de Doutoramento, Ciências do Mar, especialidade de Biologia Marinha, 18 de Dezembro de 2015, Universidade dos Açores.
Resumo:
Nanotechnology is an important emerging industry with a projected annual market of around one trillion dollars by 2015. It involves the control of atoms and molecules to create new materials with a variety of useful functions. Although there are advantages on the utilization of these nano-scale materials, questions related with its impact over the environment and human health must be addressed too, so that potential risks can be limited at early stages of development. At this time, occupational health risks associated with manufacturing and use of nanoparticles are not yet clearly understood. However, workers may be exposed to nanoparticles through inhalation at levels that can greatly exceed ambient concentrations. Current workplace exposure limits are based on particle mass, but this criteria could not be adequate in this case as nanoparticles are characterized by very large surface area, which has been pointed out as the distinctive characteristic that could even turn out an inert substance into another substance exhibiting very different interactions with biological fluids and cells. Therefore, it seems that, when assessing human exposure based on the mass concentration of particles, which is widely adopted for particles over 1 μm, would not work in this particular case. In fact, nanoparticles have far more surface area for the equivalent mass of larger particles, which increases the chance they may react with body tissues. Thus, it has been claimed that surface area should be used for nanoparticle exposure and dosing. As a result, assessing exposure based on the measurement of particle surface area is of increasing interest. It is well known that lung deposition is the most efficient way for airborne particles to enter the body and cause adverse health effects. If nanoparticles can deposit in the lung and remain there, have an active surface chemistry and interact with the body, then, there is potential for exposure. It was showed that surface area plays an important role in the toxicity of nanoparticles and this is the metric that best correlates with particle-induced adverse health effects. The potential for adverse health effects seems to be directly proportional to particle surface area. The objective of the study is to identify and validate methods and tools for measuring nanoparticles during production, manipulation and use of nanomaterials.
Resumo:
A classical application of biosignal analysis has been the psychophysiological detection of deception, also known as the polygraph test, which is currently a part of standard practices of law enforcement agencies and several other institutions worldwide. Although its validity is far from gathering consensus, the underlying psychophysiological principles are still an interesting add-on for more informal applications. In this paper we present an experimental off-the-person hardware setup, propose a set of feature extraction criteria and provide a comparison of two classification approaches, targeting the detection of deception in the context of a role-playing interactive multimedia environment. Our work is primarily targeted at recreational use in the context of a science exhibition, where the main goal is to present basic concepts related with knowledge discovery, biosignal analysis and psychophysiology in an educational way, using techniques that are simple enough to be understood by children of different ages. Nonetheless, this setting will also allow us to build a significant data corpus, annotated with ground-truth information, and collected with non-intrusive sensors, enabling more advanced research on the topic. Experimental results have shown interesting findings and provided useful guidelines for future work. Pattern Recognition
Resumo:
Conferência: 2nd Experiment at International Conference (Exp at)- Univ Coimbra, Coimbra, Portugal - Sep 18-20, 2013