800 resultados para programming environments


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the energy management of the isolated operation of small power system, the economic scheduling of the generation units is a crucial problem. Applying right timing can maximize the performance of the supply. The optimal operation of a wind turbine, a solar unit, a fuel cell and a storage battery is searched by a mixed-integer linear programming implemented in General Algebraic Modeling Systems (GAMS). A Virtual Power Producer (VPP) can optimal operate the generation units, assured the good functioning of equipment, including the maintenance, operation cost and the generation measurement and control. A central control at system allows a VPP to manage the optimal generation and their load control. The application of methodology to a real case study in Budapest Tech, demonstrates the effectiveness of this method to solve the optimal isolated dispatch of the DC micro-grid renewable energy park. The problem has been converged in 0.09 s and 30 iterations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the energy management of a small power system, the scheduling of the generation units is a crucial problem for which adequate methodologies can maximize the performance of the energy supply. This paper proposes an innovative methodology for distributed energy resources management. The optimal operation of distributed generation, demand response and storage resources is formulated as a mixed-integer linear programming model (MILP) and solved by a deterministic optimization technique CPLEX-based implemented in General Algebraic Modeling Systems (GAMS). The paper deals with a vision for the grids of the future, focusing on conceptual and operational aspects of electrical grids characterized by an intensive penetration of DG, in the scope of competitive environments and using artificial intelligence methodologies to attain the envisaged goals. These concepts are implemented in a computational framework which includes both grid and market simulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electricity market players operating in a liberalized environment requires access to an adequate decision support tool, allowing them to consider all the business opportunities and take strategic decisions. Ancillary services represent a good negotiation opportunity that must be considered by market players. For this, decision support tools must include ancillary market simulation. This paper proposes two different methods (Linear Programming and Genetic Algorithm approaches) for ancillary services dispatch. The methodologies are implemented in MASCEM, a multi-agent based electricity market simulator. A test case concerning the dispatch of Regulation Down, Regulation Up, Spinning Reserve and Non-Spinning Reserve services is included in this paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the proposal of an architecture for developing systems that interact with Ambient Intelligence (AmI) environments. This architecture has been proposed as a consequence of a methodology for the inclusion of Artificial Intelligence in AmI environments (ISyRAmI - Intelligent Systems Research for Ambient Intelligence). The ISyRAmI architecture considers several modules. The first is related with the acquisition of data, information and even knowledge. This data/information knowledge deals with our AmI environment and can be acquired in different ways (from raw sensors, from the web, from experts). The second module is related with the storage, conversion, and handling of the data/information knowledge. It is understood that incorrectness, incompleteness, and uncertainty are present in the data/information/knowledge. The third module is related with the intelligent operation on the data/information/knowledge of our AmI environment. Here we include knowledge discovery systems, expert systems, planning, multi-agent systems, simulation, optimization, etc. The last module is related with the actuation in the AmI environment, by means of automation, robots, intelligent agents and users.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recentemente, tem-se assistido à utilização de ambientes imersivos 3D em vários domínios tais como: actividades empresariais, educativas, lúdicas, entre outras devido à expansão do Second Life. A finalidade deste conceito é oferecer aos utilizadores um acesso alternativo a valências existentes no mundo real, a partir de um computador ligado à Internet. Uma aplicação prática pode ser a sua utilização em laboratórios remotos, com a finalidade de controlar remotamente instrumentos de medição, a partir de um ambiente imersivo. Para isso, o mesmo deve permitir a construção de um laboratório virtual e respectivos instrumentos, também virtuais. Este tipo de solução é viável, devido a existirem dispositivos com interfaces de acesso remoto, e ambientes 3D desenvolvidos em linguagens de programação que possuem bibliotecas de código para protocolos de redes de computadores. A finalidade deste trabalho é desenvolver uma metodologia de acesso remoto, a instrumentos de medição em laboratórios de electricidade e electrónica, usando ambientes imersivos 3D. Como caso de estudo, o instrumento utilizado é um multímetro, controlado remotamente a partir de uma reprodução num mundo virtual, construído no ambiente 3D Open Wonderland. Nessa reprodução virtual, numa primeira fase, só serão disponibilizadas para medição, um conjunto limitado das variáveis eléctricas passíveis de medir através do multímetro seleccionado.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

II European Conference on Curriculum Studies. "Curriculum studies: Policies, perspectives and practices”. Porto, FPCEUP, October 16th - 17th.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tese de Doutoramento, Ciências do Mar, especialidade de Biologia Marinha, 18 de Dezembro de 2015, Universidade dos Açores.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nanotechnology is an important emerging industry with a projected annual market of around one trillion dollars by 2015. It involves the control of atoms and molecules to create new materials with a variety of useful functions. Although there are advantages on the utilization of these nano-scale materials, questions related with its impact over the environment and human health must be addressed too, so that potential risks can be limited at early stages of development. At this time, occupational health risks associated with manufacturing and use of nanoparticles are not yet clearly understood. However, workers may be exposed to nanoparticles through inhalation at levels that can greatly exceed ambient concentrations. Current workplace exposure limits are based on particle mass, but this criteria could not be adequate in this case as nanoparticles are characterized by very large surface area, which has been pointed out as the distinctive characteristic that could even turn out an inert substance into another substance exhibiting very different interactions with biological fluids and cells. Therefore, it seems that, when assessing human exposure based on the mass concentration of particles, which is widely adopted for particles over 1 μm, would not work in this particular case. In fact, nanoparticles have far more surface area for the equivalent mass of larger particles, which increases the chance they may react with body tissues. Thus, it has been claimed that surface area should be used for nanoparticle exposure and dosing. As a result, assessing exposure based on the measurement of particle surface area is of increasing interest. It is well known that lung deposition is the most efficient way for airborne particles to enter the body and cause adverse health effects. If nanoparticles can deposit in the lung and remain there, have an active surface chemistry and interact with the body, then, there is potential for exposure. It was showed that surface area plays an important role in the toxicity of nanoparticles and this is the metric that best correlates with particle-induced adverse health effects. The potential for adverse health effects seems to be directly proportional to particle surface area. The objective of the study is to identify and validate methods and tools for measuring nanoparticles during production, manipulation and use of nanomaterials.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Actualmente verifica-se que a complexidade dos sistemas informáticos tem vindo a aumentar, fazendo parte das nossas ferramentas diárias de trabalho a utilização de sistemas informáticos e a utilização de serviços online. Neste âmbito, a internet obtém um papel de destaque junto das universidades, ao permitir que alunos e professores possam interagir mais facilmente. A internet e a educação baseada na Web vêm oferecer acesso remoto a qualquer informação independentemente da localização ou da hora. Como consequência, qualquer pessoa com uma ligação à internet, ao poder adquirir informações sobre um determinado tema junto dos maiores peritos, obtém vantagens significativas. Os laboratórios remotos são uma solução muito valorizada no que toca a interligar tecnologia e recursos humanos em ambientes que podem estar afastados no tempo ou no espaço. A criação deste tipo de laboratórios e a sua utilidade real só é possível porque as tecnologias de comunicação emergentes têm contribuído de uma forma muito relevante para melhorar a sua disponibilização à distância. A necessidade de criação de laboratórios remotos torna-se imprescindível para pesquisas relacionadas com engenharia que envolvam a utilização de recursos escassos ou de grandes dimensões. Apoiado neste conceito, desenvolveu-se um laboratório remoto para os alunos de engenharia que precisam de testar circuitos digitais numa carta de desenvolvimento de hardware configurável, permitindo a utilização deste recurso de uma forma mais eficiente. O trabalho consistiu na criação de um laboratório remoto de baixo custo, com base em linguagens de programação open source, sendo utilizado como unidade de processamento um router da ASUS com o firmware OpenWrt. Este firmware é uma distribuição Linux para sistemas embutidos. Este laboratório remoto permite o teste dos circuitos digitais numa carta de desenvolvimento de hardware configurável em tempo real, utilizando a interface JTAG. O laboratório desenvolvido tem a particularidade de ter como unidade de processamento um router. A utilização do router como servidor é uma solução muito pouco usual na implementação de laboratórios remotos. Este router, quando comparado com um computador normal, apresenta uma capacidade de processamento e memória muito inferior, embora os testes efectuados provassem que apresenta um desempenho muito adequado às expectativas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A classical application of biosignal analysis has been the psychophysiological detection of deception, also known as the polygraph test, which is currently a part of standard practices of law enforcement agencies and several other institutions worldwide. Although its validity is far from gathering consensus, the underlying psychophysiological principles are still an interesting add-on for more informal applications. In this paper we present an experimental off-the-person hardware setup, propose a set of feature extraction criteria and provide a comparison of two classification approaches, targeting the detection of deception in the context of a role-playing interactive multimedia environment. Our work is primarily targeted at recreational use in the context of a science exhibition, where the main goal is to present basic concepts related with knowledge discovery, biosignal analysis and psychophysiology in an educational way, using techniques that are simple enough to be understood by children of different ages. Nonetheless, this setting will also allow us to build a significant data corpus, annotated with ground-truth information, and collected with non-intrusive sensors, enabling more advanced research on the topic. Experimental results have shown interesting findings and provided useful guidelines for future work. Pattern Recognition

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As polycyclic aromatic hydrocarbons (PAHs) have a negative impact on human health due to their mutagenic and/or carcinogenic properties, the objective of this work was to study the influence of tobacco smoke on levels and phase distribution of PAHs and to evaluate the associated health risks. The air samples were collected at two homes; 18 PAHs (the 16 PAHs considered by U.S. EPA as priority pollutants, dibenzo[a,l]pyrene and benzo[j]fluoranthene) were determined in gas phase and associated with thoracic (PM10) and respirable (PM2.5) particles. At home influenced by tobacco smoke the total concentrations of 18 PAHs in air ranged from 28.3 to 106 ngm 3 (mean of 66.7 25.4 ngm 3),∑PAHs being 95% higher than at the non-smoking one where the values ranged from 17.9 to 62.0 ngm 3 (mean of 34.5 16.5 ngm 3). On average 74% and 78% of ∑PAHs were present in gas phase at the smoking and non-smoking homes, respectively, demonstrating that adequate assessment of PAHs in air requires evaluation of PAHs in both gas and particulate phases. When influenced by tobacco smoke the health risks values were 3.5e3.6 times higher due to the exposure of PM10. The values of lifetime lung cancer risks were 4.1 10 3 and 1.7 10 3 for the smoking and nonsmoking homes, considerably exceeding the health-based guideline level at both homes also due to the contribution of outdoor traffic emissions. The results showed that evaluation of benzo[a]pyrene alone would probably underestimate the carcinogenic potential of the studied PAH mixtures; in total ten carcinogenic PAHs represented 36% and 32% of the gaseous ∑PAHs and in particulate phase they accounted for 75% and 71% of ∑PAHs at the smoking and non-smoking homes, respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Esta dissertação apresenta o trabalho realizado no âmbito da unidade curricular de Tese / Dissertação (TEDI) do Mestrado em Engenharia Eletrotécnica e de Computadores – Especialização em Automação e Sistemas em parceria com a empresa Live Simply, uma empresa de domótica que decidiu apostar na inovação e no desenvolvimento de serviços e produtos de valor acrescentado para consolidar a sua posição no mercado. Neste contexto, foram identificadas como mais-valias para a Live Simply a conceção, por um lado, de uma ferramenta de apoio técnico de integração e simplificação das fases de projeto, configuração e gestão de instalações domóticas e, por outro lado, de uma interface com a instalação para o cliente consultar e alterar, em tempo real, o estado dos atuadores. Depois de analisadas as tecnologias disponíveis, selecionaram-se as soluções a adotar (linguagens de programação, servidores de base de dados e ambientes de desenvolvimento), definiu-se a arquitetura do sistema, detalhando-se os módulos de projeto, configuração e gestão de instalações, a estrutura da base de dados assim como o hardware de controlo da instalação. De seguida, procedeu-se ao desenvolvimento dos módulos de software e à configuração e programação do módulo de hardware. Por último, procedeu-se a um conjunto exaustivo de testes aos diferentes módulos que demonstraram o correto funcionamento da ferramenta e a adequação das tecnologias empregues. A ferramenta de apoio técnico realizada integra as fases do projeto, configuração e gestão de instalações domóticas, permitindo melhorar o desempenho dos técnicos e a resposta aos clientes. A interface oferecida ao dono da instalação é uma interface Web de aspeto amigável e fácil utilização que permite consultar e modificar em tempo real o estado da instalação.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years several countries have set up policies that allow exchange of kidneys between two or more incompatible patient–donor pairs. These policies lead to what is commonly known as kidney exchange programs. The underlying optimization problems can be formulated as integer programming models. Previously proposed models for kidney exchange programs have exponential numbers of constraints or variables, which makes them fairly difficult to solve when the problem size is large. In this work we propose two compact formulations for the problem, explain how these formulations can be adapted to address some problem variants, and provide results on the dominance of some models over others. Finally we present a systematic comparison between our models and two previously proposed ones via thorough computational analysis. Results show that compact formulations have advantages over non-compact ones when the problem size is large.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the last two decades, there was a proliferation of programming exercise formats that hinders interoperability in automatic assessment. In the lack of a widely accepted standard, a pragmatic solution is to convert content among the existing formats. BabeLO is a programming exercise converter providing services to a network of heterogeneous e-learning systems such as contest management systems, programming exercise authoring tools, evaluation engines and repositories of learning objects. Its main feature is the use of a pivotal format to achieve greater extensibility. This approach simplifies the extension to other formats, just requiring the conversion to and from the pivotal format. This paper starts with an analysis of programming exercise formats representative of the existing diversity. This analysis sets the context for the proposed approach to exercise conversion and to the description of the pivotal data format. The abstract service definition is the basis for the design of BabeLO, its components and web service interface. This paper includes a report on the use of BabeLO in two concrete scenarios: to relocate exercises to a different repository, and to use an evaluation engine in a network of heterogeneous systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

E-Learning frameworks are conceptual tools to organize networks of elearning services. Most frameworks cover areas that go beyond the scope of e-learning, from course to financial management, and neglects the typical activities in everyday life of teachers and students at schools such as the creation, delivery, resolution and evaluation of assignments. This paper presents the Ensemble framework - an e-learning framework exclusively focused on the teaching-learning process through the coordination of pedagogical services. The framework presents an abstract data, integration and evaluation model based on content and communications specifications. These specifications must base the implementation of networks in specialized domains with complex evaluations. In this paper we specialize the framework for two domains with complex evaluation: computer programming and computer-aided design (CAD). For each domain we highlight two Ensemble hotspots: data and evaluations procedures. In the former we formally describe the exercise and present possible extensions. In the latter, we describe the automatic evaluation procedures.