781 resultados para Play-based programs


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The future scenarios for operation of smart grids are likely to include a large diversity of players, of different types and sizes. With control and decision making being decentralized over the network, intelligence should also be decentralized so that every player is able to play in the market environment. In the new context, aggregator players, enabling medium, small, and even micro size players to act in a competitive environment, will be very relevant. Virtual Power Players (VPP) and single players must optimize their energy resource management in order to accomplish their goals. This is relatively easy to larger players, with financial means to have access to adequate decision support tools, to support decision making concerning their optimal resource schedule. However, the smaller players have difficulties in accessing this kind of tools. So, it is required that these smaller players can be offered alternative methods to support their decisions. This paper presents a methodology, based on Artificial Neural Networks (ANN), intended to support smaller players’ resource scheduling. The used methodology uses a training set that is built using the energy resource scheduling solutions obtained with a reference optimization methodology, a mixed-integer non-linear programming (MINLP) in this case. The trained network is able to achieve good schedule results requiring modest computational means.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The growing importance and influence of new resources connected to the power systems has caused many changes in their operation. Environmental policies and several well know advantages have been made renewable based energy resources largely disseminated. These resources, including Distributed Generation (DG), are being connected to lower voltage levels where Demand Response (DR) must be considered too. These changes increase the complexity of the system operation due to both new operational constraints and amounts of data to be processed. Virtual Power Players (VPP) are entities able to manage these resources. Addressing these issues, this paper proposes a methodology to support VPP actions when these act as a Curtailment Service Provider (CSP) that provides DR capacity to a DR program declared by the Independent System Operator (ISO) or by the VPP itself. The amount of DR capacity that the CSP can assure is determined using data mining techniques applied to a database which is obtained for a large set of operation scenarios. The paper includes a case study based on 27,000 scenarios considering a diversity of distributed resources in a 33 bus distribution network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent decades, all over the world, competition in the electric power sector has deeply changed the way this sector’s agents play their roles. In most countries, electric process deregulation was conducted in stages, beginning with the clients of higher voltage levels and with larger electricity consumption, and later extended to all electrical consumers. The sector liberalization and the operation of competitive electricity markets were expected to lower prices and improve quality of service, leading to greater consumer satisfaction. Transmission and distribution remain noncompetitive business areas, due to the large infrastructure investments required. However, the industry has yet to clearly establish the best business model for transmission in a competitive environment. After generation, the electricity needs to be delivered to the electrical system nodes where demand requires it, taking into consideration transmission constraints and electrical losses. If the amount of power flowing through a certain line is close to or surpasses the safety limits, then cheap but distant generation might have to be replaced by more expensive closer generation to reduce the exceeded power flows. In a congested area, the optimal price of electricity rises to the marginal cost of the local generation or to the level needed to ration demand to the amount of available electricity. Even without congestion, some power will be lost in the transmission system through heat dissipation, so prices reflect that it is more expensive to supply electricity at the far end of a heavily loaded line than close to an electric power generation. Locational marginal pricing (LMP), resulting from bidding competition, represents electrical and economical values at nodes or in areas that may provide economical indicator signals to the market agents. This article proposes a data-mining-based methodology that helps characterize zonal prices in real power transmission networks. To test our methodology, we used an LMP database from the California Independent System Operator for 2009 to identify economical zones. (CAISO is a nonprofit public benefit corporation charged with operating the majority of California’s high-voltage wholesale power grid.) To group the buses into typical classes that represent a set of buses with the approximate LMP value, we used two-step and k-means clustering algorithms. By analyzing the various LMP components, our goal was to extract knowledge to support the ISO in investment and network-expansion planning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Group decision making plays an important role in organizations, especially in the present-day economy that demands high-quality, yet quick decisions. Group decision-support systems (GDSSs) are interactive computer-based environments that support concerted, coordinated team efforts toward the completion of joint tasks. The need for collaborative work in organizations has led to the development of a set of general collaborative computer-supported technologies and specific GDSSs that support distributed groups (in time and space) in various domains. However, each person is unique and has different reactions to various arguments. Many times a disagreement arises because of the way we began arguing, not because of the content itself. Nevertheless, emotion, mood, and personality factors have not yet been addressed in GDSSs, despite how strongly they influence results. Our group’s previous work considered the roles that emotion and mood play in decision making. In this article, we reformulate these factors and include personality as well. Thus, this work incorporates personality, emotion, and mood in the negotiation process of an argumentbased group decision-making process. Our main goal in this work is to improve the negotiation process through argumentation using the affective characteristics of the involved participants. Each participant agent represents a group decision member. This representation lets us simulate people with different personalities. The discussion process between group members (agents) is made through the exchange of persuasive arguments. Although our multiagent architecture model4 includes two types of agents—the facilitator and the participant— this article focuses on the emotional, personality, and argumentation components of the participant agent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Em Portugal, no final da década de 90 do século XX, começaram a ser desenvolvidas estratégias de discriminação positiva visando garantir o cumprimento da escolaridade obrigatória e a luta contra a exclusão escolar e social, designadamente nas periferias das grandes cidades. É neste contexto que surgem as escolas e os Territórios Educativos de Intervenção Prioritária (TEIP) pelo Despacho nº147-B/ME/96, de 1 de Agosto de 1996. Este programa, inicialmente inspirado nas zonas de intervenção prioritárias francesas, visava a intervenção de vários parceiros locais (professores, alunos, pessoal não docente, associações de pais, autarquias locais, associações culturais e associações recreativas) na elaboração do Projeto Educativo. Com o XVII Governo Constitucional o programa foi redefinido passando a incluir novas vertentes: necessidade de existência de um projeto educativo próprio; consultadoria externa; avaliação periódica de resultados em diferentes domínios (taxas de insucesso e abandono escolar, assiduidade, comportamento, participação, inovações organizacionais, parcerias educativas estabelecidas no âmbito do Programa, etc.). Foi também alargado a todo o território nacional, envolvendo atualmente cento e cinco agrupamentos escolares. Com esta apresentação visamos analisar se as novas políticas e diretrizes, no domínio da educação prioritária, contribuíram para a emergência de novas estratégias pedagógicas, organizacionais e de envolvimento comunitário. Os elementos em que basearemos a nossa análise serão os seguintes: (i) relatório nacional do programa TEIP (2010-2011); (ii) relatórios do programa de avaliação externa das escolas; (iii) entrevistas realizadas aos atores locais, designadamente a coordenadores e consultores dos projetos TEIP. As conclusões da nossa comunicação centrar-se-ão no papel dos atores locais no desenvolvimento do Programa TEIP e no impacto deste programa na melhoria dos resultados académicos, na diminuição da indisciplina e violência escolar e na construção de percursos de vida que contrariem as tendências para a exclusão escolar e social.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mestrado em Engenharia Electrotécnica – Sistemas Eléctricos de Energia

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: A cross-sectional population-based study was conducted to assess, in active smokers, the relationship of number of cigarettes smoked and other characteristics to salivary cotinine concentrations. METHODS: A random sample of active smokers aged 15 years or older was selected using a stepwise cluster sample strategy, in the year 2000 in Rio de Janeiro, Brazil. The study included 401 subjects. Salivary cotinine concentration was determined using gas chromatography with nitrogen-phosphorus detection. A standard questionnaire was used to collect demographic and smoking behavioral data. The relation between the number of cigarettes smoked in the last 24h and cotinine level was examined by means of a nonparametric fitting technique of robust locally weighted regression. RESULTS: Significantly (p<0.05) higher adjusted mean cotinine levels were found in subjects smoking their first cigarette within five minutes after waking up, and in those smoking 1-20 cigarettes in the last 24h who reported inhaling more than ½ the time. In those smoking 1-20 cigarettes, the slope was significantly higher for those subjects waiting for more than five minutes before smoking their first cigarette after waking up, and those smoking "light" cigarettes when compared with their counterparts. These heterogeneities became negligible and non-significant when subjects with cotinine >40 ng/mL per cigarette were excluded. CONCLUSIONS: There was found a positive association between self-reporting smoking five minutes after waking up, and inhaling more than ½ the time are consistent and higher cotinine levels. These can be markers of dependence and higher nicotine intake. Salivary cotinine proved to be a useful biomarker of recent smoking and can be used in epidemiological studies and smoking cessation programs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Independent component analysis (ICA) has recently been proposed as a tool to unmix hyperspectral data. ICA is founded on two assumptions: 1) the observed spectrum vector is a linear mixture of the constituent spectra (endmember spectra) weighted by the correspondent abundance fractions (sources); 2)sources are statistically independent. Independent factor analysis (IFA) extends ICA to linear mixtures of independent sources immersed in noise. Concerning hyperspectral data, the first assumption is valid whenever the multiple scattering among the distinct constituent substances (endmembers) is negligible, and the surface is partitioned according to the fractional abundances. The second assumption, however, is violated, since the sum of abundance fractions associated to each pixel is constant due to physical constraints in the data acquisition process. Thus, sources cannot be statistically independent, this compromising the performance of ICA/IFA algorithms in hyperspectral unmixing. This paper studies the impact of hyperspectral source statistical dependence on ICA and IFA performances. We conclude that the accuracy of these methods tends to improve with the increase of the signature variability, of the number of endmembers, and of the signal-to-noise ratio. In any case, there are always endmembers incorrectly unmixed. We arrive to this conclusion by minimizing the mutual information of simulated and real hyperspectral mixtures. The computation of mutual information is based on fitting mixtures of Gaussians to the observed data. A method to sort ICA and IFA estimates in terms of the likelihood of being correctly unmixed is proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chapter in Book Proceedings with Peer Review First Iberian Conference, IbPRIA 2003, Puerto de Andratx, Mallorca, Spain, JUne 4-6, 2003. Proceedings

Relevância:

30.00% 30.00%

Publicador:

Resumo:

23rd SPACE AGM and Conference from 9 to 12 May 2012 Conference theme: The Role of Professional Higher Education: Responsibility and Reflection Venue: Mikkeli University of Applied Sciences, Mikkeli, Finland

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Hearing loss h sone raised impact in the development and academic progress of a child. In several developed countries, early detection is part of the national health plan through universal neonatal hearing screening (UNHS) and also with school hearing screening programs (SHSP), but only a few have published national data and revised protocols. Currently in Portugal, the UNHS is implemented in the main district hospitals but not the SHPS, as well we still do not make use of concrete data nor publication of studies on the national reality. Objectives: The incidence of the hearing loss and of otological problems was studied in school communities in the north of the country with 2550 participants between 3 and 17 years old. Methods: Statistical data collected within the schools with a standard auditory hearing screening protocol. All participants were evaluated with the same protocol, an audiological anamnesis, otoscopy and audiometric exam screening (500, 1000, 2000 and 4000 Hz) were fulfilled. Results: Different otological problems were identified and the audiometric screening exam counted auditory thresholds that outpointed uni and bilateral hearing loss in about 5.7% of the cases. Conclusions: The study has demonstrated that auditory school screening should take place as early as possible and be part of the primary health care to identify and direct children to appropriate rehabilitation, education and attendance. Thus, reducing high costs with late treatment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study focused on the development of a sensitive enzymatic biosensor for the determination of pirimicarb pesticide based on the immobilization of laccase on composite carbon paste electrodes. Multi- walled carbon nanotubes(MWCNTs)paste electrode modified by dispersion of laccase(3%,w/w) within the optimum composite matrix(60:40%,w/w,MWCNTs and paraffin binder)showed the best performance, with excellent electron transfer kinetic and catalytic effects related to the redox process of the substrate4- aminophenol. No metal or anti-interference membrane was added. Based on the inhibition of laccase activity, pirimicarb can be determined in the range 9.90 ×10- 7 to 1.15 ×10- 5 molL 1 using 4- aminophenol as substrate at the optimum pH of 5.0, with acceptable repeatability and reproducibility (relative standard deviations lower than 5%).The limit of detection obtained was 1.8 × 10-7 molL 1 (0.04 mgkg 1 on a fresh weight vegetable basis).The high activity and catalytic properties of the laccase- based biosensor are retained during ca. one month. The optimized electroanalytical protocol coupled to the QuEChERS methodology were applied to tomato and lettuce samples spiked at three levels; recoveries ranging from 91.0±0.1% to 101.0 ± 0.3% were attained. No significant effects in the pirimicarb electro- analysis were observed by the presence of pro-vitamin A, vitamins B1 and C,and glucose in the vegetable extracts. The proposed biosensor- based pesticide residue methodology fulfills all requisites to be used in implementation of food safety programs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Future distribution systems will have to deal with an intensive penetration of distributed energy resources ensuring reliable and secure operation according to the smart grid paradigm. SCADA (Supervisory Control and Data Acquisition) is an essential infrastructure for this evolution. This paper proposes a new conceptual design of an intelligent SCADA with a decentralized, flexible, and intelligent approach, adaptive to the context (context awareness). This SCADA model is used to support the energy resource management undertaken by a distribution network operator (DNO). Resource management considers all the involved costs, power flows, and electricity prices, allowing the use of network reconfiguration and load curtailment. Locational Marginal Prices (LMP) are evaluated and used in specific situations to apply Demand Response (DR) programs on a global or a local basis. The paper includes a case study using a 114 bus distribution network and load demand based on real data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

After a historical introduction, the bulk of the thesis concerns the study of a declarative semantics for logic programs. The main original contributions are: ² WFSX (Well–Founded Semantics with eXplicit negation), a new semantics for logic programs with explicit negation (i.e. extended logic programs), which compares favourably in its properties with other extant semantics. ² A generic characterization schema that facilitates comparisons among a diversity of semantics of extended logic programs, including WFSX. ² An autoepistemic and a default logic corresponding to WFSX, which solve existing problems of the classical approaches to autoepistemic and default logics, and clarify the meaning of explicit negation in logic programs. ² A framework for defining a spectrum of semantics of extended logic programs based on the abduction of negative hypotheses. This framework allows for the characterization of different levels of scepticism/credulity, consensuality, and argumentation. One of the semantics of abduction coincides with WFSX. ² O–semantics, a semantics that uniquely adds more CWA hypotheses to WFSX. The techniques used for doing so are applicable as well to the well–founded semantics of normal logic programs. ² By introducing explicit negation into logic programs contradiction may appear. I present two approaches for dealing with contradiction, and show their equivalence. One of the approaches consists in avoiding contradiction, and is based on restrictions in the adoption of abductive hypotheses. The other approach consists in removing contradiction, and is based in a transformation of contradictory programs into noncontradictory ones, guided by the reasons for contradiction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Maxwell equations play a fundamental role in the electromagnetic theory and lead to models useful in physics and engineering. This formalism involves integer-order differential calculus, but the electromagnetic diffusion points towards the adoption of a fractional calculus approach. This study addresses the skin effect and develops a new method for implementing fractional-order inductive elements. Two genetic algorithms are adopted, one for the system numerical evaluation and another for the parameter identification, both with good results.