18 resultados para PROPOSED APPROACH
em Instituto Politécnico do Porto, Portugal
Resumo:
Energy systems worldwide are complex and challenging environments. Multi-agent based simulation platforms are increasing at a high rate, as they show to be a good option to study many issues related to these systems, as well as the involved players at act in this domain. In this scope the authors’ research group has developed a multi-agent system: MASCEM (Multi-Agent System for Competitive Electricity Markets), which simulates the electricity markets. MASCEM is integrated with ALBidS (Adaptive Learning Strategic Bidding System) that works as a decision support system for market players. The ALBidS system allows MASCEM market negotiating players to take the best possible advantages from the market context. However, it is still necessary to adequately optimize the player’s portfolio investment. For this purpose, this paper proposes a market portfolio optimization method, based on particle swarm optimization, which provides the best investment profile for a market player, considering the different markets the player is acting on in each moment, and depending on different contexts of negotiation, such as the peak and offpeak periods of the day, and the type of day (business day, weekend, holiday, etc.). The proposed approach is tested and validated using real electricity markets data from the Iberian operator – OMIE.
Resumo:
Smart Grids (SGs) have emerged as the new paradigm for power system operation and management, being designed to include large amounts of distributed energy resources. This new paradigm requires new Energy Resource Management (ERM) methodologies considering different operation strategies and the existence of new management players such as several types of aggregators. This paper proposes a methodology to facilitate the coalition between distributed generation units originating Virtual Power Players (VPP) considering a game theory approach. The proposed approach consists in the analysis of the classifications that were attributed by each VPP to the distributed generation units, as well as in the analysis of the previous established contracts by each player. The proposed classification model is based in fourteen parameters including technical, economical and behavioural ones. Depending of the VPP strategies, size and goals, each parameter has different importance. VPP can also manage other type of energy resources, like storage units, electric vehicles, demand response programs or even parts of the MV and LV distribution network. A case study with twelve VPPs with different characteristics and one hundred and fifty real distributed generation units is included in the paper.
Resumo:
In recent years the use of several new resources in power systems, such as distributed generation, demand response and more recently electric vehicles, has significantly increased. Power systems aim at lowering operational costs, requiring an adequate energy resources management. In this context, load consumption management plays an important role, being necessary to use optimization strategies to adjust the consumption to the supply profile. These optimization strategies can be integrated in demand response programs. The control of the energy consumption of an intelligent house has the objective of optimizing the load consumption. This paper presents a genetic algorithm approach to manage the consumption of a residential house making use of a SCADA system developed by the authors. Consumption management is done reducing or curtailing loads to keep the power consumption in, or below, a specified energy consumption limit. This limit is determined according to the consumer strategy and taking into account the renewable based micro generation, energy price, supplier solicitations, and consumers’ preferences. The proposed approach is compared with a mixed integer non-linear approach.
Resumo:
This paper presents a methodology to address reactive power compensation using Evolutionary Particle Swarm Optimization (EPSO) technique programmed in the MATLAB environment. The main objective is to find the best operation point minimizing power losses with reactive power compensation, subjected to all operational constraints, namely full AC power flow equations, active and reactive power generation constraints. The methodology has been tested with the IEEE 14 bus test system demonstrating the ability and effectiveness of the proposed approach to handle the reactive power compensation problem.
Resumo:
In the last two decades, there was a proliferation of programming exercise formats that hinders interoperability in automatic assessment. In the lack of a widely accepted standard, a pragmatic solution is to convert content among the existing formats. BabeLO is a programming exercise converter providing services to a network of heterogeneous e-learning systems such as contest management systems, programming exercise authoring tools, evaluation engines and repositories of learning objects. Its main feature is the use of a pivotal format to achieve greater extensibility. This approach simplifies the extension to other formats, just requiring the conversion to and from the pivotal format. This paper starts with an analysis of programming exercise formats representative of the existing diversity. This analysis sets the context for the proposed approach to exercise conversion and to the description of the pivotal data format. The abstract service definition is the basis for the design of BabeLO, its components and web service interface. This paper includes a report on the use of BabeLO in two concrete scenarios: to relocate exercises to a different repository, and to use an evaluation engine in a network of heterogeneous systems.
Resumo:
High-level parallel languages offer a simple way for application programmers to specify parallelism in a form that easily scales with problem size, leaving the scheduling of the tasks onto processors to be performed at runtime. Therefore, if the underlying system cannot efficiently execute those applications on the available cores, the benefits will be lost. In this paper, we consider how to schedule highly heterogenous parallel applications that require real-time performance guarantees on multicore processors. The paper proposes a novel scheduling approach that combines the global Earliest Deadline First (EDF) scheduler with a priority-aware work-stealing load balancing scheme, which enables parallel realtime tasks to be executed on more than one processor at a given time instant. Experimental results demonstrate the better scalability and lower scheduling overhead of the proposed approach comparatively to an existing real-time deadline-oriented scheduling class for the Linux kernel.
Resumo:
It has been shown that in reality at least two general scenarios of data structuring are possible: (a) a self-similar (SS) scenario when the measured data form an SS structure and (b) a quasi-periodic (QP) scenario when the repeated (strongly correlated) data form random sequences that are almost periodic with respect to each other. In the second case it becomes possible to describe their behavior and express a part of their randomness quantitatively in terms of the deterministic amplitude–frequency response belonging to the generalized Prony spectrum. This possibility allows us to re-examine the conventional concept of measurements and opens a new way for the description of a wide set of different data. In particular, it concerns different complex systems when the ‘best-fit’ model pretending to be the description of the data measured is absent but the barest necessity of description of these data in terms of the reduced number of quantitative parameters exists. The possibilities of the proposed approach and detection algorithm of the QP processes were demonstrated on actual data: spectroscopic data recorded for pure water and acoustic data for a test hole. The suggested methodology allows revising the accepted classification of different incommensurable and self-affine spatial structures and finding accurate interpretation of the generalized Prony spectroscopy that includes the Fourier spectroscopy as a partial case.
Resumo:
Preemptions account for a non-negligible overhead during system execution. There has been substantial amount of research on estimating the delay incurred due to the loss of working sets in the processor state (caches, registers, TLBs) and some on avoiding preemptions, or limiting the preemption cost. We present an algorithm to reduce preemptions by further delaying the start of execution of high priority tasks in fixed priority scheduling. Our approaches take advantage of the floating non-preemptive regions model and exploit the fact that, during the schedule, the relative task phasing will differ from the worst-case scenario in terms of admissible preemption deferral. Furthermore, approximations to reduce the complexity of the proposed approach are presented. Substantial set of experiments demonstrate that the approach and approximations improve over existing work, in particular for the case of high utilisation systems, where savings of up to 22% on the number of preemption are attained.
Resumo:
This article describes a finite element-based formulation for the statistical analysis of the response of stochastic structural composite systems whose material properties are described by random fields. A first-order technique is used to obtain the second-order statistics for the structural response considering means and variances of the displacement and stress fields of plate or shell composite structures. Propagation of uncertainties depends on sensitivities taken as measurement of variation effects. The adjoint variable method is used to obtain the sensitivity matrix. This method is appropriated for composite structures due to the large number of random input parameters. Dominant effects on the stochastic characteristics are studied analyzing the influence of different random parameters. In particular, a study of the anisotropy influence on uncertainties propagation of angle-ply composites is carried out based on the proposed approach.
Resumo:
This paper proposes a dynamic scheduler that supports the coexistence of guaranteed and non-guaranteed bandwidth servers to efficiently handle soft-tasks’ overloads by making additional capacity available from two sources: (i) residual capacity allocated but unused when jobs complete in less than their budgeted execution time; (ii) stealing capacity from inactive non-isolated servers used to schedule best-effort jobs. The effectiveness of the proposed approach in reducing the mean tardiness of periodic jobs is demonstrated through extensive simulations. The achieved results become even more significant when tasks’ computation times have a large variance.
Resumo:
A QoS adaptation to dynamically changing system conditions that takes into consideration the user’s constraints on the stability of service provisioning is presented. The goal is to allow the system to make QoS adaptation decisions in response to fluctuations in task traffic flow, under the control of the user. We pay special attention to the case where monitoring the stability period and resource load variation of Service Level Agreements for different types of services is used to dynamically adapt future stability periods, according to a feedback control scheme. System’s adaptation behaviour can be configured according to a desired confidence level on future resource usage. The viability of the proposed approach is validated by preliminary experiments.
Resumo:
A simple procedure to measure the cohesive laws of bonded joints under mode I loading using the double cantilever beam test is proposed. The method only requires recording the applied load–displacement data and measuring the crack opening displacement at its tip in the course of the experimental test. The strain energy release rate is obtained by a procedure involving the Timoshenko beam theory, the specimen’s compliance and the crack equivalent concept. Following the proposed approach the influence of the fracture process zone is taken into account which is fundamental for an accurate estimation of the failure process details. The cohesive law is obtained by differentiation of the strain energy release rate as a function of the crack opening displacement. The model was validated numerically considering three representative cohesive laws. Numerical simulations using finite element analysis including cohesive zone modeling were performed. The good agreement between the inputted and resulting laws for all the cases considered validates the model. An experimental confirmation was also performed by comparing the numerical and experimental load–displacement curves. The numerical load–displacement curves were obtained by adjusting typical cohesive laws to the ones measured experimentally following the proposed approach and using finite element analysis including cohesive zone modeling. Once again, good agreement was obtained in the comparisons thus demonstrating the good performance of the proposed methodology.
Resumo:
The elastic behavior of the demand consumption jointly used with other available resources such as distributed generation (DG) can play a crucial role for the success of smart grids. The intensive use of Distributed Energy Resources (DER) and the technical and contractual constraints result in large-scale non linear optimization problems that require computational intelligence methods to be solved. This paper proposes a Particle Swarm Optimization (PSO) based methodology to support the minimization of the operation costs of a virtual power player that manages the resources in a distribution network and the network itself. Resources include the DER available in the considered time period and the energy that can be bought from external energy suppliers. Network constraints are considered. The proposed approach uses Gaussian mutation of the strategic parameters and contextual self-parameterization of the maximum and minimum particle velocities. The case study considers a real 937 bus distribution network, with 20310 consumers and 548 distributed generators. The obtained solutions are compared with a deterministic approach and with PSO without mutation and Evolutionary PSO, both using self-parameterization.
Resumo:
In today’s healthcare paradigm, optimal sedation during anesthesia plays an important role both in patient welfare and in the socio-economic context. For the closed-loop control of general anesthesia, two drugs have proven to have stable, rapid onset times: propofol and remifentanil. These drugs are related to their effect in the bispectral index, a measure of EEG signal. In this paper wavelet time–frequency analysis is used to extract useful information from the clinical signals, since they are time-varying and mark important changes in patient’s response to drug dose. Model based predictive control algorithms are employed to regulate the depth of sedation by manipulating these two drugs. The results of identification from real data and the simulation of the closed loop control performance suggest that the proposed approach can bring an improvement of 9% in overall robustness and may be suitable for clinical practice.
Resumo:
Electricity Markets are not only a new reality but an evolving one as the involved players and rules change at a relatively high rate. Multi-agent simulation combined with Artificial Intelligence techniques may result in very helpful sophisticated tools. This paper presents a new methodology for the management of coalitions in electricity markets. This approach is tested using the multi-agent market simulator MASCEM (Multi-Agent Simulator of Competitive Electricity Markets), taking advantage of its ability to provide the means to model and simulate Virtual Power Players (VPP). VPPs are represented as coalitions of agents, with the capability of negotiating both in the market and internally, with their members in order to combine and manage their individual specific characteristics and goals, with the strategy and objectives of the VPP itself. A case study using real data from the Iberian Electricity Market is performed to validate and illustrate the proposed approach.