65 resultados para water use optimization
em Instituto Politécnico do Porto, Portugal
Resumo:
In real optimization problems, usually the analytical expression of the objective function is not known, nor its derivatives, or they are complex. In these cases it becomes essential to use optimization methods where the calculation of the derivatives, or the verification of their existence, is not necessary: the Direct Search Methods or Derivative-free Methods are one solution. When the problem has constraints, penalty functions are often used. Unfortunately the choice of the penalty parameters is, frequently, very difficult, because most strategies for choosing it are heuristics strategies. As an alternative to penalty function appeared the filter methods. A filter algorithm introduces a function that aggregates the constrained violations and constructs a biobjective problem. In this problem the step is accepted if it either reduces the objective function or the constrained violation. This implies that the filter methods are less parameter dependent than a penalty function. In this work, we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of the simplex method and filter methods. This method does not compute or approximate any derivatives, penalty constants or Lagrange multipliers. The basic idea of simplex filter algorithm is to construct an initial simplex and use the simplex to drive the search. We illustrate the behavior of our algorithm through some examples. The proposed methods were implemented in Java.
Resumo:
The filter method is a technique for solving nonlinear programming problems. The filter algorithm has two phases in each iteration. The first one reduces a measure of infeasibility, while in the second the objective function value is reduced. In real optimization problems, usually the objective function is not differentiable or its derivatives are unknown. In these cases it becomes essential to use optimization methods where the calculation of the derivatives or the verification of their existence is not necessary: direct search methods or derivative-free methods are examples of such techniques. In this work we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of simplex and filter methods. This method neither computes nor approximates derivatives, penalty constants or Lagrange multipliers.
Resumo:
The concept of demand response has drawing attention to the active participation in the economic operation of power systems, namely in the context of recent electricity markets and smart grid models and implementations. In these competitive contexts, aggregators are necessary in order to make possible the participation of small size consumers and generation units. The methodology proposed in the present paper aims to address the demand shifting between periods, considering multi-period demand response events. The focus is given to the impact in the subsequent periods. A Virtual Power Player operates the network, aggregating the available resources, and minimizing the operation costs. The illustrative case study included is based on a scenario of 218 consumers including generation sources.
Resumo:
Demand response programs and models have been developed and implemented for an improved performance of electricity markets, taking full advantage of smart grids. Studying and addressing the consumers’ flexibility and network operation scenarios makes possible to design improved demand response models and programs. The methodology proposed in the present paper aims to address the definition of demand response programs that consider the demand shifting between periods, regarding the occurrence of multi-period demand response events. The optimization model focuses on minimizing the network and resources operation costs for a Virtual Power Player. Quantum Particle Swarm Optimization has been used in order to obtain the solutions for the optimization model that is applied to a large set of operation scenarios. The implemented case study illustrates the use of the proposed methodology to support the decisions of the Virtual Power Player in what concerns the duration of each demand response event.
Resumo:
In recent years the use of several new resources in power systems, such as distributed generation, demand response and more recently electric vehicles, has significantly increased. Power systems aim at lowering operational costs, requiring an adequate energy resources management. In this context, load consumption management plays an important role, being necessary to use optimization strategies to adjust the consumption to the supply profile. These optimization strategies can be integrated in demand response programs. The control of the energy consumption of an intelligent house has the objective of optimizing the load consumption. This paper presents a genetic algorithm approach to manage the consumption of a residential house making use of a SCADA system developed by the authors. Consumption management is done reducing or curtailing loads to keep the power consumption in, or below, a specified energy consumption limit. This limit is determined according to the consumer strategy and taking into account the renewable based micro generation, energy price, supplier solicitations, and consumers’ preferences. The proposed approach is compared with a mixed integer non-linear approach.
Resumo:
Demand response concept has been gaining increasing importance while the success of several recent implementations makes this resource benefits unquestionable. This happens in a power systems operation environment that also considers an intensive use of distributed generation. However, more adequate approaches and models are needed in order to address the small size consumers and producers aggregation, while taking into account these resources goals. The present paper focuses on the demand response programs and distributed generation resources management by a Virtual Power Player that optimally aims to minimize its operation costs taking the consumption shifting constraints into account. The impact of the consumption shifting in the distributed generation resources schedule is also considered. The methodology is applied to three scenarios based on 218 consumers and 4 types of distributed generation, in a time frame of 96 periods.
Resumo:
Ecological Water Quality - Water Treatment and Reuse
Resumo:
Abstract: Preferential flow and transport through macropores affect plant water use efficiency and enhance leaching of agrochemicals and the transport of colloids, thereby increasing the risk for contamination of groundwater resources. The effects of soil compaction, expressed in terms of bulk density (BD), and organic carbon (OC) content on preferential flow and transport were investigated using 150 undisturbed soil cores sampled from 15 × 15–m grids on two field sites. Both fields had loamy textures, but one site had significantly higher OC content. Leaching experiments were conducted in each core by applying a constant irrigation rate of 10 mm h−1 with a pulse application of tritium tracer. Five percent tritium mass arrival times and apparent dispersivities were derived from each of the tracer breakthrough curves and correlated with texture, OC content, and BD to assess the spatial distribution of preferential flow and transport across the investigated fields. Soils from both fields showed strong positive correlations between BD and preferential flow. Interestingly, the relationships between BD and tracer transport characteristics were markedly different for the two fields, although the relationship between BD and macroporosity was nearly identical. The difference was likely caused by the higher contents of fines and OC at one of the fields leading to stronger aggregation, smaller matrix permeability, and a more pronounced pipe-like pore system with well-aligned macropores.
Resumo:
The deterioration of water quality by Cyanobacteria cause outbreaks and epidemics associated with harmful diseases in Humans and animals because of the toxins that they release. Microcystin-LR is one of the hepatotoxins most widely studied and the World Health Organization, recommend a maximum value of 1mgL 1 in drinking water. Highly specific recognition molecules, such as molecular imprinted polymers are developed to quantify microcystins in waters for human use and shown to be of great potential in the analysis of these kinds of samples. The obtained results were auspicious, the detection limit found, 1.5mgL 1, being of the same order of magnitude as the guideline limit recommended by the WHO. This technology is very promising because the sensors are stable and specific, and the technology is inexpensive and allows for rapid on-site monitoring.
Resumo:
Seven pyrethroids (bifenthrin, fenpropathrin, k-cyhalothrin, permethrin, a-cypermethrin, fenvalerate, and deltamethrin) were extracted from water using C18 solid-phase extraction disks, followed by gas chromatography with an electron capture detector (GC-ECD) analysis. The limits of detection in water samples ranged from 0.5 ng L-1 (fenpropathrin) to 110 ng L- 1 (permethrin), applying the calibration graph. The effects of different numbers of (re)utilizations of the same disks (up to four times with several concentrations) on the recoveries of the pyrethroids were considered. The recoveries were all between 70 and 120% after four utilizations of the same disk. There was no difference between these recoveries at a confidence level of 95%.
Resumo:
Trihalomethanes (THMs) are widely referred and studied as disinfection by-products (DBPs). The THMs that are most commonly detected are chloroform (TCM), bromodichloromethane (BDCM), chlorodibromomethane (CDBM), and bromoform (TBM). Several studies regarding the determination of THMs in swimming pool water and air samples have been published. This paper reviews the most recent work in this field, with a special focus on water and air sampling, sample preparation and analytical determination methods. An experimental study has been developed in order to optimize the headspace solid-phasemicroextraction (HS-SPME) conditions of TCM, BDCM, CDBM and TBM from water samples using a 23 factorial design. An extraction temperature of 45 °C, for 25min, and a desorption time of 5 min were found to be the best conditions. Analysis was performed by gas chromatography with an electron capture detector (GC-ECD). The method was successfully applied to a set of 27 swimming pool water samples collected in the Oporto area (Portugal). TCM was the only THM detected with levels between 4.5 and 406.5 μg L−1. Four of the samples exceeded the guideline value for total THMs in swimming pool water (100 μgL−1) indicated by the Portuguese Health Authority.
Resumo:
This study aims to optimize the water quality monitoring of a polluted watercourse (Leça River, Portugal) through the principal component analysis (PCA) and cluster analysis (CA). These statistical methodologies were applied to physicochemical, bacteriological and ecotoxicological data (with the marine bacterium Vibrio fischeri and the green alga Chlorella vulgaris) obtained with the analysis of water samples monthly collected at seven monitoring sites and during five campaigns (February, May, June, August, and September 2006). The results of some variables were assigned to water quality classes according to national guidelines. Chemical and bacteriological quality data led to classify Leça River water quality as “bad” or “very bad”. PCA and CA identified monitoring sites with similar pollution pattern, giving to site 1 (located in the upstream stretch of the river) a distinct feature from all other sampling sites downstream. Ecotoxicity results corroborated this classification thus revealing differences in space and time. The present study includes not only physical, chemical and bacteriological but also ecotoxicological parameters, which broadens new perspectives in river water characterization. Moreover, the application of PCA and CA is very useful to optimize water quality monitoring networks, defining the minimum number of sites and their location. Thus, these tools can support appropriate management decisions.
Resumo:
Environmental nanoremediation of various contaminants has been reported in several recent studies. In this paper, the state of the art on the use of nanoparticles in soil and groundwater remediation processes is presented. There is a substantive body of evidence on the growing and successful application of nanoremediation for a diversity of soil and groundwater contamination contexts, particularly, for heavy metals, other inorganic contaminants, organic contaminants and emerging contaminants, as pharmaceutical and personal care products. This review confirms the competence of the use of nanoparticles in the remediation of contaminated media and the prevalent use of iron based nanoparticles.
Resumo:
This paper addresses the problem of energy resources management using modern metaheuristics approaches, namely Particle Swarm Optimization (PSO), New Particle Swarm Optimization (NPSO) and Evolutionary Particle Swarm Optimization (EPSO). The addressed problem in this research paper is intended for aggregators’ use operating in a smart grid context, dealing with Distributed Generation (DG), and gridable vehicles intelligently managed on a multi-period basis according to its users’ profiles and requirements. The aggregator can also purchase additional energy from external suppliers. The paper includes a case study considering a 30 kV distribution network with one substation, 180 buses and 90 load points. The distribution network in the case study considers intense penetration of DG, including 116 units from several technologies, and one external supplier. A scenario of 6000 EVs for the given network is simulated during 24 periods, corresponding to one day. The results of the application of the PSO approaches to this case study are discussed deep in the paper.
Resumo:
Metaheuristics performance is highly dependent of the respective parameters which need to be tuned. Parameter tuning may allow a larger flexibility and robustness but requires a careful initialization. The process of defining which parameters setting should be used is not obvious. The values for parameters depend mainly on the problem, the instance to be solved, the search time available to spend in solving the problem, and the required quality of solution. This paper presents a learning module proposal for an autonomous parameterization of Metaheuristics, integrated on a Multi-Agent System for the resolution of Dynamic Scheduling problems. The proposed learning module is inspired on Autonomic Computing Self-Optimization concept, defining that systems must continuously and proactively improve their performance. For the learning implementation it is used Case-based Reasoning, which uses previous similar data to solve new cases. In the use of Case-based Reasoning it is assumed that similar cases have similar solutions. After a literature review on topics used, both AutoDynAgents system and Self-Optimization module are described. Finally, a computational study is presented where the proposed module is evaluated, obtained results are compared with previous ones, some conclusions are reached, and some future work is referred. It is expected that this proposal can be a great contribution for the self-parameterization of Metaheuristics and for the resolution of scheduling problems on dynamic environments.