863 resultados para Short-term Exercise
Resumo:
This paper proposes an energy resources management methodology based on three distinct time horizons: day-ahead scheduling, hour-ahead scheduling, and real-time scheduling. In each scheduling process it is necessary the update of generation and consumption operation and of the storage and electric vehicles storage status. Besides the new operation condition, it is important more accurate forecast values of wind generation and of consumption using results of in short-term and very short-term methods. A case study considering a distribution network with intensive use of distributed generation and electric vehicles is presented.
Resumo:
Distributed Energy Resources (DER) scheduling in smart grids presents a new challenge to system operators. The increase of new resources, such as storage systems and demand response programs, results in additional computational efforts for optimization problems. On the other hand, since natural resources, such as wind and sun, can only be precisely forecasted with small anticipation, short-term scheduling is especially relevant requiring a very good performance on large dimension problems. Traditional techniques such as Mixed-Integer Non-Linear Programming (MINLP) do not cope well with large scale problems. This type of problems can be appropriately addressed by metaheuristics approaches. This paper proposes a new methodology called Signaled Particle Swarm Optimization (SiPSO) to address the energy resources management problem in the scope of smart grids, with intensive use of DER. The proposed methodology’s performance is illustrated by a case study with 99 distributed generators, 208 loads, and 27 storage units. The results are compared with those obtained in other methodologies, namely MINLP, Genetic Algorithm, original Particle Swarm Optimization (PSO), Evolutionary PSO, and New PSO. SiPSO performance is superior to the other tested PSO variants, demonstrating its adequacy to solve large dimension problems which require a decision in a short period of time.
Resumo:
Distribution systems are the first volunteers experiencing the benefits of smart grids. The smart grid concept impacts the internal legislation and standards in grid-connected and isolated distribution systems. Demand side management, the main feature of smart grids, acquires clear meaning in low voltage distribution systems. In these networks, various coordination procedures are required between domestic, commercial and industrial consumers, producers and the system operator. Obviously, the technical basis for bidirectional communication is the prerequisite of developing such a coordination procedure. The main coordination is required when the operator tries to dispatch the producers according to their own preferences without neglecting its inherent responsibility. Maintenance decisions are first determined by generating companies, and then the operator has to check and probably modify them for final approval. In this paper the generation scheduling from the viewpoint of a distribution system operator (DSO) is formulated. The traditional task of the DSO is securing network reliability and quality. The effectiveness of the proposed method is assessed by applying it to a 6-bus and 9-bus distribution system.
Resumo:
The large increase of distributed energy resources, including distributed generation, storage systems and demand response, especially in distribution networks, makes the management of the available resources a more complex and crucial process. With wind based generation gaining relevance, in terms of the generation mix, the fact that wind forecasting accuracy rapidly drops with the increase of the forecast anticipation time requires to undertake short-term and very short-term re-scheduling so the final implemented solution enables the lowest possible operation costs. This paper proposes a methodology for energy resource scheduling in smart grids, considering day ahead, hour ahead and five minutes ahead scheduling. The short-term scheduling, undertaken five minutes ahead, takes advantage of the high accuracy of the very-short term wind forecasting providing the user with more efficient scheduling solutions. The proposed method uses a Genetic Algorithm based approach for optimization that is able to cope with the hard execution time constraint of short-term scheduling. Realistic power system simulation, based on PSCAD , is used to validate the obtained solutions. The paper includes a case study with a 33 bus distribution network with high penetration of distributed energy resources implemented in PSCAD .
Resumo:
Short-term risk management is highly dependent on long-term contractual decisions previously established; risk aversion factor of the agent and short-term price forecast accuracy. Trying to give answers to that problem, this paper provides a different approach for short-term risk management on electricity markets. Based on long-term contractual decisions and making use of a price range forecast method developed by the authors, the short-term risk management tool presented here has as main concern to find the optimal spot market strategies that a producer should have for a specific day in function of his risk aversion factor, with the objective to maximize the profits and simultaneously to practice the hedge against price market volatility. Due to the complexity of the optimization problem, the authors make use of Particle Swarm Optimization (PSO) to find the optimal solution. Results from realistic data, namely from OMEL electricity market, are presented and discussed in detail.
Resumo:
The large increase of Distributed Generation (DG) in Power Systems (PS) and specially in distribution networks makes the management of distribution generation resources an increasingly important issue. Beyond DG, other resources such as storage systems and demand response must be managed in order to obtain more efficient and “green” operation of PS. More players, such as aggregators or Virtual Power Players (VPP), that operate these kinds of resources will be appearing. This paper proposes a new methodology to solve the distribution network short term scheduling problem in the Smart Grid context. This methodology is based on a Genetic Algorithms (GA) approach for energy resource scheduling optimization and on PSCAD software to obtain realistic results for power system simulation. The paper includes a case study with 99 distributed generators, 208 loads and 27 storage units. The GA results for the determination of the economic dispatch considering the generation forecast, storage management and load curtailment in each period (one hour) are compared with the ones obtained with a Mixed Integer Non-Linear Programming (MINLP) approach.
Resumo:
In the proposed model, the independent system operator (ISO) provides the opportunity for maintenance outage rescheduling of generating units before each short-term (ST) time interval. Long-term (LT) scheduling for 1 or 2 years in advance is essential for the ISO and the generation companies (GENCOs) to decide their LT strategies; however, it is not possible to be exactly followed and requires slight adjustments. The Cournot-Nash equilibrium is used to characterize the decision-making procedure of an individual GENCO for ST intervals considering the effective coordination with LT plans. Random inputs, such as parameters of the demand function of loads, hourly demand during the following ST time interval and the expected generation pattern of the rivals, are included as scenarios in the stochastic mixed integer program defined to model the payoff-maximizing objective of a GENCO. Scenario reduction algorithms are used to deal with the computational burden. Two reliability test systems were chosen to illustrate the effectiveness of the proposed model for the ST decision-making process for future planned outages from the point of view of a GENCO.
Resumo:
In the current context of serious climate changes, where the increase of the frequency of some extreme events occurrence can enhance the rate of periods prone to high intensity forest fires, the National Forest Authority often implements, in several Portuguese forest areas, a regular set of measures in order to control the amount of fuel mass availability (PNDFCI, 2008). In the present work we’ll present a preliminary analysis concerning the assessment of the consequences given by the implementation of prescribed fire measures to control the amount of fuel mass in soil recovery, in particular in terms of its water retention capacity, its organic matter content, pH and content of iron. This work is included in a larger study (Meira-Castro, 2009(a); Meira-Castro, 2009(b)). According to the established praxis on the data collection, embodied in multidimensional matrices of n columns (variables in analysis) by p lines (sampled areas at different depths), and also considering the quantitative data nature present in this study, we’ve chosen a methodological approach that considers the multivariate statistical analysis, in particular, the Principal Component Analysis (PCA ) (Góis, 2004). The experiments were carried out in a soil cover over a natural site of Andaluzitic schist, in Gramelas, Caminha, NW Portugal, who was able to maintain itself intact from prescribed burnings from four years and was submit to prescribed fire in March 2008. The soils samples were collected from five different plots at six different time periods. The methodological option that was adopted have allowed us to identify the most relevant relational structures inside the n variables, the p samples and in two sets at the same time (Garcia-Pereira, 1990). Consequently, and in addition to the traditional outputs produced from the PCA, we have analyzed the influence of both sampling depths and geomorphological environments in the behavior of all variables involved.
Resumo:
Load forecasting has gradually becoming a major field of research in electricity industry. Therefore, Load forecasting is extremely important for the electric sector under deregulated environment as it provides a useful support to the power system management. Accurate power load forecasting models are required to the operation and planning of a utility company, and they have received increasing attention from researches of this field study. Many mathematical methods have been developed for load forecasting. This work aims to develop and implement a load forecasting method for short-term load forecasting (STLF), based on Holt-Winters exponential smoothing and an artificial neural network (ANN). One of the main contributions of this paper is the application of Holt-Winters exponential smoothing approach to the forecasting problem and, as an evaluation of the past forecasting work, data mining techniques are also applied to short-term Load forecasting. Both ANN and Holt-Winters exponential smoothing approaches are compared and evaluated.
Resumo:
Wind speed forecasting has been becoming an important field of research to support the electricity industry mainly due to the increasing use of distributed energy sources, largely based on renewable sources. This type of electricity generation is highly dependent on the weather conditions variability, particularly the variability of the wind speed. Therefore, accurate wind power forecasting models are required to the operation and planning of wind plants and power systems. A Support Vector Machines (SVM) model for short-term wind speed is proposed and its performance is evaluated and compared with several artificial neural network (ANN) based approaches. A case study based on a real database regarding 3 years for predicting wind speed at 5 minutes intervals is presented.
Resumo:
This work explores the use of fluorescent probes to evaluate the responses of the green alga Pseudokirchneriella subcapitata to the action of three nominal concentrations of Cd(II), Cr(VI), Cu(II) and Zn(II) for a short time (6 h). The toxic effect of the metals on algal cells was monitored using the fluorochromes SYTOX Green (SG, membrane integrity), fluorescein diacetate (FDA, esterase activity) and rhodamine 123 (Rh123, mitochondrial membrane potential). The impact of metals on chlorophyll a (Chl a) autofluorescence was also evaluated. Esterase activity was the most sensitive parameter. At the concentrations studied, all metals induced the loss of esterase activity. SG could be used to effectively detect the loss of membrane integrity in algal cells exposed to 0.32 or 1.3 μmol L−1 Cu(II). Rh123 revealed a decrease in the mitochondrial membrane potential of algal cells exposed to 0.32 and 1.3 μmol L−1 Cu(II), indicating that mitochondrial activity was compromised. Chl a autofluorescence was also affected by the presence of Cr(VI) and Cu(II), suggesting perturbation of photosynthesis. In conclusion, the fluorescence-based approach was useful for detecting the disturbance of specific cellular characteristics. Fluorescent probes are a useful diagnostic tool for the assessment of the impact of toxicants on specific targets of P. subcapitata algal cells.
Resumo:
Evidence in the literature suggests a negative relationship between volume of medical procedures and mortality rates in the health care sector. In general, high-volume hospitals appear to achieve lower mortality rates, although considerable variation exists. However, most studies focus on US hospitals, which face different incentives than hospitals in a National Health Service (NHS). In order to add to the literature, this study aims to understand what happens in a NHS. Results reveal a statistically significant correlation between volume of procedures and better outcomes for the following medical procedures: cerebral infarction, respiratory infections, circulatory disorders with AMI, bowel procedures, cirrhosis, and hip and femur procedures. The effect is explained with the practice-makes-perfect hypothesis through static effects of scale with little evidence of learning-by-doing. The centralization of those medical procedures is recommended given that this policy would save a considerable number of lives (reduction of 12% in deaths for cerebral infarction).
Resumo:
BACKGROUND: Highway maintenance workers are constantly and simultaneously exposed to traffic-related particle and noise emissions, and both have been linked to increased cardiovascular morbidity and mortality in population-based epidemiology studies. OBJECTIVES: We aimed to investigate short-term health effects related to particle and noise exposure. METHODS: We monitored 18 maintenance workers, during as many as five 24-hour periods from a total of 50 observation days. We measured their exposure to fine particulate matter (PM2.5), ultrafine particles, noise, and the cardiopulmonary health endpoints: blood pressure, pro-inflammatory and pro-thrombotic markers in the blood, lung function and fractional exhaled nitric oxide (FeNO) measured approximately 15 hours post-work. Heart rate variability was assessed during a sleep period approximately 10 hours post-work. RESULTS: PM2.5 exposure was significantly associated with C-reactive protein and serum amyloid A, and negatively associated with tumor necrosis factor α. None of the particle metrics were significantly associated with von Willebrand factor or tissue factor expression. PM2.5 and work noise were associated with markers of increased heart rate variability, and with increased HF and LF power. Systolic and diastolic blood pressure on the following morning were significantly associated with noise exposure after work, and non-significantly associated with PM2.5. We observed no significant associations between any of the exposures and lung function or FeNO. CONCLUSIONS: Our findings suggest that exposure to particles and noise during highway maintenance work might pose a cardiovascular health risk. Actions to reduce these exposures could lead to better health for this population of workers.
Resumo:
INTRODUCTION: Although long-term video-EEG monitoring (LVEM) is routinely used to investigate paroxysmal events, short-term video-EEG monitoring (SVEM) lasting <24 h is increasingly recognized as a cost-effective tool. Since, however, relatively few studies addressed the yield of SVEM among different diagnostic groups, we undertook the present study to investigate this aspect. METHODS: We retrospectively analyzed 226 consecutive SVEM recordings over 6 years. All patients were referred because routine EEGs were inconclusive. Patients were classified into 3 suspected diagnostic groups: (1) group with epileptic seizures, (2) group with psychogenic nonepileptic seizures (PNESs), and (3) group with other or undetermined diagnoses. We assessed recording lengths, interictal epileptiform discharges, epileptic seizures, PNESs, and the definitive diagnoses obtained after SVEM. RESULTS: The mean age was 34 (±18.7) years, and the median recording length was 18.6 h. Among the 226 patients, 127 referred for suspected epilepsy - 73 had a diagnosis of epilepsy, none had a diagnosis of PNESs, and 54 had other or undetermined diagnoses post-SVEM. Of the 24 patients with pre-SVEM suspected PNESs, 1 had epilepsy, 12 had PNESs, and 11 had other or undetermined diagnoses. Of the 75 patients with other diagnoses pre-SVEM, 17 had epilepsy, 11 had PNESs, and 47 had other or undetermined diagnoses. After SVEM, 15 patients had definite diagnoses other than epilepsy or PNESs, while in 96 patients, diagnosis remained unclear. Overall, a definitive diagnosis could be reached in 129/226 (57%) patients. CONCLUSIONS: This study demonstrates that in nearly 3/5 patients without a definitive diagnosis after routine EEG, SVEM allowed us to reach a diagnosis. This procedure should be encouraged in this setting, given its time-effectiveness compared with LVEM.
Resumo:
Traditional psychometric theory and practice classify people according to broad ability dimensions but do not examine how these mental processes occur. Hunt and Lansman (1975) proposed a 'distributed memory' model of cognitive processes with emphasis on how to describe individual differences based on the assumption that each individual possesses the same components. It is in the quality of these components ~hat individual differences arise. Carroll (1974) expands Hunt's model to include a production system (after Newell and Simon, 1973) and a response system. He developed a framework of factor analytic (FA) factors for : the purpose of describing how individual differences may arise from them. This scheme is to be used in the analysis of psychometric tes ts . Recent advances in the field of information processing are examined and include. 1) Hunt's development of differences between subjects designated as high or low verbal , 2) Miller's pursuit of the magic number seven, plus or minus two, 3) Ferguson's examination of transfer and abilities and, 4) Brown's discoveries concerning strategy teaching and retardates . In order to examine possible sources of individual differences arising from cognitive tasks, traditional psychometric tests were searched for a suitable perceptual task which could be varied slightly and administered to gauge learning effects produced by controlling independent variables. It also had to be suitable for analysis using Carroll's f ramework . The Coding Task (a symbol substitution test) found i n the Performance Scale of the WISe was chosen. Two experiments were devised to test the following hypotheses. 1) High verbals should be able to complete significantly more items on the Symbol Substitution Task than low verbals (Hunt, Lansman, 1975). 2) Having previous practice on a task, where strategies involved in the task may be identified, increases the amount of output on a similar task (Carroll, 1974). J) There should be a sUbstantial decrease in the amount of output as the load on STM is increased (Miller, 1956) . 4) Repeated measures should produce an increase in output over trials and where individual differences in previously acquired abilities are involved, these should differentiate individuals over trials (Ferguson, 1956). S) Teaching slow learners a rehearsal strategy would improve their learning such that their learning would resemble that of normals on the ,:same task. (Brown, 1974). In the first experiment 60 subjects were d.ivided·into high and low verbal, further divided randomly into a practice group and nonpractice group. Five subjects in each group were assigned randomly to work on a five, seven and nine digit code throughout the experiment. The practice group was given three trials of two minutes each on the practice code (designed to eliminate transfer effects due to symbol similarity) and then three trials of two minutes each on the actual SST task . The nonpractice group was given three trials of two minutes each on the same actual SST task . Results were analyzed using a four-way analysis of variance . In the second experiment 18 slow learners were divided randomly into two groups. one group receiving a planned strategy practioe, the other receiving random practice. Both groups worked on the actual code to be used later in the actual task. Within each group subjects were randomly assigned to work on a five, seven or nine digit code throughout. Both practice and actual tests consisted on three trials of two minutes each. Results were analyzed using a three-way analysis of variance . It was found in t he first experiment that 1) high or low verbal ability by itself did not produce significantly different results. However, when in interaction with the other independent variables, a difference in performance was noted . 2) The previous practice variable was significant over all segments of the experiment. Those who received previo.us practice were able to score significantly higher than those without it. J) Increasing the size of the load on STM severely restricts performance. 4) The effect of repeated trials proved to be beneficial. Generally, gains were made on each successive trial within each group. S) In the second experiment, slow learners who were allowed to practice randomly performed better on the actual task than subjeots who were taught the code by means of a planned strategy. Upon analysis using the Carroll scheme, individual differences were noted in the ability to develop strategies of storing, searching and retrieving items from STM, and in adopting necessary rehearsals for retention in STM. While these strategies may benef it some it was found that for others they may be harmful . Temporal aspects and perceptual speed were also found to be sources of variance within individuals . Generally it was found that the largest single factor i nfluencing learning on this task was the repeated measures . What e~ables gains to be made, varies with individuals . There are environmental factors, specific abilities, strategy development, previous learning, amount of load on STM , perceptual and temporal parameters which influence learning and these have serious implications for educational programs .