930 resultados para Power Flow Control, Radial Distribution System, Distributed Generator (DG)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents Reinforcement Learning (RL) approaches to Economic Dispatch problem. In this paper, formulation of Economic Dispatch as a multi stage decision making problem is carried out, then two variants of RL algorithms are presented. A third algorithm which takes into consideration the transmission losses is also explained. Efficiency and flexibility of the proposed algorithms are demonstrated through different representative systems: a three generator system with given generation cost table, IEEE 30 bus system with quadratic cost functions, 10 generator system having piecewise quadratic cost functions and a 20 generator system considering transmission losses. A comparison of the computation times of different algorithms is also carried out.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Study on variable stars is an important topic of modern astrophysics. After the invention of powerful telescopes and high resolving powered CCD’s, the variable star data is accumulating in the order of peta-bytes. The huge amount of data need lot of automated methods as well as human experts. This thesis is devoted to the data analysis on variable star’s astronomical time series data and hence belong to the inter-disciplinary topic, Astrostatistics. For an observer on earth, stars that have a change in apparent brightness over time are called variable stars. The variation in brightness may be regular (periodic), quasi periodic (semi-periodic) or irregular manner (aperiodic) and are caused by various reasons. In some cases, the variation is due to some internal thermo-nuclear processes, which are generally known as intrinsic vari- ables and in some other cases, it is due to some external processes, like eclipse or rotation, which are known as extrinsic variables. Intrinsic variables can be further grouped into pulsating variables, eruptive variables and flare stars. Extrinsic variables are grouped into eclipsing binary stars and chromospheri- cal stars. Pulsating variables can again classified into Cepheid, RR Lyrae, RV Tauri, Delta Scuti, Mira etc. The eruptive or cataclysmic variables are novae, supernovae, etc., which rarely occurs and are not periodic phenomena. Most of the other variations are periodic in nature. Variable stars can be observed through many ways such as photometry, spectrophotometry and spectroscopy. The sequence of photometric observa- xiv tions on variable stars produces time series data, which contains time, magni- tude and error. The plot between variable star’s apparent magnitude and time are known as light curve. If the time series data is folded on a period, the plot between apparent magnitude and phase is known as phased light curve. The unique shape of phased light curve is a characteristic of each type of variable star. One way to identify the type of variable star and to classify them is by visually looking at the phased light curve by an expert. For last several years, automated algorithms are used to classify a group of variable stars, with the help of computers. Research on variable stars can be divided into different stages like observa- tion, data reduction, data analysis, modeling and classification. The modeling on variable stars helps to determine the short-term and long-term behaviour and to construct theoretical models (for eg:- Wilson-Devinney model for eclips- ing binaries) and to derive stellar properties like mass, radius, luminosity, tem- perature, internal and external structure, chemical composition and evolution. The classification requires the determination of the basic parameters like pe- riod, amplitude and phase and also some other derived parameters. Out of these, period is the most important parameter since the wrong periods can lead to sparse light curves and misleading information. Time series analysis is a method of applying mathematical and statistical tests to data, to quantify the variation, understand the nature of time-varying phenomena, to gain physical understanding of the system and to predict future behavior of the system. Astronomical time series usually suffer from unevenly spaced time instants, varying error conditions and possibility of big gaps. This is due to daily varying daylight and the weather conditions for ground based observations and observations from space may suffer from the impact of cosmic ray particles. Many large scale astronomical surveys such as MACHO, OGLE, EROS, xv ROTSE, PLANET, Hipparcos, MISAO, NSVS, ASAS, Pan-STARRS, Ke- pler,ESA, Gaia, LSST, CRTS provide variable star’s time series data, even though their primary intention is not variable star observation. Center for Astrostatistics, Pennsylvania State University is established to help the astro- nomical community with the aid of statistical tools for harvesting and analysing archival data. Most of these surveys releases the data to the public for further analysis. There exist many period search algorithms through astronomical time se- ries analysis, which can be classified into parametric (assume some underlying distribution for data) and non-parametric (do not assume any statistical model like Gaussian etc.,) methods. Many of the parametric methods are based on variations of discrete Fourier transforms like Generalised Lomb-Scargle peri- odogram (GLSP) by Zechmeister(2009), Significant Spectrum (SigSpec) by Reegen(2007) etc. Non-parametric methods include Phase Dispersion Minimi- sation (PDM) by Stellingwerf(1978) and Cubic spline method by Akerlof(1994) etc. Even though most of the methods can be brought under automation, any of the method stated above could not fully recover the true periods. The wrong detection of period can be due to several reasons such as power leakage to other frequencies which is due to finite total interval, finite sampling interval and finite amount of data. Another problem is aliasing, which is due to the influence of regular sampling. Also spurious periods appear due to long gaps and power flow to harmonic frequencies is an inherent problem of Fourier methods. Hence obtaining the exact period of variable star from it’s time series data is still a difficult problem, in case of huge databases, when subjected to automation. As Matthew Templeton, AAVSO, states “Variable star data analysis is not always straightforward; large-scale, automated analysis design is non-trivial”. Derekas et al. 2007, Deb et.al. 2010 states “The processing of xvi huge amount of data in these databases is quite challenging, even when looking at seemingly small issues such as period determination and classification”. It will be beneficial for the variable star astronomical community, if basic parameters, such as period, amplitude and phase are obtained more accurately, when huge time series databases are subjected to automation. In the present thesis work, the theories of four popular period search methods are studied, the strength and weakness of these methods are evaluated by applying it on two survey databases and finally a modified form of cubic spline method is intro- duced to confirm the exact period of variable star. For the classification of new variable stars discovered and entering them in the “General Catalogue of Vari- able Stars” or other databases like “Variable Star Index“, the characteristics of the variability has to be quantified in term of variable star parameters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In dieser Arbeit werden verschiedene Computermodelle, Rechenverfahren und Methoden zur Unterstützung bei der Integration großer Windleistungen in die elektrische Energieversorgung entwickelt. Das Rechenmodell zur Simulation der zeitgleich eingespeisten Windenergie erzeugt Summenganglinien von beliebig zusammengestellten Gruppen von Windenergieanlagen, basierend auf gemessenen Wind- und Leistungsdaten der nahen Vergangenheit. Dieses Modell liefert wichtige Basisdaten für die Analyse der Windenergieeinspeisung auch für zukünftige Szenarien. Für die Untersuchung der Auswirkungen von Windenergieeinspeisungen großräumiger Anlagenverbünde im Gigawattbereich werden verschiedene statistische Analysen und anschauliche Darstellungen erarbeitet. Das im Rahmen dieser Arbeit entwickelte Modell zur Berechnung der aktuell eingespeisten Windenergie aus online gemessenen Leistungsdaten repräsentativer Windparks liefert wertvolle Informationen für die Leistungs- und Frequenzregelung der Netzbetreiber. Die zugehörigen Verfahren zur Ermittlung der repräsentativen Standorte und zur Überprüfung der Repräsentativität bilden die Grundlage für eine genaue Abbildung der Windenergieeinspeisung für größere Versorgungsgebiete, basierend auf nur wenigen Leistungsmessungen an Windparks. Ein weiteres wertvolles Werkzeug für die optimale Einbindung der Windenergie in die elektrische Energieversorgung bilden die Prognosemodelle, die die kurz- bis mittelfristig zu erwartende Windenergieeinspeisung ermitteln. In dieser Arbeit werden, aufbauend auf vorangegangenen Forschungsarbeiten, zwei, auf Künstlich Neuronalen Netzen basierende Modelle vorgestellt, die den zeitlichen Verlauf der zu erwarten Windenergie für Netzregionen und Regelzonen mit Hilfe von gemessenen Leistungsdaten oder prognostizierten meteorologischen Parametern zur Verfügung stellen. Die softwaretechnische Zusammenfassung des Modells zur Berechnung der aktuell eingespeisten Windenergie und der Modelle für die Kurzzeit- und Folgetagsprognose bietet eine attraktive Komplettlösung für die Einbindung der Windenergie in die Leitwarten der Netzbetreiber. Die dabei entwickelten Schnittstellen und die modulare Struktur des Programms ermöglichen eine einfache und schnelle Implementierung in beliebige Systemumgebungen. Basierend auf der Leistungsfähigkeit der Online- und Prognosemodelle werden Betriebsführungsstrategien für zu Clustern im Gigawattbereich zusammengefasste Windparks behandelt, die eine nach ökologischen und betriebswirtschaftlichen Gesichtspunkten sowie nach Aspekten der Versorgungssicherheit optimale Einbindung der geplanten Offshore-Windparks ermöglichen sollen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta tesis está dividida en dos partes: en la primera parte se presentan y estudian los procesos telegráficos, los procesos de Poisson con compensador telegráfico y los procesos telegráficos con saltos. El estudio presentado en esta primera parte incluye el cálculo de las distribuciones de cada proceso, las medias y varianzas, así como las funciones generadoras de momentos entre otras propiedades. Utilizando estas propiedades en la segunda parte se estudian los modelos de valoración de opciones basados en procesos telegráficos con saltos. En esta parte se da una descripción de cómo calcular las medidas neutrales al riesgo, se encuentra la condición de no arbitraje en este tipo de modelos y por último se calcula el precio de las opciones Europeas de compra y venta.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El interés de este trabajo de investigación es explicar el tema del control geopolítico de los recursos naturales en Colombia en el periodo 2002-2011. Se analiza y explica cómo a partir del control territorial y el régimen de extracción desarrollados por la Drummond Company Inc. en el Departamento del Cesar se puede establecer su índice de poder geopolítico. Siguiendo los postulados de Michael Klare, respecto al desarrollo de una carrera internacional por los recursos energéticos mundiales, y los postulados de Kepa Sodupe, respecto a la posibilidad de reconfigurar los métodos de cálculo del poder, se avanza hacia los resultados de la investigación. Estos resultados permiten identificar de forma documentada el índice de poder geopolítico de una multinacional extranjera en el territorio colombiano.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Extending previous studies, a full-circle investigation of the ring current has been made using Cluster 4-spacecraft observations near perigee, at times when the Cluster array had relatively small separations and nearly regular tetrahedral configurations, and when the Dst index was greater than −30 nT (non-storm conditions). These observations result in direct estimations of the near equatorial current density at all magnetic local times (MLT) for the first time and with sufficient accuracy, for the following observations. The results confirm that the ring current flows westward and show that the in situ average measured current density (sampled in the radial range accessed by Cluster 4–4.5RE) is asymmetric in MLT, ranging from 9 to 27 nAm−2. The direction of current is shown to be very well ordered for the whole range of MLT. Both of these results are in line with previous studies on partial ring extent. The magnitude of the current density, however, reveals a distinct asymmetry: growing from 10 to 27 nAm−2 as azimuth reduces from about 12:00MLT to 03:00 and falling from 20 to 10 nAm−2 less steadily as azimuth reduces from 24:00 to 12:00MLT. This result has not been reported before and we suggest it could reflect a number of effects. Firstly, we argue it is consistent with the operation of region-2 field aligned-currents (FACs), which are expected to flow upward into the ring current around 09:00MLT and downward out of the ring current around 14:00MLT. Secondly, we note that it is also consistent with a possible asymmetry in the radial distribution profile of current density (resulting in higher peak at 4– 4.5RE). We note that part of the enhanced current could reflect an increase in the mean AE activity (during the periods in which Cluster samples those MLT).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An investigation is presented of a quasi-stationary convective system (QSCS) which occurred over the UK Southwest Peninsula on 21 July 2010. This system was remarkably similar in its location and structure to one which caused devastating flash flooding in the coastal village of Boscastle, Cornwall on 16 August 2004. However, in the 2010 case rainfall accumulations were around four times smaller and no flooding was recorded. The more extreme nature of the Boscastle case is shown to be related to three factors: (1) higher rain rates, associated with a warmer and moister tropospheric column and deeper convective clouds; (2) a more stationary system, due to slower evolution of the large-scale flow; and (3) distribution of the heaviest precipitation over fewer river catchments. Overall, however, the synoptic setting of the two events was broadly similar, suggesting that such conditions favour the development of QSCSs over the Southwest Peninsula. A numerical simulation of the July 2010 event was performed using a 1.5-km grid length configuration of the Met Office Unified Model. This reveals that convection was repeatedly initiated through lifting of low-level air parcels along a quasi-stationary coastal convergence line. Sensitivity tests are used to show that this convergence line was a sea breeze front which temporarily stalled along the coastline due to the retarding influence of an offshore-directed background wind component. Several deficiencies are noted in the 1.5-km model’s representation of the storm system, including delayed convective initiation; however, significant improvements are observed when the grid length is reduced to 500 m. These result in part from an improved representation of the convergence line, which enhances the associated low-level ascent allowing air parcels to more readily reach their level of free convection. The implications of this finding for forecasting convective precipitation are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The quality control, validation and verification of the European Flood Alert System (EFAS) are described. EFAS is designed as a flood early warning system at pan-European scale, to complement national systems and provide flood warnings more than 2 days before a flood. On average 20–30 alerts per year are sent out to the EFAS partner network which consists of 24 National hydrological authorities responsible for transnational river basins. Quality control of the system includes the evaluation of the hits, misses and false alarms, showing that EFAS has more than 50% of the time hits. Furthermore, the skills of both the meteorological as well as the hydrological forecasts are evaluated, and are included here for a 10-year period. Next, end-user needs and feedback are systematically analysed. Suggested improvements, such as real-time river discharge updating, are currently implemented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Sweden, there are about 0.5 million single-family houses that are heated by electricity alone, and rising electricity costs force the conversion to other heating sources such as heat pumps and wood pellet heating systems. Pellet heating systems for single-family houses are currently a strongly growing market. Future lack of wood fuels is possible even in Sweden, and combining wood pellet heating with solar heating will help to save the bio-fuel resources. The objectives of this thesis are to investigate how the electrically heated single-family houses can be converted to pellet and solar heating systems, and how the annual efficiency and solar gains can be increased in such systems. The possible reduction of CO-emissions by combining pellet heating with solar heating has also been investigated. Systems with pellet stoves (both with and without a water jacket), pellet boilers and solar heating have been simulated. Different system concepts have been compared in order to investigate the most promising solutions. Modifications in system design and control strategies have been carried out in order to increase the system efficiency and the solar gains. Possibilities for increasing the solar gains have been limited to investigation of DHW-units for hot water production and the use of hot water for heating of dishwashers and washing machines via a heat exchanger instead of electricity (heat-fed appliances). Computer models of pellet stoves, boilers, DHW-units and heat-fed appliances have been developed and the parameters for the models have been identified from measurements on real components. The conformity between the models and the measurements has been checked. The systems with wood pellet stoves have been simulated in three different multi-zone buildings, simulated in detail with heat distribution through door openings between the zones. For the other simulations, either a single-zone house model or a load file has been used. Simulations were carried out for Stockholm, Sweden, but for the simulations with heat-fed machines also for Miami, USA. The foremost result of this thesis is the increased understanding of the dynamic operation of combined pellet and solar heating systems for single-family houses. The results show that electricity savings and annual system efficiency is strongly affected by the system design and the control strategy. Large reductions in pellet consumption are possible by combining pellet boilers with solar heating (a reduction larger than the solar gains if the system is properly designed). In addition, large reductions in carbon monoxide emissions are possible. To achieve these reductions it is required that the hot water production and the connection of the radiator circuit is moved to a well insulated, solar heated buffer store so that the boiler can be turned off during the periods when the solar collectors cover the heating demand. The amount of electricity replaced using systems with pellet stoves is very dependant on the house plan, the system design, if internal doors are open or closed and the comfort requirements. Proper system design and control strategies are crucial to obtain high electricity savings and high comfort with pellet stove systems. The investigated technologies for increasing the solar gains (DHW-units and heat-fed appliances) significantly increase the solar gains, but for the heat-fed appliances the market introduction is difficult due to the limited financial savings and the need for a new heat distribution system. The applications closest to market introduction could be for communal laundries and for use in sunny climates where the dominating part of the heat can be covered by solar heating. The DHW-unit is economical but competes with the internal finned-tube heat exchanger which is the totally dominating technology for hot water preparation in solar combisystems for single-family houses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is about new digital moving image recording technologies and how they augment the distribution of creativity and the flexibility in moving image production systems, but also impose constraints on how images flow through the production system. The central concept developed in this thesis is ‘creative space’ which links quality and efficiency in moving image production to time for creative work, capacity of digital tools, user skills and the constitution of digital moving image material. The empirical evidence of this thesis is primarily based on semi-structured interviews conducted with Swedish film and TV production representatives.This thesis highlights the importance of pre-production technical planning and proposes a design management support tool (MI-FLOW) as a way to leverage functional workflows that is a prerequisite for efficient and cost effective moving image production.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Best corporate governance practices published in the primers of Brazilian Securities and Exchange Commission and the Brazilian Corporate Governance Institute promote board independence as much as possible, as a way to increase the effectiveness of governance mechanism (Sanzovo, 2010). Therefore, this paper aims at understanding if what the managerial literature portraits as being self-evident - stricter governance, better performance - can be observed in actual evidence. The question answered is: do companies with a stricter control and monitoring system perform better than others? The method applied in this paper consists on comparing 116 companies in respect to the their independence level between top management team and board directors– being that measured by four parameters, namely, the percentage of independent outsiders in the board, the separation of CEO and chairman, the adoption of contingent compensation and the percentage of institutional investors in the ownership structure – and their financial return measured in terms return on assets (ROA) from the latest Quarterly Earnings release of 2012. From the 534 companies listed in the Stock Exchange of Sao Paulo – Bovespa – 116 were selected due to their level of corporate governance. The title “Novo Mercado” refers to the superior level of governance level within companies listed in Bovespa, as they have to follow specific criteria to assure shareholders ´protection (BM&F, 2011). Regression analyses were conducted in order to reveal the correlation level between two selected variables. The results from the regression analysis were the following: the correlation between each parameter and ROA was 10.26%; the second regression analysis conducted measured the correlation between the independence level of top management team vis-à-vis board directors – namely, CEO relative power - and ROA, leading to a multiple R of 5.45%. Understanding that the scale is a simplification of the reality, the second part of the analysis transforms all the four parameters into dummy variables, excluding what could be called as an arbitrary scale. The ultimate result from this paper led to a multiple R of 28.44%, which implies that the combination of the variables are still not enough to translate the complex reality of organizations. Nonetheless, an important finding can be taken from this paper: two variables (percentage of outside directors and percentage of institutional investor ownership) are significant in the regression, with p-value lower than 10% and with negative coefficients. In other words, counter affirming what the literature very often portraits as being self-evident – stricter governance leads to higher performance – this paper has provided evidences to believe that the increase in the formal governance structure trough outside directors in the board and ownership by institutional investor might actually lead to worse performance. The section limitations and suggestions for future researches presents some reasons explaining why, although supported by strong theoretical background, this paper faced some challenging methodological assumptions, precluding categorical statements about the level of governance – measured by four selected parameters – and the financial return in terms of financial on assets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to develop a laboratory method for time response evaluation on electronically controlled spray equipment using Programmable Logic Controllers (PLCs). For that purpose, a PLC controlled digital drive inverter was set up to drive an asynchronous electric motor linked to a centrifugal pump on a experimental sprayer equipped with electronic flow control. The PLC was operated via RS232 serial communication from a PC computer. A user program was written to control de motor by adjusting the following system variables, all related to the motor speed: time stopped; ramp up and ramp down times, time running at a given constant speed and ramp down time to stop the motor. This set up was used in conjunction with a data acquisition system to perform laboratory tests with an electronically controlled sprayer. Time response for pressure stabilization was measured while changing the pump speed by +/-20%. The results showed that for a 0.2 s ramp time increasing the motor speed, as an example, an AgLogix Flow Control system (Midwest Technologies Inc.) took 22 s in average to readjust the pressure. When decreasing the motor speed, this time response was down to 8 s. General results also showed that this kind of methodology could make easier the definition of standards for tests on electronically controlled application equipment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The optimized allocation of protective devices in strategic points of the circuit improves the quality of the energy supply and the system reliability index. This paper presents a nonlinear integer programming (NLIP) model with binary variables, to deal with the problem of protective device allocation in the main feeder and all branches of an overhead distribution circuit, to improve the reliability index and to provide customers with service of high quality and reliability. The constraints considered in the problem take into account technical and economical limitations, such as coordination problems of serial protective devices, available equipment, the importance of the feeder and the circuit topology. The use of genetic algorithms (GAs) is proposed to solve this problem, using a binary representation that does (1) or does not (0) show allocation of protective devices (reclosers, sectionalizers and fuses) in predefined points of the circuit. Results are presented for a real circuit (134 busses), with the possibility of protective device allocation in 29 points. Also the ability of the algorithm in finding good solutions while improving significantly the indicators of reliability is shown. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work describes a methodology for power factor control and correction of the unbalanced currents in four-wire electric circuits. The methodology is based on the insertion of two compensation networks, one wye-grounded neutral and another in delta, in parallel to the load. The mathematical development has been proposed in previous work [3]. In this paper, however, the methodology was adapted to accept different power factors for the system to be compensated. on the other hand, the determination of the compensation susceptances is based on the instantaneous values of the load currents. The results are obtained using the MatLab - Simulink environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This letter presents an alternative approach for reducing the total real power losses by using a continuation method. Results for two simple test systems and for the IEEE 57-bus system show that this procedure results in larger voltage stability margin. Besides, the reduction of real power losses obtained with this procedure leads to significant money savings and, simultaneously, to voltage profile improvement. Comparison between the solution of an optimal power flow and the proposed method shows that the latter can provide near optimal results and so, it can be a reasonable alternative to power system voltage stability enhancement.