947 resultados para Distribution generation
Resumo:
In this thesis various schemes using custom power devices for power quality improvement in low voltage distribution network are studied. Customer operated distributed generators makes a typical network non-radial and affect the power quality. A scheme considering different algorithm of DSTATCOM is proposed for power circulation and islanded operation of the system. To compensate reactive power overflow and facilitate unity power factor, a UPQC is introduced. Stochastic analysis is carried out for different scenarios to get a comprehensive idea about a real life distribution network. Combined operation of static compensator and voltage regulator is tested for the optimum quality and stability of the system.
Resumo:
The global efforts to reduce carbon emissions from power generation have favoured renewable energy resources such as wind and solar in recent years. The generation of power from the renewable energy resources has become attractive because of various incentives provided by government policies supporting green power. Among the various available renewable energy resources, the power generation from wind has seen tremendous growth in the last decade. This article discusses various advantages of the upcoming offshore wind technology and associated considerations related to their construction. The conventional configuration of the offshore wind farm is based on the alternative current internal links. With the recent advances of improved commercialised converters, voltage source converters based high voltage direct current link for offshore wind farms is gaining popularity. The planning and construction phases of offshore wind farms, including related environmental issues, are discussed here.
Resumo:
This paper relates to the importance of impact of the chosen bottle-point method when conducting ion exchange equilibria experiments. As an illustration, potassium ion exchange with strong acid cation resin was investigated due to its relevance to the treatment of various industrial effluents and groundwater. The “constant mass” bottle-point method was shown to be problematic in that depending upon the resin mass used the equilibrium isotherm profiles were different. Indeed, application of common equilibrium isotherm models revealed that the optimal fit could be with either the Freundlich or Temkin equations, depending upon the conditions employed. It could be inferred that the resin surface was heterogeneous in character, but precise conclusions regarding the variation in the heat of sorption were not possible. Estimation of the maximum potassium loading was also inconsistent when employing the “constant mass” method. The “constant concentration” bottle-point method illustrated that the Freundlich model was a good representation of the exchange process. The isotherms recorded were relatively consistent when compared to the “constant mass” approach. Unification of all the equilibrium isotherm data acquired was achieved by use of the Langmuir Vageler expression. The maximum loading of potassium ions was predicted to be at least 116.5 g/kg resin.
Resumo:
Long-term measurements of particle number size distribution (PNSD) produce a very large number of observations and their analysis requires an efficient approach in order to produce results in the least possible time and with maximum accuracy. Clustering techniques are a family of sophisticated methods which have been recently employed to analyse PNSD data, however, very little information is available comparing the performance of different clustering techniques on PNSD data. This study aims to apply several clustering techniques (i.e. K-means, PAM, CLARA and SOM) to PNSD data, in order to identify and apply the optimum technique to PNSD data measured at 25 sites across Brisbane, Australia. A new method, based on the Generalised Additive Model (GAM) with a basis of penalised B-splines, was proposed to parameterise the PNSD data and the temporal weight of each cluster was also estimated using the GAM. In addition, each cluster was associated with its possible source based on the results of this parameterisation, together with the characteristics of each cluster. The performances of four clustering techniques were compared using the Dunn index and Silhouette width validation values and the K-means technique was found to have the highest performance, with five clusters being the optimum. Therefore, five clusters were found within the data using the K-means technique. The diurnal occurrence of each cluster was used together with other air quality parameters, temporal trends and the physical properties of each cluster, in order to attribute each cluster to its source and origin. The five clusters were attributed to three major sources and origins, including regional background particles, photochemically induced nucleated particles and vehicle generated particles. Overall, clustering was found to be an effective technique for attributing each particle size spectra to its source and the GAM was suitable to parameterise the PNSD data. These two techniques can help researchers immensely in analysing PNSD data for characterisation and source apportionment purposes.
Resumo:
An expanding education market targeted through ‘bridging material’ enabling cineliteracies has the potential to offer Australian producers with increased distribution opportunities, educators with targeted teaching aids and students with enhanced learning outcomes. For Australian documentary producers, the key to unlocking the potential of the education sector is engaging with its curriculum-based requirements at the earliest stages of pre-production. Two key mechanisms can lead to effective educational engagement; the established area of study guides produced in association with the Australian Teachers of Media (ATOM) and the emerging area of philanthropic funding coordinated by the Documentary Australia Foundation (DAF). DAF has acted as a key financial and cultural philanthropic bridge between individuals, foundations, corporations and the Australian documentary sector for over 14 years. DAF does not make or commission films but through management and receipt of grants and donations provides ‘expertise, information, guidance and resources to help each sector work together to achieve their goals’. The DAF application process also requires film-makers to detail their ‘Education and Outreach Strategy’ for each film with 582 films registered and 39 completed as of June 2014. These education strategies that can range from detailed to cursory efforts offer valuable insights into the Australian documentary sector's historical and current expectations of education as a receptive and dynamic audience for quality factual content. A recurring film-maker education strategy found in the DAF data is an engagement with ATOM to create a study guide for their film. This study guide then acts as a ‘bridging material’ between content and education audience. The frequency of this effort suggests these study guides enable greater educator engagement with content and increased interest and distribution of the film to educators. The paper Education paths for documentary distribution: DAF, ATOM and the study guides that bind them will address issues arising out of the changing needs of the education sector and the impact targeting ‘cineliteracy’ outcomes may have for Australian documentary distribution.
Resumo:
Overvoltage and overloading due to high utilization of PVs are the main power quality concerns for future distribution power systems. This paper proposes a distributed control coordination strategy to manage multiple PVs within a network to overcome these issues. PVs reactive power is used to deal with over-voltages and PVs active power curtailment are regulated to avoid overloading. The proposed control structure is used to share the required contribution fairly among PVs, in proportion to their ratings. This approach is examined on a practical distribution network with multiple PVs.
Resumo:
In this chapter, we explore methods for automatically generating game content—and games themselves—adapted to individual players in order to improve their playing experience or achieve a desired effect. This goes beyond notions of mere replayability and involves modeling player needs to maximize their enjoyment, involvement, and interest in the game being played. We identify three main aspects of this process: generation of new content and rule sets, measurement of this content and the player, and adaptation of the game to change player experience. This process forms a feedback loop of constant refinement, as games are continually improved while being played. Framed within this methodology, we present an overview of our recent and ongoing research in this area. This is illustrated by a number of case studies that demonstrate these ideas in action over a variety of game types, including 3D action games, arcade games, platformers, board games, puzzles, and open-world games. We draw together some of the lessons learned from these projects to comment on the difficulties, the benefits, and the potential for personalized gaming via adaptive game design.