861 resultados para Grid-connected


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Beside the traditional paradigm of "centralized" power generation, a new concept of "distributed" generation is emerging, in which the same user becomes pro-sumer. During this transition, the Energy Storage Systems (ESS) can provide multiple services and features, which are necessary for a higher quality of the electrical system and for the optimization of non-programmable Renewable Energy Source (RES) power plants. A ESS prototype was designed, developed and integrated into a renewable energy production system in order to create a smart microgrid and consequently manage in an efficient and intelligent way the energy flow as a function of the power demand. The produced energy can be introduced into the grid, supplied to the load directly or stored in batteries. The microgrid is composed by a 7 kW wind turbine (WT) and a 17 kW photovoltaic (PV) plant are part of. The load is given by electrical utilities of a cheese factory. The ESS is composed by the following two subsystems, a Battery Energy Storage System (BESS) and a Power Control System (PCS). With the aim of sizing the ESS, a Remote Grid Analyzer (RGA) was designed, realized and connected to the wind turbine, photovoltaic plant and the switchboard. Afterwards, different electrochemical storage technologies were studied, and taking into account the load requirements present in the cheese factory, the most suitable solution was identified in the high temperatures salt Na-NiCl2 battery technology. The data acquisition from all electrical utilities provided a detailed load analysis, indicating the optimal storage size equal to a 30 kW battery system. Moreover a container was designed and realized to locate the BESS and PCS, meeting all the requirements and safety conditions. Furthermore, a smart control system was implemented in order to handle the different applications of the ESS, such as peak shaving or load levelling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

L’esperimento CMS a LHC ha raccolto ingenti moli di dati durante Run-1, e sta sfruttando il periodo di shutdown (LS1) per evolvere il proprio sistema di calcolo. Tra i possibili miglioramenti al sistema, emergono ampi margini di ottimizzazione nell’uso dello storage ai centri di calcolo di livello Tier-2, che rappresentano - in Worldwide LHC Computing Grid (WLCG)- il fulcro delle risorse dedicate all’analisi distribuita su Grid. In questa tesi viene affrontato uno studio della popolarità dei dati di CMS nell’analisi distribuita su Grid ai Tier-2. Obiettivo del lavoro è dotare il sistema di calcolo di CMS di un sistema per valutare sistematicamente l’ammontare di spazio disco scritto ma non acceduto ai centri Tier-2, contribuendo alla costruzione di un sistema evoluto di data management dinamico che sappia adattarsi elasticamente alle diversi condizioni operative - rimuovendo repliche dei dati non necessarie o aggiungendo repliche dei dati più “popolari” - e dunque, in ultima analisi, che possa aumentare l’“analysis throughput” complessivo. Il Capitolo 1 fornisce una panoramica dell’esperimento CMS a LHC. Il Capitolo 2 descrive il CMS Computing Model nelle sue generalità, focalizzando la sua attenzione principalmente sul data management e sulle infrastrutture ad esso connesse. Il Capitolo 3 descrive il CMS Popularity Service, fornendo una visione d’insieme sui servizi di data popularity già presenti in CMS prima dell’inizio di questo lavoro. Il Capitolo 4 descrive l’architettura del toolkit sviluppato per questa tesi, ponendo le basi per il Capitolo successivo. Il Capitolo 5 presenta e discute gli studi di data popularity condotti sui dati raccolti attraverso l’infrastruttura precedentemente sviluppata. L’appendice A raccoglie due esempi di codice creato per gestire il toolkit attra- verso cui si raccolgono ed elaborano i dati.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In questa tesi vengono analizzate le principali tecniche di Resource Discovery in uso nei sistemi di Grid Computing, valutando i principali vantaggi e svantaggi di ogni soluzione. Particolare attenzione verrà riposta sul Resource Discovery ad Agenti, che si propone come architettura capace di risolvere in maniera definitiva i classici problemi di queste reti. All'interno dell'elaborato, inoltre, ogni tecnica presentata verrà arricchita con una sua implementazione pratica: tra queste, ricordiamo MDS, Chord e l'implementazione Kang.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electric power grids throughout the world suffer from serious inefficiencies associated with under-utilization due to demand patterns, engineering design and load following approaches in use today. These grids consume much of the world’s energy and represent a large carbon footprint. From material utilization perspectives significant hardware is manufactured and installed for this infrastructure often to be used at less than 20-40% of its operational capacity for most of its lifetime. These inefficiencies lead engineers to require additional grid support and conventional generation capacity additions when renewable technologies (such as solar and wind) and electric vehicles are to be added to the utility demand/supply mix. Using actual data from the PJM [PJM 2009] the work shows that consumer load management, real time price signals, sensors and intelligent demand/supply control offer a compelling path forward to increase the efficient utilization and carbon footprint reduction of the world’s grids. Underutilization factors from many distribution companies indicate that distribution feeders are often operated at only 70-80% of their peak capacity for a few hours per year, and on average are loaded to less than 30-40% of their capability. By creating strong societal connections between consumers and energy providers technology can radically change this situation. Intelligent deployment of smart sensors, smart electric vehicles, consumer-based load management technology very high saturations of intermittent renewable energy supplies can be effectively controlled and dispatched to increase the levels of utilization of existing utility distribution, substation, transmission, and generation equipment. The strengthening of these technology, society and consumer relationships requires rapid dissemination of knowledge (real time prices, costs & benefit sharing, demand response requirements) in order to incentivize behaviors that can increase the effective use of technological equipment that represents one of the largest capital assets modern society has created.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Grid (or sieve) therapy ("Gitter-" oder "Siebtherapie"), spatially fractionated kilo- and megavolt X-ray therapy, was invented in 1909 by Alban Köhler, a radiologist in Wiesbaden, Germany. He tested it on several patients before 1913 using approximately 60-70kV Hittorf-Crookes tubes. Köhler pushed the X-ray tube's lead-shielded housing against a stiff grid of 1 mm-square iron wires woven 3.0-3.5mm on center, taped tightly to the skin over a thin chamois. Numerous islets unshielded by iron in the pressure-blanched skin were irradiated with up to about 6 erythema doses (ED). The skin was then thoroughly cleansed, disinfected, and bandaged; delayed punctate necrosis healed in several weeks. Although grid therapy was disparaged or ignored until the 1930s, it has been used successfully since then to shrink bulky malignancies. Also, advanced cancers in rats and mice have been mitigated or ablated using Köhler's concept since the early 1990s by unidirectional or stereotactic exposure to an array of nearly parallel microplanar (25-75μm-wide) beams of very intense, moderately hard (median energy approximately 100 keV) synchrotron-generated X rays spaced 0.1-0.4mm on center. Such beams maintain sharp edges at high doses well beneath the skin yet confer little toxicity. They could palliate some otherwise intractable malignancies, perhaps in young children too, with tolerable sequelae. There are plans for such studies in larger animals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pumped-storage (PS) systems are used to store electric energy as potential energy for release during peak demand. We investigate the impacts of a planned 1000 MW PS scheme connecting Lago Bianco with Lago di Poschiavo (Switzerland) on temperature and particle mass concentration in both basins. The upper (turbid) basin is a reservoir receiving large amounts of fine particles from the partially glaciated watershed, while the lower basin is a much clearer natural lake. Stratification, temperature and particle concentrations in the two basins were simulated with and without PS for four different hydrological conditions and 27 years of meteorological forcing using the software CE-QUAL-W2. The simulations showed that the PS operations lead to an increase in temperature in both basins during most of the year. The increase is most pronounced (up to 4°C) in the upper hypolimnion of the natural lake toward the end of summer stratification and is partially due to frictional losses in the penstocks, pumps and turbines. The remainder of the warming is from intense coupling to the atmosphere while water resides in the shallower upper reservoir. These impacts are most pronounced during warm and dry years, when the upper reservoir is strongly heated and the effects are least concealed by floods. The exchange of water between the two basins relocates particles from the upper reservoir to the lower lake, where they accumulate during summer in the upper hypolimnion (10 to 20 mg L−1) but also to some extent decrease light availability in the trophic surface layer.