875 resultados para Federal Energy Management Program (U.S.)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents the application of fuzzy theory to support the decision of implementing energy efficiency program in sawmills operating in the processing of Pinustaeda and Pinuselliotii. The justification of using a system based on fuzzy theory for analysis of consumption and the specific factors involved, such is the diversity of rates / factors. With the fuzzy theory, we can build a reliable system for verifying actual energy efficiency. The indices and factors characteristic of industrial activity were measured and used as the basis for the fuzzy system. We developed a management system and technology. The system involves the management practices in energy efficiency, maintenance of plant and equipment and the presence of qualified staff. The technological system involves the power factor, load factor, the factor of demand and the specific consumption. The first response provides the possibility of increased energy efficiency and the second level of energy efficiency in the industry studied. With this tool, programs can be developed for energy conservation and energy efficiency in the industrial timber with wide application in this area that is as diverse as production processes. The same systems developed can be used in other industrial activities, provided they are used indices and characteristic features of the sectors involved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Invasive feral swine (Sus scrofa) cause deleterious impacts to ecosystem processes and functioning throughout their worldwide distribution, including forested ecosystems in the United States. Unfortunately, many feral swine damage management programs are conducted in a piecemeal fashion, are not adequately funded, and lack clearly stated or realistic objectives. This review paper identifies damage caused by feral swine to forest resources and presents techniques used to prevent and control feral swine damage. Concluding points related to planning a feral swine damage management program are: (1) the value of using a variety of techniques in an integrated fashion cannot be overstated; (2) there is value in using indices for both feral swine populations and their damage pre and post management activities; (3) innovative technologies will increasing be of value in the pursuit of feral swine damage reduction; and (4) though not appropriate in every situation, there is value in involving the public in feral swine damage management decisions and activities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Vancouver International Airport (YVR) is the second busiest airport in Canada. YVR is located on Sea Island in the Fraser River Estuary - a world-class wintering and staging area for hundreds of thousands of migratory birds. The Fraser Delta supports Canada’s largest wintering populations of waterfowl, shorebirds, and raptors. The large number of aircraft movements and the presence of many birds near YVR pose a wide range of considerable aviation safety hazards. Until the late 1980s when a full-time Wildlife Control Program (WCP) was initiated, YVR had the highest number of bird strikes of any Canadian commercial airport. Although the risks of bird strikes associated with the operation of YVR are generally well known by airport managers, and a number of risk assessments have been conducted associated with the Sea Island Conservation Area, no quantitative assessment of risks of bird strikes has been conducted for airport operations at YVR. Because the goal of all airports is to operate safely, an airport wildlife management program strives to reduce the risk of bird strikes. A risk assessment establishes the current risk of strikes, which can be used as a benchmark to focus wildlife control activities and to assess the effectiveness of the program in reducing bird strike risks. A quantitative risk assessment also documents the process and information used in assessing risk and allows the assessment to be repeated in the future in order to measure the change in risk over time in an objective and comparative manner. This study was undertaken to comply with new Canadian legislation expected to take effect in 2006 requiring airports in Canada to conduct a risk assessment and develop a wildlife management plan. Although YVR has had a management plan for many years, it took this opportunity to update the plan and conduct a risk assessment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We calculate the relic abundance of mixed axion/neutralino cold dark matter which arises in R-parity conserving supersymmetric (SUSY) models wherein the strong CP problem is solved by the Peccei-Quinn (PQ) mechanism with a concommitant axion/saxion/axino supermultiplet. By numerically solving the coupled Boltzmann equations, we include the combined effects of 1. thermal axino production with cascade decays to a neutralino LSP, 2. thermal saxion production and production via coherent oscillations along with cascade decays and entropy injection, 3. thermal neutralino production and re-annihilation after both axino and saxion decays, 4. gravitino production and decay and 5. axion production both thermally and via oscillations. For SUSY models with too high a standard neutralino thermal abundance, we find the combined effect of SUSY PQ particles is not enough to lower the neutralino abundance down to its measured value, while at the same time respecting bounds on late-decaying neutral particles from BBN. However, models with a standard neutralino underabundance can now be allowed with either neutralino or axion domination of dark matter, and furthermore, these models can allow the PQ breaking scale f(a) to be pushed up into the 10(14) - 10(15) GeV range, which is where it is typically expected to be in string theory models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation deals with the development of a project concerning a demonstration in the scope of the Supply Chain 6 of the Internet of Energy (IoE) project: the Remote Monitoring Emulator, which bears my personal contribution in several sections. IoE is a project of international relevance, that means to establish an interoperability standard as regards the electric power production and utilization infrastructure, using Smart Space platforms. The future perspectives of IoE have to do with a platform for electrical power trade-of, the Smart Grid, whose energy is produced by decentralized renewable sources and whose services are exploited primarily according to the Internet of Things philosophy. The main consumers of this kind of smart technology will be Smart Houses (that is to say, buildings controlled by an autonomous system for electrical energy management that is interoperable with the Smart Grid) and Electric Mobility, that is a smart and automated management regarding movement and, overall, recharging of electrical vehicles. It is precisely in the latter case study that the project Remote Monitoring Emulator takes place. It consists in the development of a simulated platform for the management of an electrical vehicle recharging in a city. My personal contribution to this project lies in development and modeling of the simulation platform, of its counterpart in a mobile application and implementation of a city service prototype. This platform shall, ultimately, make up a demonstrator system exploiting the same device which a real user, inside his vehicle, would use. The main requirements that this platform shall satisfy will be interoperability, expandability and relevance to standards, as it needs to communicate with other development groups and to effectively respond to internal changes that can affect IoE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The US penitentiary at Lewisburg, Pennsylvania, was retrofitted in 2008 to offer the country's first federal Special Management Unit (SMU) program of its kind. This model SMU is designed for federal inmates from around the country identified as the most intractably troublesome, and features double-celling of inmates in tiny spaces, in 23-hour or 24-hour a day lockdown, requiring them to pass through a two-year program of readjustment. These spatial tactics, and the philosophy of punishment underlying them, contrast with the modern reform ideals upon which the prison was designed and built in 1932. The SMU represents the latest punitive phase in American penology, one that neither simply eliminates men as in the premodern spectacle, nor creates the docile, rehabilitated bodies of the modern panopticon; rather, it is a late-modern structure that produces only fear, terror, violence, and death. This SMU represents the latest of the late-modern prisons, similar to other supermax facilities in the US but offering its own unique system of punishment as well. While the prison exists within the system of American law and jurisprudence, it also manifests features of Agamben's lawless, camp-like space that emerges during a state of exception, exempt from outside scrutiny with inmate treatment typically beyond the scope of the law

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The past decade has seen the energy consumption in servers and Internet Data Centers (IDCs) skyrocket. A recent survey estimated that the worldwide spending on servers and cooling have risen to above $30 billion and is likely to exceed spending on the new server hardware . The rapid rise in energy consumption has posted a serious threat to both energy resources and the environment, which makes green computing not only worthwhile but also necessary. This dissertation intends to tackle the challenges of both reducing the energy consumption of server systems and by reducing the cost for Online Service Providers (OSPs). Two distinct subsystems account for most of IDC’s power: the server system, which accounts for 56% of the total power consumption of an IDC, and the cooling and humidifcation systems, which accounts for about 30% of the total power consumption. The server system dominates the energy consumption of an IDC, and its power draw can vary drastically with data center utilization. In this dissertation, we propose three models to achieve energy effciency in web server clusters: an energy proportional model, an optimal server allocation and frequency adjustment strategy, and a constrained Markov model. The proposed models have combined Dynamic Voltage/Frequency Scaling (DV/FS) and Vary-On, Vary-off (VOVF) mechanisms that work together for more energy savings. Meanwhile, corresponding strategies are proposed to deal with the transition overheads. We further extend server energy management to the IDC’s costs management, helping the OSPs to conserve, manage their own electricity cost, and lower the carbon emissions. We have developed an optimal energy-aware load dispatching strategy that periodically maps more requests to the locations with lower electricity prices. A carbon emission limit is placed, and the volatility of the carbon offset market is also considered. Two energy effcient strategies are applied to the server system and the cooling system respectively. With the rapid development of cloud services, we also carry out research to reduce the server energy in cloud computing environments. In this work, we propose a new live virtual machine (VM) placement scheme that can effectively map VMs to Physical Machines (PMs) with substantial energy savings in a heterogeneous server cluster. A VM/PM mapping probability matrix is constructed, in which each VM request is assigned with a probability running on PMs. The VM/PM mapping probability matrix takes into account resource limitations, VM operation overheads, server reliability as well as energy effciency. The evolution of Internet Data Centers and the increasing demands of web services raise great challenges to improve the energy effciency of IDCs. We also express several potential areas for future research in each chapter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis focuses on the impact of the American shale gas boom on the European natural gas market. The study presents different tests in order to analyze the dynamics of natural gas prices in the U.S., U.K. and German natural gas market. The question of cointegration between these different markets are analyzed using several tests. More specifically, the ADF tests for the presence of a unit root. The error correction model test and the Johansen cointegration procedure are applied in order to accept or reject the hypothesis of an integrated market. The results suggest no evidence of cointegration between these markets. There currently is no evidence of an impact of the U.S. shale gas boom on the European market.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

STUDY DESIGN Technical note and case series. OBJECTIVE To introduce an innovative minimal-invasive surgical procedure reducing surgery time and blood loss in management of U-shaped sacrum fractures. SUMMARY OF BACKGROUND Despite their seldom appearance, U-shaped fractures can cause severe neurological deficits and surgical management difficulties. According to the nature of the injury normally occurring in multi-injured patients after a fall from height, a jump, or road traffic accident, U-shaped fractures create a spinopelvic dissociation and hence are highly unstable. In the past, time-consuming open procedures like large posterior constructs or shortening osteotomies with or without decompression were the method of choice, sacrificing spinal mobility. Insufficient restoration of sacrococcygeal angle and pelvic incidence with conventional techniques may have adverse long-term effects in these patients. METHODS In a consecutive series of 3 patients, percutaneous reduction of the fracture with Schanz pins inserted in either the pedicles of L5 or the S1 body and the posterior superior iliac crest was achieved. The Schanz pins act as lever, allowing a good manipulation of the fracture. The reduction is secured by a temporary external fixator to permit optimal restoration of pelvic incidence and sacral kyphosis. Insertion of 2 transsacral screws allow fixation of the restored spinopelvic alignment. RESULTS Anatomic alignment of the sacrum was possible in each case. Surgery time ranged from 90 to 155 minutes and the blood loss was <50 mL in all 3 cases. Two patients had very good results in the long term regarding maintenance of pelvic incidence and sacrococcygeal angle. One patient with previous cauda equina decompression had loss of correction after 6 months. CONCLUSIONS Percutaneous reduction and transsacral screw fixation offers a less invasive method for treating U-shaped fractures. This can be advantageous in treatment of patients with multiple injuries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A process evaluation of the Houston Childhood Lead Poisoning Prevention Program, 1992-1995, was conducted. The Program's goal is to reduce lead poisoning prevalence. The study proposed to determine to what extent the Program was implemented as planned by measuring how well Program services were actually: (1) received by the intended target population; (2) delivered to children with elevated blood lead levels; (3) delivered in compliance with the Centers for Disease Control and Prevention and Program guidelines and timetables; and (4) able to reduce lead poisoning prevalence among those rescreened. Utilizing a program monitoring design, the Program's pre-collected computer records were reviewed. The study sample consisted of 820 children whose blood lead levels were above 15 micrograms per deciLiter, representing approximately 2.9% of the 28,406 screened over this period. Three blood lead levels from each participant were examined: the initial elevated result; the confirmatory result; and the next rescreen result, after the elevated confirmatory level. Results showed that the Program screened approximately 18% (28,406 of 161,569) of Houston's children under age 6 years for lead poisoning. Based on Chi-square tests of significance, results also showed that lead-poisoned participants were more likely to be younger than 3 years, male and Hispanic, compared to those not lead poisoned. The age, gender and ethnic differences observed were statistically significant (p =.01, p =.00, p =.00). Four of the six Program services: medical evaluations, rescreening, environmental inspections and confirmation, had satisfactory delivery completion rates of 71%-98%. Delivery timetable compliance rates for three of the six services examined: outreach contacts, home visits and environmental inspections were below 32%. However, dangerously elevated blood lead levels fell and lead poisoning prevalence dropped from 3.3% at initial screening to 1.2% among those rescreened, after intervention. From a public health perspective, reductions in lead poisoning prevalence are very meaningful. Based on these findings, the following are recommendations for future research: (1) integrate Program database files by utilizing a computer database management program; (2) target services at Hispanic male children under age 3 years living in the highest risk neighborhoods; (3) increase resources to: improve tracking and documentation of service delivery and provide more non-medical case management and environmental services; and (4) share the evaluation methodology/findings with the Centers for Disease Control and Prevention administrators; the implications may be relevant to other program managers conducting such assessments. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The progressive depletion of fossil fuels and their high contribution to the energy supply in this modern society forces that will be soon replaced by renewable fuels. But the dispersion and alternation of renewable energy production also undertake to reduce their costs to use as energy storage and hydrogen carrier. It is necessary to develop technologies for hydrogen production from all renewable energy storage technologies and the development of energy production from hydrogen fuel cells and cogeneration and tri generation systems. In order to propel this technological development discussed where the hydrogen plays a key role as energy storage and renewable energy, the National Centre of Hydrogen and Fuel Cell Technology Experimentation in Spain equipped with installations that enable scientific and technological design, develop, verify, certify, approve, test, measure and, more importantly, the facility ensures continuous operation for 24 hours a day, 365 days year. At the same time, the system is scalable so as to allow continuous adaptation of new technologies are developed and incorporated into the assembly to verify integration at the same time it checks the validity of their development. The transformation sector can be said to be the heart of the system, because without neglecting the other sectors, this should prove the validity of hydrogen as a carrier - energy storage are important efforts that have to do to demonstrate the suitability of fuel cells or internal combustion systems to realize the energy stored in hydrogen at prices competitive with conventional systems. The multiple roles to meet the fuel cells under different conditions of operation require to cover their operating conditions, many different sizes and applications. The fourth area focuses on integration is an essential complement within the installation. We must integrate not only the electricity produced, but also hydrogen is used and the heat generated in the process of using hydrogen energy. The energy management in its three forms: hydrogen chemical, electrical and thermal integration requires complicated and require a logic and artificial intelligence extremes to ensure maximum energy efficiency at the same time optimum utilization is achieved. Verification of the development and approval in the entire production system and, ultimately, as a demonstrator set to facilitate the simultaneous evolution of production technology, storage and distribution of hydrogen fuel cells has been assessed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using photocatalysis for energy applications depends, more than for environmental purposes or selective chemical synthesis, on converting as much of the solar spectrum as possible; the best photocatalyst, titania, is far from this. Many efforts are pursued to use better that spectrum in photocatalysis, by doping titania or using other materials (mainly oxides, nitrides and sulphides) to obtain a lower bandgap, even if this means decreasing the chemical potential of the electron-hole pairs. Here we introduce an alternative scheme, using an idea recently proposed for photovoltaics: the intermediate band (IB) materials. It consists in introducing in the gap of a semiconductor an intermediate level which, acting like a stepstone, allows an electron jumping from the valence band to the conduction band in two steps, each one absorbing one sub-bandgap photon. For this the IB must be partially filled, to allow both sub-bandgap transitions to proceed at comparable rates; must be made of delocalized states to minimize nonradiative recombination; and should not communicate electronically with the outer world. For photovoltaic use the optimum efficiency so achievable, over 1.5 times that given by a normal semiconductor, is obtained with an overall bandgap around 2.0 eV (which would be near-optimal also for water phtosplitting). Note that this scheme differs from the doping principle usually considered in photocatalysis, which just tries to decrease the bandgap; its aim is to keep the full bandgap chemical potential but using also lower energy photons. In the past we have proposed several IB materials based on extensively doping known semiconductors with light transition metals, checking first of all with quantum calculations that the desired IB structure results. Subsequently we have synthesized in powder form two of them: the thiospinel In2S3 and the layered compound SnS2 (having bandgaps of 2.0 and 2.2 eV respectively) where the octahedral cation is substituted at a â?10% level with vanadium, and we have verified that this substitution introduces in the absorption spectrum the sub-bandgap features predicted by the calculations. With these materials we have verified, using a simple reaction (formic acid oxidation), that the photocatalytic spectral response is indeed extended to longer wavelengths, being able to use even 700 nm photons, without largely degrading the response for above-bandgap photons (i.e. strong recombination is not induced) [3b, 4]. These materials are thus promising for efficient photoevolution of hydrogen from water; work on this is being pursued, the results of which will be presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Photovoltaic (PV) Module Reliability Workshop was held in Golden, Colorado, on Feb. 28?March 1, 2012. The objective was to share information to improve PV module reliability because such improvements reduce the cost of solar electricity and give investors confidence in the technology. NREL led the workshop, which was sponsored by the U.S. Department of Energy (DOE) Solar Energy Technologies Program (Solar Program).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The worldwide "hyper-connection" of any object around us is the challenge that promises to cover the paradigm of the Internet of Things. If the Internet has colonized the daily life of more than 2000 million1 people around the globe, the Internet of Things faces of connecting more than 100000 million2 "things" by 2020. The underlying Internet of Things’ technologies are the cornerstone that promises to solve interrelated global problems such as exponential population growth, energy management in cities, and environmental sustainability in the average and long term. On the one hand, this Project has the goal of knowledge acquisition about prototyping technologies available in the market for the Internet of Things. On the other hand, the Project focuses on the development of a system for devices management within a Wireless Sensor and Actuator Network to offer some services accessible from the Internet. To accomplish the objectives, the Project will begin with a detailed analysis of various “open source” hardware platforms to encourage creative development of applications, and automatically extract information from the environment around them for transmission to external systems. In addition, web platforms that enable mass storage with the philosophy of the Internet of Things will be studied. The project will culminate in the proposal and specification of a service-oriented software architecture for embedded systems that allows communication between devices on the network, and the data transmission to external systems. Furthermore, it abstracts the complexities of hardware to application developers. RESUMEN. La “hiper-conexión” a nivel mundial de cualquier objeto que nos rodea es el desafío al que promete dar cobertura el paradigma de la Internet de las Cosas. Si la Internet ha colonizado el día a día de más de 2000 millones1 de personas en todo el planeta, la Internet de las Cosas plantea el reto de conectar a más de 100000 millones2 de “cosas” para el año 2020. Las tecnologías subyacentes de la Internet de las Cosas son la piedra angular que prometen dar solución a problemas globales interrelacionados como el crecimiento exponencial de la población, la gestión de la energía en las ciudades o la sostenibilidad del medioambiente a largo plazo. Este Proyecto Fin de Carrera tiene como principales objetivos por un lado, la adquisición de conocimientos acerca de las tecnologías para prototipos disponibles en el mercado para la Internet de las Cosas, y por otro lado el desarrollo de un sistema para la gestión de dispositivos de una red inalámbrica de sensores que ofrezcan unos servicios accesibles desde la Internet. Con el fin de abordar los objetivos marcados, el proyecto comenzará con un análisis detallado de varias plataformas hardware de tipo “open source” que estimulen el desarrollo creativo de aplicaciones y que permitan extraer de forma automática información del medio que les rodea para transmitirlo a sistemas externos para su posterior procesamiento. Por otro lado, se estudiarán plataformas web identificadas con la filosofía de la Internet de las Cosas que permitan el almacenamiento masivo de datos que diferentes plataformas hardware transfieren a través de la Internet. El Proyecto culminará con la propuesta y la especificación una arquitectura software orientada a servicios para sistemas empotrados que permita la comunicación entre los dispositivos de la red y la transmisión de datos a sistemas externos, así como facilitar el desarrollo de aplicaciones a los programadores mediante la abstracción de la complejidad del hardware.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta Tesis aborda los problemas de eficiencia de las redes eléctrica desde el punto de vista del consumo. En particular, dicha eficiencia es mejorada mediante el suavizado de la curva de consumo agregado. Este objetivo de suavizado de consumo implica dos grandes mejoras en el uso de las redes eléctricas: i) a corto plazo, un mejor uso de la infraestructura existente y ii) a largo plazo, la reducción de la infraestructura necesaria para suplir las mismas necesidades energéticas. Además, esta Tesis se enfrenta a un nuevo paradigma energético, donde la presencia de generación distribuida está muy extendida en las redes eléctricas, en particular, la generación fotovoltaica (FV). Este tipo de fuente energética afecta al funcionamiento de la red, incrementando su variabilidad. Esto implica que altas tasas de penetración de electricidad de origen fotovoltaico es perjudicial para la estabilidad de la red eléctrica. Esta Tesis trata de suavizar la curva de consumo agregado considerando esta fuente energética. Por lo tanto, no sólo se mejora la eficiencia de la red eléctrica, sino que también puede ser aumentada la penetración de electricidad de origen fotovoltaico en la red. Esta propuesta conlleva grandes beneficios en los campos económicos, social y ambiental. Las acciones que influyen en el modo en que los consumidores hacen uso de la electricidad con el objetivo producir un ahorro energético o un aumento de eficiencia son llamadas Gestión de la Demanda Eléctrica (GDE). Esta Tesis propone dos algoritmos de GDE diferentes para cumplir con el objetivo de suavizado de la curva de consumo agregado. La diferencia entre ambos algoritmos de GDE reside en el marco en el cual estos tienen lugar: el marco local y el marco de red. Dependiendo de este marco de GDE, el objetivo energético y la forma en la que se alcanza este objetivo son diferentes. En el marco local, el algoritmo de GDE sólo usa información local. Este no tiene en cuenta a otros consumidores o a la curva de consumo agregado de la red eléctrica. Aunque esta afirmación pueda diferir de la definición general de GDE, esta vuelve a tomar sentido en instalaciones locales equipadas con Recursos Energéticos Distribuidos (REDs). En este caso, la GDE está enfocada en la maximización del uso de la energía local, reduciéndose la dependencia con la red. El algoritmo de GDE propuesto mejora significativamente el auto-consumo del generador FV local. Experimentos simulados y reales muestran que el auto-consumo es una importante estrategia de gestión energética, reduciendo el transporte de electricidad y alentando al usuario a controlar su comportamiento energético. Sin embargo, a pesar de todas las ventajas del aumento de auto-consumo, éstas no contribuyen al suavizado del consumo agregado. Se han estudiado los efectos de las instalaciones locales en la red eléctrica cuando el algoritmo de GDE está enfocado en el aumento del auto-consumo. Este enfoque puede tener efectos no deseados, incrementando la variabilidad en el consumo agregado en vez de reducirlo. Este efecto se produce porque el algoritmo de GDE sólo considera variables locales en el marco local. Los resultados sugieren que se requiere una coordinación entre las instalaciones. A través de esta coordinación, el consumo debe ser modificado teniendo en cuenta otros elementos de la red y buscando el suavizado del consumo agregado. En el marco de la red, el algoritmo de GDE tiene en cuenta tanto información local como de la red eléctrica. En esta Tesis se ha desarrollado un algoritmo autoorganizado para controlar el consumo de la red eléctrica de manera distribuida. El objetivo de este algoritmo es el suavizado del consumo agregado, como en las implementaciones clásicas de GDE. El enfoque distribuido significa que la GDE se realiza desde el lado de los consumidores sin seguir órdenes directas emitidas por una entidad central. Por lo tanto, esta Tesis propone una estructura de gestión paralela en lugar de una jerárquica como en las redes eléctricas clásicas. Esto implica que se requiere un mecanismo de coordinación entre instalaciones. Esta Tesis pretende minimizar la cantidad de información necesaria para esta coordinación. Para lograr este objetivo, se han utilizado dos técnicas de coordinación colectiva: osciladores acoplados e inteligencia de enjambre. La combinación de estas técnicas para llevar a cabo la coordinación de un sistema con las características de la red eléctrica es en sí mismo un enfoque novedoso. Por lo tanto, este objetivo de coordinación no es sólo una contribución en el campo de la gestión energética, sino también en el campo de los sistemas colectivos. Los resultados muestran que el algoritmo de GDE propuesto reduce la diferencia entre máximos y mínimos de la red eléctrica en proporción a la cantidad de energía controlada por el algoritmo. Por lo tanto, conforme mayor es la cantidad de energía controlada por el algoritmo, mayor es la mejora de eficiencia en la red eléctrica. Además de las ventajas resultantes del suavizado del consumo agregado, otras ventajas surgen de la solución distribuida seguida en esta Tesis. Estas ventajas se resumen en las siguientes características del algoritmo de GDE propuesto: • Robustez: en un sistema centralizado, un fallo o rotura del nodo central provoca un mal funcionamiento de todo el sistema. La gestión de una red desde un punto de vista distribuido implica que no existe un nodo de control central. Un fallo en cualquier instalación no afecta el funcionamiento global de la red. • Privacidad de datos: el uso de una topología distribuida causa de que no hay un nodo central con información sensible de todos los consumidores. Esta Tesis va más allá y el algoritmo propuesto de GDE no utiliza información específica acerca de los comportamientos de los consumidores, siendo la coordinación entre las instalaciones completamente anónimos. • Escalabilidad: el algoritmo propuesto de GDE opera con cualquier número de instalaciones. Esto implica que se permite la incorporación de nuevas instalaciones sin afectar a su funcionamiento. • Bajo coste: el algoritmo de GDE propuesto se adapta a las redes actuales sin requisitos topológicos. Además, todas las instalaciones calculan su propia gestión con un bajo requerimiento computacional. Por lo tanto, no se requiere un nodo central con un alto poder de cómputo. • Rápido despliegue: las características de escalabilidad y bajo coste de los algoritmos de GDE propuestos permiten una implementación rápida. No se requiere una planificación compleja para el despliegue de este sistema. ABSTRACT This Thesis addresses the efficiency problems of the electrical grids from the consumption point of view. In particular, such efficiency is improved by means of the aggregated consumption smoothing. This objective of consumption smoothing entails two major improvements in the use of electrical grids: i) in the short term, a better use of the existing infrastructure and ii) in long term, the reduction of the required infrastructure to supply the same energy needs. In addition, this Thesis faces a new energy paradigm, where the presence of distributed generation is widespread over the electrical grids, in particular, the Photovoltaic (PV) generation. This kind of energy source affects to the operation of the grid by increasing its variability. This implies that a high penetration rate of photovoltaic electricity is pernicious for the electrical grid stability. This Thesis seeks to smooth the aggregated consumption considering this energy source. Therefore, not only the efficiency of the electrical grid is improved, but also the penetration of photovoltaic electricity into the grid can be increased. This proposal brings great benefits in the economic, social and environmental fields. The actions that influence the way that consumers use electricity in order to achieve energy savings or higher efficiency in energy use are called Demand-Side Management (DSM). This Thesis proposes two different DSM algorithms to meet the aggregated consumption smoothing objective. The difference between both DSM algorithms lie in the framework in which they take place: the local framework and the grid framework. Depending on the DSM framework, the energy goal and the procedure to reach this goal are different. In the local framework, the DSM algorithm only uses local information. It does not take into account other consumers or the aggregated consumption of the electrical grid. Although this statement may differ from the general definition of DSM, it makes sense in local facilities equipped with Distributed Energy Resources (DERs). In this case, the DSM is focused on the maximization of the local energy use, reducing the grid dependence. The proposed DSM algorithm significantly improves the self-consumption of the local PV generator. Simulated and real experiments show that self-consumption serves as an important energy management strategy, reducing the electricity transport and encouraging the user to control his energy behavior. However, despite all the advantages of the self-consumption increase, they do not contribute to the smooth of the aggregated consumption. The effects of the local facilities on the electrical grid are studied when the DSM algorithm is focused on self-consumption maximization. This approach may have undesirable effects, increasing the variability in the aggregated consumption instead of reducing it. This effect occurs because the algorithm only considers local variables in the local framework. The results suggest that coordination between these facilities is required. Through this coordination, the consumption should be modified by taking into account other elements of the grid and seeking for an aggregated consumption smoothing. In the grid framework, the DSM algorithm takes into account both local and grid information. This Thesis develops a self-organized algorithm to manage the consumption of an electrical grid in a distributed way. The goal of this algorithm is the aggregated consumption smoothing, as the classical DSM implementations. The distributed approach means that the DSM is performed from the consumers side without following direct commands issued by a central entity. Therefore, this Thesis proposes a parallel management structure rather than a hierarchical one as in the classical electrical grids. This implies that a coordination mechanism between facilities is required. This Thesis seeks for minimizing the amount of information necessary for this coordination. To achieve this objective, two collective coordination techniques have been used: coupled oscillators and swarm intelligence. The combination of these techniques to perform the coordination of a system with the characteristics of the electric grid is itself a novel approach. Therefore, this coordination objective is not only a contribution in the energy management field, but in the collective systems too. Results show that the proposed DSM algorithm reduces the difference between the maximums and minimums of the electrical grid proportionally to the amount of energy controlled by the system. Thus, the greater the amount of energy controlled by the algorithm, the greater the improvement of the efficiency of the electrical grid. In addition to the advantages resulting from the smoothing of the aggregated consumption, other advantages arise from the distributed approach followed in this Thesis. These advantages are summarized in the following features of the proposed DSM algorithm: • Robustness: in a centralized system, a failure or breakage of the central node causes a malfunction of the whole system. The management of a grid from a distributed point of view implies that there is not a central control node. A failure in any facility does not affect the overall operation of the grid. • Data privacy: the use of a distributed topology causes that there is not a central node with sensitive information of all consumers. This Thesis goes a step further and the proposed DSM algorithm does not use specific information about the consumer behaviors, being the coordination between facilities completely anonymous. • Scalability: the proposed DSM algorithm operates with any number of facilities. This implies that it allows the incorporation of new facilities without affecting its operation. • Low cost: the proposed DSM algorithm adapts to the current grids without any topological requirements. In addition, every facility calculates its own management with low computational requirements. Thus, a central computational node with a high computational power is not required. • Quick deployment: the scalability and low cost features of the proposed DSM algorithms allow a quick deployment. A complex schedule of the deployment of this system is not required.