908 resultados para security of electricity supply
Resumo:
Electricity spot prices have always been a demanding data set for time series analysis, mostly because of the non-storability of electricity. This feature, making electric power unlike the other commodities, causes outstanding price spikes. Moreover, the last several years in financial world seem to show that ’spiky’ behaviour of time series is no longer an exception, but rather a regular phenomenon. The purpose of this paper is to seek patterns and relations within electricity price outliers and verify how they affect the overall statistics of the data. For the study techniques like classical Box-Jenkins approach, series DFT smoothing and GARCH models are used. The results obtained for two geographically different price series show that patterns in outliers’ occurrence are not straightforward. Additionally, there seems to be no rule that would predict the appearance of a spike from volatility, while the reverse effect is quite prominent. It is concluded that spikes cannot be predicted based only on the price series; probably some geographical and meteorological variables need to be included in modeling.
Resumo:
Deregulation of the electricity sector liberated the electricity sale and production for competitive forces while in the network business, electricity transmission and distribution, natural monopoly positions were recognised. Deregulation was accompanied by efficiencyoriented thinking on the whole electricity supply industry. For electricity distribution this meant a transition from a public service towards profit-driven business guided by economic regulation. Regulation is the primary means to enforce societal and other goals in the regulated monopoly sector. The design of economic regulation is concerned with two main attributes; end-customer price and quality of electricity distribution services. Regulation limits the costs of the regulated company but also defines the desired quality of monopoly services. The characteristics of the regulatory framework and the incentives it provides are therefore decisive for the electricity distribution sector. Regulation is not a static factor; changes in the regulatory practices cause discontinuity points, which in turn generate risks. A variety of social and environmental concerns together with technological advancements have emphasised the relevance of quality regulation, which is expected to lead to the large-scale replacement of overhead lines with underground cables. The electricity network construction activity is therefore currently witnessing revolutionary changes in its competitive landscape. In a business characterised by high statutory involvement and a high level of sunk costs, recognising and understanding the regulatory risks becomes a key success factor. As a response, electricity distribution companies have turned into outsourcing to attain efficiency and quality goals. This doctoral thesis addresses the impacts of regulatory risks on electricity network construction, which is a commonly outsourced activity in the electricity distribution network sector. The chosen research approach is characterised as an action analytical research on account of the fact that regulatory risks are greatly dependent on the individual nature of the regulatory regime applied in the electricity distribution sector. The main contribution of this doctoral thesis is to develop a concept for recognising and managing the business risks stemming from economic regulation. The degree of outsourcing in the sector is expected to increase in years to come. The results of the research provide new knowledge to manage the regulatory risks when outsourcing services.
Resumo:
In the Russian Wholesale Market, electricity and capacity are traded separately. Capacity is a special good, the sale of which obliges suppliers to keep their generating equipment ready to produce the quantity of electricity indicated by the System Operator. The purpose of the formation of capacity trading was the maintenance of reliable and uninterrupted delivery of electricity in the wholesale market. The price of capacity reflects constant investments in construction, modernization and maintenance of power plants. So, the capacity sale creates favorable conditions to attract investments in the energy sector because it guarantees the investor that his investments will be returned.
Resumo:
Objective: The paper analyzes the supply and the utilization of hemodynamic services in Rio de Janeiro, Brazil.Methods: It's an exploratory study that uses data obtained from Brazilian official databases. The period of supply analysis was from 1999 to 2009, and of utilization was from 2008 to October 2012.Results: Since 1999 there is a growth of hemodynamic equipment purchase. The private sector concentrates most of the supply, but it has been reducing its availability to SUS. The rate between population and equipment in Brazil exceeds the ones of some rich countries. In the sense of supply, there are in 2009, a supply rate of 1,4 equipments for 1 million inhabitants in RJ state, larger than brazilian rate, of 3,4 but the rates are similar for public customers.Conclusion: Interventional cardiology procedures have improved in the state, but in a different way. And this is because the public hospitals at Rio de Janeiro have mostly reduced their production, while the private ones have increased their production. The observed result is the SUS users performing their procedures at great distances.
Resumo:
Ambitious energy targets set by EU put pressures to increase share of renewable electricity supply in this and next decades and therefore, some EU member countries have boosted increasing renewable energy generation capacity by implementing subsidy schemes on national level. In this study, two different change approaches to increase renewable energy supply and increase self-sufficiency of supply are assessed with respect to their impacts on power system, electricity market and electricity generation costs in Finland. It is obtained that the current electricity generation costs are high compared to opportunities of earnings from present-day investor’s perspective. In addition, the growth expectations of consumptions and the price forecasts do not stimulate investing in new generation capacity. Revolutionary transition path is driven by administrative and political interventions to achieve the energy targets. Evolutionary transition path is driven by market-based mechanisms, such as market itself and emission trading scheme. It is obtained in this study that in the revolutionary transition path operation of market-based mechanisms is distorted to some extent and it is likely that this path requires providing more public financial resources compared to evolutionary transition path. In the evolutionary transition path the energy targets are not achieved as quickly but market-based mechanisms function better and investment environment endures more stable compared to revolutionary transition path.
Resumo:
Time series of hourly electricity spot prices have peculiar properties. Electricity is by its nature difficult to store and has to be available on demand. There are many reasons for wanting to understand correlations in price movements, e.g. risk management purposes. The entire analysis carried out in this thesis has been applied to the New Zealand nodal electricity prices: offer prices (from 29 May 2002 to 31 March 2009) and final prices (from 1 January 1999 to 31 March 2009). In this paper, such natural factors as location of the node and generation type in the node that effects the correlation between nodal prices have been reviewed. It was noticed that the geographical factor affects the correlation between nodes more than others. Therefore, the visualisation of correlated nodes was done. However, for the offer prices the clear separation of correlated and not correlated nodes was not obtained. Finally, it was concluded that location factor most strongly affects correlation of electricity nodal prices; problems in visualisation probably associated with power losses when the power is transmitted over long distance.
Resumo:
Tämän työn tavoitteena oli selvittää sähkön jakeluverkkotoiminnan valvontamenetelmien muutoksien vaikutuksia Loiste Sähköverkko Oy:n talouteen neljännellä ja viidennellä valvontajaksolla. Tarkastelua varten tehtiin talousmalli, joka mallintaa verkkoyhtiön taloutta vuoteen 2040 asti. Talousmallissa mallinnettiin kaikkien kannustimien vaikutus paitsi innovaatio- ja toimitusvarmuuskannustimien vaikutus. Talousmallinnuksen perusperiaate oli, että mitä ei pystytä kattamaan siirtotuloilla, rahoitetaan vieraalla pääomalla, kun kassavirran minimitaso ja investointitaso ovat valittu. Talousmallilla tarkasteltiin neljää erilaista verkostoskenaariota. Tarkasteltavat verkostoskenaariot olivat kehittämissuunnitelman mukainen skenaario, nopeutettu kehittämissuunnitelman mukainen skenaario, kaapelointipainotteinen skenaario ja kunnossapitopainotteinen skenaario. Verkon arvon kehittyminen verkostoskenaarioissa mallinnettiin Loiste Sähköverkko Oy:n investointimallilla ja kuvattiin talousmallinnusta varten jälleenhankinta-arvon, nykykäyttöarvon, investointien ja tasapoistojen kehittymisellä vuoteen 2029 asti. Työn tulosten perusteella kehittämissuunnitelman mukaisessa skenaariossa vieraan pääoman määrä pysyy kohtuullisena ja mahdollistaa kohtuullisen kassavirran tarkastelujakson lopussa. Nopeutetussa kehittämissuunnitelman mukaisessa skenaariossa ja kaapelointipainotteisissa skenaariossa vieraan pääoman määrä kasvaa merkittävästi, mikä voi lisätä liiketaloudellisia riskejä, mutta toisaalta mahdollistavat korkeamman kassavirran tarkastelujakson lopussa. Kunnossapitopainotteisessa skenaariossa vieraan pääoman määrä on matala, mutta kassavirta myös pysyy matalana tarkastelujakson loppuun asti.
Resumo:
Tässä työssä tutkitaan suljetun jakeluverkon regulaatiota yhden suomalaisen teollisuussähköverkon näkökulmasta ja sitä verrataan olemassa olevaan energiaviraston jakeluverkkoja koskevaan säätelymalliin. Työssä tutkitaan verkkoyhtiön kehittämän säätelymallin toimivuutta ja sitä kuinka se täyttää hyvän valvontamallin ominaispiirteet. Tässä työssä ei ole tarkoituksena esittää uutta valvontamallia, vaan tutkia ja arvioida olemassa olevia malleja. Työssä käydään läpi olemassa oleva kolmannen valvontajakson regulaatiomalli sekä muutokset, joita on esitetty tuleville valvontakausille. Työssä taustoitetaan valvontatoimintaa myös yleisen talousteorian ja valvontateorioiden avulla. Tämän taustoituksen avulla tarkastellaan teollisuussähköverkkojen erityisiä ominaisuuksia ja edelleen tutkitun teollisuussähköverkon korkeita käyttövaatimuksia ja syitä tällaisille vaatimuksille. Verkkoyhtiön käyttämä tariffimalli kuvataan myös sen liittyessä sääntelyyn saumattomasti. Työssä todistettiin mallin toimivuus tämän kaltaisessa toimintaympäristössä, jossa käyntivarmuusvaatimus on erittäin korkea. Havainnoissa myös korostuu mallin pitkäjänteisyys ja ennustettavuus. Laskentaesimerkkien avulla arvioidaan liittymähinnoittelun kohtuullisuutta asiakkaalle ja mallin toimivuutta yleensä. Näissä tarkasteluissa havaittiin, että yleisellä tasolla liittymähinnoittelu on kohtuullista asiakkaalle ja se mahdollistaa asiakkaan pääomien tehokkaamman käytön kuin ilman verkkoyhtiötä. Kehittämiskohteina tuodaan esille keskeytyksistä aiheutuvien haittojen käsittely ja se, ettei tällaiseen verkkoympäristöön ole tällä hetkellä toimivaa mittaria, jolla voitaisiin arvioida kuinka hyvin verkkoyhtiö toimii.
Resumo:
The electricity distribution sector will face significant changes in the future. Increasing reliability demands will call for major network investments. At the same time, electricity end-use is undergoing profound changes. The changes include future energy technologies and other advances in the field. New technologies such as microgeneration and electric vehicles will have different kinds of impacts on electricity distribution network loads. In addition, smart metering provides more accurate electricity consumption data and opportunities to develop sophisticated load modelling and forecasting approaches. Thus, there are both demands and opportunities to develop a new type of long-term forecasting methodology for electricity distribution. The work concentrates on the technical and economic perspectives of electricity distribution. The doctoral dissertation proposes a methodology to forecast electricity consumption in the distribution networks. The forecasting process consists of a spatial analysis, clustering, end-use modelling, scenarios and simulation methods, and the load forecasts are based on the application of automatic meter reading (AMR) data. The developed long-term forecasting process produces power-based load forecasts. By applying these results, it is possible to forecast the impacts of changes on electrical energy in the network, and further, on the distribution system operator’s revenue. These results are applicable to distribution network and business planning. This doctoral dissertation includes a case study, which tests the forecasting process in practice. For the case study, the most prominent future energy technologies are chosen, and their impacts on the electrical energy and power on the network are analysed. The most relevant topics related to changes in the operating environment, namely energy efficiency, microgeneration, electric vehicles, energy storages and demand response, are discussed in more detail. The study shows that changes in electricity end-use may have radical impacts both on electrical energy and power in the distribution networks and on the distribution revenue. These changes will probably pose challenges for distribution system operators. The study suggests solutions for the distribution system operators on how they can prepare for the changing conditions. It is concluded that a new type of load forecasting methodology is needed, because the previous methods are no longer able to produce adequate forecasts.
Resumo:
The vast majority of our contemporary society owns a mobile phone, which has resulted in a dramatic rise in the amount of networked computers in recent years. Security issues in the computers have followed the same trend and nearly everyone is now affected by such issues. How could the situation be improved? For software engineers, an obvious answer is to build computer software with security in mind. A problem with building software with security is how to define secure software or how to measure security. This thesis divides the problem into three research questions. First, how can we measure the security of software? Second, what types of tools are available for measuring security? And finally, what do these tools reveal about the security of software? Measuring tools of these kind are commonly called metrics. This thesis is focused on the perspective of software engineers in the software design phase. Focus on the design phase means that code level semantics or programming language specifics are not discussed in this work. Organizational policy, management issues or software development process are also out of the scope. The first two research problems were studied using a literature review while the third was studied using a case study research. The target of the case study was a Java based email server called Apache James, which had details from its changelog and security issues available and the source code was accessible. The research revealed that there is a consensus in the terminology on software security. Security verification activities are commonly divided into evaluation and assurance. The focus of this work was in assurance, which means to verify one’s own work. There are 34 metrics available for security measurements, of which five are evaluation metrics and 29 are assurance metrics. We found, however, that the general quality of these metrics was not good. Only three metrics in the design category passed the inspection criteria and could be used in the case study. The metrics claim to give quantitative information on the security of the software, but in practice they were limited to evaluating different versions of the same software. Apart from being relative, the metrics were unable to detect security issues or point out problems in the design. Furthermore, interpreting the metrics’ results was difficult. In conclusion, the general state of the software security metrics leaves a lot to be desired. The metrics studied had both theoretical and practical issues, and are not suitable for daily engineering workflows. The metrics studied provided a basis for further research, since they pointed out areas where the security metrics were necessary to improve whether verification of security from the design was desired.
Resumo:
La production de biodiésel par des microalgues est intéressante à plusieurs niveaux. Dans le premier chapitre, un éventail de pour et contres concernant l’utilisation de microalgues pour la production de biocarburant sont ici révisés. La culture d’algues peut s'effectuer en utilisant des terres non-arables, de l’eau non-potable et des nutriments de base. De plus, la biomasse produite par les algues est considérablement plus importante que celle de plantes vasculaires. Plusieurs espèces on le contenu lipidique en forme de triacylglycérols (TAGs), qui peut correspondre jusqu'à 30% - 40% du poids sec de la biomasse. Ces proportions sont considérablement plus élevées que celui des huiles contenues dans les graines actuellement utilisées pour le biodiésel de première génération. Par contre, une production pratique et peu couteuse de biocarburant par des microalgues requiert de surpasser plusieurs obstacles. Ceci inclut le développement de systèmes de culture efficace à faible coût, de techniques de récupération requérant peu d’énergie, et de méthodes d’extraction et de conversion de l’huile non-dommageables pour l’environnement et peu couteuses. Le deuxième chapitre explore l'une des questions importantes soulevées dans le premier chapitre: la sélection d'une souche pour la culture. Une collection de souches de microalgues d'eau douce indigène au Québec a été établi et examiné au niveau de la diversité physiologique. Cette collection est composée de cent souches, que apparaissaient très hétérogènes en terme de croissance lorsque mises en culture à 10±2 °C ou 22±2 °C sur un effluent secondaire d’une usine municipale de traitement des eaux usées (EU), défini comme milieu Bold's Basal Medium (BBM). Des diagrammes de dispersion ont été utilisés pour étudier la diversité physiologique au sein de la collection, montrant plusieurs résultats intéressants. Il y avait une dispersion appréciable dans les taux de croissance selon les différents types de milieux et indépendamment de la température. De manière intéressante, en considérant que tous les isolats avaient initialement été enrichis sur milieu BBM, la distribution était plutôt symétrique autour de la ligne d’iso-croissance, suggérant que l’enrichissement sur BBM n’a pas semblé biaiser la croissance des souches sur ce milieu par rapport aux EU. Également, considérant que les isolats avaient d’abord été enrichis à 22°C, il est assez surprenant que la distribution de taux de croissance spécifiques soit aussi symétrique autour de la ligne d’iso-croissance, avec grossièrement des nombres égaux d’isolats de part et d’autre. Ainsi, l’enrichissement à 22°C ne semble pas biaiser les cellules vers une croissance à cette température plutôt que vers 10°C. Les diagrammes de dispersion obtenus lorsque le pourcentage en lipides de cultures sur BBM ont été comparées à des cultures ayant poussé sur EU soit à 10°C ou 22°C rendent évident que la production de lipides est favorisée par la culture sur EU aux deux températures, et que la production lipidique ne semble pas particulièrement plus favorisée par l’une ou l’autre de ces températures. Lorsque la collection a été examinée pour y déceler des différences avec le site d’échantillonnage, une analyse statistique a montré grossièrement que le même degré de diversité physiologique était retrouvé dans les échantillons des deux différents sites. Le troisième chapitre a poursuivi l'évaluation de la culture d'algues au Québec. L’utilisation de déchets industriels riches en nutriments minéraux et en sources de carbone pour augmenter la biomasse finale en microalgues et le produit lipidique à faible coût est une stratégie importante pour rendre viable la technologie des biocarburants par les algues. Par l’utilisation de souches de la collection de microalgues de l’Université de Montréal, ce rapport montre pour la première fois que des souches de microalgues peuvent pousser en présence de xylose, la source de carbone majoritairement retrouvée dans les eaux usées provenant des usines de pâte et papier, avec une hausse du taux de croissance de 2,8 fois par rapport à la croissance photoautotrophe, atteignant jusqu’à µ=1,1/jour. En présence de glycérol, les taux de croissance atteignaient des valeurs aussi élevées que µ=1,52/jour. La production lipidique augmentait jusqu’à 370% en présence de glycérol et 180% avec le xylose pour la souche LB1H10, démontrant que cette souche est appropriée pour le développement ultérieur de biocarburants en culture mixotrophe. L'ajout de xylose en cultures d'algues a montré certains effets inattendus. Le quatrième chapitre de ce travail a porté à comprendre ces effets sur la croissance des microalgues et la production de lipides. Quatre souches sauvages indigènes ont été obersvées quotidiennement, avant et après l’ajout de xylose, par cytométrie en flux. Avec quelques souches de Chlorella, l’ajout de xylose induisait une hausse rapide de l’accumulation de lipide (jusqu’à 3,3 fois) pendant les premières six à douze heures. Aux temps subséquents, les cellules montraient une diminution du contenu en chlorophylle, de leur taille et de leur nombre. Par contre, l’unique membre de la famille des Scenedesmaceae avait la capacité de profiter de la présence de cette source de carbone sous culture mixotrophe ou hétérotrophe sans effet négatif apparent. Ces résultats suggèrent que le xylose puisse être utilisé avant la récolte afin de stimuler l’augmentation du contenu lipidique de la culture d’algues, soit en système de culture continu ou à deux étapes, permettant la biorestauration des eaux usées provenant de l’industrie des pâtes et papiers. Le cinquième chapitre aborde une autre déché industriel important: le dioxyde de carbone et les gaz à effet de serre. Plus de la moitié du dioxyde de carbone qui est émis dans l'atmosphère chaque jour est dégagé par un processus stationnaire, soit pour la production d’électricité ou pour la fabrication industrielle. La libération de CO2 par ces sources pourrait être atténuée grâce à la biorestauration avec microalgues, une matière première putative pour les biocarburants. Néanmoins, toutes les cheminées dégagent un gaz différent, et la sélection des souches d'algues est vitale. Ainsi, ce travail propose l'utilisation d’un état de site particulier pour la bioprospection de souches d'algues pour être utilisé dans le processus de biorestauration. Les résultats montrent que l'utilisation d'un processus d'enrichissement simple lors de l'étape d'isolement peut sélectionner des souches qui étaient en moyenne 43,2% mieux performantes dans la production de biomasse que les souches isolées par des méthodes traditionnelles. Les souches isolées dans ce travail étaient capables d'assimiler le dioxyde de carbone à un taux supérieur à la moyenne, comparées à des résultats récents de la littérature.
Resumo:
The study is a close scrutiny of the process of investigation of offences in India along with an analysis of powers and functions of the investigating agency. The offences, which are prejudicial to sovereignty, integrity and security of the nation or to its friendly relations with foreign states, are generally called the offences against national security. Offences against national security being prejudicial to the very existence of the nation and its legal system, is a heinous and terrible one. As early as 1971 the Law Commission of India had pointed out the need of treating the offences relating to national security and their perpetrators on a totally different procedural footing. The recommendation that, all the offences coming under the said category ought to be brought under the purview of a single enactment so as to confront such offences effectively. The discrepancies in and inadequacies of the criminal justice system in India as much as they are related to the investigations of the offences against national security are examined and the reforms are also suggested. The quality of criminal justice is closely linked with the caliber of the prosecution system and many of the acquittals in courts can be ascribed not only to poor investigations but also to poor quality of prosecution.
Resumo:
Internet today has become a vital part of day to day life, owing to the revolutionary changes it has brought about in various fields. Dependence on the Internet as an information highway and knowledge bank is exponentially increasing so that a going back is beyond imagination. Transfer of critical information is also being carried out through the Internet. This widespread use of the Internet coupled with the tremendous growth in e-commerce and m-commerce has created a vital need for infonnation security.Internet has also become an active field of crackers and intruders. The whole development in this area can become null and void if fool-proof security of the data is not ensured without a chance of being adulterated. It is, hence a challenge before the professional community to develop systems to ensure security of the data sent through the Internet.Stream ciphers, hash functions and message authentication codes play vital roles in providing security services like confidentiality, integrity and authentication of the data sent through the Internet. There are several ·such popular and dependable techniques, which have been in use widely, for quite a long time. This long term exposure makes them vulnerable to successful or near successful attempts for attacks. Hence it is the need of the hour to develop new algorithms with better security.Hence studies were conducted on various types of algorithms being used in this area. Focus was given to identify the properties imparting security at this stage. By making use of a perception derived from these studies, new algorithms were designed. Performances of these algorithms were then studied followed by necessary modifications to yield an improved system consisting of a new stream cipher algorithm MAJE4, a new hash code JERIM- 320 and a new message authentication code MACJER-320. Detailed analysis and comparison with the existing popular schemes were also carried out to establish the security levels.The Secure Socket Layer (SSL) I Transport Layer Security (TLS) protocol is one of the most widely used security protocols in Internet. The cryptographic algorithms RC4 and HMAC have been in use for achieving security services like confidentiality and authentication in the SSL I TLS. But recent attacks on RC4 and HMAC have raised questions about the reliability of these algorithms. Hence MAJE4 and MACJER-320 have been proposed as substitutes for them. Detailed studies on the performance of these new algorithms were carried out; it has been observed that they are dependable alternatives.
Resumo:
Coordination among supply chain members is essential for better supply chain performance. An effective method to improve supply chain coordination is to implement proper coordination mechanisms. The primary objective of this research is to study the performance of a multi-level supply chain while using selected coordination mechanisms separately, and in combination, under lost sale and back order cases. The coordination mechanisms used in this study are price discount, delay in payment and different types of information sharing. Mathematical modelling and simulation modelling are used in this study to analyse the performance of the supply chain using these mechanisms. Initially, a three level supply chain consisting of a supplier, a manufacturer and a retailer has been used to study the combined effect of price discount and delay in payment on the performance (profit) of supply chain using mathematical modelling. This study showed that implementation of individual mechanisms improves the performance of the supply chain compared to ‘no coordination’. When more than one mechanism is used in combination, performance in most cases further improved. The three level supply chain considered in mathematical modelling was then extended to a three level network supply chain consisting of a four retailers, two wholesalers, and a manufacturer with an infinite part supplier. The performance of this network supply chain was analysed under both lost sale and backorder cases using simulation modelling with the same mechanisms: ‘price discount and delay in payment’ used in mathematical modelling. This study also showed that the performance of the supply chain is significantly improved while using combination of mechanisms as obtained earlier. In this study, it is found that the effect (increase in profit) of ‘delay in payment’ and combination of ‘price discount’ & ‘delay in payment’ on SC profit is relatively high in the case of lost sale. Sensitivity analysis showed that order cost of the retailer plays a major role in the performance of the supply chain as it decides the order quantity of the other players in the supply chain in this study. Sensitivity analysis also showed that there is a proportional change in supply chain profit with change in rate of return of any player. In the case of price discount, elasticity of demand is an important factor to improve the performance of the supply chain. It is also found that the change in permissible delay in payment given by the seller to the buyer affects the SC profit more than the delay in payment availed by the buyer from the seller. In continuation of the above, a study on the performance of a four level supply chain consisting of a manufacturer, a wholesaler, a distributor and a retailer with ‘information sharing’ as coordination mechanism, under lost sale and backorder cases, using a simulation game with live players has been conducted. In this study, best performance is obtained in the case of sharing ‘demand and supply chain performance’ compared to other seven types of information sharing including traditional method. This study also revealed that effect of information sharing on supply chain performance is relatively high in the case of lost sale than backorder. The in depth analysis in this part of the study showed that lack of information sharing need not always be resulting in bullwhip effect. Instead of bullwhip effect, lack of information sharing produced a huge hike in lost sales cost or backorder cost in this study which is also not favorable for the supply chain. Overall analysis provided the extent of improvement in supply chain performance under different cases. Sensitivity analysis revealed useful insights about the decision variables of supply chain and it will be useful for the supply chain management practitioners to take appropriate decisions.
Resumo:
The principal objective of this paper is to develop a methodology for the formulation of a master plan for renewable energy based electricity generation in The Gambia, Africa. Such a master plan aims to develop and promote renewable sources of energy as an alternative to conventional forms of energy for generating electricity in the country. A tailor-made methodology for the preparation of a 20-year renewable energy master plan focussed on electricity generation is proposed in order to be followed and verified throughout the present dissertation, as it is applied for The Gambia. The main input data for the proposed master plan are (i) energy demand analysis and forecast over 20 years and (ii) resource assessment for different renewable energy alternatives including their related power supply options. The energy demand forecast is based on a mix between Top-Down and Bottom-Up methodologies. The results are important data for future requirements of (primary) energy sources. The electricity forecast is separated in projections at sent-out level and at end-user level. On the supply side, Solar, Wind and Biomass, as sources of energy, are investigated in terms of technical potential and economic benefits for The Gambia. Other criteria i.e. environmental and social are not considered in the evaluation. Diverse supply options are proposed and technically designed based on the assessed renewable energy potential. This process includes the evaluation of the different available conversion technologies and finalizes with the dimensioning of power supply solutions, taking into consideration technologies which are applicable and appropriate under the special conditions of The Gambia. The balance of these two input data (demand and supply) gives a quantitative indication of the substitution potential of renewable energy generation alternatives in primarily fossil-fuel-based electricity generation systems, as well as fuel savings due to the deployment of renewable resources. Afterwards, the identified renewable energy supply options are ranked according to the outcomes of an economic analysis. Based on this ranking, and other considerations, a 20-year investment plan, broken down into five-year investment periods, is prepared and consists of individual renewable energy projects for electricity generation. These projects included basically on-grid renewable energy applications. Finally, a priority project from the master plan portfolio is selected for further deeper analysis. Since solar PV is the most relevant proposed technology, a PV power plant integrated to the fossil-fuel powered main electrical system in The Gambia is considered as priority project. This project is analysed by economic competitiveness under the current conditions in addition to sensitivity analysis with regard to oil and new-technology market conditions in the future.