920 resultados para Distributed Control Problems
Resumo:
Tämän työn tarkoituksena on koota yhteen selluprosessin mittausongelmat ja mahdolliset mittaustekniikat ongelmien ratkaisemiseksi. Pääpaino on online-mittaustekniikoissa. Työ koostuu kolmesta osasta. Ensimmäinen osa on kirjallisuustyö, jossa esitellään nykyaikaisen selluprosessin perusmittaukset ja säätötarpeet. Mukana on koko kuitulinja puunkäsittelystä valkaisuun ja kemikaalikierto: haihduttamo, soodakattila, kaustistamo ja meesauuni. Toisessa osassa mittausongelmat ja mahdolliset mittaustekniikat on koottu yhteen ”tiekartaksi”. Tiedot on koottu vierailemalla kolmella suomalaisella sellutehtaalla ja haastattelemalla laitetekniikka- ja mittaustekniikka-asiantuntijoita. Prosessikemian paremmalle ymmärtämiselle näyttää haastattelun perusteella olevan tarvetta, minkä vuoksi konsentraatiomittaukset on valittu jatkotutkimuskohteeksi. Viimeisessä osassa esitellään mahdollisia mittaustekniikoita konsentraatiomittausten ratkaisemiseksi. Valitut tekniikat ovat lähi-infrapunatekniikka (NIR), fourier-muunnosinfrapunatekniikka (FTIR), online-kapillaarielektroforeesi (CE) ja laserindusoitu plasmaemissiospektroskopia (LIPS). Kaikkia tekniikoita voi käyttää online-kytkettyinä prosessikehitystyökaluina. Kehityskustannukset on arvioitu säätöön kytketylle online-laitteelle. Kehityskustannukset vaihtelevat nollasta miestyövuodesta FTIR-tekniikalle viiteen miestyövuoteen CE-laitteelle; kehityskustannukset riippuvat tekniikan kehitysasteesta ja valmiusasteesta tietyn ongelman ratkaisuun. Työn viimeisessä osassa arvioidaan myös yhden mittausongelman – pesuhäviömittauksen – ratkaisemisen teknis-taloudellista kannattavuutta. Ligniinipitoisuus kuvaisi nykyisiä mittauksia paremmin todellista pesuhäviötä. Nykyään mitataan joko natrium- tai COD-pesuhäviötä. Ligniinipitoisuutta voidaan mitata UV-absorptiotekniikalla. Myös CE-laitetta voitaisiin käyttää pesuhäviön mittauksessa ainakin prosessikehitysvaiheessa. Taloudellinen tarkastelu pohjautuu moniin yksinkertaistuksiin ja se ei sovellu suoraan investointipäätösten tueksi. Parempi mittaus- ja säätöjärjestelmä voisi vakauttaa pesemön ajoa. Investointi ajoa vakauttavaan järjestelmään on kannattavaa, jos todellinen ajotilanne on tarpeeksi kaukana kustannusminimistä tai jos pesurin ajo heilahtelee eli pesuhäviön keskihajonta on suuri. 50 000 € maksavalle mittaus- ja säätöjärjestelmälle saadaan alle 0,5 vuoden takaisinmaksuaika epävakaassa ajossa, jos COD-pesuhäviön vaihteluväli on 5,2 – 11,6 kg/odt asetusarvon ollessa 8,4 kg/odt. Laimennuskerroin vaihtelee tällöin välillä 1,7 – 3,6 m3/odt asetusarvon ollessa 2,5 m3/odt.
Resumo:
Reinsurance is one of the tools that an insurer can use to mitigate the underwriting risk and then to control its solvency. In this paper, we focus on the proportional reinsurance arrangements and we examine several optimization and decision problems of the insurer with respect to the reinsurance strategy. To this end, we use as decision tools not only the probability of ruin but also the random variable deficit at ruin if ruin occurs. The discounted penalty function (Gerber & Shiu, 1998) is employed to calculate as particular cases the probability of ruin and the moments and the distribution function of the deficit at ruin if ruin occurs.
Resumo:
Tämä diplomityö käsittelee sääntöpohjaisen verkkoon pääsyn hallinnan (NAC) ratkaisuja arkkitehtonisesta näkökulmasta. Työssä käydään läpi Trusted Computing Groupin, Microsoft Corporationin, Juniper Networksin sekä Cisco Systemsin NAC-ratkaisuja. NAC koostuu joukosta uusia sekä jo olemassa olevia teknologioita, jotka auttavat ennalta määriteltyyn sääntökantaan perustuen hallitsemaan suojattuun verkkoon pyrkivien laitteiden tietoliikenneyhteyksiä. Käyttäjän tunnistamisen lisäksi NAC pystyy rajoittamaan verkkoon pääsyä laitekohtaisten ominaisuuksien perusteella, esimerkiksi virustunnisteisiin ja käyttöjärjestelmäpäivityksiin liittyen ja paikkaamaan tietyin rajoituksin näissä esiintyviä puutteita verkkoon pääsyn sallimiseksi. NAC on verraten uusi käsite, jolta puuttuu tarkka määritelmä. Tästä johtuen nykymarkkinoilla myydään ominaisuuksiltaan puutteellisia tuotteita NAC-nimikkeellä. Standardointi eri valmistajien NAC-komponenttien yhteentoimivuuden takaamiseksi on meneillään, minkä perusteella ratkaisut voidaan jakaa joko avoimia standardeja tai valmistajakohtaisia standardeja noudattaviksi. Esitellyt NAC-ratkaisut noudattavat standardeja joko rajoitetusti tai eivät lainkaan. Mikään läpikäydyistä ratkaisuista ei ole täydellinen NAC, mutta Juniper Networksin ratkaisu nousee niistä potentiaalisimmaksi jatkokehityksen ja -tutkimuksen kohteeksi TietoEnator Processing & Networks Oy:lle. Eräs keskeinen ongelma NAC-konseptissa on työaseman tietoverkolle toimittama mahdollisesti valheellinen tietoturvatarkistuksen tulos, minkä perusteella pääsyä osittain hallitaan. Muun muassa tähän ongelmaan ratkaisuna voisi olla jo nykytietokoneista löytyvä TPM-siru, mikä takaa tiedon oikeellisuuden ja koskemattomuuden.
Resumo:
Industry's growing need for higher productivity is placing new demands on mechanisms connected with electrical motors, because these can easily lead to vibration problems due to fast dynamics. Furthermore, the nonlinear effects caused by a motor frequently reduce servo stability, which diminishes the controller's ability to predict and maintain speed. Hence, the flexibility of a mechanism and its control has become an important area of research. The basic approach in control system engineering is to assume that the mechanism connected to a motor is rigid, so that vibrations in the tool mechanism, reel, gripper or any apparatus connected to the motor are not taken into account. This might reduce the ability of the machine system to carry out its assignment and shorten the lifetime of the equipment. Nonetheless, it is usually more important to know how the mechanism, or in other words the load on the motor, behaves. A nonlinear load control method for a permanent magnet linear synchronous motor is developed and implemented in the thesis. The purpose of the controller is to track a flexible load to the desired velocity reference as fast as possible and without awkward oscillations. The control method is based on an adaptive backstepping algorithm with its stability ensured by the Lyapunov stability theorem. As a reference controller for the backstepping method, a hybrid neural controller is introduced in which the linear motor itself is controlled by a conventional PI velocity controller and the vibration of the associated flexible mechanism is suppressed from an outer control loop using a compensation signal from a multilayer perceptron network. To avoid the local minimum problem entailed in neural networks, the initial weights are searched for offline by means of a differential evolution algorithm. The states of a mechanical system for controllers are estimated using the Kalman filter. The theoretical results obtained from the control design are validated with the lumped mass model for a mechanism. Generalization of the mechanism allows the methods derived here to be widely implemented in machine automation. The control algorithms are first designed in a specially introduced nonlinear simulation model and then implemented in the physical linear motor using a DSP (Digital Signal Processor) application. The measurements prove that both controllers are capable of suppressing vibration, but that the backstepping method is superior to others due to its accuracy of response and stability properties.
Resumo:
Snow cover is an important control in mountain environments and a shift of the snow-free period triggered by climate warming can strongly impact ecosystem dynamics. Changing snow patterns can have severe effects on alpine plant distribution and diversity. It thus becomes urgent to provide spatially explicit assessments of snow cover changes that can be incorporated into correlative or empirical species distribution models (SDMs). Here, we provide for the first time a with a lower overestimation comparison of two physically based snow distribution models (PREVAH and SnowModel) to produce snow cover maps (SCMs) at a fine spatial resolution in a mountain landscape in Austria. SCMs have been evaluated with SPOT-HRVIR images and predictions of snow water equivalent from the two models with ground measurements. Finally, SCMs of the two models have been compared under a climate warming scenario for the end of the century. The predictive performances of PREVAH and SnowModel were similar when validated with the SPOT images. However, the tendency to overestimate snow cover was slightly lower with SnowModel during the accumulation period, whereas it was lower with PREVAH during the melting period. The rate of true positives during the melting period was two times higher on average with SnowModel with a lower overestimation of snow water equivalent. Our results allow for recommending the use of SnowModel in SDMs because it better captures persisting snow patches at the end of the snow season, which is important when modelling the response of species to long-lasting snow cover and evaluating whether they might survive under climate change.
Resumo:
This work describes different possibilities of protection and control system improvement of primary distribution substation. The status of condition and main problems of power networks from reliability point of view in Russia are described. This work studies technologies used today in Russia for protection of distribution networks with their disadvantages. Majority of medium voltage networks (6-35 kV) has isolated network point. There is still no any protection available on the market which allows to estimate distance to fault in case of earth fault. The thesis analyses methods of earth fault distance calculation. On the basis of computer simulation the influence of various factors on calculation accuracy is studied. The practical implementation of the method presupposes usage of digital relay. Application of digital relay is accompanied by numerous opportunities which are described in this work. Also advantages of system implemented on the basis of IEC 61850 standard are examined. Finally, suitability of modern digital relays from GOST standard point of view is analyzed.
Resumo:
Previous genetic studies have demonstrated that natal homing shapes the stock structure of marine turtle nesting populations. However, widespread sharing of common haplotypes based on short segments of the mitochondrial control region often limits resolution of the demographic connectivity of populations. Recent studies employing longer control region sequences to resolve haplotype sharing have focused on regional assessments of genetic structure and phylogeography. Here we synthesize available control region sequences for loggerhead turtles from the Mediterranean Sea, Atlantic, and western Indian Ocean basins. These data represent six of the nine globally significant regional management units (RMUs) for the species and include novel sequence data from Brazil, Cape Verde, South Africa and Oman. Genetic tests of differentiation among 42 rookeries represented by short sequences (380 bp haplotypes from 3,486 samples) and 40 rookeries represented by long sequences (~800 bp haplotypes from 3,434 samples) supported the distinction of the six RMUs analyzed as well as recognition of at least 18 demographically independent management units (MUs) with respect to female natal homing. A total of 59 haplotypes were resolved. These haplotypes belonged to two highly divergent global lineages, with haplogroup I represented primarily by CC-A1, CC-A4, and CC-A11 variants and haplogroup II represented by CC-A2 and derived variants. Geographic distribution patterns of haplogroup II haplotypes and the nested position of CC-A11.6 from Oman among the Atlantic haplotypes invoke recent colonization of the Indian Ocean from the Atlantic for both global lineages. The haplotypes we confirmed for western Indian Ocean RMUs allow reinterpretation of previous mixed stock analysis and further suggest that contemporary migratory connectivity between the Indian and Atlantic Oceans occurs on a broader scale than previously hypothesized. This study represents a valuable model for conducting comprehensive international cooperative data management and research in marine ecology.
Resumo:
Simulation has traditionally been used for analyzing the behavior of complex real world problems. Even though only some features of the problems are considered, simulation time tends to become quite high even for common simulation problems. Parallel and distributed simulation is a viable technique for accelerating the simulations. The success of parallel simulation depends heavily on the combination of the simulation application, algorithm and message population in the simulation is sufficient, no additional delay is caused by this environment. In this thesis a conservative, parallel simulation algorithm is applied to the simulation of a cellular network application in a distributed workstation environment. This thesis presents a distributed simulation environment, Diworse, which is based on the use of networked workstations. The distributed environment is considered especially hard for conservative simulation algorithms due to the high cost of communication. In this thesis, however, the distributed environment is shown to be a viable alternative if the amount of communication is kept reasonable. Novel ideas of multiple message simulation and channel reduction enable efficient use of this environment for the simulation of a cellular network application. The distribution of the simulation is based on a modification of the well known Chandy-Misra deadlock avoidance algorithm with null messages. The basic Chandy Misra algorithm is modified by using the null message cancellation and multiple message simulation techniques. The modifications reduce the amount of null messages and the time required for their execution, thus reducing the simulation time required. The null message cancellation technique reduces the processing time of null messages as the arriving null message cancels other non processed null messages. The multiple message simulation forms groups of messages as it simulates several messages before it releases the new created messages. If the message population in the simulation is suffiecient, no additional delay is caused by this operation A new technique for considering the simulation application is also presented. The performance is improved by establishing a neighborhood for the simulation elements. The neighborhood concept is based on a channel reduction technique, where the properties of the application exclusively determine which connections are necessary when a certain accuracy for simulation results is required. Distributed simulation is also analyzed in order to find out the effect of the different elements in the implemented simulation environment. This analysis is performed by using critical path analysis. Critical path analysis allows determination of a lower bound for the simulation time. In this thesis critical times are computed for sequential and parallel traces. The analysis based on sequential traces reveals the parallel properties of the application whereas the analysis based on parallel traces reveals the properties of the environment and the distribution.
Resumo:
This thesis is done as a complementary part for the active magnet bearing (AMB) control software development project in Lappeenranta University of Technology. The main focus of the thesis is to examine an idea of a real-time operating system (RTOS) framework that operates in a dedicated digital signal processor (DSP) environment. General use real-time operating systems do not necessarily provide sufficient platform for periodic control algorithm utilisation. In addition, application program interfaces found in real-time operating systems are commonly non-existent or provided as chip-support libraries, thus hindering platform independent software development. Hence, two divergent real-time operating systems and additional periodic extension software with the framework design are examined to find solutions for the research problems. The research is discharged by; tracing the selected real-time operating system, formulating requirements for the system, and designing the real-time operating system framework (OSFW). The OSFW is formed by programming the framework and conjoining the outcome with the RTOS and the periodic extension. The system is tested and functionality of the software is evaluated in theoretical context of the Rate Monotonic Scheduling (RMS) theory. The performance of the OSFW and substance of the approach are discussed in contrast to the research theme. The findings of the thesis demonstrates that the forged real-time operating system framework is a viable groundwork solution for periodic control applications.
Resumo:
Synchronous machines with an AC converter are used mainly in large drives, for example in ship propulsion drives as well as in rolling mill drives in steel industry. These motors are used because of their high efficiency, high overload capacity and good performance in the field weakening area. Present day drives for electrically excited synchronous motors are equipped with position sensors. Most drives for electrically excited synchronous motors will be equipped with position sensors also in future. This kind of drives with good dynamics are mainly used in metal industry. Drives without a position sensor can be used e.g. in ship propulsion and in large pump and blower drives. Nowadays, these drives are equipped with a position sensor, too. The tendency is to avoid a position sensor if possible, since a sensor reduces the reliability of the drive and increases costs (latter is not very significant for large drives). A new control technique for a synchronous motor drive is a combination of the Direct Flux Linkage Control (DFLC) based on a voltage model and a supervising method (e.g. current model). This combination is called Direct Torque Control method (DTC). In the case of the position sensorless drive, the DTC can be implemented by using other supervising methods that keep the stator flux linkage origin centered. In this thesis, a method for the observation of the drift of the real stator flux linkage in the DTC drive is introduced. It is also shown how this method can be used as a supervising method that keeps the stator flux linkage origin centered in the case of the DTC. In the position sensorless case, a synchronous motor can be started up with the DTC control, when a method for the determination of the initial rotor position presented in this thesis is used. The load characteristics of such a drive are not very good at low rotational speeds. Furthermore, continuous operation at a zero speed and at a low rotational speed is not possible, which is partly due to the problems related to the flux linkage estimate. For operation in a low speed area, a stator current control method based on the DFLC modulator (DMCQ is presented. With the DMCC, it is possible to start up and operate a synchronous motor at a zero speed and at low rotational speeds in general. The DMCC is necessary in situations where high torque (e.g. nominal torque) is required at the starting moment, or if the motor runs several seconds at a zero speed or at a low speed range (up to 2 Hz). The behaviour of the described methods is shown with test results. The test results are presented for the direct flux linkage and torque controlled test drive system with a 14.5 kVA, four pole salient pole synchronous motor with a damper winding and electric excitation. The static accuracy of the drive is verified by measuring the torque in a static load operation, and the dynamics of the drive is proven in load transient tests. The performance of the drive concept presented in this work is sufficient e.g. for ship propulsion and for large pump drives. Furthermore, the developed methods are almost independent of the machine parameters.
Resumo:
There is evidence that virtual reality (VR) pain distraction is effective at improving pain-related outcomes. However, more research is needed to investigate VR environments with other pain-related goals. The main aim of this study was to compare the differential effects of two VR environments on a set of pain-related and cognitive variables during a cold pressor experiment. One of these environments aimed to distract attention away from pain (VRD), whereas the other was designed to enhance pain control (VRC). Participants were 77 psychology students, who were randomly assigned to one of the following three conditions during the cold pressor experiment: (a) VRD, (b) VRC, or (c) Non-VR (control condition). Data were collected regarding both pain-related variables (intensity, tolerance, threshold, time perception, and pain sensitivity range) and cognitive variables (self-efficacy and catastrophizing). Results showed that in comparison with the control condition, the VRC intervention significantly increased pain tolerance, the pain sensitivity range, and the degree of time underestimation. It also increased self-efficacy in tolerating pain and led to a reduction in reported helplessness. The VRD intervention significantly increased the pain threshold and pain tolerance in comparison with the control condition, but it did not affect any of the cognitive variables. Overall, the intervention designed to enhance control seems to have a greater effect on the cognitive variables assessed. Although these results need to be replicated in further studies, the findings suggest that the VRC intervention has considerable potential in terms of increasing self-efficacy and modifying the negative thoughts that commonly accompany pain problems.
Resumo:
Un dels problemes típics de regulació en el camp de l’automatització industrial és el control de velocitat lineal d’entrada del fil a les bobines, ja que com més gruix acumulem a igual velocitat de rotació de la bobina s’augmenta notablement la velocitat lineal d’entrada del fil, aquest desajust s’ha de poder compensar de forma automàtica per aconseguir una velocitat d’entrada constant. Aquest problema de regulació de velocitats és molt freqüent i de difícil control a la indústria on intervé el bobinat d’algun tipus de material com cablejat, fil, paper, làmines de planxa, tubs, etc... Els dos reptes i objectius principals són, primer, la regulació de la velocitat de rotació de la bobina per aconseguir una velocitat lineal del fil d’entrada, i segon, mitjançant el guiatge de l’alimentació de fil a la bobina, aconseguir un repartiment uniforme de cada capa de fil. El desenvolupament consisteix amb l’automatització i control d’una bobinadora automàtica mitjançant la configuració i programació de PLC’s, servomotors i encoders. Finalment es farà el muntatge pràctic sobre una bancada per verificar i simular el seu correcte funcionament que ha de donar solució a aquests problemes de regulació de velocitats. Com a conclusions finals s’han aconseguit els objectius i una metodologia per fer una regulació de velocitats de rotació per bobines, amb accionaments de servomotors amb polsos, i a nivell de coneixements he aconseguit dominar les aplicacions d’aquest tipus d’accionaments aplicats a construccions mecàniques.
Resumo:
The marine environment is certainly one of the most complex systems to study, not only because of the challenges posed by the nature of the waters, but especially due to the interactions of physical, chemical and biological processes that control the cycles of the elements. Together with analytical chemists, oceanographers have been making a great effort in the advancement of knowledge of the distribution patterns of trace elements and processes that determine their biogeochemical cycles and influences on the climate of the planet. The international academic community is now in prime position to perform the first study on a global scale for observation of trace elements and their isotopes in the marine environment (GEOTRACES) and to evaluate the effects of major global changes associated with the influences of megacities distributed around the globe. This action can only be performed due to the development of highly sensitive detection methods and the use of clean sampling and handling techniques, together with a joint international program working toward the clear objective of expanding the frontiers of the biogeochemistry of the oceans and related topics, including climate change issues and ocean acidification associated with alterations in the carbon cycle. It is expected that the oceanographic data produced this coming decade will allow a better understanding of biogeochemical cycles, and especially the assessment of changes in trace elements and contaminants in the oceans due to anthropogenic influences, as well as its effects on ecosystems and climate. Computational models are to be constructed to simulate the conditions and processes of the modern oceans and to allow predictions. The environmental changes arising from human activity since the 18th century (also called the Anthropocene) have made the Earth System even more complex. Anthropogenic activities have altered both terrestrial and marine ecosystems, and the legacy of these impacts in the oceans include: a) pollution of the marine environment by solid waste, including plastics; b) pollution by chemical and medical (including those for veterinary use) substances such as hormones, antibiotics, legal and illegal drugs, leading to possible endocrine disruption of marine organisms; and c) ocean acidification, the collateral effect of anthropogenic emissions of CO2 into the atmosphere, irreversible in the human life time scale. Unfortunately, the anthropogenic alteration of the hydrosphere due to inputs of plastics, metal, hydrocarbons, contaminants of emerging concern and even with formerly "exotic" trace elements, such us rare earth elements is likely to accelerate in the near future. These emerging contaminants would likely soon present difficulties for studies in pristine environments. All this knowledge brings with it a great responsibility: helping to envisage viable adaptation and mitigation solutions to the problems identified. The greatest challenge faced by Brazil is currently to create a framework project to develop education, science and technology applied to oceanography and related areas. This framework would strengthen the present working groups and enhance capacity building, allowing a broader Brazilian participation in joint international actions and scientific programs. Recently, the establishment of the National Institutes of Science and Technology (INCTs) for marine science, and the creation of the National Institute of Oceanographic and Hydrological Research represent an exemplary start. However, the participation of the Brazilian academic community in the latest assaults on the frontier of chemical oceanography is extremely limited, largely due to: i. absence of physical infrastructure for the preparation and processing of field samples at ultra-trace level; ii. limited access to oceanographic cruises, due to the small number of Brazilian vessels and/or absence of "clean" laboratories on board; iii. restricted international cooperation; iv. limited analytical capacity of Brazilian institutions for the analysis of trace elements in seawater; v. high cost of ultrapure reagents associated with processing a large number of samples, and vi. lack of qualified technical staff. Advances in knowledge, analytic capabilities and the increasing availability of analytical resources available today offer favorable conditions for chemical oceanography to grow. The Brazilian academic community is maturing and willing to play a role in strengthening the marine science research programs by connecting them with educational and technological initiatives in order to preserve the oceans and to promote the development of society.
Resumo:
Electricity distribution network operation (NO) models are challenged as they are expected to continue to undergo changes during the coming decades in the fairly developed and regulated Nordic electricity market. Network asset managers are to adapt to competitive technoeconomical business models regarding the operation of increasingly intelligent distribution networks. Factors driving the changes for new business models within network operation include: increased investments in distributed automation (DA), regulative frameworks for annual profit limits and quality through outage cost, increasing end-customer demands, climatic changes and increasing use of data system tools, such as Distribution Management System (DMS). The doctoral thesis addresses the questions a) whether there exist conditions and qualifications for competitive markets within electricity distribution network operation and b) if so, identification of limitations and required business mechanisms. This doctoral thesis aims to provide an analytical business framework, primarily for electric utilities, for evaluation and development purposes of dedicated network operation models to meet future market dynamics within network operation. In the thesis, the generic build-up of a business model has been addressed through the use of the strategicbusiness hierarchy levels of mission, vision and strategy for definition of the strategic direction of the business followed by the planning, management and process execution levels of enterprisestrategy execution. Research questions within electricity distribution network operation are addressed at the specified hierarchy levels. The results of the research represent interdisciplinary findings in the areas of electrical engineering and production economics. The main scientific contributions include further development of the extended transaction cost economics (TCE) for government decisions within electricity networks and validation of the usability of the methodology for the electricity distribution industry. Moreover, DMS benefit evaluations in the thesis based on the outage cost calculations propose theoretical maximum benefits of DMS applications equalling roughly 25% of the annual outage costs and 10% of the respective operative costs in the case electric utility. Hence, the annual measurable theoretical benefits from the use of DMS applications are considerable. The theoretical results in the thesis are generally validated by surveys and questionnaires.
Resumo:
In this thesis a control system for an intelligent low voltage energy grid is presented, focusing on the control system created by using a multi-agent approach which makes it versatile and easy to expand according to the future needs. The control system is capable of forecasting the future energy consumption and decisions making on its own without human interaction when countering problems. The control system is a part of the St. Petersburg State Polytechnic University’s smart grid project that aims to create a smart grid for the university’s own use. The concept of the smart grid is interesting also for the consumers as it brings new possibilities to control own energy consumption and to save money. Smart grids makes it possible to monitor the energy consumption in real-time and to change own habits to save money. The intelligent grid also brings possibilities to integrate the renewable energy sources to the global or the local energy production much better than the current systems. Consumers can also sell their extra power to the global grid if they want.