57 resultados para Housing, Single family -- Design and construction

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examines Smart Grids and distributed generation, which is connected to a single-family house. The distributed generation comprises small wind power plant and solar panels. The study is done from the consumer point of view and it is divided into two parts. The first part presents the theoretical part and the second part presents the research part. The theoretical part consists of the definition of distributed generation, wind power, solar energy and Smart Grids. The study examines what the Smart Grids will enable. New technology concerning Smart Grids is also examined. The research part introduces wind and sun conditions from two countries. The countries are Finland and Germany. According to the wind and sun conditions of these two countries, the annual electricity production from wind power plant and solar panels will be calculated. The costs of generating electricity from wind and solar energy are calculated from the results of annual electricity productions. The study will also deal with feed-in tariffs, which are supporting systems for renewable energy resources. It is examined in the study, if it is cost-effective for the consumers to use the produced electricity by themselves or sell it to the grid. Finally, figures for both countries are formed. The figures include the calculated cost of generating electricity from wind power plant and solar panels, retail and wholesale prices and feed-in tariffs. In Finland, it is not cost-effective to sell the produced electricity to the grid, before there are support systems. In Germany, it is cost-effective to sell the produced electricity from solar panels to the grid because of feed-in tariffs. On the other hand, in Germany it is cost-effective to produce electricity from wind to own use because the retail price is higher than the produced electricity from wind.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The need for high performance, high precision, and energy saving in rotating machinery demands an alternative solution to traditional bearings. Because of the contactless operation principle, the rotating machines employing active magnetic bearings (AMBs) provide many advantages over the traditional ones. The advantages such as contamination-free operation, low maintenance costs, high rotational speeds, low parasitic losses, programmable stiffness and damping, and vibration insulation come at expense of high cost, and complex technical solution. All these properties make the use of AMBs appropriate primarily for specific and highly demanding applications. High performance and high precision control requires model-based control methods and accurate models of the flexible rotor. In turn, complex models lead to high-order controllers and feature considerable computational burden. Fortunately, in the last few years the advancements in signal processing devices provide new perspective on the real-time control of AMBs. The design and the real-time digital implementation of the high-order LQ controllers, which focus on fast execution times, are the subjects of this work. In particular, the control design and implementation in the field programmable gate array (FPGA) circuits are investigated. The optimal design is guided by the physical constraints of the system for selecting the optimal weighting matrices. The plant model is complemented by augmenting appropriate disturbance models. The compensation of the force-field nonlinearities is proposed for decreasing the uncertainty of the actuator. A disturbance-observer-based unbalance compensation for canceling the magnetic force vibrations or vibrations in the measured positions is presented. The theoretical studies are verified by the practical experiments utilizing a custom-built laboratory test rig. The test rig uses a prototyping control platform developed in the scope of this work. To sum up, the work makes a step in the direction of an embedded single-chip FPGA-based controller of AMBs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is a literature review which describes the construction of state of the art of permanent magnet generators and motors constructing and discusses the current and possible application of these machines in industry. Permanent magnet machines are a well-know class of rotating and linear electric machines used for many years in industrial applications. A particular interest for permanent magnet generators is connected with wind mills, which seem to be becoming increasingly popular nowadays. Geared and direct-driven permanent magnet generators are described. A classification of direct-driven permanent magnet generators is given. Design aspects of permanent magnet generators are presented. Permanent magnet generators for wind turbines designs are highlighted. Dynamics and vibration problems of permanent magnet generators covered in literature are presented. The application of the Finite Element Method for mechanical problems solution in the field of permanent magnet generators is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Particulate nanostructures are increasingly used for analytical purposes. Such particles are often generated by chemical synthesis from non-renewable raw materials. Generation of uniform nanoscale particles is challenging and particle surfaces must be modified to make the particles biocompatible and water-soluble. Usually nanoparticles are functionalized with binding molecules (e.g., antibodies or their fragments) and a label substance (if needed). Overall, producing nanoparticles for use in bioaffinity assays is a multistep process requiring several manufacturing and purification steps. This study describes a biological method of generating functionalized protein-based nanoparticles with specific binding activity on the particle surface and label activity inside the particles. Traditional chemical bioconjugation of the particle and specific binding molecules is replaced with genetic fusion of the binding molecule gene and particle backbone gene. The entity of the particle shell and binding moieties are synthesized from generic raw materials by bacteria, and fermentation is combined with a simple purification method based on inclusion bodies. The label activity is introduced during the purification. The process results in particles that are ready-to-use as reagents in bioaffinity. Apoferritin was used as particle body and the system was demonstrated using three different binding moieties: a small protein, a peptide and a single chain Fv antibody fragment that represents a complex protein including disulfide bridge.If needed, Eu3+ was used as label substance. The results showed that production system resulted in pure protein preparations, and the particles were of homogeneous size when visualized with transmission electron microscopy. Passively introduced label was stably associated with the particles, and binding molecules genetically fused to the particle specifically bound target molecules. Functionality of the particles in bioaffinity assays were successfully demonstrated with two types of assays; as labels and in particle-enhanced agglutination assay. This biological production procedure features many advantages that make the process especially suited for applications that have frequent and recurring requirements for homogeneous functional particles. The production process of ready, functional and watersoluble particles follows principles of “green chemistry”, is upscalable, fast and cost-effective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the theoretical part, the different polymerisation catalysts are introduced and the phenomena related to mixing in the stirred tank reactor are presented. Also the advantages and challenges related to scale-up are discussed. The aim of the applied part was to design and implement an intermediate-sized reactor useful for scale-up studies. The reactor setting was tested making one batch of Ziegler–Natta polypropylene catalyst. The catalyst preparation with a designed equipment setting succeeded and the catalyst was analysed. The analyses of the catalyst were done, because the properties of the catalyst were compared to the normal properties of Ziegler–Natta polypropylene catalyst. The total titanium content of the catalyst was slightly higher than in normal Ziegler–Natta polypropylene catalyst, but the magnesium and aluminium content of the catalyst were in the normal level. By adjusting the siphonation tube and adding one washing step the titanium content of the catalyst could be decreased. The particle size of the catalyst was small, but the activity was in a normal range. The size of the catalyst particles could be increased by decreasing the stirring speed. During the test run, it was noticed that some improvements for the designed equipment setting could be done. For example more valves for the chemical feed line need to be added to ensure inert conditions during the catalyst preparation. Also nitrogen for the reactor needs to separate from other nitrogen line. With this change the pressure in the reactor can be kept as desired during the catalyst preparation. The proposals for improvements are presented in the applied part. After these improvements are done, the equipment setting is ready for start-up. The computational fluid dynamics model for the designed reactor was provided by cooperation with Lappeenranta University of Technology. The experiments showed that for adequate mixing with one impeller, stirring speed of 600 rpm is needed. The computational fluid dynamics model with two impellers showed that there was no difference in the mixing efficiency if the upper impeller were pumping downwards or upwards.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The significance of services as business and human activities has increased dramatically throughout the world in the last three decades. Becoming a more and more competitive and efficient service provider while still being able to provide unique value opportunities for customers requires new knowledge and ideas. Part of this knowledge is created and utilized in daily activities in every service organization, but not all of it, and therefore an emerging phenomenon in the service context is information awareness. Terms like big data and Internet of things are not only modern buzz-words but they are also describing urgent requirements for a new type of competences and solutions. When the amount of information increases and the systems processing information become more efficient and intelligent, it is the human understanding and objectives that may get separated from the automated processes and technological innovations. This is an important challenge and the core driver for this dissertation: What kind of information is created, possessed and utilized in the service context, and even more importantly, what information exists but is not acknowledged or used? In this dissertation the focus is on the relationship between service design and service operations. Reframing this relationship refers to viewing the service system from the architectural perspective. The selected perspective allows analysing the relationship between design activities and operational activities as an information system while maintaining the tight connection to existing service research contributions and approaches. This type of an innovative approach is supported by research methodology that relies on design science theory. The methodological process supports the construction of a new design artifact based on existing theoretical knowledge, creation of new innovations and testing the design artifact components in real service contexts. The relationship between design and operations is analysed in the health care and social care service systems. The existing contributions in service research tend to abstract services and service systems as value creation, working or interactive systems. This dissertation adds an important information processing system perspective to the research. The main contribution focuses on the following argument: Only part of the service information system is automated and computerized, whereas a significant part of information processing is embedded in human activities, communication and ad-hoc reactions. The results indicate that the relationship between service design and service operations is more complex and dynamic than the existing scientific and managerial models tend to view it. Both activities create, utilize, mix and share information, making service information management a necessary but relatively unknown managerial task. On the architectural level, service system -specific elements seem to disappear, but access to more general information elements and processes can be found. While this dissertation focuses on conceptual-level design artifact construction, the results provide also very practical implications for service providers. Personal, visual and hidden activities of service, and more importantly all changes that take place in any service system have also an information dimension. Making this information dimension visual and prioritizing the processed information based on service dimensions is likely to provide new opportunities to increase activities and provide a new type of service potential for customers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Euroopan sähkösektori on ollut viimeisen vuosikymmenen suurten mullistusten kourissa. Sähkömarkkinoiden avautumisen jälkeen monopoliliiketoimintaa harjoittavien sähköyhtiöiden on ollut pakko parantaa tuottavuuttaan. Ratkaisuksi tähän on etsitty apua huolto- ja rakennustoimintojen ulkoistamisella. Ulkoistaminen on kuitenkin uusi menetelmä tällä sektorilla. Tämän tutkielman tavoitteena on selvittää syyt, jotka tanskalaisella sähköverkkoyhtiöllä oli huolto- ja rakennustoimintojen ulkoistamiseen, sekä löytää siitä saatavat hyödyt ja siihen sisältyvät riskit. Tutkimus suoritetaan käyttäen apuna kirjallisuutta, saatavilla olevia due diligence-, sekä muita raportteja ja analyysejä, sekä tapausta koskettavien tahojen haastatteluja.Lisäksi sähköverkkoalan asiantuntijoiden kanssa käytyjä konsultointia käytetäänselvitykseen. Tutkimus osoittaa, että perimmäiset ajurit huolto- ja rakennustoimintojen ulkoistamiseen tulivat lainmuutosten ja vapautuneiden sähkömarkkinoiden asettamista paineista. Kunnallisessa organisaatiossa parantaa tehokkuutta ulkoistamalla jotain toimintoja yksityisomisteiselle palvelun tuottajalle. Muut ulkoistamisesta odotetut hyödyt olivat alentuneet kustannukset, virtaviivaisempi organisaation ja sähköverkkoyhtiön tehottomista osista eroon pääseminen ennen sen myymistä.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyzes the possibilities of integrating cost information and engineering design. Special emphasis is put on finding the potential of using the activity-based costing (ABC) method. Today, the problem of cost estimation in engineering design is that there are two separate extremes of knowledge. On the one extreme, the engineers model the technical parametres behindcosts in great detail but do not get appropriate cost information to their elegant models. On the other extreme, the accounting professionals are stuck with traditional cost accounting methods driven by the procedures and cycles of financial accounting. Therefore, in many cases, the cost information needs of various decision making groups, for example design engineers, are not served satisfactorily. This paper studies if the activity-based costing (ABC) method could offer a compromise between the two extremes. Recognizing activities and activity chains as well as activity and cost drivers could be specially beneficial for design engineers. Also, recognizing the accurate and reliable product costs of existing products helps when doing variant design. However, ABC is not at its best if the cost system becomes too complicated. This is why a comprehensive ABC-cost information system with detailed cost information for the use of design engineers should be examined critically. ABC is at its best when considering such issues as which activities drive costs, the cost of product complexity, allocating indirect costs on the products, the relationships between processes and costs, and the cost of excess capacity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Key management has a fundamental role in secure communications. Designing and testing of key management protocols is tricky. These protocols must work flawlessly despite of any abuse. The main objective of this work was to design and implement a tool that helps to specify the protocol and makes it possible to test the protocol while it is still under development. This tool generates compile-ready java code from a key management protocol model. A modelling method for these protocols, which uses Unified Modeling Language (UML) was also developed. The protocol is modelled, exported as an XMI and read by the code generator tool. The code generator generates java code that is immediately executable with a test software after compilation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tässä diplomityössä perehdytään WAP:in Push -viitekehykseen. WAP-standardit määrittelevät kuinka Internet-tyyppisiä palveluita, joita voidaan käyttää erilaisia mobiileja päätelaiteitteita käyttäen, toteutetaan tehokkaalla ja verkkoteknologiasta riippumattomalla tavalla. WAP pohjautuu Internet:iin, mutta huomioi pienten päätelaitteiden ja mobiiliverkkojen rajoitukset ja erikoisominaisuudet. WAP Push viitekehys määrittelee verkon aloittaman palvelusisällön toimittamisen. Työn teoriaosassa käydään läpi yleinen WAP-arkkitehtuuri ja WAP-protokollapino käyttäen vertailukohtina lanka-Internetin arkkitehtuuria ja protokollapinoa. Edellistä perustana käyttäen tutustaan WAP Push -viitekehykseen. Käytännönosassa kuvataan WAP Push -välityspalvelimen suunnittelu ja kehitystyö. WAP Push -välityspalvelin on keskeinen verkkoelementti WAP Push -viitekehyksessä. WAP Push -välityspalvelin yhdistää Internetin ja mobiiliverkon tavalla, joka piilottaa teknologiaeroavaisuudet Internetissä olevalta palveluntuottajalta.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tämä diplomityö esittää Symbianin käyttöjärjestelmän verkkoarkkitehtuuriin perustuvan paikallisverkkokortin (LAN) käyttöönottoa. Pääajatus oli keskitetty langattoman LAN— kortin (WLAN) ajureiden käyttöönottoon. Jokainen Symbianin käyttöjärjestelmän verkkoarkkitehtuurin komponentti oli huolellisesti tutkittu, painottaen mahdollista langattoman yhteyden uudelleenkäyttöä. Myös olemassaolevan Ethernetkortin ajureiden uudelleenkäyttö oli huolellisesti otettu huomioon. Diplomityöprojektin aikana esimerkki WLAN-kortin ajurin lähdekoodista oli esitetty. Tämä ajuri on kirjoitettu NOKIA DTN-20 WLAN-korttia varten. Havaittiin myös, että suurin osa Symbianin käyttöjärjestelmän verkkoarkkitehtuurista voidaan käyttää myös WLAN-pohjaisessa yhteydessä ilman muutoksia.. Esitetty ajuri käyttää myös tiettyjä olemassaolevan Ethernetkortin ajureita, esim. Logical Device Driver (LDD):tä.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The networking and digitalization of audio equipment has created a need for control protocols. These protocols offer new services to customers and ensure that the equipment operates correctly. The control protocols used in the computer networks are not directly applicable since embedded systems have resource and cost limitations. In this master's thesis the design and implementation of new loudspeaker control network protocols are presented. The protocol stack was required to be reliable, have short response times, configure the network automatically and support the dynamic addition and removal of loudspeakers. The implemented protocol stack was also required to be as efficient and lightweight as possible because the network nodes are fairly simple and lack processing power. The protocol stack was thoroughly tested, validated and verified. The protocols were formally described using LOTOS (Language of Temporal Ordering Specifications) and verified using reachability analysis. A prototype of the loudspeaker network was built and used for testing the operation and the performance of the control protocols. The implemented control protocol stack met the design specifications and proved to be highly reliable and efficient.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Terveydenhuollossa käytetään nykyisin informaatioteknologian (IT) mahdollisuuksia parantamaan hoidon laatua, vähentämään hoitoon liittyviä kuluja sekä yksinkertaistamaan ja selkeyttämään laakareiden työnkulkua. Tietojärjestelmät, jotka edustavat jokaisen IT-ratkaisun ydintä, täytyy kehittää täyttämään lukuisia vaatimuksia, ja yksi niistä on kyky integroitua saumattomasti toisten tietojärjestelmien kanssa. Järjestelmäintegraatio on kuitenkin yhä haastava tehtävä, vaikka sita varten on kehitetty useita standardeja. Tässä työssä kuvataan vastakehitetyn lääketieteellisen tietojärjestelmän liittymäratkaisu. Työssä pohditaan vaatimuksia, jotka tällaiselle sovellukselle asetetaan, ja myös tapa, jolla vaatimukset toteutuvat on esitetty. Liittymaratkaisu on jaettu kahteen osaan, tietojärjestelmaliittymään ja "liittymakoneeseen" (interfacing engine). Edellinen on käsittää perustoiminnallisuuden, jota tarvitaan vastaanottamaan ja lähettämään tietoa toisiin järjestelmiin, kun taas jälkimmäinen tarjoaa tuen tuotantoympäristössa käytettäville standardeille. Molempien osien suunnitelu on esitelty perusteellisesti tässä työssä. Ongelma ratkaistiin modulaarisen ja geneerisen suunnittelun avulla. Tämä lähestymistapa osoitetaan työssä kestäväksi ja joustavaksi ratkaisuksi, jota voidaan käyttää tarkastelemaan laajaa valikoimaa liittymäratkaisulle asetettuja vaatimuksia. Lisaksi osoitetaan kuinka tehty ratkaisu voidaan joustavuutensa ansiosta helposti mukauttaa vaatimuksiin, joita ei ole etukäteen tunnistettu, ja siten saavutetaan perusta myös tulevaisuuden tarpeille