21 resultados para source and sink
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
Tässä pro gradu -tutkielmassa käsittelen lähde- ja kohdetekstikeskeisyyttä näytelmäkääntämisessä. Tutkimuskohteina olivat käännösten sanasto, syntaksi, näyttämötekniikka, kielikuvat, sanaleikit, runomitta ja tyyli. Tutkimuksen tarkoituksena oli selvittää, näkyykö teoreettinen painopisteen siirtyminen lähdetekstikeskeisyydestä kohdetekstikeskeisyyteen suomenkielisessä näytelmäkääntämisessä. Oletuksena oli, että siirtyminen näkyy käytetyissä käännösstrategioissa. Tutkimuksen teoriaosuudessa käsitellään ensin lähde- ja kohdetekstikeskeisiä käännösteorioita. Ensin esitellään kaksi lähdetekstikeskeistä teoriaa, jotka ovat Catfordin (1965) muodollinen vastaavuus ja Nidan (1964) dynaaminen ekvivalenssi. Kohdetekstikeskeisistä teorioista käsitellään Touryn (1980) ja Newmarkin (1981) teoreettisia näkemyksiä sekä Reiss ja Vermeerin (1986) esittelemää skopos-teoriaa. Vieraannuttamisen ja kotouttamisen periaatteet esitellään lyhyesti. Teoriaosuudessa käsitellään myös näytelmäkääntämistä, William Shakespearen kieltä ja siihen liittyviä käännösongelmia. Lisäksi esittelen lyhyesti Shakespearen kääntämistä Suomessa ja Julius Caesarin neljä kääntäjää. Tutkimuksen materiaalina oli neljä Shakespearen Julius Caesar –näytelmän suomennosta, joista Paavo Cajanderin käännös on julkaistu vuonna 1883, Eeva-Liisa Mannerin vuonna 1983, Lauri Siparin vuonna 2006 ja Jarkko Laineen vuonna 2007. Analyysissa käännöksiä verrattiin lähdetekstiin ja toisiinsa ja vertailtiin kääntäjien tekemiä käännösratkaisuja. Tulokset olivat oletuksen mukaisia. Lähdetekstikeskeisiä käännösstrategioita oli käytetty uusissa käännöksissä vähemmän kuin vanhemmissa. Kohdetekstikeskeiset strategiat erosivat huomattavasti toisistaan ja uusinta käännöstä voi sanoa adaptaatioksi. Jatkotutkimuksissa tulisi materiaali laajentaa koskemaan muitakin Shakespearen näytelmien suomennoksia. Eri aikakausien käännöksiä tulisi verrata keskenään ja toisiinsa, jotta voitaisiin luotettavasti kuvata muutosta lähde- ja kohdetekstikeskeisten käännösstrategioiden käytössä ja eri aikakausien tyypillisten strategioiden kartoittamiseksi.
Resumo:
Cyanobacteria are a diverse group of oxygenic photosynthetic bacteria that inhabit in a wide range of environments. They are versatile and multifaceted organisms with great possibilities for different biotechnological applications. For example, cyanobacteria produce molecular hydrogen (H2), which is one of the most important alternatives for clean and sustainable energy. Apart from being beneficial, cyanobacteria also possess harmful characteristics and may become a source of threat to human health and other living organisms, as they are able to form surface blooms that are producing a variety of toxic or bioactive compounds. The University of Helsinki Culture Collection (UHCC) maintains around 1,000 cyanobacterial strains representing a large number of genera and species isolated from the Baltic Sea and Finnish lakes. The culture collection covers different life forms such as unicellular and filamentous, N2-fixing and non-N2-fixing strains, and planktonic and benthic cyanobacteria. In this thesis, the UHCC has been screened to identify potential strains for sustainable biohydrogen production and also for strains that produce compounds modifying the bioenergetic pathways of other cyanobacteria or terrestrial plants. Among the 400 cyanobacterial strains screened so far, ten were identified as high H2-producing strains. The enzyme systems involved in H2 metabolism of cyanobacteria were analyzed using the Southern hybridization approach. This revealed the presence of the enzyme nitrogenase in all strains tested, while none of them are likely to have contained alternative nitrogenases. All the strains tested, except for two Calothrix strains, XSPORK 36C and XSPORK 11A, were suggested to contain both uptake and bidirectional hydrogenases. Moreover, 55 methanol extracts of various cyanobacterial strains were screened to identify potent bioactive compounds affecting the photosynthetic apparatus of the model cyanobacterium, Synechocystis PCC 6803. The extract from Nostoc XPORK 14A was the only one that modified the photosynthetic machinery and dark respiration. The compound responsible for this effect was identified, purified, and named M22. M22 demonstrated a dual-action mechanism: production of reactive oxygen species (ROS) under illumination and an unknown mechanism that also prevailed in the dark. During summer, the Baltic Sea is occupied by toxic blooms of Nodularia spumigena (hereafter referred to as N. spumigena), which produces a hepatotoxin called nodularin. Long-term exposure of the terrestrial plant spinach to nodularin was studied. Such treatment resulted in inhibition of growth and chlorosis of the leaves. Moreover, the activity and amount of mitochondrial electron transfer complexes increased in the leaves exposed to nodularin-containing extract, indicating upregulation of respiratory reactions, whereas no marked changes were detected in the structure or function of the photosynthetic machinery. Nodularin-exposed plants suffered from oxidative stress, evidenced by oxidative modifications of various proteins. Plants initiated strategies to combat the stress by increasing the levels of alpha-tocopherol, mitochondrial alternative oxidase (AOX), and mitochondrial ascorbate peroxidase (mAPX).
Resumo:
The aim of this master’s thesis is to study how Agile method (Scrum) and open source software are utilized to produce software for a flagship product in a complex production environment. The empirical case and the used artefacts are taken from the Nokia MeeGo N9 product program, and from the related software program, called as the Harmattan. The single research case is analysed by using a qualitative method. The Grounded Theory principles are utilized, first, to find out all the related concepts from artefacts. Second, these concepts are analysed, and finally categorized to a core category and six supported categories. The result is formulated as the operation of software practices conceivable in circumstances, where the accountable software development teams and related context accepts a open source software nature as a part of business vision and the whole organization supports the Agile methods.
Resumo:
In the present dissertation, multilingual thesauri were approached as cultural products and the focus was twofold: On the empirical level the focus was placed on the translatability of certain British-English social science indexing terms into the Finnish language and culture at a concept, a term and an indexing term level. On the theoretical level the focus was placed on the aim of translation and on the concept of equivalence. In accordance with modern communicative and dynamic translation theories the interest was on the human dimension. The study is qualitative. In this study, equivalence was understood in a similar way to how dynamic, functional equivalence is commonly understood in translation studies. Translating was seen as a decision-making process, where a translator often has different kinds of possibilities to choose in order to fulfil the function of the translation. Accordingly, and as a starting point for the construction of the empirical part, the function of the source text was considered to be the same or similar to the function of the target text, that is, a functional thesaurus both in source and target context. Further, the study approached the challenges of multilingual thesaurus construction from the perspectives of semantics and pragmatics. In semantic analysis the focus was on what the words conventionally mean and in pragmatics on the ‘invisible’ meaning - or how we recognise what is meant even when it is not actually said (or written). Languages and ideas expressed by languages are created mainly in accordance with expressional needs of the surrounding culture and thesauri were considered to reflect several subcultures and consequently the discourses which represent them. The research material consisted of different kinds of potential discourses: dictionaries, database records, and thesauri, Finnish versus British social science researches, Finnish versus British indexers, simulated indexing tasks with five articles and Finnish versus British thesaurus constructors. In practice, the professional background of the two last mentioned groups was rather similar. It became even more clear that all the material types had their own characteristics, although naturally not entirely separate from each other. It is further noteworthy that the different types and origins of research material were not used to represent true comparison pairs, and that the aim of triangulation of methods and material was to gain a holistic view. The general research questions were: 1. Can differences be found between Finnish and British discourses regarding family roles as thesaurus terms, and if so, what kinds of differences and which are the implications for multilingual thesaurus construction? 2. What is the pragmatic indexing term equivalence? The first question studied how the same topic (family roles) was represented in different contexts and by different users, and further focused on how the possible differences were handled in multilingual thesaurus construction. The second question was based on findings of the previous one, and answered to the final question as to what kinds of factors should be considered when defining translation equivalence in multilingual thesaurus construction. The study used multiple cases and several data collection and analysis methods aiming at theoretical replication and complementarity. The empirical material and analysis consisted of focused interviews (with Finnish and British social scientists, thesaurus constructors and indexers), simulated indexing tasks with Finnish and British indexers, semantic component analysis of dictionary definitions and translations, coword analysis and datasets retrieved in databases, and discourse analysis of thesauri. As a terminological starting point a topic and case family roles was selected. The results were clear: 1) It was possible to identify different discourses. There also existed subdiscourses. For example within the group of social scientists the orientation to qualitative versus quantitative research had an impact on the way they reacted to the studied words and discourses, and indexers placed more emphasis on the information seekers whereas thesaurus constructors approached the construction problems from a more material based solution. The differences between the different specialist groups i.e. the social scientists, the indexers and the thesaurus constructors were often greater than between the different geo-cultural groups i.e. Finnish versus British. The differences occurred as a result of different translation aims, diverging expectations for multilingual thesauri and variety of practices. For multilingual thesaurus construction this means severe challenges. The clearly ambiguous concept of multilingual thesaurus as well as different construction and translation strategies should be considered more precisely in order to shed light on focus and equivalence types, which are clearly not self-evident. The research also revealed the close connection between the aims of multilingual thesauri and the pragmatic indexing term equivalence. 2) The pragmatic indexing term equivalence is very much context-depended. Although thesaurus term equivalence is defined and standardised in the field of library and information science (LIS), it is not understood in one established way and the current LIS tools are inadequate to provide enough analytical tools for both constructing and studying different kinds of multilingual thesauri as well as their indexing term equivalence. The tools provided in translation science were more practical and theoretical, and especially the division of different meanings of a word provided a useful tool in analysing the pragmatic equivalence, which often differs from the ideal model represented in thesaurus construction literature. The study thus showed that the variety of different discourses should be acknowledged, there is a need for operationalisation of new types of multilingual thesauri, and the factors influencing pragmatic indexing term equivalence should be discussed more precisely than is traditionally done.
Resumo:
Open source and open source software development have been interesting phenomena during the past decade. Traditional business models do not apply with open source, where the actual product is free. However, it is possible to make business with open source, even successfully, but the question is: how? The aim of this study is to find the key factors of successfully making business out of commercial open source software development. The task is achieved by finding the factors that influence open source projects, finding the relation between those factors, and find out why some factors explain the success more than others. The literature review concentrates first on background of open innovation, open source and open source software. Then business models, critical success factors and success measures are examined. Based on existing literature a framework was created. The framework contains categorized success factors that influence software projects in general as well as open source software projects. The main categories of success factors in software business are divided into community management, technology management, project management and market management. In order to find out which of the factors based on the existing literature are the most critical, empirical research was done by conducting unstructured personal interviews. The main finding based on the interviews is that the critical success factors in open source software business do not differ from those in traditional software business or in fact from those in any other business. Some factors in the framework came out in the interviews that can be considered as key factors: establishing and communicating hierarchy (community management), localization (technology management), good license know-how and IPR management (project management), and effective market management (market management). The critical success factors according to the interviewees are not listed in the framework: low price, good product and good business model development.
Resumo:
Mass-produced paper electronics (large area organic printed electronics on paper-based substrates, “throw-away electronics”) has the potential to introduce the use of flexible electronic applications in everyday life. While paper manufacturing and printing have a long history, they were not developed with electronic applications in mind. Modifications to paper substrates and printing processes are required in order to obtain working electronic devices. This should be done while maintaining the high throughput of conventional printing techniques and the low cost and recyclability of paper. An understanding of the interactions between the functional materials, the printing process and the substrate are required for successful manufacturing of advanced devices on paper. Based on the understanding, a recyclable, multilayer-coated paper-based substrate that combines adequate barrier and printability properties for printed electronics and sensor applications was developed in this work. In this multilayer structure, a thin top-coating consisting of mineral pigments is coated on top of a dispersion-coated barrier layer. The top-coating provides well-controlled sorption properties through controlled thickness and porosity, thus enabling optimizing the printability of functional materials. The penetration of ink solvents and functional materials stops at the barrier layer, which not only improves the performance of the functional material but also eliminates potential fiber swelling and de-bonding that can occur when the solvents are allowed to penetrate into the base paper. The multi-layer coated paper under consideration in the current work consists of a pre-coating and a smoothing layer on which the barrier layer is deposited. Coated fine paper may also be used directly as basepaper, ensuring a smooth base for the barrier layer. The top layer is thin and smooth consisting of mineral pigments such as kaolin, precipitated calcium carbonate, silica or blends of these. All the materials in the coating structure have been chosen in order to maintain the recyclability and sustainability of the substrate. The substrate can be coated in steps, sequentially layer by layer, which requires detailed understanding and tuning of the wetting properties and topography of the barrier layer versus the surface tension of the top-coating. A cost competitive method for industrial scale production is the curtain coating technique allowing extremely thin top-coatings to be applied simultaneously with a closed and sealed barrier layer. The understanding of the interactions between functional materials formulated and applied on paper as inks, makes it possible to create a paper-based substrate that can be used to manufacture printed electronics-based devices and sensors on paper. The multitude of functional materials and their complex interactions make it challenging to draw general conclusions in this topic area. Inevitably, the results become partially specific to the device chosen and the materials needed in its manufacturing. Based on the results, it is clear that for inks based on dissolved or small size functional materials, a barrier layer is beneficial and ensures the functionality of the printed material in a device. The required active barrier life time depends on the solvents or analytes used and their volatility. High aspect ratio mineral pigments, which create tortuous pathways and physical barriers within the barrier layer limit the penetration of solvents used in functional inks. The surface pore volume and pore size can be optimized for a given printing process and ink through a choice of pigment type and coating layer thickness. However, when manufacturing multilayer functional devices, such as transistors, which consist of several printed layers, compromises have to be made. E.g., while a thick and porous top-coating is preferable for printing of source and drain electrodes with a silver particle ink, a thinner and less absorbing surface is required to form a functional semiconducting layer. With the multilayer coating structure concept developed in this work, it was possible to make the paper substrate suitable for printed functionality. The possibility of printing functional devices, such as transistors, sensors and pixels in a roll-to-roll process on paper is demonstrated which may enable introducing paper for use in disposable “onetime use” or “throwaway” electronics and sensors, such as lab-on-strip devices for various analyses, consumer packages equipped with product quality sensors or remote tracking devices.
Resumo:
Eutrophication caused by anthropogenic nutrient pollution has become one of the most severe threats to water bodies. Nutrients enter water bodies from atmospheric precipitation, industrial and domestic wastewaters and surface runoff from agricultural and forest areas. As point pollution has been significantly reduced in developed countries in recent decades, agricultural non-point sources have been increasingly identified as the largest source of nutrient loading in water bodies. In this study, Lake Säkylän Pyhäjärvi and its catchment are studied as an example of a long-term, voluntary-based, co-operative model of lake and catchment management. Lake Pyhäjärvi is located in the centre of an intensive agricultural area in southwestern Finland. More than 20 professional fishermen operate in the lake area, and the lake is used as a drinking water source and for various recreational activities. Lake Pyhäjärvi is a good example of a large and shallow lake that suffers from eutrophication and is subject to measures to improve this undesired state under changing conditions. Climate change is one of the most important challenges faced by Lake Pyhäjärvi and other water bodies. The results show that climatic variation affects the amounts of runoff and nutrient loading and their timing during the year. The findings from the study area concerning warm winters and their influences on nutrient loading are in accordance with the IPCC scenarios of future climate change. In addition to nutrient reduction measures, the restoration of food chains (biomanipulation) is a key method in water quality management. The food-web structure in Lake Pyhäjärvi has, however, become disturbed due to mild winters, short ice cover and low fish catch. Ice cover that enables winter seining is extremely important to the water quality and ecosystem of Lake Pyhäjärvi, as the vendace stock is one of the key factors affecting the food web and the state of the lake. New methods for the reduction of nutrient loading and the treatment of runoff waters from agriculture, such as sand filters, were tested in field conditions. The results confirm that the filter technique is an applicable method for nutrient reduction, but further development is needed. The ability of sand filters to absorb nutrients can be improved with nutrient binding compounds, such as lime. Long-term hydrological, chemical and biological research and monitoring data on Lake Pyhäjärvi and its catchment provide a basis for water protection measures and improve our understanding of the complicated physical, chemical and biological interactions between the terrestrial and aquatic realms. In addition to measurements carried out in field conditions, Lake Pyhäjärvi and its catchment were studied using various modelling methods. In the calibration and validation of models, long-term and wide-ranging time series data proved to be valuable. Collaboration between researchers, modellers and local water managers further improves the reliability and usefulness of models. Lake Pyhäjärvi and its catchment can also be regarded as a good research laboratory from the point of view of the Baltic Sea. The main problem in both of them is eutrophication caused by excess nutrients, and nutrient loading has to be reduced – especially from agriculture. Mitigation measures are also similar in both cases.
Resumo:
Ontology matching is an important task when data from multiple data sources is integrated. Problems of ontology matching have been studied widely in the researchliterature and many different solutions and approaches have been proposed alsoin commercial software tools. In this survey, well-known approaches of ontologymatching, and its subtype schema matching, are reviewed and compared. The aimof this report is to summarize the knowledge about the state-of-the-art solutionsfrom the research literature, discuss how the methods work on different application domains, and analyze pros and cons of different open source and academic tools inthe commercial world.
Resumo:
In this study, a model for the unsteady dynamic behaviour of a once-through counter flow boiler that uses an organic working fluid is presented. The boiler is a compact waste-heat boiler without a furnace and it has a preheater, a vaporiser and a superheater. The relative lengths of the boiler parts vary with the operating conditions since they are all parts of a single tube. The present research is a part of a study on the unsteady dynamics of an organic Rankine cycle power plant and it will be a part of a dynamic process model. The boiler model is presented using a selected example case that uses toluene as the process fluid and flue gas from natural gas combustion as the heat source. The dynamic behaviour of the boiler means transition from the steady initial state towards another steady state that corresponds to the changed process conditions. The solution method chosen was to find such a pressure of the process fluid that the mass of the process fluid in the boiler equals the mass calculated using the mass flows into and out of the boiler during a time step, using the finite difference method. A special method of fast calculation of the thermal properties has been used, because most of the calculation time is spent in calculating the fluid properties. The boiler was divided into elements. The values of the thermodynamic properties and mass flows were calculated in the nodes that connect the elements. Dynamic behaviour was limited to the process fluid and tube wall, and the heat source was regarded as to be steady. The elements that connect the preheater to thevaporiser and the vaporiser to the superheater were treated in a special way that takes into account a flexible change from one part to the other. The model consists of the calculation of the steady state initial distribution of the variables in the nodes, and the calculation of these nodal values in a dynamic state. The initial state of the boiler was received from a steady process model that isnot a part of the boiler model. The known boundary values that may vary during the dynamic calculation were the inlet temperature and mass flow rates of both the heat source and the process fluid. A brief examination of the oscillation around a steady state, the so-called Ledinegg instability, was done. This examination showed that the pressure drop in the boiler is a third degree polynomial of the mass flow rate, and the stability criterion is a second degree polynomial of the enthalpy change in the preheater. The numerical examination showed that oscillations did not exist in the example case. The dynamic boiler model was analysed for linear and step changes of the entering fluid temperatures and flow rates.The problem for verifying the correctness of the achieved results was that there was no possibility o compare them with measurements. This is why the only way was to determine whether the obtained results were intuitively reasonable and the results changed logically when the boundary conditions were changed. The numerical stability was checked in a test run in which there was no change in input values. The differences compared with the initial values were so small that the effects of numerical oscillations were negligible. The heat source side tests showed that the model gives results that are logical in the directions of the changes, and the order of magnitude of the timescale of changes is also as expected. The results of the tests on the process fluid side showed that the model gives reasonable results both on the temperature changes that cause small alterations in the process state and on mass flow rate changes causing very great alterations. The test runs showed that the dynamic model has no problems in calculating cases in which temperature of the entering heat source suddenly goes below that of the tube wall or the process fluid.
Resumo:
Työn tavoitteena oli selvittää yksilön ja organisaation tietämystä sekä niiden kasvattamista. Tarkoituksena oli löytää yksilön ja organisaation tietämyksen yhdistäviä tekijöitä. Tutkimusmetodologia oli empiirinen deskriptiivinen tutkimus ja tutkimusmenetelmä oli kvalitatiivinen perustuen kymmeneen teemahaastatteluun. Tutkimuksen tuloksena oli, että tietotyöntekijät kasvattavat omaa ja organisaation tietämystä samankaltaisin keinoin, eikä niiden välillä mielletty olevan suurtakaan eroa. Työssä kokeminen yrityksen ja erehdyksen kautta kerrottiin tärkeimmäksi menetelmäksi kasvattaa omaa tietämystä. Kirjoja arvostettiin erityisen paljon tietämyksen kasvattamisessa ja se ohitti tiedon lähteenä jopa internetin. Työkollegat olivat kolmas tärkeä tietämyksen lähde. Kaksi tärkeintä edellytystä tietämyksen kasvattamisessa olivat opiskelun aikana saatu tietopohja sekä oma kiinnostus ja motivaatio. Organisaation tietämyksen kasvattamisessa tärkeimmäksi tekijäksi nousivat dokumentointi, virheistä oppiminen, sisäinen kommunikointi, tietojärjestelmät ja avoin organisaatiokulttuuri. Tutkimuksen perusteella syntyi kolmivaiheinen malli tietämyksen kasvattamisen kehästä, jonka elementit ovat edellytykset, lähteet ja menetelmät. Kehän keskellä on työssä oppiminen, joka on tärkein tekijä tietämyksen kasvattamisessa. Case-yrityksen toiminta oppivana organisaationa osoitti, että yrityksessä tulisi panostaa erityisesti oppimisen tukemiseen ja johtamiseen. Myös tiedon dokumentointi ja muuttaminen avoimeen muotoon sekä yrityksen prosessit tarvitsevat selkiyttämistä. Organisaation avoimuus ja ilmapiiri osoittautuivat hyviksi, mikä auttaa osaltaan tietämyksen kasvattamista.
Resumo:
The objectives of the study were to introduce current issues of chilled and frozen food packaging, priorize certain customer needs and review brand owners’ opinions and ways of actions. Packaging industry and packaged food markets were reviewed. Interviews of food industry brand owners are used as data sources. Analytic hierarchy process was used to prioritize the customer needs. Food packaging, special features of ready meals, chilled and frozen food packaging, packaging industry and packaged food markets in Europe are approached using literature and market reviews and forecasts. In empirical part the customer needs of paperboard trays are prioritized. The most important features for brand owners are related in product safety and environmental issues. Paperboard trays have benefits compared to its competing packaging solutions. Paperboard is recyclable, from renewable source and it has good printability. Emerging issues of packaging technology are biodegradable package, elderly friendly features such as easiness of opening and re-closing possibility, brand protection and tamper evident solutions.
Resumo:
The performance of Grid connected Photovoltaic System working with DCBoost stage is investigated. The DC-Boost Converter topology is deduced from three phase half controlled bridge and controlled by Sliding Mode Control. Due to the fact that Grid connected Photovoltaic System includes Solar cells as a DC source and inverter for grid connection, those are under the scope of this research as well. The advantages of using MPPT are analyzed. The system is simulated in Matlab-Simulink™ environment.
Resumo:
Tietotekniikan käyttö on tärkeää mikroyrityksen kasvun kannalta. Tutkielmassa pyrittiin toimintatutkimuksen keinoin löytämään kosmetiikan suoramyyntiä harjoittavan toiminimen KaunisSinä taustalla olevan osa-aikaisen yrittäjän asettamien tavoitteiden ja rajoitteiden mukaan paras ohjelmisto tukemaan asiakkuuden hallintaa. Ohjelmiston valintaa varten tutkittiin ohjelmistohankinnan menetelmiä kaupallisten valmisohjelmistojen, avoimen lähdekoodin ohjelmistojen ja räätälöityjen ohjelmistojen osalta. Yrittäjän toimintatapojen kartoituksen perusteella muodostettiin kriteerit ohjelmistojen vertailua ja valintaa varten. Vertailussa käytettiin painotetun keskiarvon menetelmää. Markkinoilla on saatavilla ominaisuuksiltaan sopivia avoimen lähdekoodin asiakkuuden hallintaohjelmistoja. Valinta on kompromissi ohjelmiston tarjoaman toiminnallisuuden sekä ominaisuuksien ja yritykselle muodostuneiden toimintatapojen välillä. Yrityksen on siis osittain mukautettava toimintatapojaan ohjelmiston mukaiseksi.
Resumo:
Kiihtyvän kasvihuoneilmiön aiheuttama maailmanlaajuinen ilmastonmuutos on yksi aikamme suurimmista haasteista. Kasvihuoneilmiön voimistuminen on ihmistoiminnan seurausta, mutta myös luonnon omat prosessit toimivat kasvihuonekaasujen lähteinä ja nieluina. Tässä työssä on selvitetty luonnon kasvihuonekaasulähteiden ja –nielujen määrää ja merkitystä maakunnallisella tasolla. Lisäksi työssä on esitetty maakunnallisia toimenpide-ehdotuksia kasvihuonekaasunielujen ylläpitämiseksi ja lisäämiseksi. Esimerkkinä työssä on käytetty Keski-Suomen maakuntaa ja laskenta on kohdistettu vuoteen 2008. Keski-Suomessa luonnon prosessit toimivat vuonna 2008 nettonieluna, jonka suuruus oli 1 813 701 tonnia hiilidioksidiekvivalenttia. Eniten kasvihuonekaasuja sitoi puusto 3 019 360 hiilidioksidiekvivalenttitonnin nielulla. Laskennassa huomioitiin hiilidioksidi, metaani ja dityppioksidi. Laskenta suoritettiin soveltuvilta osin kansallisen kasvihuonekaasuraportoinnin periaatteiden mukaisesti sekä tutkimuskirjallisuuteen perustuvien päästö- ja nielukertoimien avulla. Tämä työ osoittaa luonnon nielujen olevan tärkeässä roolissa ilmastonmuutoksen hillitsemisessä. Nielujen säilyttämiseen ja lisäämiseen tähtäävillä toimenpiteillä ilmakehän kasvihuonekaasupitoisuuksiin voidaan vaikuttaa kustannustehokkaasti sekä maailmanlaajuisella että alueellisella tasolla.
Resumo:
Työssä tutustutaan erilaisiin e-portfoliojärjestelmiin, niiden ominaisuuksiin ja toiminnallisuuksiin sekä vertaillaan valittuja e-portfoliojärjestelmiä keskenään yhteensopivuuden ja -toimivuuden näkökulmasta yliopistomaailmassa. Tavoitteena on valita e-portfoliojärjestelmistä monipuolisin ja opiskelijaystävällisin järjestelmä. Järjestelmän tulee tukea opiskelijan elinikäistä oppimista. E-portfolio on sähköisessä muodossa oleva omien töiden kokoelma ja sitä hallitaan e-portfoliojärjestelmällä. E-portfoliojärjestelmissä tärkeässä roolissa ovat portfoliospesifikaatiot, jotka mahdollistavat e-portfolion siirtämisen sellaisenaan toiseen järjestelmään rajapintojen kautta. Työssä ei keskitytä henkilökohtaiseen oppimisympäristöön, joka on käsitteenä laajempi kuin e-portfolio. Työssä vertailtiin avoimeen lähdekoodiin perustuvia, web-pohjaisia e-portfoliojärjestelmiä ja tutkittiin niiden tarjoamaa mahdollisuutta portfolioiden siirtoon ja tuomiseen. Lisäksi tarkasteltiin järjestelmien integroituvuutta muihin järjestelmiin, kuten virtuaalisiin oppimisympäristöihin. Vertailun tuloksena monipuolisin ja opiskelijaystävällisin e-portfoliojärjestelmä on Mahara ja yleisin portfoliospesifikaatio Leap2A.