23 resultados para Architecture and climate
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
Selostus: Viljelyvyöhykkeiden ja kasvumallien soveltaminen ilmastonmuutoksen tutkimisessa: Mackenzien jokialue, Kanada
Resumo:
Selostus: Ilmakehä-, sää- ja ilmastoskenaarioiden kehittäminen pohjoisille alueille
Resumo:
Abstract
Resumo:
The aim of this dissertation is to bridge and synthesize the different streams of literature addressing ecosystem architecture through a multiple‐lens perspective. In addition, the structural properties of and processes to design and manage the architecture will be examined. With this approach, the oft‐neglected actor‐structure duality is addressed and both the position and structure, and action and process are under scrutiny. Further, the developed framework and empirical evidence offer valuable insights on how firms collectively create value and individually appropriate value. The dissertation is divided into two parts. The first part comprises a literature review, as well as the conclusions of the whole study, and the second part includes six research publications. The dissertation is based on three different reasoning logics: abduction, induction and deduction; related qualitative and quantitative methodologies are utilized in the empirical examination of the phenomenon in the information and communication technology industry. The results suggest firstly that there are endogenous and exogenous structural properties of the ecosystem architecture. Out of these, the former ones can be more easily influenced by a particular actor whereas the latter ones are taken more or less for granted. Secondly, the exogenous ecosystem design properties influence the value creation potential of the ecosystem whereas the endogenous ecosystem design properties influence the value appropriation potential of a particular actor in the ecosystem. Thirdly, the study suggests that there is a relationship between endogenous and exogenous structural properties in that the endogenous properties can be leveraged to create and reconfigure the exogenous properties whereas the exogenous properties prose opportunities and restrictions on the use of endogenous properties. In addition, the study suggests that there are different emergent and engineered processes to design and manage ecosystem architecture and to influence both the endogenous and exogenous structural properties of ecosystem architecture. This study makes three main contributions. First, on the conceptual level, it brings coherence and direction to the fast growing body of literature on novel inter‐organizational arrangements, such as ecosystems. It does this by bridging and synthetizing three different streams of literature, namely the boundary, design and orchestration conception. Secondly, it sets out a framework that enhances our understanding of the structural properties of ecosystem architecture; of the processes to design and manage ecosystem architecture; and of their influence on the value creation potential of the ecosystem and the value capture potential of a particular firm. Thirdly, it offers empirical evidence of the structural properties and processes.
Resumo:
Global climate change and intentional climate modification, i.e. geoengineering include various ethical problems which are entangled as a complex ensemble of questions regarding the future of the biosphere. The possibilities of catastrophic effects of climate change which are also called “climate emergency” have led to the emergence of the idea of modifying the atmospheric conditions in the form of geoengineering. The novel issue of weather ethics is a subdivision of climate ethics, and it is interested in ethical and political questions surrounding weather and climate control and modification in a restricted spatio-temporal scale. The objective of geoengineering is to counterbalance the adverse effects of climate change and its diverse corollaries in various ways on a large scale. The claim of this dissertation is that there are ethical justifications to claim that currently large-scale interventions to the climate system are ethically questionable. The justification to pursue geoengineering on the basis of considering its pros and cons, is inadequate. Moral judgement can still be elaborated in cases where decisions have to be made urgently and the selection of desirable choices is severely limited. The changes needed to avoid severe negative impacts of climate change requires commitment to mitigation as well as social changes because technical solutions cannot address the issue of climate change altogether. The quantitative emphasis of consumerism should shift to qualitative focus on the aspiration for simplicity in order to a move towards the objective of the continuation of the existence of humankind and a flourishing, vital biosphere.
Resumo:
Tässä diplomityössä perehdytään WAP:in Push -viitekehykseen. WAP-standardit määrittelevät kuinka Internet-tyyppisiä palveluita, joita voidaan käyttää erilaisia mobiileja päätelaiteitteita käyttäen, toteutetaan tehokkaalla ja verkkoteknologiasta riippumattomalla tavalla. WAP pohjautuu Internet:iin, mutta huomioi pienten päätelaitteiden ja mobiiliverkkojen rajoitukset ja erikoisominaisuudet. WAP Push viitekehys määrittelee verkon aloittaman palvelusisällön toimittamisen. Työn teoriaosassa käydään läpi yleinen WAP-arkkitehtuuri ja WAP-protokollapino käyttäen vertailukohtina lanka-Internetin arkkitehtuuria ja protokollapinoa. Edellistä perustana käyttäen tutustaan WAP Push -viitekehykseen. Käytännönosassa kuvataan WAP Push -välityspalvelimen suunnittelu ja kehitystyö. WAP Push -välityspalvelin on keskeinen verkkoelementti WAP Push -viitekehyksessä. WAP Push -välityspalvelin yhdistää Internetin ja mobiiliverkon tavalla, joka piilottaa teknologiaeroavaisuudet Internetissä olevalta palveluntuottajalta.
Resumo:
Mitä on läsnäolo? Tämä työ määrittelee läsnäolon tietyn henkilön, laitteen tai palvelun halukkuudeksi kommunikoida. Nykyään on olemassa lukuisia läsnäolotietoa levittäviä sovelluksia, joista jokainen käyttää erilaista protokollaa tehtävän suorittamiseen. Vasta viime aikoina sovellusten kehittäjät ovat huomanneet tarpeen yhdelle sovellukselle, joka kykenee tukemaan lukuisia läsnäoloprotokollia. Session Initiation Protocol (SIP) voi levittää läsnäolotietoa muiden ominaisuuksiensa lisäksi. Kun muita protokollia käytetään vain reaaliaikaiseen viestintään ja läsnäolotiedon lähetykseen, SIP pystyy moniin muihinkin asioihin. Se on alunperin suunniteltu aloittamaan, muuttamaan ja lopettamaan osapuolien välisiä multimediaistuntoja. Arkkitehtuurin toteutus käyttää kahta Symbian –käyttöjärjestelmän perusominaisuutta: asiakas-palvelin rakennetta ja kontaktitietokantaa. Asiakaspalvelin rakenne erottaa asiakkaan protokollasta tarjoten perustan laajennettavalle usean protokollan arkkitehtuurille ja kontaktitietokanta toimii läsnäolotietojen varastona. Työn tuloksena on Symbianin käyttöjärjestelmässä toimiva läsnäoloasiakas.
Resumo:
Sekä organisaatiokulttuuria, luottamusta että innovatiivisuutta on tutkittu paljon, mutta toistaiseksi nämä käsitteet yhdistävää kokonaisvaltaista tutkimusta ei juuri ole tehty tai ainakaan raportoitu tieteellisissä julkaisuissa. Tätä tutkimus käsitteli organisaatiokulttuurin, luottamuksen ja innovatiivisuuden suhteita erilaisissa organisaatiokulttuureissa. Tutkimuksen tavoitteena oli tutkia organisaatiokulttuurin vaikutusta luottamukseen (sekä kompetenssiin, hyväntahtoisuuteen että rehellisyyteen perustuvaan lateraaliin, vertikaaliin ja institutionaaliseen luottamukseen), innovaatioilmastoon ja innovaatiotoiminnan tuloksellisuuteen. Organisaatiokulttuurin, luottamuksen ja innovatiivisuuden yhteyttä tarkasteltiin neljässä erityyppisessä organisaatiokulttuurissa (klaani-, adhokratia-, hierarkia- ja markkinakulttuurit), jotka pohjautuvat kilpailevien arvojen malliin. Tutkimuksen empiirinen osa toteutettiin posti ja Internet -pohjaisena kyselytutkimuksena 40 organisaatioyksikössä tilastollisen analyysin menetelmin. Yleisellä tasolla työssä saatiin selville, että luottamuksen ja innovatiivisuuden tasot vaihtelevat erityyppisissä organisaatio-kulttuureissa. Tarkemmin sanottuna klaani- ja adhokratiakulttuureissa luottamus ja innovatiivisuus olivat korkeita, ja näillä kulttuureilla oli myös positiivinen vaikutus innovaatiotoiminnan tuloksellisuuteen. Erityisesti institutionaalisen luottamuksen ja innovaatiotuen merkitykset olivat tärkeitä, sillä ne toimivat mediaattoreina organisaatiokulttuurin ja innovatiivisuuden välisessä suhteessa. Luottamuksella ja innovatiivisuudelle ei sitä vastoin ollut vaikutusta hierarkia- ja markkinakulttuureissa, tai vaikutus oli negatiivinen. Tässä työssä osoitettiin myös aiemmin hyvin vähän tutkitun institutionaalisen organisatorisen luottamuksen merkitys organisaatioiden innovatiivisuudessa.
Resumo:
Current-day web search engines (e.g., Google) do not crawl and index a significant portion of theWeb and, hence, web users relying on search engines only are unable to discover and access a large amount of information from the non-indexable part of the Web. Specifically, dynamic pages generated based on parameters provided by a user via web search forms (or search interfaces) are not indexed by search engines and cannot be found in searchers’ results. Such search interfaces provide web users with an online access to myriads of databases on the Web. In order to obtain some information from a web database of interest, a user issues his/her query by specifying query terms in a search form and receives the query results, a set of dynamic pages that embed required information from a database. At the same time, issuing a query via an arbitrary search interface is an extremely complex task for any kind of automatic agents including web crawlers, which, at least up to the present day, do not even attempt to pass through web forms on a large scale. In this thesis, our primary and key object of study is a huge portion of the Web (hereafter referred as the deep Web) hidden behind web search interfaces. We concentrate on three classes of problems around the deep Web: characterization of deep Web, finding and classifying deep web resources, and querying web databases. Characterizing deep Web: Though the term deep Web was coined in 2000, which is sufficiently long ago for any web-related concept/technology, we still do not know many important characteristics of the deep Web. Another matter of concern is that surveys of the deep Web existing so far are predominantly based on study of deep web sites in English. One can then expect that findings from these surveys may be biased, especially owing to a steady increase in non-English web content. In this way, surveying of national segments of the deep Web is of interest not only to national communities but to the whole web community as well. In this thesis, we propose two new methods for estimating the main parameters of deep Web. We use the suggested methods to estimate the scale of one specific national segment of the Web and report our findings. We also build and make publicly available a dataset describing more than 200 web databases from the national segment of the Web. Finding deep web resources: The deep Web has been growing at a very fast pace. It has been estimated that there are hundred thousands of deep web sites. Due to the huge volume of information in the deep Web, there has been a significant interest to approaches that allow users and computer applications to leverage this information. Most approaches assumed that search interfaces to web databases of interest are already discovered and known to query systems. However, such assumptions do not hold true mostly because of the large scale of the deep Web – indeed, for any given domain of interest there are too many web databases with relevant content. Thus, the ability to locate search interfaces to web databases becomes a key requirement for any application accessing the deep Web. In this thesis, we describe the architecture of the I-Crawler, a system for finding and classifying search interfaces. Specifically, the I-Crawler is intentionally designed to be used in deepWeb characterization studies and for constructing directories of deep web resources. Unlike almost all other approaches to the deep Web existing so far, the I-Crawler is able to recognize and analyze JavaScript-rich and non-HTML searchable forms. Querying web databases: Retrieving information by filling out web search forms is a typical task for a web user. This is all the more so as interfaces of conventional search engines are also web forms. At present, a user needs to manually provide input values to search interfaces and then extract required data from the pages with results. The manual filling out forms is not feasible and cumbersome in cases of complex queries but such kind of queries are essential for many web searches especially in the area of e-commerce. In this way, the automation of querying and retrieving data behind search interfaces is desirable and essential for such tasks as building domain-independent deep web crawlers and automated web agents, searching for domain-specific information (vertical search engines), and for extraction and integration of information from various deep web resources. We present a data model for representing search interfaces and discuss techniques for extracting field labels, client-side scripts and structured data from HTML pages. We also describe a representation of result pages and discuss how to extract and store results of form queries. Besides, we present a user-friendly and expressive form query language that allows one to retrieve information behind search interfaces and extract useful data from the result pages based on specified conditions. We implement a prototype system for querying web databases and describe its architecture and components design.
Resumo:
This study considered the current situation of biofuels markets in Finland. The fact that industry consumes more than half of the total primary energy, widely applied combined heat and power production and a high share of solid biomass fuels in the total energy consumption are specific to the Finnish energy system. Wood is the most important source of bioenergy in Finland, representing 21% of the total energy consumption in 2006. Almost 80% of the wood-based energy is recovered from industrial by-products and residues. Finland has commitment itself to maintaining its greenhouse gas emissions at the 1990 level, at the highest, during the period 2008–2012. The energy and climate policy carried out in recent years has been based on the National Energy and Climate introduced in 2005. The Finnish energy policy aims to achieve the target, and a variety of measures are taken to promote the use of renewable energy sources and especially wood fuels. In 2007, the government started to prepare a new long-term (up to the year 2050) climate and energy strategy that will meet EU’s new targets for the reduction of green house gas emissions and the promotion of renewable energy sources. The new strategy will be introduced during 2008. The international biofuels trade has a substantial importance for the utilisation of bioenergy in Finland. In 2006, the total international trading of solid and liquid biofuels was approximately 64 PJ of which import was 61 PJ. Most of the import is indirect and takes place within the forest industry’s raw wood imports. In 2006, as much as 24% of wood energy was based on foreignorigin wood. Wood pellets and tall oil form the majority of export streams of biofuels. The indirect import of wood fuels increased almost 10% in 2004–2006, while the direct trade of solid and liquid biofuels has been almost constant.
Resumo:
The purpose of this thesis was to investigate creating and improving category purchasing visibility for corporate procurement by utilizing financial information. This thesis was a part of the global category driven spend analysis project of Konecranes Plc. While creating general understanding for building category driven corporate spend visibility, the IT architecture and needed purchasing parameters for spend analysis were described. In the case part of the study three manufacturing plants of Konecranes Standard Lifting, Heavy Lifting and Services business areas were examined. This included investigating the operative IT system architecture and needed processes for building corporate spend visibility. The key findings of this study were the identification of the needed processes for gathering purchasing data elements while creating corporate spend visibility in fragmented source system environment. As an outcome of the study, roadmap presenting further development areas was introduced for Konecranes.
Resumo:
Mathematical models often contain parameters that need to be calibrated from measured data. The emergence of efficient Markov Chain Monte Carlo (MCMC) methods has made the Bayesian approach a standard tool in quantifying the uncertainty in the parameters. With MCMC, the parameter estimation problem can be solved in a fully statistical manner, and the whole distribution of the parameters can be explored, instead of obtaining point estimates and using, e.g., Gaussian approximations. In this thesis, MCMC methods are applied to parameter estimation problems in chemical reaction engineering, population ecology, and climate modeling. Motivated by the climate model experiments, the methods are developed further to make them more suitable for problems where the model is computationally intensive. After the parameters are estimated, one can start to use the model for various tasks. Two such tasks are studied in this thesis: optimal design of experiments, where the task is to design the next measurements so that the parameter uncertainty is minimized, and model-based optimization, where a model-based quantity, such as the product yield in a chemical reaction model, is optimized. In this thesis, novel ways to perform these tasks are developed, based on the output of MCMC parameter estimation. A separate topic is dynamical state estimation, where the task is to estimate the dynamically changing model state, instead of static parameters. For example, in numerical weather prediction, an estimate of the state of the atmosphere must constantly be updated based on the recently obtained measurements. In this thesis, a novel hybrid state estimation method is developed, which combines elements from deterministic and random sampling methods.
Resumo:
The necessity of EC (Electronic Commerce) and enterprise systems integration is perceived from the integrated nature of enterprise systems. The proven benefits of EC to provide competitive advantages to the organizations force enterprises to adopt and integrate EC with their enterprise systems. Integration is a complex task to facilitate seamless flow of information and data between different systems within and across enterprises. Different systems have different platforms, thus to integrate systems with different platforms and infrastructures, integration technologies, such as middleware, SOA (Service-Oriented Architecture), ESB (Enterprise Service Bus), JCA (J2EE Connector Architecture), and B2B (Business-to-Business) integration standards are required. Huge software vendors, such as Oracle, IBM, Microsoft, and SAP suggest various solutions to address EC and enterprise systems integration problems. There are limited numbers of literature about the integration of EC and enterprise systems in detail. Most of the studies in this area have focused on the factors which influence the adoption of EC by enterprise or other studies provide limited information about a specific platform or integration methodology in general. Therefore, this thesis is conducted to cover the technical details of EC and enterprise systems integration and covers both the adoption factors and integration solutions. In this study, many literature was reviewed and different solutions were investigated. Different enterprise integration approaches as well as most popular integration technologies were investigated. Moreover, various methodologies of integrating EC and enterprise systems were studied in detail and different solutions were examined. In this study, the influential factors to adopt EC in enterprises were studied based on previous literature and categorized to technical, social, managerial, financial, and human resource factors. Moreover, integration technologies were categorized based on three levels of integration, which are data, application, and process. In addition, different integration approaches were identified and categorized based on their communication and platform. Also, different EC integration solutions were investigated and categorized based on the identified integration approaches. By considering different aspects of integration, this study is a great asset to the architectures, developers, and system integrators in order to integrate and adopt EC with enterprise systems.