27 resultados para Multi Domain Information Model

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Digital business ecosystems (DBE) are becoming an increasingly popular concept for modelling and building distributed systems in heterogeneous, decentralized and open environments. Information- and communication technology (ICT) enabled business solutions have created an opportunity for automated business relations and transactions. The deployment of ICT in business-to-business (B2B) integration seeks to improve competitiveness by establishing real-time information and offering better information visibility to business ecosystem actors. The products, components and raw material flows in supply chains are traditionally studied in logistics research. In this study, we expand the research to cover the processes parallel to the service and information flows as information logistics integration. In this thesis, we show how better integration and automation of information flows enhance the speed of processes and, thus, provide cost savings and other benefits for organizations. Investments in DBE are intended to add value through business automation and are key decisions in building up information logistics integration. Business solutions that build on automation are important sources of value in networks that promote and support business relations and transactions. Value is created through improved productivity and effectiveness when new, more efficient collaboration methods are discovered and integrated into DBE. Organizations, business networks and collaborations, even with competitors, form DBE in which information logistics integration has a significant role as a value driver. However, traditional economic and computing theories do not focus on digital business ecosystems as a separate form of organization, and they do not provide conceptual frameworks that can be used to explore digital business ecosystems as value drivers—combined internal management and external coordination mechanisms for information logistics integration are not the current practice of a company’s strategic process. In this thesis, we have developed and tested a framework to explore the digital business ecosystems developed and a coordination model for digital business ecosystem integration; moreover, we have analysed the value of information logistics integration. The research is based on a case study and on mixed methods, in which we use the Delphi method and Internetbased tools for idea generation and development. We conducted many interviews with key experts, which we recoded, transcribed and coded to find success factors. Qualitative analyses were based on a Monte Carlo simulation, which sought cost savings, and Real Option Valuation, which sought an optimal investment program for the ecosystem level. This study provides valuable knowledge regarding information logistics integration by utilizing a suitable business process information model for collaboration. An information model is based on the business process scenarios and on detailed transactions for the mapping and automation of product, service and information flows. The research results illustrate the current cap of understanding information logistics integration in a digital business ecosystem. Based on success factors, we were able to illustrate how specific coordination mechanisms related to network management and orchestration could be designed. We also pointed out the potential of information logistics integration in value creation. With the help of global standardization experts, we utilized the design of the core information model for B2B integration. We built this quantitative analysis by using the Monte Carlo-based simulation model and the Real Option Value model. This research covers relevant new research disciplines, such as information logistics integration and digital business ecosystems, in which the current literature needs to be improved. This research was executed by high-level experts and managers responsible for global business network B2B integration. However, the research was dominated by one industry domain, and therefore a more comprehensive exploration should be undertaken to cover a larger population of business sectors. Based on this research, the new quantitative survey could provide new possibilities to examine information logistics integration in digital business ecosystems. The value activities indicate that further studies should continue, especially with regard to the collaboration issues on integration, focusing on a user-centric approach. We should better understand how real-time information supports customer value creation by imbedding the information into the lifetime value of products and services. The aim of this research was to build competitive advantage through B2B integration to support a real-time economy. For practitioners, this research created several tools and concepts to improve value activities, information logistics integration design and management and orchestration models. Based on the results, the companies were able to better understand the formulation of the digital business ecosystem and the importance of joint efforts in collaboration. However, the challenge of incorporating this new knowledge into strategic processes in a multi-stakeholder environment remains. This challenge has been noted, and new projects have been established in pursuit of a real-time economy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In literature CO 2 liquidization is well studied with steady state modeling. Steady state modeling gives an overview of the process but it doesn’t give information about process behavior during transients. In this master’s thesis three dynamic models of CO2 liquidization were made and tested. Models were straight multi-stage compression model and two compression liquid pumping models, one with and one without cold energy recovery. Models were made with Apros software, models were also used to verify that Apros is capable to model phase changes and over critical state of CO 2. Models were verified against compressor manufacturer’s data and simulation results presented in literature. From the models made in this thesis, straight compression model was found to be the most energy efficient and fastest to react to transients. Also Apros was found to be capable tool for dynamic liquidization modeling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objective of this Master’s Thesis was to examine the perceived city brand image of tourists and residents. It was aimed to accomplish by examining first the contribution of city attributes and marketing communications on forming brand attitudes, and then discover how the brand attitudes influence on city brand image. The impact of brand attitudes and city brand image on behavioral intention was also reviewed. The empirical part of the thesis was conducted with a quantitative method through online-based survey. The sample (n = 492) consisted of tourists and residents of the case city. The data was analyzed with statistical analyses by SPSS program. Brand attitudes, based on the main attributes, were calculated through multi-attribute attitude model. The results confirmed exposure to marketing communications has direct and positive influence on brand attitudes, especially the offline marketing communications. The findings revealed brand attitudes impact directly on city brand image perception. Brand attitudes and brand image dimensions had direct impact on tourists and residents’ behavioral intention. The findings provide important information for the city marketers. They increase marketers understanding on how target population perceives the city brand image and how it impacts on their future behavior. This thesis reveals the perception of current city brand image and gives guidance on what to emphasize in city branding to increase city’s attractiveness in conjunction with its economic development. Furthermore, the created framework can be utilized also in the future researches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis aims to find an effective way of conducting a target audience analysis (TAA) in cyber domain. There are two main focal points that are addressed; the nature of the cyber domain and the method of the TAA. Of the cyber domain the object is to find the opportunities, restrictions and caveats that result from its digital and temporal nature. This is the environment in which the TAA method is examined in this study. As the TAA is an important step of any psychological operation and critical to its success, the method used must cover all the main aspects affecting the choice of a proper target audience. The first part of the research was done by sending an open-ended questionnaire to operators in the field of information warfare both in Finland and abroad. As the results were inconclusive, the research was completed by assessing the applicability of United States Army Joint Publication FM 3-05.301 in the cyber domain via a theory-based content analysis. FM 3- 05.301 was chosen because it presents a complete method of the TAA process. The findings were tested against the results of the questionnaire and new scientific research in the field of psychology. The cyber domain was found to be “fast and vast”, volatile and uncontrollable. Although governed by laws to some extent, the cyber domain is unpredictable by nature and not controllable to reasonable amount. The anonymity and lack of verification often present in the digital channels mean that anyone can have an opinion, and any message sent may change or even be counterproductive to the original purpose. The TAA method of the FM 3-05.301 is applicable in the cyber domain, although some parts of the method are outdated and thus suggested to be updated if used in that environment. The target audience categories of step two of the process were replaced by new groups that exist in the digital environment. The accessibility assessment (step eight) was also redefined, as in the digital media the mere existence of a written text is typically not enough to convey the intended message to the target audience. The scientific studies made in computer sciences and both in psychology and sociology about the behavior of people in social media (and overall in cyber domain) call for a more extensive remake of the TAA process. This falls, however, out of the scope of this work. It is thus suggested that further research should be carried out in search of computer-assisted methods and a more thorough TAA process, utilizing the latest discoveries of human behavior. ---------------------------------------------------------------------------------------------------------------------------------- Tämän opinnäytetyön tavoitteena on löytää tehokas tapa kohdeyleisöanalyysin tekemiseksi kybertoimintaympäristössä. Työssä keskitytään kahteen ilmiöön: kybertoimintaympäristön luonteeseen ja kohdeyleisöanalyysin metodiin. Kybertoimintaympäristön osalta tavoitteena on löytää sen digitaalisesta ja ajallisesta luonteesta juontuvat mahdollisuudet, rajoitteet ja sudenkuopat. Tämä on se ympäristö jossa kohdeyleisöanalyysiä tarkastellaan tässä työssä. Koska kohdeyleisöanalyysi kuuluu olennaisena osana jokaiseen psykologiseen operaatioon ja on onnistumisen kannalta kriittinen tekijä, käytettävän metodin tulee pitää sisällään kaikki oikean kohdeyleisön valinnan kannalta merkittävät osa-alueet. Tutkimuksen ensimmäisessä vaiheessa lähetettiin avoin kysely informaatiosodankäynnin ammattilaisille Suomessa ja ulkomailla. Koska kyselyn tulokset eivät olleet riittäviä johtopäätösten tekemiseksi, tutkimusta jatkettiin tarkastelemalla Yhdysvaltojen armeijan kenttäohjesäännön FM 3-05.301 soveltuvuutta kybertoimintaympäristössä käytettäväksi teorialähtöisen sisällönanalyysin avulla. FM 3-05.301 valittiin koska se sisältää kokonaisvaltaisen kohdeyleisöanalyysiprosessin. Havaintoja verrattiin kyselytutkimuksen tuloksiin ja psykologian uusiin tutkimuksiin. Kybertoimintaympäristö on tulosten perusteella nopea ja valtava, jatkuvasti muuttuva ja kontrolloimaton. Vaikkakin lait hallitsevat kybertoimintaympäristöä jossakin määrin, on se silti luonteeltaan ennakoimaton eikä sitä voida luotettavasti hallita. Digitaalisilla kanavilla usein läsnäoleva nimettömyys ja tiedon tarkastamisen mahdottomuus tarkoittavat että kenellä tahansa voi olla mielipide asioista, ja mikä tahansa viesti voi muuttua, jopa alkuperäiseen tarkoitukseen nähden vastakkaiseksi. FM 3-05.301:n metodi toimii kybertoimintaympäristössä, vaikkakin jotkin osa-alueet ovat vanhentuneita ja siksi ne esitetään päivitettäväksi mikäli metodia käytetään kyseisessä ympäristössä. Kohdan kaksi kohdeyleisökategoriat korvattiin uusilla, digitaalisessa ympäristössä esiintyvillä ryhmillä. Lähestyttävyyden arviointi (kohta 8) muotoiltiin myös uudestaan, koska digitaalisessa mediassa pelkkä tekstin läsnäolo ei sellaisenaan tyypillisesti vielä riitä halutun viestin välittämiseen kohdeyleisölle. Tietotekniikan edistyminen ja psykologian sekä sosiologian aloilla tehty tieteellinen tutkimus ihmisten käyttäytymisestä sosiaalisessa mediassa (ja yleensä kybertoimintaympäristössä) mahdollistavat koko kohdeyleisöanalyysiprosessin uudelleenrakentamisen. Tässä työssä sitä kuitenkaan ei voida tehdä. Siksi esitetäänkin että lisätutkimusta tulisi tehdä sekä tietokoneavusteisten prosessien että vielä syvällisempien kohdeyleisöanalyysien osalta, käyttäen hyväksi viimeisimpiä ihmisen käyttäytymiseen liittyviä tutkimustuloksia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One challenge on data assimilation (DA) methods is how the error covariance for the model state is computed. Ensemble methods have been proposed for producing error covariance estimates, as error is propagated in time using the non-linear model. Variational methods, on the other hand, use the concepts of control theory, whereby the state estimate is optimized from both the background and the measurements. Numerical optimization schemes are applied which solve the problem of memory storage and huge matrix inversion needed by classical Kalman filter methods. Variational Ensemble Kalman filter (VEnKF), as a method inspired the Variational Kalman Filter (VKF), enjoys the benefits from both ensemble methods and variational methods. It avoids filter inbreeding problems which emerge when the ensemble spread underestimates the true error covariance. In VEnKF this is tackled by resampling the ensemble every time measurements are available. One advantage of VEnKF over VKF is that it needs neither tangent linear code nor adjoint code. In this thesis, VEnKF has been applied to a two-dimensional shallow water model simulating a dam-break experiment. The model is a public code with water height measurements recorded in seven stations along the 21:2 m long 1:4 m wide flume’s mid-line. Because the data were too sparse to assimilate the 30 171 model state vector, we chose to interpolate the data both in time and in space. The results of the assimilation were compared with that of a pure simulation. We have found that the results revealed by the VEnKF were more realistic, without numerical artifacts present in the pure simulation. Creating a wrapper code for a model and DA scheme might be challenging, especially when the two were designed independently or are poorly documented. In this thesis we have presented a non-intrusive approach of coupling the model and a DA scheme. An external program is used to send and receive information between the model and DA procedure using files. The advantage of this method is that the model code changes needed are minimal, only a few lines which facilitate input and output. Apart from being simple to coupling, the approach can be employed even if the two were written in different programming languages, because the communication is not through code. The non-intrusive approach is made to accommodate parallel computing by just telling the control program to wait until all the processes have ended before the DA procedure is invoked. It is worth mentioning the overhead increase caused by the approach, as at every assimilation cycle both the model and the DA procedure have to be initialized. Nonetheless, the method can be an ideal approach for a benchmark platform in testing DA methods. The non-intrusive VEnKF has been applied to a multi-purpose hydrodynamic model COHERENS to assimilate Total Suspended Matter (TSM) in lake Säkylän Pyhäjärvi. The lake has an area of 154 km2 with an average depth of 5:4 m. Turbidity and chlorophyll-a concentrations from MERIS satellite images for 7 days between May 16 and July 6 2009 were available. The effect of the organic matter has been computationally eliminated to obtain TSM data. Because of computational demands from both COHERENS and VEnKF, we have chosen to use 1 km grid resolution. The results of the VEnKF have been compared with the measurements recorded at an automatic station located at the North-Western part of the lake. However, due to TSM data sparsity in both time and space, it could not be well matched. The use of multiple automatic stations with real time data is important to elude the time sparsity problem. With DA, this will help in better understanding the environmental hazard variables for instance. We have found that using a very high ensemble size does not necessarily improve the results, because there is a limit whereby additional ensemble members add very little to the performance. Successful implementation of the non-intrusive VEnKF and the ensemble size limit for performance leads to an emerging area of Reduced Order Modeling (ROM). To save computational resources, running full-blown model in ROM is avoided. When the ROM is applied with the non-intrusive DA approach, it might result in a cheaper algorithm that will relax computation challenges existing in the field of modelling and DA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ortogonaalisen M-kaistaisen moniresoluutioanalyysin matemaattiset perusteet esitetään yksityiskohtaisesti. Coifman-aallokkeiden määritelmä yleistetään dilaatiokertoimelle M ja nollasta poikkeavalle häviävien momenttien keskukselle.Funktion approksimointia näytepisteistä aallokkeiden avulla pohditaan ja erityisesti esitetään approksimaation asymptoottinen virhearvio Coifman-aallokkeille. Skaalaussuotimelle osoitetaan välttämättömät ja riittävät ehdot, jotka johtavat yleistettyihin Coifman-aallokkeisiin. Moniresoluutioanalyysin tiheys todistetaansuoraan Lebesguen integraalin määritelmään perustuen yksikön partitio-ominaisuutta käyttäen. Todistus on riittävä sellaisenaan avaruudessa L2(Wd) käyttämättä Fourier-tason ominaisuuksia tai ehtoja. Mallatin algoritmi johdetaan M-kaistaisille aallokkeille ja moniuloitteisille signaaleille. Algoritmille esitetään myös rekursiivinen muoto. Differentiaalievoluutioalgoritmin avulla ratkaistaan Coifman-aallokkeisiin liittyvien skaalaussuotimien kertoimien arvoja useille skaalausfunktiolle. Approksimaatio- ja kuvanpakkausesimerkkejä esitetään menetelmien havainnollistamiseksi. Differentiaalievoluutioalgoritmin avulla etsitään myös referenssikuville optimoitu skaalaussuodin. Löydetty suodin on regulaarinen ja erittäinsymmetrinen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tutkimuksen tarkoituksena on selvittää mitkä ovat Lappeenrannan teknillisen korkeakoulun laadun kehittämisen mahdollisuudet. Tutkimusongelmaa lähestytään selvittämällä kirjallisuudesta laadun kehittämisen yleisiä perusedellytyksiä ja miten toimiminen julkisesti rahoitetulla sektorilla vaikuttaa niihin. Teollisuudesta peräisin olevat laadun kehittämisen j a laatujohtamisen menetelmät eivät yksin sovellu tietointensiiviseen akateemiseen maailmaan. Tietojohtaminen tuo yliopistojen laadun kehittämiseen uuden ulottuvuuden. Organisaatiot on nähtävä moniulotteisina tietoyöparistoina, joissa on mekaanisia, orgaanisia ja dynaamisia piirteitä. Näissä tietoympäristöissä on omat periaatteensa, joiden mukaan niiden toimintaa tehokkaimmin johdetaan, ja kriteerinsä, joiden pohjalta laatu määräytyy. Tutkimus osoittaa, että LTKK:n johto suhtautuu myönteisesti laadun kehittämiseen ja LTKK:ssa on monia kohteita, joiden laatua voidaan kehittää. Vaikka LTKK:ssa arvostetaan innovatiivisuutta, joka on dynaamisen ympäristön laadun kriteeri, kehittämisehdotukset tukivat lähes kokonaan orgaanisen ympäristön laatua. jonka kriteerinä on hallittu kehittäminen. Suurimmat haasteet laadun kehittämisessä ovat kenties dynaamisen ympäristön tavoitteiden tunnistaminen, henkilöstön asenteiden muuttaminen ja yhteisöllisyyden lisääminen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Globalization is the trend which is realized in all areas in today’s business world. Pressure for cost reduction, changes in market situation and available scale economies have changed business environment more global than ever. To respond to new situation, companies are establishing global strategies. In this thesis, available global competitive advantages in electrical machine industry are studied in context of gaining them by global technology transfers. In theory part, establishing global strategy and competitive advantage is considered with connection to global sourcing and supply chain management. Additionally, market development in 21st century and its impact on global strategies is studied. In practice, global manufacturing is enabled by technology transfer projects. Smooth and fast project implementation enables faster and more flexible production ramp up. By starting the production available competitive advantages can be realized. In this thesis the present situation of technology transfer projects and the risks and advantages related to global manufacturing are analyzed. The analysis of implemented technology transfer projects indicates that project implementation is in good level. For further development of project execution 10 minor suggestions could be presented with two major ones: higher level standardization and development of product information model to support better global manufacturing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Configuration management is often seen as an enabler for the main IT Service Management (ITSM) processes such as Incident and Problem management. A decent level of quality of IT configuration data is required in order to carry out routines of these processes. This case study examines the state of configuration management in a multinational organization and aims at identification of methods for its improvement. The author has stayed five months with this company in order to collect different sources of evidence and to make observations. The main source of data for this study is interviews with some of the key employees of the assigned organization who are involved into the ITSM processes. This study concludes the maturity level of the existing configuration management process to be repeatable but intuitive, and outlines the principal requirements for its improvement. A match between the requirements identified in the organization and the requirements stated in the ISO/IEC 20000 standard indicates the possibility of adopting ITIL guidelines as a method for configuration management process improvement. The outcome of the study presents a set of recommendations for improvement that considers the process, the information model and the information system for configuration management in the case organization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Työn tavoitteena oli tutkia, kuinka rakennuksen tietomallia voidaan hyödyntää rakennusliikkeen tuotannonsuunnittelu- ja rakennusvaiheessa sekä mitä sen hyödyntäminen edellyttää rakennuksen tietomallin informaatiosisällöltä. Tavoitteena oli myös tunnistaa tuotannonohjauksen ”pullonkauloja”, joissa kohdin toimintaa voitaisiin tietomallien avulla tehostaa. Työn teoreettisena taustana on aineettoman pääoman merkitys yrityksen kilpailuedun luojana, tietämyksen hallinta ja teknologian hyödyntäminen tietämyksen hallinnassa. Työssä tutkittiin rakennustuotannon johtamista ja tietomallintamisen hyödyntämistä rakentamisessa sekä tietomallien hyödynnettävyyden varmistamista yleisellä tasolla. Työssä tutustuttiin kohdeyritykseen tuotannonohjaukseen ja rakennusvaiheen tietomallien hyödyntämisen nykytilaan. Tuloksina voidaan todeta, että tietomallien tuotannonsuunnittelu- ja rakennusvaiheen tietomallien hyödyntämisen perusedellytys on tietomallien oikeellisuus sekä tietomallien ja perinteisten suunnitteludokumenttien yhdenmukainen tietosisältö. Tämän lisäksi tarvitaan suunnitellut toimintatavat ja toimivat tiedonjakelukanavat sekä kyky hyödyntää tieto- ja viestintäteknologiaa. Tietomallit eivät tämän tutkimuksen perusteella näytä luoneen tarvetta uudenlaisille tuotanto-organisaation roolituksille. Tietomalleilla uskotaan olevan positiivisia vaikutuksia rakennusvaiheen muutostenhallinnassa. Tuotantoorganisaation henkilöillä oli positiivisia odotuksia tietomallien hyödyntämisestä tuotannonohjauksessa. Tietomallien odotetaan tukevan erillisten suunnitelmien muodostamien kokonaisuuksien hahmottamista, rakennusvaiheen osapuolten yhteistyötä ja töiden yhteensovitusta sekä logistiikan suunnittelua ja vaikutusten havainnointia.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Fluent health information flow is critical for clinical decision-making. However, a considerable part of this information is free-form text and inabilities to utilize it create risks to patient safety and cost-­effective hospital administration. Methods for automated processing of clinical text are emerging. The aim in this doctoral dissertation is to study machine learning and clinical text in order to support health information flow.First, by analyzing the content of authentic patient records, the aim is to specify clinical needs in order to guide the development of machine learning applications.The contributions are a model of the ideal information flow,a model of the problems and challenges in reality, and a road map for the technology development. Second, by developing applications for practical cases,the aim is to concretize ways to support health information flow. Altogether five machine learning applications for three practical cases are described: The first two applications are binary classification and regression related to the practical case of topic labeling and relevance ranking.The third and fourth application are supervised and unsupervised multi-class classification for the practical case of topic segmentation and labeling.These four applications are tested with Finnish intensive care patient records.The fifth application is multi-label classification for the practical task of diagnosis coding. It is tested with English radiology reports.The performance of all these applications is promising. Third, the aim is to study how the quality of machine learning applications can be reliably evaluated.The associations between performance evaluation measures and methods are addressed,and a new hold-out method is introduced.This method contributes not only to processing time but also to the evaluation diversity and quality. The main conclusion is that developing machine learning applications for text requires interdisciplinary, international collaboration. Practical cases are very different, and hence the development must begin from genuine user needs and domain expertise. The technological expertise must cover linguistics,machine learning, and information systems. Finally, the methods must be evaluated both statistically and through authentic user-feedback.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Ohjelmiston kehitystyökalut käyttävät infromaatiota kehittäjän tuottamasta lähdekoodista. Informaatiota hyödynnetään ohjelmistoprojektin eri vaiheissa ja eri tarkoituksissa. Moderneissa ohjelmistoprojekteissa käytetyn informaation määrä voi kasvaa erittäin suureksi. Ohjelmistotyökaluilla on omat informaatiomallinsa ja käyttömekanisminsa. Informaation määrä sekä erilliset työkaluinformaatiomallit tekevät erittäin hankalaksi rakentaa joustavaa työkaluympäristöä, erityisesti ongelma-aluekohtaiseen ohjelmiston kehitysprosessiin. Tässä työssä on analysoitu perusinformaatiometamalleja Unified Modeling language kielestä, Python ohjelmointikielestä ja C++ ohjelmointikielestä. Metainformaation taso on rajoitettu rakenteelliselle tasolle. Ajettavat rakenteet on jätetty pois. ModelBase metamalli on yhdistetty olemassa olevista analysoiduista metamalleista. Tätä metamallia voidaan käyttää tulevaisuudessa ohjelmistotyökalujen kehitykseen.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

1. Introduction "The one that has compiled ... a database, the collection, securing the validity or presentation of which has required an essential investment, has the sole right to control the content over the whole work or over either a qualitatively or quantitatively substantial part of the work both by means of reproduction and by making them available to the public", Finnish Copyright Act, section 49.1 These are the laconic words that implemented the much-awaited and hotly debated European Community Directive on the legal protection of databases,2 the EDD, into Finnish Copyright legislation in 1998. Now in the year 2005, after more than half a decade of the domestic implementation it is yet uncertain as to the proper meaning and construction of the convoluted qualitative criteria the current legislation employs as a prerequisite for the database protection both in Finland and within the European Union. Further, this opaque Pan-European instrument has the potential of bringing about a number of far-reaching economic and cultural ramifications, which have remained largely uncharted or unobserved. Thus the task of understanding this particular and currently peculiarly European new intellectual property regime is twofold: first, to understand the mechanics and functioning of the EDD and second, to realise the potential and risks inherent in the new legislation in economic, cultural and societal dimensions. 2. Subject-matter of the study: basic issues The first part of the task mentioned above is straightforward: questions such as what is meant by the key concepts triggering the functioning of the EDD such as presentation of independent information, what constitutes an essential investment in acquiring data and when the reproduction of a given database reaches either qualitatively or quantitatively the threshold of substantiality before the right-holder of a database can avail himself of the remedies provided by the statutory framework remain unclear and call for a careful analysis. As for second task, it is already obvious that the practical importance of the legal protection providedby the database right is in the rapid increase. The accelerating transformationof information into digital form is an existing fact, not merely a reflection of a shape of things to come in the future. To take a simple example, the digitisation of a map, traditionally in paper format and protected by copyright, can provide the consumer a markedly easier and faster access to the wanted material and the price can be, depending on the current state of the marketplace, cheaper than that of the traditional form or even free by means of public lending libraries providing access to the information online. This also renders it possible for authors and publishers to make available and sell their products to markedly larger, international markets while the production and distribution costs can be kept at minimum due to the new electronic production, marketing and distributionmechanisms to mention a few. The troublesome side is for authors and publishers the vastly enhanced potential for illegal copying by electronic means, producing numerous virtually identical copies at speed. The fear of illegal copying canlead to stark technical protection that in turn can dampen down the demand for information goods and services and furthermore, efficiently hamper the right of access to the materials available lawfully in electronic form and thus weaken the possibility of access to information, education and the cultural heritage of anation or nations, a condition precedent for a functioning democracy. 3. Particular issues in Digital Economy and Information Networks All what is said above applies a fortiori to the databases. As a result of the ubiquity of the Internet and the pending breakthrough of Mobile Internet, peer-to-peer Networks, Localand Wide Local Area Networks, a rapidly increasing amount of information not protected by traditional copyright, such as various lists, catalogues and tables,3previously protected partially by the old section 49 of the Finnish Copyright act are available free or for consideration in the Internet, and by the same token importantly, numerous databases are collected in order to enable the marketing, tendering and selling products and services in above mentioned networks. Databases and the information embedded therein constitutes a pivotal element in virtually any commercial operation including product and service development, scientific research and education. A poignant but not instantaneously an obvious example of this is a database consisting of physical coordinates of a certain selected group of customers for marketing purposes through cellular phones, laptops and several handheld or vehicle-based devices connected online. These practical needs call for answer to a plethora of questions already outlined above: Has thecollection and securing the validity of this information required an essential input? What qualifies as a quantitatively or qualitatively significant investment? According to the Directive, the database comprises works, information and other independent materials, which are arranged in systematic or methodical way andare individually accessible by electronic or other means. Under what circumstances then, are the materials regarded as arranged in systematic or methodical way? Only when the protected elements of a database are established, the question concerning the scope of protection becomes acute. In digital context, the traditional notions of reproduction and making available to the public of digital materials seem to fit ill or lead into interpretations that are at variance with analogous domain as regards the lawful and illegal uses of information. This may well interfere with or rework the way in which the commercial and other operators have to establish themselves and function in the existing value networks of information products and services. 4. International sphere After the expiry of the implementation period for the European Community Directive on legal protection of databases, the goals of the Directive must have been consolidated into the domestic legislations of the current twenty-five Member States within the European Union. On one hand, these fundamental questions readily imply that the problemsrelated to correct construction of the Directive underlying the domestic legislation transpire the national boundaries. On the other hand, the disputes arisingon account of the implementation and interpretation of the Directive on the European level attract significance domestically. Consequently, the guidelines on correct interpretation of the Directive importing the practical, business-oriented solutions may well have application on European level. This underlines the exigency for a thorough analysis on the implications of the meaning and potential scope of Database protection in Finland and the European Union. This position hasto be contrasted with the larger, international sphere, which in early 2005 does differ markedly from European Union stance, directly having a negative effect on international trade particularly in digital content. A particular case in point is the USA, a database producer primus inter pares, not at least yet having aSui Generis database regime or its kin, while both the political and academic discourse on the matter abounds. 5. The objectives of the study The above mentioned background with its several open issues calls for the detailed study of thefollowing questions: -What is a database-at-law and when is a database protected by intellectual property rights, particularly by the European database regime?What is the international situation? -How is a database protected and what is its relation with other intellectual property regimes, particularly in the Digital context? -The opportunities and threats provided by current protection to creators, users and the society as a whole, including the commercial and cultural implications? -The difficult question on relation of the Database protection and protection of factual information as such. 6. Dsiposition The Study, in purporting to analyse and cast light on the questions above, is divided into three mainparts. The first part has the purpose of introducing the political and rationalbackground and subsequent legislative evolution path of the European database protection, reflected against the international backdrop on the issue. An introduction to databases, originally a vehicle of modern computing and information andcommunication technology, is also incorporated. The second part sets out the chosen and existing two-tier model of the database protection, reviewing both itscopyright and Sui Generis right facets in detail together with the emergent application of the machinery in real-life societal and particularly commercial context. Furthermore, a general outline of copyright, relevant in context of copyright databases is provided. For purposes of further comparison, a chapter on the precursor of Sui Generi, database right, the Nordic catalogue rule also ensues. The third and final part analyses the positive and negative impact of the database protection system and attempts to scrutinize the implications further in the future with some caveats and tentative recommendations, in particular as regards the convoluted issue concerning the IPR protection of information per se, a new tenet in the domain of copyright and related rights.