864 resultados para Data mining methods


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Diplomityön tavoitteena oli selkeyttää ja yhtenäistää case-yrityksen asiakaspalveluprosesseja kuvaamalla ne ja kehittämällä niitä. Case-yhtiönä oli seudullinen elinkeinoyhtiö, joka tarjoaa asiakkailleen tietointensiivisiä asiantuntija- ja neuvontapalveluita. Diplomityön lopputuloksena oli tarkoitus luoda yhtenäiset asiakaspalveluprosessimallit case-yritykselle. Tutkimuksessa vastattiin kirjallisuuskatsauksen avulla kysymyksiin, kuinka asiakaspalveluprosesseja voidaan kehittää ja millä menetelmillä prosesseja voidaan kuvata. Tämän jälkeen tutkittiin, miten case-yritys voi kehittää asiakaspalveluprosessejaan soveltamalla näitä menetelmiä. Selvisi, että palveluprosessin kehittäminen edellyttää käytännössä palvelun tuotteistamista, joka voidaan jakaa kolmeen osa-alueeseen: 1) palvelun määrittäminen ja standardointi, 2) palvelun ja asiantuntijuuden aineellistaminen ja konkretisointi ja 3) prosessien ja metodien systematisointi ja standardisointi. Asiantuntijapalveluja on vaikea tai mahdoton standardoida sellaisenaan, mutta ne voidaan tuotteistaa modulaarisesti. Palveluprosessien systematisoimiseen sopivin työkalu on niiden kuvaaminen service blueprinting -menetelmällä. Tutkimus on laadultaan kvalitatiivinen kuvaileva tapaustutkimus. Tutkimuksen empiirisessä on käytetty aineistonhankintatapoina puolistrukturoitua kyselyä, teemahaastatteluja, osallistuvaa havainnointia ja dokumenttien tutkimista. Näiden aineistojen analysoinnin tuloksena saatiin case-yritykselle kehitettyä yhtenäinen malli yleisestä asiakaspalveluprosessista ja kuvattua keskeisimpien palvelujen asiakaspalveluprosessit service blueprinting -menetelmällä.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Ion mobility spectrometry (IMS) is a straightforward, low cost method for fast and sensitive determination of organic and inorganic analytes. Originally this portable technique was applied to the determination of gas phase compounds in security and military use. Nowadays, IMS has received increasing attention in environmental and biological analysis, and in food quality determination. This thesis consists of literature review of suitable sample preparation and introduction methods for liquid matrices applicable to IMS from its early development stages to date. Thermal desorption, solid phase microextraction (SPME) and membrane extraction were examined in experimental investigations of hazardous aquatic pollutants and potential pollutants. Also the effect of different natural waters on the extraction efficiency was studied, and the utilised IMS data processing methods are discussed. Parameters such as extraction and desorption temperatures, extraction time, SPME fibre depth, SPME fibre type and salt addition were examined for the studied sample preparation and introduction methods. The observed critical parameters were extracting material and temperature. The extraction methods showed time and cost effectiveness because sampling could be performed in single step procedures and from different natural water matrices within a few minutes. Based on these experimental and theoretical studies, the most suitable method to test in the automated monitoring system is membrane extraction. In future an IMS based early warning system for monitoring water pollutants could ensure the safe supply of drinking water. IMS can also be utilised for monitoring natural waters in cases of environmental leakage or chemical accidents. When combined with sophisticated sample introduction methods, IMS possesses the potential for both on-line and on-site identification of analytes in different water matrices.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In recent decades, business intelligence (BI) has gained momentum in real-world practice. At the same time, business intelligence has evolved as an important research subject of Information Systems (IS) within the decision support domain. Today’s growing competitive pressure in business has led to increased needs for real-time analytics, i.e., so called real-time BI or operational BI. This is especially true with respect to the electricity production, transmission, distribution, and retail business since the law of physics determines that electricity as a commodity is nearly impossible to be stored economically, and therefore demand-supply needs to be constantly in balance. The current power sector is subject to complex changes, innovation opportunities, and technical and regulatory constraints. These range from low carbon transition, renewable energy sources (RES) development, market design to new technologies (e.g., smart metering, smart grids, electric vehicles, etc.), and new independent power producers (e.g., commercial buildings or households with rooftop solar panel installments, a.k.a. Distributed Generation). Among them, the ongoing deployment of Advanced Metering Infrastructure (AMI) has profound impacts on the electricity retail market. From the view point of BI research, the AMI is enabling real-time or near real-time analytics in the electricity retail business. Following Design Science Research (DSR) paradigm in the IS field, this research presents four aspects of BI for efficient pricing in a competitive electricity retail market: (i) visual data-mining based descriptive analytics, namely electricity consumption profiling, for pricing decision-making support; (ii) real-time BI enterprise architecture for enhancing management’s capacity on real-time decision-making; (iii) prescriptive analytics through agent-based modeling for price-responsive demand simulation; (iv) visual data-mining application for electricity distribution benchmarking. Even though this study is from the perspective of the European electricity industry, particularly focused on Finland and Estonia, the BI approaches investigated can: (i) provide managerial implications to support the utility’s pricing decision-making; (ii) add empirical knowledge to the landscape of BI research; (iii) be transferred to a wide body of practice in the power sector and BI research community.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The amount of biological data has grown exponentially in recent decades. Modern biotechnologies, such as microarrays and next-generation sequencing, are capable to produce massive amounts of biomedical data in a single experiment. As the amount of the data is rapidly growing there is an urgent need for reliable computational methods for analyzing and visualizing it. This thesis addresses this need by studying how to efficiently and reliably analyze and visualize high-dimensional data, especially that obtained from gene expression microarray experiments. First, we will study the ways to improve the quality of microarray data by replacing (imputing) the missing data entries with the estimated values for these entries. Missing value imputation is a method which is commonly used to make the original incomplete data complete, thus making it easier to be analyzed with statistical and computational methods. Our novel approach was to use curated external biological information as a guide for the missing value imputation. Secondly, we studied the effect of missing value imputation on the downstream data analysis methods like clustering. We compared multiple recent imputation algorithms against 8 publicly available microarray data sets. It was observed that the missing value imputation indeed is a rational way to improve the quality of biological data. The research revealed differences between the clustering results obtained with different imputation methods. On most data sets, the simple and fast k-NN imputation was good enough, but there were also needs for more advanced imputation methods, such as Bayesian Principal Component Algorithm (BPCA). Finally, we studied the visualization of biological network data. Biological interaction networks are examples of the outcome of multiple biological experiments such as using the gene microarray techniques. Such networks are typically very large and highly connected, thus there is a need for fast algorithms for producing visually pleasant layouts. A computationally efficient way to produce layouts of large biological interaction networks was developed. The algorithm uses multilevel optimization within the regular force directed graph layout algorithm.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The visualization of measurement data is important in the fields of engineering for research analysis and presentation purposes. A suitable visualization method for scientific visualization is needed when handling measurement data. Visualization methods and techniques will be presented throughout this work. They are the bases of scientific visualization from the abstract visualization process to the applied techniques suited for each situation. This work also proposes a visualization tool using the MATLAB® software. The tool was designed as general as possible to encompass the most needs in terms of measurement data visualization. It offers possibilities for both static and dynamic visualization of the data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of the thesis is to study how mathematics is experienced and used in preschool children’s activities and how preschool teachers frame their teaching of mathematical content. The studies include analyses of children’s actions in different activities from a mathematical perspective and preschool teachers’ intentions with and their teaching of mathematics. Preschool teachers’ understanding of the knowledge required in this area is also scrutinised. The theoretical points of departure are variation theory and sociocultural theory. With variation theory the focus is directed towards how mathematical content is dealt with in teaching situations where preschool teachers have chosen the learning objects. The sociocultural perspective has been chosen because children’s mathematical learning in play often takes place in interactions with others and in the encounter with culturally mediated concepts. The theoretical framework also includes didactical points of departure. The study is qualitative, with videography and phenomenography as metholological research approaches. In the study, video observations and interviews with preschool teachers have been used as data collection methods. The results show that in children’s play mathematics consists of volume, geometrical shapes, gravity, quantity and positioning. The situations also include size, patterns, proportions, counting and the creation of pairs. The preschool teachers’ intentions, planning and staging of their goal-oriented work are that all children should be given the opportunity to discern a mathematical content. This also includes making learning objects visible in here-and-now-situations. Variation and a clear focus on the mathematical content are important in this context. One of the study’s knowledge contributions concerns the didactics of mathematics in the preschool. This relates to the teaching of mathematics and includes the knowledge that preschool teachers regard as essential for their teaching. This includes theoretical and practical knowledge about children and children’s learning and didactical issues and strategies. The conclusion is that preschool teachers need to have a basic knowledge of mathematics and the didactics of mathematics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

cDNA microarray is an innovative technology that facilitates the analysis of the expression of thousands of genes simultaneously. The utilization of this methodology, which is rapidly evolving, requires a combination of expertise from the biological, mathematical and statistical sciences. In this review, we attempt to provide an overview of the principles of cDNA microarray technology, the practical concerns of the analytical processing of the data obtained, the correlation of this methodology with other data analysis methods such as immunohistochemistry in tissue microarrays, and the cDNA microarray application in distinct areas of the basic and clinical sciences.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The use of recovered paper as raw material in the paper and board industry has increased heavily during recent decades. At the same time, growing environmental awareness has raised the interest in recycling and a more sustainable way of living, at least in high-income countries. This paper combines these topics and explores how economic, demographic and environmental factors have affected the recovery and utilization of recycled paper between 1992 and 2010 in a sample of 70 countries. This study updates and extends the previous research on the topic using panel data and panel data estimation methods. The results confirm the roles of economic determinants but also indicate that concern for the environment impacts the recovery of recycled paper particularly in high-income countries. Moreover, the motives for recycling appear to depend on the income level of a country, which is something that future policies should consider.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This doctoral dissertation explores the contribution of environmental management practices, the so-called clean development mechanism (CDM) projects, and foreign direct investment (FDI) in achieving sustainable development in developing countries, particularly in Sub- Saharan Africa. Because the climate change caused by greenhouse gas emissions is one of the most serious global environmental challenges, the main focus is on the causal links between carbon dioxide (CO2) emissions, energy consumption, and economic development in Sub-Saharan Africa. In addition, the dissertation investigates the factors that have affected the distribution of CDM projects in developing countries and the relationships between FDI and other macroeconomic variables of interest. The main contribution of the dissertation is empirical. One of the publications uses crosssectional data and Tobit and Poisson regressions. Three of the studies use time-series data and vector autoregressive and vector error correction models, while two publications use panel data and panel data estimation methods. One of the publications uses thus both timeseries and panel data. The concept of Granger causality is utilized in four of the publications. The results indicate that there are significant differences in the Granger causality relationships between CO2 emissions, energy consumption, economic growth, and FDI in different countries. It appears also that the causality relationships change over time. Furthermore, the results support the environmental Kuznets curve hypothesis but only for some of the countries. As to CDM activities, past emission levels, institutional quality, and the size of the host country appear to be among the significant determinants of the distribution of CDM projects. FDI and exports are also found to be significant determinants of economic growth.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Presentation of Kristiina Hormia-Poutanen at the 25th Anniversary Conference of The National Repository Library of Finland, Kuopio 22th of May 2015.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aineistojen käsittely ja jalostaminen. Esitys Liikearkistopäiville 2015.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nursing education research has confirmed its place in the discipline of nursing and caring sciences being one of the most central research areas. However, extensive and systematic analysis of nursing education research has been lacking both nationally and internationally. The aim of this study was to describe the focus of nursing education research in Finnish doctoral dissertations in the field of nursing and caring sciences between the years 1990–2014. In addition, the characteristics (i.e. methods, study informants and reporting of validity, reliability, and research ethics) of the dissertations were described. Also, international reviews (N=39) focusing on nursing education research were analysed for a background literature. A literature review was carried out. Altogether 51 (=N) Finnish doctoral dissertations of nursing and caring sciences in the field of nursing education research were included in the final analysis. The data for this research was collected from the open publication lists of each university offering education in nursing and caring sciences in Finland. The dissertations were published in 1990–2014. The data were analysed by content analysis both deductively and inductively. This study consists of a scientific article manuscript and a background literature review. Nursing education research has focused both nationally and internationally on four main areas: structural factors in nursing education, nurse teacherhood, teaching activities, and learning and learning outcomes in nursing education. In Finland, the most central focus area was learning (84.3 %) whereas nurse teacherhood and structural factors in nursing education were studied the least. Students were the predominant study informant group while nurse staff including nurse mentors were next and nurse educators only the third. Surveys and interviews were the most common data collection methods. In the findings there were a lot of similarities with international reviews of nursing education research. Finnish nursing education research has been very student-centred yet studies focusing on the education of other nursing based professions or different levels of education are rare. Future research about nurse teacherhood, curricula and structural factors in nursing education is recommended. There is also a need for experimental designs. In addition, nursing education research should focus on the central phenomena of nursing education and working life. All in all, more nursing education research is needed. Nursing education dissertations cover only 12.3 % of all the dissertations of nursing and caring sciences in Finland.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The case company in this study is a large industrial engineering company whose business is largely based on delivering a wide-range of engineering projects. The aim of this study is to create and develop a fairly simple Excel-based tool for the sales department. The tool’s main function is to estimate and visualize the profitability of various small projects. The study also aims to find out other possible and more long-term solutions for tackling the problem in the future. The study is highly constructive and descriptive as it focuses on the development task and in the creation of a new operating model. The developed tool focuses on estimating the profitability of the small orders of the selected project portfolio currently on the bidding-phase (prospects) and will help the case company in the monthly reporting of sales figures. The tool will analyse the profitability of a certain project by calculating its fixed and variable costs, then further the gross margin and operating profit. The bidding phase of small project is a phase that has not been covered fully by the existing tools within the case company. The project portfolio tool can be taken into use immediately within the case company and it will provide fairly accurate estimate of the profitability figures of the recently sold small projects.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Leveraging cloud services, companies and organizations can significantly improve their efficiency, as well as building novel business opportunities. Cloud computing offers various advantages to companies while having some risks for them too. Advantages offered by service providers are mostly about efficiency and reliability while risks of cloud computing are mostly about security problems. Problems with security of the cloud still demand significant attention in order to tackle the potential problems. Security problems in the cloud as security problems in any area of computing, can not be fully tackled. However creating novel and new solutions can be used by service providers to mitigate the potential threats to a large extent. Looking at the security problem from a very high perspective, there are two focus directions. Security problems that threaten service user’s security and privacy are at one side. On the other hand, security problems that threaten service provider’s security and privacy are on the other side. Both kinds of threats should mostly be detected and mitigated by service providers. Looking a bit closer to the problem, mitigating security problems that target providers can protect both service provider and the user. However, the focus of research community mostly is to provide solutions to protect cloud users. A significant research effort has been put in protecting cloud tenants against external attacks. However, attacks that are originated from elastic, on-demand and legitimate cloud resources should still be considered seriously. The cloud-based botnet or botcloud is one of the prevalent cases of cloud resource misuses. Unfortunately, some of the cloud’s essential characteristics enable criminals to form reliable and low cost botclouds in a short time. In this paper, we present a system that helps to detect distributed infected Virtual Machines (VMs) acting as elements of botclouds. Based on a set of botnet related system level symptoms, our system groups VMs. Grouping VMs helps to separate infected VMs from others and narrows down the target group under inspection. Our system takes advantages of Virtual Machine Introspection (VMI) and data mining techniques.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Kilpailuetua tavoittelevan yrityksen pitää kyetä jalostamaan tietoa ja tunnistamaan sen avulla uusia tulevaisuuden mahdollisuuksia. Tulevaisuuden mielikuvien luomiseksi yrityksen on tunnettava toimintaympäristönsä ja olla herkkänä havaitsemaan muutostrendit ja muut toimintaympäristön signaalit. Ympäristön elintärkeät signaalit liittyvät kilpailijoihin, teknologian kehittymiseen, arvomaailman muutoksiin, globaaleihin väestötrendeihin tai jopa ympäristön muutoksiin. Spatiaaliset suhteet ovat peruspilareita käsitteellistää maailmaamme. Pitney (2015) on arvioinut, että 80 % kaikesta bisnesdatasta sisältää jollakin tavoin viittauksia paikkatietoon. Siitä huolimatta paikkatietoa on vielä huonosti hyödynnetty yritysten strategisten päätösten tukena. Teknologioiden kehittyminen, tiedon nopea siirto ja paikannustekniikoiden integroiminen eri laitteisiin ovat mahdollistaneet sen, että paikkatietoa hyödyntäviä palveluja ja ratkaisuja tullaan yhä enemmän näkemään yrityskentässä. Tutkimuksen tavoitteena oli selvittää voiko location intelligence toimia strategisen päätöksenteon tukena ja jos voi, niin miten. Työ toteutettiin konstruktiivista tutkimusmenetelmää käyttäen, jolla pyritään ratkaisemaan jokin relevantti ongelma. Konstruktiivinen tutkimus tehtiin tiiviissä yhteistyössä kolmen pk-yrityksen kanssa ja siihen haastateltiin kuutta eri strategiasta vastaavaa henkilöä. Tutkimuksen tuloksena löydettiin, että location intelligenceä voidaan hyödyntää strategisen päätöksenteon tukena usealla eri tasolla. Yksinkertaisimmassa karttaratkaisussa halutut tiedot tuodaan kartalle ja luodaan visuaalinen esitys, jonka avulla johtopäätöksien tekeminen helpottuu. Toisen tason karttaratkaisu pitää sisällään sekä sijainti- että ominaisuustietoa, jota on yhdistetty eri lähteistä. Tämä toisen tason karttaratkaisu on usein kuvailevaa analytiikkaa, joka mahdollistaa erilaisten ilmiöiden analysoinnin. Kolmannen eli ylimmän tason karttaratkaisu tarjoaa ennakoivaa analytiikkaa ja malleja tulevaisuudesta. Tällöin ohjelmaan koodataan älykkyyttä, jossa informaation keskinäisiä suhteita on määritelty joko tiedon louhintaa tai tilastollisia analyysejä hyödyntäen. Tutkimuksen johtopäätöksenä voidaan todeta, että location intelligence pystyy tarjoamaan lisäarvoa strategisen päätöksenteon tueksi, mikäli yritykselle on hyödyllistä ymmärtää eri ilmiöiden, asiakastarpeiden, kilpailijoiden ja markkinamuutoksien maantieteellisiä eroavaisuuksia. Parhaimmillaan location intelligence -ratkaisu tarjoaa luotettavan analyysin, jossa tieto välittyy muuttumattomana päätöksentekijältä toiselle ja johtopäätökseen johtaneita syitä on mahdollista palata tarkastelemaan tarvittaessa uudelleen.