15 resultados para LONGITUDINAL DATA-ANALYSIS

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays the used fuel variety in power boilers is widening and new boiler constructions and running models have to be developed. This research and development is done in small pilot plants where more faster analyse about the boiler mass and heat balance is needed to be able to find and do the right decisions already during the test run. The barrier on determining boiler balance during test runs is the long process of chemical analyses of collected input and outputmatter samples. The present work is concentrating on finding a way to determinethe boiler balance without chemical analyses and optimise the test rig to get the best possible accuracy for heat and mass balance of the boiler. The purpose of this work was to create an automatic boiler balance calculation method for 4 MW CFB/BFB pilot boiler of Kvaerner Pulping Oy located in Messukylä in Tampere. The calculation was created in the data management computer of pilot plants automation system. The calculation is made in Microsoft Excel environment, which gives a good base and functions for handling large databases and calculations without any delicate programming. The automation system in pilot plant was reconstructed und updated by Metso Automation Oy during year 2001 and the new system MetsoDNA has good data management properties, which is necessary for big calculations as boiler balance calculation. Two possible methods for calculating boiler balance during test run were found. Either the fuel flow is determined, which is usedto calculate the boiler's mass balance, or the unburned carbon loss is estimated and the mass balance of the boiler is calculated on the basis of boiler's heat balance. Both of the methods have their own weaknesses, so they were constructed parallel in the calculation and the decision of the used method was left to user. User also needs to define the used fuels and some solid mass flowsthat aren't measured automatically by the automation system. With sensitivity analysis was found that the most essential values for accurate boiler balance determination are flue gas oxygen content, the boiler's measured heat output and lower heating value of the fuel. The theoretical part of this work concentrates in the error management of these measurements and analyses and on measurement accuracy and boiler balance calculation in theory. The empirical part of this work concentrates on the creation of the balance calculation for the boiler in issue and on describing the work environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Identification of low-dimensional structures and main sources of variation from multivariate data are fundamental tasks in data analysis. Many methods aimed at these tasks involve solution of an optimization problem. Thus, the objective of this thesis is to develop computationally efficient and theoretically justified methods for solving such problems. Most of the thesis is based on a statistical model, where ridges of the density estimated from the data are considered as relevant features. Finding ridges, that are generalized maxima, necessitates development of advanced optimization methods. An efficient and convergent trust region Newton method for projecting a point onto a ridge of the underlying density is developed for this purpose. The method is utilized in a differential equation-based approach for tracing ridges and computing projection coordinates along them. The density estimation is done nonparametrically by using Gaussian kernels. This allows application of ridge-based methods with only mild assumptions on the underlying structure of the data. The statistical model and the ridge finding methods are adapted to two different applications. The first one is extraction of curvilinear structures from noisy data mixed with background clutter. The second one is a novel nonlinear generalization of principal component analysis (PCA) and its extension to time series data. The methods have a wide range of potential applications, where most of the earlier approaches are inadequate. Examples include identification of faults from seismic data and identification of filaments from cosmological data. Applicability of the nonlinear PCA to climate analysis and reconstruction of periodic patterns from noisy time series data are also demonstrated. Other contributions of the thesis include development of an efficient semidefinite optimization method for embedding graphs into the Euclidean space. The method produces structure-preserving embeddings that maximize interpoint distances. It is primarily developed for dimensionality reduction, but has also potential applications in graph theory and various areas of physics, chemistry and engineering. Asymptotic behaviour of ridges and maxima of Gaussian kernel densities is also investigated when the kernel bandwidth approaches infinity. The results are applied to the nonlinear PCA and to finding significant maxima of such densities, which is a typical problem in visual object tracking.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The recent rapid development of biotechnological approaches has enabled the production of large whole genome level biological data sets. In order to handle thesedata sets, reliable and efficient automated tools and methods for data processingand result interpretation are required. Bioinformatics, as the field of studying andprocessing biological data, tries to answer this need by combining methods and approaches across computer science, statistics, mathematics and engineering to studyand process biological data. The need is also increasing for tools that can be used by the biological researchers themselves who may not have a strong statistical or computational background, which requires creating tools and pipelines with intuitive user interfaces, robust analysis workflows and strong emphasis on result reportingand visualization. Within this thesis, several data analysis tools and methods have been developed for analyzing high-throughput biological data sets. These approaches, coveringseveral aspects of high-throughput data analysis, are specifically aimed for gene expression and genotyping data although in principle they are suitable for analyzing other data types as well. Coherent handling of the data across the various data analysis steps is highly important in order to ensure robust and reliable results. Thus,robust data analysis workflows are also described, putting the developed tools andmethods into a wider context. The choice of the correct analysis method may also depend on the properties of the specific data setandthereforeguidelinesforchoosing an optimal method are given. The data analysis tools, methods and workflows developed within this thesis have been applied to several research studies, of which two representative examplesare included in the thesis. The first study focuses on spermatogenesis in murinetestis and the second one examines cell lineage specification in mouse embryonicstem cells.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research concerns the Urban Living Idea Contest conducted by Creator Space™ of BASF SE during its 150th anniversary in 2015. The main objectives of the thesis are to provide a comprehensive analysis of the Urban Living Idea Contest (ULIC) and propose a number of improvement suggestions for future years. More than 4,000 data points were collected and analyzed to investigate the functionality of different elements of the contest. Furthermore, a set of improvement suggestions were proposed to BASF SE. Novelty of this thesis lies in the data collection and the original analysis of the contest, which identified its critical elements, as well as the areas that could be improved. The author of this research was a member of the organizing team and involved in the decision making process from the beginning until the end of the ULIC.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Tässä johtaminen ja organisaatiot -oppialaan kuuluvassa väitöstutkimuksessa selvitetään, miten asiakasomisteisten osuuskuntien johtajat sekä hallintohenkilöt vastaavat organisaatioidensa kohtaamiin institutionaalisiin ja kilpailullisiin paineisiin sekä kuinka he pyrkivät vaikuttamaan organisaatiokenttänsä instituutioihin yritystensä kilpailuaseman parantamiseksi pankkisektorilla. Asiakasomisteisten osuuskuntien päätöksentekoa tarkastellaan kolmen kriittisen tapauksen kautta. Tapauksia analysoidaan hyödyntäen institutionaalista organisaatioteoriaa. Kyseisen teoriapohjan soveltaminen strategisen päätöksenteon tutkimiseksi antaa mahdollisuuden tarkastella asiakasomisteisten osuuskuntien eri toimijoiden ja instituutioiden välistä dialogia aiempaa laajemmin. Pitkittäisaineisto (v. 1939 - 2005) koostuu kaikkiaan 57 haastattelusta sekä laaja-alaisesta historiallisesta arkistomateriaalista. Työn keskeinen kontribuutio on asiakasomisteisten osuuskuntien päätöksenteon kytkeminen institutionaaliseen organisaatioteoriaan, erityisesti legitimiteettiin sekä instituutioihin vaikuttamiseen liittyviin kysymyksiin. Työssä esitetään väite, jonka mukaan pankkisektorilla ryhmänä toimiva asiakasomisteinen osuuskunta hakee legitimiteettiä organisaatiokentän lisäksi paikallisyhteisöstä, mikä tuottaa jännitteen liiketoiminnan harjoittamiseen sekä strategiseen johtamiseen.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The main objective of this dissertation is to create new knowledge on an administrative innovation, its adoption, diffusion and finally its effectiveness. In this dissertation the administrative innovation is approached through a widely utilized management philosophy, namely the total quality management (TQM) strategy. TQM operationalizes a self-assessment procedure, which is based on continual improvement principles and measuring the improvements. This dissertation also captures the theme of change management as it analyzes the adoption and diffusion of the administrative innovation. It identifies innovation characteristics as well as organisational and individual factors explaining the adoption and implementation. As a special feature, this study also explores the effectiveness of the innovation based on objective data. For studying the administrative innovation (TQM model), a multinational Case Company provides a versatile ground for a deep, longitudinal analysis. The Case Company started the adoption systematically in the mid 1980s in some of its units. As part of their strategic planning today, the procedure is in use throughout the entire global company. The empirical story begins from the innovation adoption decision that was made in the Case Company over 22 years ago. In order to be able to capture the right atmosphere and backgrounds leading to the adoption decision, key informants from that time were interviewed, since the main target was to clarify the dynamics of how an administrative innovation develops. In addition, archival material was collected and studied, available memos and data relating to the innovation, innovation adoption and later to the implementation contained altogether 20500 pages of documents. A survey was furthermore conducted at the end of 2006 focusing on questions related to the innovation, organization and leadership characteristics and the response rate totalled up to 54%. For measuring the effectiveness of the innovation implementation, the needed longitudinal objective performance data was collected. This data included the profit unit level experience of TQM, the development of the self assessment scores per profit unit and performance data per profit unit measured with profitability, productivity and customer satisfaction. The data covered the years 1995-2006. As a result, the prerequisites for the successful adoption of an administrative innovation were defined, such as the top management involvement, support of the change agents and effective tools for implementation and measurement. The factors with the greatest effect on the depth of the implementation were the timing of the adoption and formalization. The results also indicated that the TQM model does have an effect on the company performance measured with profitability, productivity and customer satisfaction. Consequently this thesis contributes to the present literature (i) by taking into its scope an administrative innovation and focusing on the whole innovation implementation process, from the adoption, through diffusion until its consequences, (ii) because the studied factors with an effect on the innovation adoption and diffusion are multifaceted and grouped into individual, organizational and environmental factors, and a strong emphasis is put on the role of the individual change agents and (iii) by measuring the depth and consistency of the administrative innovation. This deep analysis was possible due to the availability of longitudinal data with triangulation possibilities.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Recent years have produced great advances in the instrumentation technology. The amount of available data has been increasing due to the simplicity, speed and accuracy of current spectroscopic instruments. Most of these data are, however, meaningless without a proper analysis. This has been one of the reasons for the overgrowing success of multivariate handling of such data. Industrial data is commonly not designed data; in other words, there is no exact experimental design, but rather the data have been collected as a routine procedure during an industrial process. This makes certain demands on the multivariate modeling, as the selection of samples and variables can have an enormous effect. Common approaches in the modeling of industrial data are PCA (principal component analysis) and PLS (projection to latent structures or partial least squares) but there are also other methods that should be considered. The more advanced methods include multi block modeling and nonlinear modeling. In this thesis it is shown that the results of data analysis vary according to the modeling approach used, thus making the selection of the modeling approach dependent on the purpose of the model. If the model is intended to provide accurate predictions, the approach should be different than in the case where the purpose of modeling is mostly to obtain information about the variables and the process. For industrial applicability it is essential that the methods are robust and sufficiently simple to apply. In this way the methods and the results can be compared and an approach selected that is suitable for the intended purpose. Differences in data analysis methods are compared with data from different fields of industry in this thesis. In the first two papers, the multi block method is considered for data originating from the oil and fertilizer industries. The results are compared to those from PLS and priority PLS. The third paper considers applicability of multivariate models to process control for a reactive crystallization process. In the fourth paper, nonlinear modeling is examined with a data set from the oil industry. The response has a nonlinear relation to the descriptor matrix, and the results are compared between linear modeling, polynomial PLS and nonlinear modeling using nonlinear score vectors.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Raw measurement data does not always immediately convey useful information, but applying mathematical statistical analysis tools into measurement data can improve the situation. Data analysis can offer benefits like acquiring meaningful insight from the dataset, basing critical decisions on the findings, and ruling out human bias through proper statistical treatment. In this thesis we analyze data from an industrial mineral processing plant with the aim of studying the possibility of forecasting the quality of the final product, given by one variable, with a model based on the other variables. For the study mathematical tools like Qlucore Omics Explorer (QOE) and Sparse Bayesian regression (SB) are used. Later on, linear regression is used to build a model based on a subset of variables that seem to have most significant weights in the SB model. The results obtained from QOE show that the variable representing the desired final product does not correlate with other variables. For SB and linear regression, the results show that both SB and linear regression models built on 1-day averaged data seriously underestimate the variance of true data, whereas the two models built on 1-month averaged data are reliable and able to explain a larger proportion of variability in the available data, making them suitable for prediction purposes. However, it is concluded that no single model can fit well the whole available dataset and therefore, it is proposed for future work to make piecewise non linear regression models if the same available dataset is used, or the plant to provide another dataset that should be collected in a more systematic fashion than the present data for further analysis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Hoitotyön koulutukseen pyritään valitsemaan alalle soveltuvia, motivoituneita sekä teoreettisissa ja kliinisissä opinnoissa menestyviä opiskelijoita. Tämän seurantatutkimuksen tarkoituksena oli vertailla soveltuvuuskokeella ja kirjallisella kokeella valittujen hoitotyön opiskelijoiden osaamista ja opiskelumotivaatiota. Tutkimuksen tavoitteena oli tehdä tutkimustulosten perusteella hoitotyön koulutuksen opiskelijavalintoihin liittyviä kehittämisehdotuksia. Tutkimuksen kohderyhmänä olivat yhteen ammattikorkeakouluun syksyn 2002 ja syksyn 2004 välisenä aikana hoitotyön koulutukseen kahdella eri valintakoemenetelmällä valitut hoitotyön opiskelijat (N=626) (sairaanhoitotyö, terveydenhoitotyö, kätilötyö). Opiskelijaryhmistä muodostettiin kaksi kohorttia valintakoemenetelmän perusteella: soveltuvuuskoe (VAL1, N=368) ja kirjallinen koe (VAL2, N=258). Seurantatutkimuksen aineisto kerättiin opiskelijoiden opintorekisteristä sekä kahdella strukturoidulla mittarilla, joilla kartoitettiin hoitotyön opiskelijoiden itsearvioitua hoitotyön osaamista (OSAA-mittari) ja opiskelumotivaatiota (MOTI-mittari). Seurantatutkimuksen aineistonkeruu ajoittui opiskelijoiden kolmannelle lukukaudella (1. mittaus, 2004‒2006, VAL1 n=234, VAL2 n=126) ja valmistumisvaiheeseen (2. mittaus, 2006‒2009, VAL1 n=149, VAL2 n=108). Ensimmäisen mittauksen vastausprosentti oli 75,0 % ja toisen mittauksen 92,4 %. Aineistojen analysoinnissa käytettiin pitkittäistutkimukseen soveltuvia monimuuttujamenetelmiä. Kahdella valintakoemenetelmällä valikoitui pienistä eroista huolimatta osaamiseltaan ja opiskelumotivaatioltaan hyvin samanlaisia opiskelijoita. Soveltuvuuskokeella valitut opiskelijat kokivat ryhmän kannustavuuden vahvemmaksi valmistumisvaiheessa kuin kirjallisella kokeella valitut. Kirjallisella kokeella valittujen opiskelijoiden kolmannen lukukauden arvosanoihin perustuva osaaminen oli parempaa kuin soveltuvuuskokeella valittujen opiskelijoiden. Suuntautumisvaihtoehto, hoitoalan työkokemus, peruskoulutus ja hakusija olivat merkittävimmin yhteydessä opiskelijoiden osaamiseen ja opiskelumotivaatioon. Valintakoemenetelmä selitti eniten opiskelijoiden osaamisessa ja opiskelumotivaatiossa ilmenneitä eroja, joskin selitysosuudet jäivät alhaisiksi. Kehittämisehdotukset kohdistuvat valintakoemenetelmien kehittämiseen ja säännölliseen arviointiin sekä alalle motivoituneisuuden määrittelyyn ja mittaamisen kehittämiseen. Jatkotutkimusaiheina ehdotetaan eri valintakoemenetelmien testaamista ja tutkimuksessa käytettyjen mittareiden edelleen kehittämistä.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Workshop at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Studying testis is complex, because the tissue has a very heterogeneous cell composition and its structure changes dynamically during development. In reproductive field, the cell composition is traditionally studied by morphometric methods such as immunohistochemistry and immunofluorescence. These techniques provide accurate quantitative information about cell composition, cell-cell association and localization of the cells of interest. However, the sample preparation, processing, staining and data analysis are laborious and may take several working days. Flow cytometry protocols coupled with DNA stains have played an important role in providing quantitative information of testicular cells populations ex vivo and in vitro studies. Nevertheless, the addition of specific cells markers such as intracellular antibodies would allow the more specific identification of cells of crucial interest during spermatogenesis. For this study, adult rat Sprague-Dawley rats were used for optimization of the flow cytometry protocol. Specific steps within the protocol were optimized to obtain a singlecell suspension representative of the cell composition of the starting material. Fixation and permeabilization procedure were optimized to be compatible with DNA stains and fluorescent intracellular antibodies. Optimization was achieved by quantitative analysis of specific parameters such as recovery of meiotic cells, amount of debris and comparison of the proportions of the various cell populations with already published data. As a result, a new and fast flow cytometry method coupled with DNA stain and intracellular antigen detection was developed. This new technique is suitable for analysis of population behavior and specific cells during postnatal testis development and spermatogenesis in rodents. This rapid protocol recapitulated the known vimentin and γH2AX protein expression patterns during rodent testis ontogenesis. Moreover, the assay was applicable for phenotype characterization of SCRbKO and E2F1KO mouse models.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This doctoral study conducts an empirical analysis of the impact of Word-of-Mouth (WOM) on marketing-relevant outcomes such as attitudes and consumer choice, during a high-involvement and complex service decision. Due to its importance to decisionmaking, WOM has attracted interest from academia and practitioners for decades. Consumers are known to discuss products and services with one another. These discussions help consumers to form an evaluative opinion, as WOM reduces perceived risk, simplifies complexity, and increases the confidence of consumers in decisionmaking. These discussions are also highly impactful as WOM is a trustworthy source of information, since it is independent from the company or brand. In responding to the calls for more research on what happens after WOM information is received, and how it affects marketing-relevant outcomes, this dissertation extends prior WOM literature by investigating how consumers process information in a highinvolvement service domain, in particular higher-education. Further, the dissertation studies how the form of WOM influences consumer choice. The research contributes to WOM and services marketing literature by developing and empirically testing a framework for information processing and studying the long-term effects of WOM. The results of the dissertation are presented in five research publications. The publications are based on longitudinal data. The research leads to the development of a proposed theoretical framework for the processing of WOM, based on theories from social psychology. The framework is specifically focused on service decisions, as it takes into account evaluation difficulty through the complex nature of choice criteria associated with service purchase decisions. Further, other gaps in current WOM literature are taken into account by, for example, examining how the source of WOM and service values affects the processing mechanism. The research also provides implications for managers aiming to trigger favorable WOM through marketing efforts, such as advertising and testimonials. The results provide suggestions on how to design these marketing efforts by taking into account the mechanism through which information is processed, or the form of social influence.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this thesis the process of building a software for transport accessibility analysis is described. The goal was to create a software which is easy to distribute and simple to use for the user without particular background in the field of the geographical data analysis. It was shown that existing tools do not suit for this particular task due to complex interface or significant rendering time. The goal was accomplished by applying modern approaches in the process of building web applications such as maps based on vector tiles, FLUX architecture design pattern and module bundling. It was discovered that vector tiles have considerable advantages over image-based tiles such as faster rendering and real-time styling.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study analyzes a young Finnish micro-sized firm that is attempting to reach internationalization readiness in the pre-internationalization stage. The purpose of this research is to analyze and better understand how a young firm reaches internationalization readiness in the pre-internationalization stage. Small firm internationalization is a vastly researched topic. Little emphasis has been placed on the specific antecedents that help the firm reach internationalization readiness in the pre-internationalization stage. The contribution of this research is thus two-fold. First, the research contributes to known theories of firm internationalization. Second, the research further extends knowledge related to how firms reach internationalization readiness specifically in the pre-internationalization stage. The theoretical background of the research involves the traditional stage theory (Uppsala model), pre-internationalization stage theory, international entrepreneurship theory and dynamic capabilities theory. With the help of these four relevant theories, empirical data was collected. The research method utilized in this study was a qualitative single case study combined with critical realist philosophy. The data analysis of this research was conducted using abduction in order to allow freedom in the analysis of the research findings. The empirical data was collected through semi-structured, face-to-face interviews. The key respondents in this study were the two managers of the case company. The findings of this study revealed four important themes from the case company’s perspective towards reaching internationalization readiness in the pre-internationalization stage. Extensive knowledge of the home market and target market were the two most important themes in this research. The next most relevant theme for reaching internationalization readiness in the pre-internationalization stage was the managers’ previous international business experiences. The final theme affecting the firm’s ability to reach internationalization readiness was the firm’s specific resources. Even though the research findings of this study are case sensitive, the research insights and explanations have the potential to be transferred to similar firms and contexts. Future research should therefore aim towards more longitudinal studies in which the context is emphasized. This should include a variety of firms in similar stages of internationalization and contexts. Future studies of this kind would be of great benefit to academia.