118 resultados para Process analysis
Resumo:
The main objective of the study was to form a strategic process model and project management tool to help IFRS change implementation projects in the future. These research results were designed based on the theoretical framework of Total Quality Management and leaning on the facts that were collected during the empirical case study of IAS 17 change. The us-age of the process oriented approach in IFRS standard change implementation after the initial IFRS implementation is rationalized with the following arguments: 1) well designed process tools lead to optimization of resources 2) With the help of process stages and related tasks it is easy to ensure the efficient way of working and managing the project as well as make sure to include all necessary stakeholders to the change process. This research is following the qualitative approach and the analysis is in describing format. The first part of the study is a literature review and the latter part has been conducted as a case study. The data has been col-lected in the case company with interviews and observation. The main findings are a process model for IFRS standard change process and a check-list formatted management tool for up-coming IFRS standard change projects. The process flow follows the main cornerstones in IASB’s standard setting process and the management tool has been divided to stages accordingly.
Resumo:
Preparative liquid chromatography is one of the most selective separation techniques in the fine chemical, pharmaceutical, and food industries. Several process concepts have been developed and applied for improving the performance of classical batch chromatography. The most powerful approaches include various single-column recycling schemes, counter-current and cross-current multi-column setups, and hybrid processes where chromatography is coupled with other unit operations such as crystallization, chemical reactor, and/or solvent removal unit. To fully utilize the potential of stand-alone and integrated chromatographic processes, efficient methods for selecting the best process alternative as well as optimal operating conditions are needed. In this thesis, a unified method is developed for analysis and design of the following singlecolumn fixed bed processes and corresponding cross-current schemes: (1) batch chromatography, (2) batch chromatography with an integrated solvent removal unit, (3) mixed-recycle steady state recycling chromatography (SSR), and (4) mixed-recycle steady state recycling chromatography with solvent removal from fresh feed, recycle fraction, or column feed (SSR–SR). The method is based on the equilibrium theory of chromatography with an assumption of negligible mass transfer resistance and axial dispersion. The design criteria are given in general, dimensionless form that is formally analogous to that applied widely in the so called triangle theory of counter-current multi-column chromatography. Analytical design equations are derived for binary systems that follow competitive Langmuir adsorption isotherm model. For this purpose, the existing analytic solution of the ideal model of chromatography for binary Langmuir mixtures is completed by deriving missing explicit equations for the height and location of the pure first component shock in the case of a small feed pulse. It is thus shown that the entire chromatographic cycle at the column outlet can be expressed in closed-form. The developed design method allows predicting the feasible range of operating parameters that lead to desired product purities. It can be applied for the calculation of first estimates of optimal operating conditions, the analysis of process robustness, and the early-stage evaluation of different process alternatives. The design method is utilized to analyse the possibility to enhance the performance of conventional SSR chromatography by integrating it with a solvent removal unit. It is shown that the amount of fresh feed processed during a chromatographic cycle and thus the productivity of SSR process can be improved by removing solvent. The maximum solvent removal capacity depends on the location of the solvent removal unit and the physical solvent removal constraints, such as solubility, viscosity, and/or osmotic pressure limits. Usually, the most flexible option is to remove solvent from the column feed. Applicability of the equilibrium design for real, non-ideal separation problems is evaluated by means of numerical simulations. Due to assumption of infinite column efficiency, the developed design method is most applicable for high performance systems where thermodynamic effects are predominant, while significant deviations are observed under highly non-ideal conditions. The findings based on the equilibrium theory are applied to develop a shortcut approach for the design of chromatographic separation processes under strongly non-ideal conditions with significant dispersive effects. The method is based on a simple procedure applied to a single conventional chromatogram. Applicability of the approach for the design of batch and counter-current simulated moving bed processes is evaluated with case studies. It is shown that the shortcut approach works the better the higher the column efficiency and the lower the purity constraints are.
Resumo:
The aim of this thesis was to develop the category planning process in the case company operating in construction industry. As the interest in the field of research has just recently started to emerge towards the benefits of category management and planning, the theoretical background was derived from literature of subjects with a relation to category planning i.e. procurement strategy, purchasing portfolio model, information flow management and cost analysis. The background for the development of category planning process was derived from retail industry, to where the category planning is more researched. The empirical study was executed with mixed method approach: quantitative data of the categories was analyzed and qualitative data was gathered through semi-structured interview and discussions within the case company. As a result, the category planning process was critically analyzed and development proposals addressed for improving the process description. Additionally a tool was developed based on the empirical study to support the category planning process of the case company.
Resumo:
The target of this thesis is to evaluate a bid, project and resource management IT tool for service delivery process via proof-of-concept (POC) project to assess, if the tested software is an appropriate tool for the Case Company’s business requirements. Literature suggests that IT projects implementation is still a grey area in scientific research. Also, IT projects have a notably high rate of failure, one significant reason for this being insufficient planning. To tackle this risk, the Case Company decided to perform a POC project, which involved a hands-on testing period of the assessed system. End users from the business side feel that current, highly tailored project management tool is inflexible, difficult to use, and sets unnecessary limitations for the business. Semi-structured interviews and a survey form are used to collect information about current business practices and business requirements related to the IT tool. For the POC project, a project group involving members from each of the Case Company’s four business divisions was established to perform the hands-on testing. Based on data acquired during the interviews and the hands-on testing period, a target state was defined and a gap analysis was carried out by comparing the features provided by the current tool and the tested tool to the target state, which are, together with the current state description, the most important result of the thesis.
Resumo:
Laser additive manufacturing (LAM), known also as 3D printing, is a powder bed fusion (PBF) type of additive manufacturing (AM) technology used to manufacture metal parts layer by layer by assist of laser beam. The development of the technology from building just prototype parts to functional parts is due to design flexibility. And also possibility to manufacture tailored and optimised components in terms of performance and strength to weight ratio of final parts. The study of energy and raw material consumption in LAM is essential as it might facilitate the adoption and usage of the technique in manufacturing industries. The objective this thesis was find the impact of LAM on environmental and economic aspects and to conduct life cycle inventory of CNC machining and LAM in terms of energy and raw material consumption at production phases. Literature overview in this thesis include sustainability issues in manufacturing industries with focus on environmental and economic aspects. Also life cycle assessment and its applicability in manufacturing industry were studied. UPLCI-CO2PE! Initiative was identified as mostly applied exiting methodology to conduct LCI analysis in discrete manufacturing process like LAM. Many of the reviewed literature had focused to PBF of polymeric material and only few had considered metallic materials. The studies that had included metallic materials had only measured input and output energy or materials of the process and compared to different AM systems without comparing to any competitive process. Neither did any include effect of process variation when building metallic parts with LAM. Experimental testing were carried out to make dissimilar samples with CNC machining and LAM in this thesis. Test samples were designed to include part complexity and weight reductions. PUMA 2500Y lathe machine was used in the CNC machining whereas a modified research machine representing EOSINT M-series was used for the LAM. The raw material used for making the test pieces were stainless steel 316L bar (CNC machined parts) and stainless steel 316L powder (LAM built parts). An analysis of power, time, and the energy consumed in each of the manufacturing processes on production phase showed that LAM utilises more energy than CNC machining. The high energy consumption was as result of duration of production. Energy consumption profiles in CNC machining showed fluctuations with high and low power ranges. LAM energy usage within specific mode (standby, heating, process, sawing) remained relatively constant through the production. CNC machining was limited in terms of manufacturing freedom as it was not possible to manufacture all the designed sample by machining. And the one which was possible was aided with large amount of material removed as waste. Planning phase in LAM was shorter than in CNC machining as the latter required many preparation steps. Specific energy consumption (SEC) were estimated in LAM based on the practical results and assumed platform utilisation. The estimated platform utilisation showed SEC could reduce when more parts were placed in one build than it was in with the empirical results in this thesis (six parts).
Resumo:
Process management refers to improving the key functions of a company. The main functions of the case company - project management, procurement, finance, and human resource - use their own separate systems. The case company is in the process of changing its software. Different functions will use the same system in the future. This software change causes changes in some of the company’s processes. Project cash flow forecasting process is one of the changing processes. Cash flow forecasting ensures the sufficiency of money and prepares for possible changes in the future. This will help to ensure the company’s viability. The purpose of the research is to describe a new project cash flow forecasting process. In addition, the aim is to analyze the impacts of the process change, with regard to the project control department’s workload and resources through the process measurement, and how the impacts take the department’s future operations into account. The research is based on process management. Processes, their descriptions, and the way the process management uses the information, are discussed in the theory part of this research. The theory part is based on literature and articles. Project cash flow and forecasting-related benefits are also discussed. After this, the project cash flow forecasting as-is and to-be processes are described by utilizing information, obtained from the theoretical part, as well as the know-how of the project control department’s personnel. Written descriptions and cross-functional flowcharts are used for descriptions. Process measurement is based on interviews with the personnel – mainly cost controllers and department managers. The process change and the integration of two processes will allow work time for other things, for example, analysis of costs. In addition to the quality of the cash flow information will improve compared to the as-is process. Analyzing the department’s other main processes, department’s roles, and their responsibilities should be checked and redesigned. This way, there will be an opportunity to achieve the best possible efficiency and cost savings.
Resumo:
The aim of this Master’s Thesis is to find applicable methods from process management literature for improving reporting and internal control in a multinational corporation. The method of analysis is qualitative and the research is conducted as a case study. Empirical data collection is carried out through interviews and participating observation. The theoretical framework is built around reporting and guidance between parent company and subsidiary, searching for means to improve them from process thinking and applicable frameworks. In the thesis, the process of intercompany reporting in the case company is modelled, and its weak points, risks, and development targets are identified. The framework of critical success factors in process improvement is utilized in assessing the development targets. Also internal control is analyzed with the tools of process thinking. As a result of this thesis, suggestions for actions improving the reporting process and internal control are made to the case company, the most essential of which are ensuring top management’s awareness and commitment to improvement, creating guidelines and tools for internal control and creating and implementing improved intercompany reporting process.
Resumo:
The main objective of this study was to find out the bases for innovation model formulation in an existing organization based on cases. Innovation processes can be analyzed based on their needs and based on their emphasis on the business model development or R&D. The research was conducted in energy sector within one company by utilizing its projects as cases for the study. It is typical for the field of business that development is slow, although the case company has put emphasis on its innovation efforts. Analysis was done by identifying the cases’ needs and comparing them. The results were that because of the variances in the needs of the cases, the applicability of innovation process models varies. It was discovered that by dividing the process into two phases, a uniform model could be composed. This model would fulfill the needs of the cases and potential future projects as well.
Resumo:
The shift towards a knowledge-based economy has inevitably prompted the evolution of patent exploitation. Nowadays, patent is more than just a prevention tool for a company to block its competitors from developing rival technologies, but lies at the very heart of its strategy for value creation and is therefore strategically exploited for economic pro t and competitive advantage. Along with the evolution of patent exploitation, the demand for reliable and systematic patent valuation has also reached an unprecedented level. However, most of the quantitative approaches in use to assess patent could arguably fall into four categories and they are based solely on the conventional discounted cash flow analysis, whose usability and reliability in the context of patent valuation are greatly limited by five practical issues: the market illiquidity, the poor data availability, discriminatory cash-flow estimations, and its incapability to account for changing risk and managerial flexibility. This dissertation attempts to overcome these impeding barriers by rationalizing the use of two techniques, namely fuzzy set theory (aiming at the first three issues) and real option analysis (aiming at the last two). It commences with an investigation into the nature of the uncertainties inherent in patent cash flow estimation and claims that two levels of uncertainties must be properly accounted for. Further investigation reveals that both levels of uncertainties fall under the categorization of subjective uncertainty, which differs from objective uncertainty originating from inherent randomness in that uncertainties labelled as subjective are highly related to the behavioural aspects of decision making and are usually witnessed whenever human judgement, evaluation or reasoning is crucial to the system under consideration and there exists a lack of complete knowledge on its variables. Having clarified their nature, the application of fuzzy set theory in modelling patent-related uncertain quantities is effortlessly justified. The application of real option analysis to patent valuation is prompted by the fact that both patent application process and the subsequent patent exploitation (or commercialization) are subject to a wide range of decisions at multiple successive stages. In other words, both patent applicants and patentees are faced with a large variety of courses of action as to how their patent applications and granted patents can be managed. Since they have the right to run their projects actively, this flexibility has value and thus must be properly accounted for. Accordingly, an explicit identification of the types of managerial flexibility inherent in patent-related decision making problems and in patent valuation, and a discussion on how they could be interpreted in terms of real options are provided in this dissertation. Additionally, the use of the proposed techniques in practical applications is demonstrated by three fuzzy real option analysis based models. In particular, the pay-of method and the extended fuzzy Black-Scholes model are employed to investigate the profitability of a patent application project for a new process for the preparation of a gypsum-fibre composite and to justify the subsequent patent commercialization decision, respectively; a fuzzy binomial model is designed to reveal the economic potential of a patent licensing opportunity.
Resumo:
The goal of this thesis is to study user-driven innovations and user involvement throughout the innovation process in context of B2B companies. Significant emphasis in the analysis put onto the late stages of innovation process and commercialization of innovations. Thesis includes detailed review of theoretical concepts and underlying frameworks of innovation process, lead users and user-driven innovations. The empirical part of the thesis consist of interviews of the four companies from ICT industry, followed by the comprehensive analysis and comparison of the results. The presented findings indicate common challenges, which ICT companies face, when shifting towards innovation by users paradigm. Linkages and connections among current situation and theoretical frameworks presented in the discussion part of the thesis allow to draw practical managerial implications. The results of the research emphasize valuable insights and challenges of user interactions within innovation process as well as output and participation related benefits for the companies and users. The research points out current state of the user involvement techniques and tools used for user interactions as well as suggests the possibilities for improvement in the future.
Resumo:
The aim of this research was to develop a piping stress analysis guideline to be widely used in Neste Jacobs Oy’s domestic and foreign projects. The company’s former guideline to performing stress analysis was partial and lacked important features, which were to be fixed through this research. The development of the guideline was based on literature research and gathering of existing knowledge from the experts in piping engineering. Case study method was utilized by performing stress analysis on an existing project with help of the new guideline. Piping components, piping engineering in process industry, and piping stress analysis were studied in the theory section of this research. Also, the existing piping standards were studied and compared with one another. By utilizing the theory found in literature and the vast experience and know-how collected from the company’s employees, a new guideline for stress analysis was developed. The guideline would be widely used in various projects. The purpose of the guideline was to clarify certain issues such as which of the piping would have to be analyzed, how are different material values determined and how will the results be reported. As a result, an extensive and comprehensive guideline for stress analysis was created. The new guideline more clearly defines formerly unclear points and creates clear parameters to performing calculations. The guideline is meant to be used by both new and experienced analysts and with its aid, the calculation process was unified throughout the whole company’s organization. Case study was used to exhibit how the guideline is utilized in practice, and how it benefits the calculation process.
Resumo:
This thesis introduces heat demand forecasting models which are generated by using data mining algorithms. The forecast spans one full day and this forecast can be used in regulating heat consumption of buildings. For training the data mining models, two years of heat consumption data from a case building and weather measurement data from Finnish Meteorological Institute are used. The thesis utilizes Microsoft SQL Server Analysis Services data mining tools in generating the data mining models and CRISP-DM process framework to implement the research. Results show that the built models can predict heat demand at best with mean average percentage errors of 3.8% for 24-h profile and 5.9% for full day. A deployment model for integrating the generated data mining models into an existing building energy management system is also discussed.
Resumo:
Tämä diplomityö arvioi hitsauksen laadunhallintaohjelmistomarkkinoiden kilpailijoita. Kilpailukenttä on uusi ja ei ole tarkkaa tietoa siitä minkälaisia kilpailijoita on markkinoilla. Hitsauksen laadunhallintaohjelmisto auttaa yrityksiä takaamaan korkean laadun. Ohjelmisto takaa korkean laadun varmistamalla, että hitsaaja on pätevä, hän noudattaa hitsausohjeita ja annettuja parametreja. Sen lisäksi ohjelmisto kerää kaiken tiedon hitsausprosessista ja luo siitä vaadittavat dokumentit. Diplomityön teoriaosuus muodostuu kirjallisuuskatsauksesta ratkaisuliike-toimintaan, kilpailija-analyysin ja kilpailuvoimien teoriaan sekä hitsauksen laadunhallintaan. Työn empiriaosuus on laadullinen tutkimus, jossa tutkitaan kilpailevia hitsauksen laadunhallintaohjelmistoja ja haastatellaan ohjelmistojen käyttäjiä. Diplomityön tuloksena saadaan uusi kilpailija-analyysimalli hitsauksen laadunhallintaohjelmistoille. Mallin avulla voidaan arvostella ohjelmistot niiden tarjoamien primääri- ja sekundääriominaisuuksien perusteella. Toiseksi tässä diplomityössä analysoidaan nykyinen kilpailijatilanne hyödyntämällä juuri kehitettyä kilpailija-analyysimallia.
Resumo:
An exchange traded fund (ETF) is a financial instrument that tracks some predetermined index. Since their initial establishment in 1993, ETFs have grown in importance in the field of passive investing. The main reason for the growth of the ETF industry is that ETFs combine benefits of stock investing and mutual fund investing. Although ETFs resemble mutual funds in many ways, also many differences occur. In addition, ETFs not only differ from mutual funds but also differ among each other. ETFs can be divided into two categories, i.e. market capitalisation ETFs and fundamental (or strategic) ETFs, and further into subcategories depending on their fundament basis. ETFs are a useful tool for diversification especially for a long-term investor. Although the economic importance of ETFs has risen drastically during the past 25 years, the differences and risk-return characteristics of fundamental ETFs have yet been rather unstudied area. In effect, no previous research on market capitalisation and fundamental ETFs was found during the research process. For its part, this thesis seeks to fill this research gap. The studied data consist of 50 market capitalisation ETFs and 50 fundamental ETFs. The fundaments, on which the indices that the fundamental ETFs track, were not limited nor segregated into subsections. The two types of ETFs were studied at an aggregate level as two different research groups. The dataset ranges from June 2006 to December 2014 with 103 monthly observations. The data was gathered using Bloomberg Terminal. The analysis was conducted as an econometric performance analysis. In addition to other econometric measures, the methods that were used in the performance analysis included modified Value-at-Risk, modified Sharpe ratio and Treynor ratio. The results supported the hypothesis that passive market capitalisation ETFs outperform active fundamental ETFs in terms of risk-adjusted returns, though the difference is rather small. Nevertheless, when taking into account the higher overall trading costs of the fundamental ETFs, the underperformance gap widens. According to the research results, market capitalisation ETFs are a recommendable diversification instrument for a long-term investor. In addition to better risk-adjusted returns, passive ETFs are more transparent and the bases of their underlying indices are simpler than those of fundamental ETFs. ETFs are still a young financial innovation and hence data is scarcely available. On future research, it would be valuable to research the differences in risk-adjusted returns also between the subsections of fundamental ETFs.
Resumo:
The recent rapid development of biotechnological approaches has enabled the production of large whole genome level biological data sets. In order to handle thesedata sets, reliable and efficient automated tools and methods for data processingand result interpretation are required. Bioinformatics, as the field of studying andprocessing biological data, tries to answer this need by combining methods and approaches across computer science, statistics, mathematics and engineering to studyand process biological data. The need is also increasing for tools that can be used by the biological researchers themselves who may not have a strong statistical or computational background, which requires creating tools and pipelines with intuitive user interfaces, robust analysis workflows and strong emphasis on result reportingand visualization. Within this thesis, several data analysis tools and methods have been developed for analyzing high-throughput biological data sets. These approaches, coveringseveral aspects of high-throughput data analysis, are specifically aimed for gene expression and genotyping data although in principle they are suitable for analyzing other data types as well. Coherent handling of the data across the various data analysis steps is highly important in order to ensure robust and reliable results. Thus,robust data analysis workflows are also described, putting the developed tools andmethods into a wider context. The choice of the correct analysis method may also depend on the properties of the specific data setandthereforeguidelinesforchoosing an optimal method are given. The data analysis tools, methods and workflows developed within this thesis have been applied to several research studies, of which two representative examplesare included in the thesis. The first study focuses on spermatogenesis in murinetestis and the second one examines cell lineage specification in mouse embryonicstem cells.