939 resultados para monitoring process mean and variance
Resumo:
Sisälogistiikassa on monissa eri vaiheissa varastointia. Raaka-aineita, keskeneräisiä ja valmiita tuotteita joudutaan varastoimaan yrityksen tiloissa. Varastoinnin laatuun tulee kiinnittää huomiota kaikissa vaiheissa. Varastoinnin laadunhallintaan voidaan käyttää standardin ISO 9000 mukaista laadunhallinta järjestelmää, johon kuuluu osa-alueina johdon vastuu, resurssien hallinta tuotteen toteuttaminen sekä mittaus, analysointi ja parantaminen. Logistiikan ja varastoinnin laadun tärkein mittari on asiakastyytyväisyys. Asiakkaat päättävät loppukädessä onko tuote tai palvelu vaatimusten mukainen, joten asiakkaan määrittelemien vaatimusten täyttäminen on ensiarvoisen tärkeää. Muita varastoinnin mittareita ovat suorituskyvyn mittarit sisältäen laatu- ja kustannusmittarit sekä uudistumiskyky. Työn empiriaosassa on kartoitettu Metso Paper Oy:n Rautpohjan yksikön välivarastoinnin mahdollisia laatupuutteita erilaisilla mittareilla. Tuloksien perusteella on kehitetty toimintamalli, kartoitettu ohjeistusta vaativat tahot sekä ehdotus seurannan järjestämisestä. Työssä on myös tarkasteltu vastuun jakautumista välivarastoinnin osalta.
Resumo:
In liberalized electricity markets, which have taken place in many countries over the world, the electricity distribution companies operate in the competitive conditions. Therefore, accurate information about the customers’ energy consumption plays an essential role for the budget keeping of the distribution company and for correct planning and operation of the distribution network. This master’s thesis is focused on the description of the possible benefits for the electric utilities and residential customers from the automatic meter reading system usage. Major benefits of the AMR, illustrated in the thesis, are distribution network management, power quality monitoring, load modelling, and detection of the illegal usage of the electricity. By the example of the power system state estimation, it was illustrated that even the partial installation of the AMR in the customer side leads to more accurate data about the voltage and power levels in the whole network. The thesis also contains the description of the present situation of the AMR integration in Russia.
Resumo:
Traditionally, researchers have considered the innovation process as being gender neutral. However, recently some studies have begun to take gender diversity into account as a determinant of firms’ innovation. This paper aims to analyse how the effect of gender diversity on innovation output at firm level is sensitive to team size. Using the Spanish PITEC (Panel de Innovación Tecnológica) from 2007 to 2012 for innovative manufacturing and service firms, we estimate a multivariate probit model to analyse how gender diversity both in R&D teams and in the total workforce affect product, process, marketing and organizational innovations. Our results show that gender-diverse teams increase the probability of innovating, and this capacity is positively related team size. Gender diversity, in both the R&D department and the total workforce, has a larger positive impact on the probability of carrying out product and organizational innovations in larger teams than it does in smaller teams. This effect is less clear-cut in the case of marketing and process innovation, where the impact is only significant for micro and small firms. Finally, size effects are of greater importance when we distinguish between the manufacturing and service sectors. JEL Code: O30, O31, J16
Resumo:
Validation and verification operations encounter various challenges in product development process. Requirements for increasing the development cycle pace set new requests for component development process. Verification and validation usually represent the largest activities, up to 40 50 % of R&D resources utilized. This research studies validation and verification as part of case company's component development process. The target is to define framework that can be used in improvement of the validation and verification capability evaluation and development in display module development projects. Validation and verification definition and background is studied in this research. Additionally, theories such as project management, system, organisational learning and causality is studied. Framework and key findings of this research are presented. Feedback system according of the framework is defined and implemented to the case company. This research is divided to the theory and empirical parts. Theory part is conducted in literature review. Empirical part is done in case study. Constructive methode and design research methode are used in this research A framework for capability evaluation and development was defined and developed as result of this research. Key findings of this study were that double loop learning approach with validation and verification V+ model enables defining a feedback reporting solution. Additional results, some minor changes in validation and verification process were proposed. There are a few concerns expressed on the results on validity and reliability of this study. The most important one was the selected research method and the selected model itself. The final state can be normative, the researcher may set study results before the actual study and in the initial state, the researcher may describe expectations for the study. Finally reliability of this study, and validity of this work are studied.
Resumo:
Chemical-looping combustion (CLC) is a novel combustion technology with inherent separation of the greenhouse gas CO2. The technique typically employs a dual fluidized bed system where a metal oxide is used as a solid oxygen carrier that transfers the oxygen from combustion air to the fuel. The oxygen carrier is looping between the air reactor, where it is oxidized by the air, and the fuel reactor, where it is reduced by the fuel. Hence, air is not mixed with the fuel, and outgoing CO2 does not become diluted by the nitrogen, which gives a possibility to collect the CO2 from the flue gases after the water vapor is condensed. CLC is being proposed as a promising and energy efficient carbon capture technology, since it can achieve both an increase in power station efficiency simultaneously with low energy penalty from the carbon capture. The outcome of a comprehensive literature study concerning the current status of CLC development is presented in this thesis. Also, a steady state model of the CLC process, based on the conservation equations of mass and energy, was developed. The model was used to determine the process conditions and to calculate the reactor dimensions of a 100 MWth CLC system with bunsenite (NiO) as oxygen carrier and methane (CH4) as fuel. This study has been made in Oxygen Carriers and Their Industrial Applications research project (2008 – 2011), funded by the Tekes – Functional Material program. I would like to acknowledge Tekes and participating companies for funding and all project partners for good and comfortable cooperation.
Resumo:
Software integration is a stage in a software development process to assemble separate components to produce a single product. It is important to manage the risks involved and being able to integrate smoothly, because software cannot be released without integrating it first. Furthermore, it has been shown that the integration and testing phase can make up 40 % of the overall project costs. These issues can be mitigated by using a software engineering practice called continuous integration. This thesis work presents how continuous integration is introduced to the author's employer organisation. This includes studying how the continuous integration process works and creating the technical basis to start using the process on future projects. The implemented system supports software written in C and C++ programming languages on Linux platform, but the general concepts can be applied to any programming language and platform by selecting the appropriate tools. The results demonstrate in detail what issues need to be solved when the process is acquired in a corporate environment. Additionally, they provide an implementation and process description suitable to the organisation. The results show that continuous integration can reduce the risks involved in a software process and increase the quality of the product as well.
Resumo:
Especially in global enterprises, key data is fragmented in multiple Enterprise Resource Planning (ERP) systems. Thus the data is inconsistent, fragmented and redundant across the various systems. Master Data Management (MDM) is a concept, which creates cross-references between customers, suppliers and business units, and enables corporate hierarchies and structures. The overall goal for MDM is the ability to create an enterprise-wide consistent data model, which enables analyzing and reporting customer and supplier data. The goal of the study was defining the properties and success factors of a master data system. The theoretical background was based on literature and the case consisted of enterprise specific needs and demands. The theoretical part presents the concept, background, and principles of MDM and then the phases of system planning and implementation project. Case consists of background, definition of as is situation, definition of project, evaluation criterions and concludes the key results of the thesis. In the end chapter Conclusions combines common principles with the results of the case. The case part ended up dividing important factors of the system in success factors, technical requirements and business benefits. To clarify the project and find funding for the project, business benefits have to be defined and the realization has to be monitored. The thesis found out six success factors for the MDM system: Well defined business case, data management and monitoring, data models and structures defined and maintained, customer and supplier data governance, delivery and quality, commitment, and continuous communication with business. Technical requirements emerged several times during the thesis and therefore those can’t be ignored in the project. Conclusions chapter goes through these factors on a general level. The success factors and technical requirements are related to the essentials of MDM: Governance, Action and Quality. This chapter could be used as guidance in a master data management project.
Resumo:
Strategic development of distribution networks plays a key role in the asset management in electricity distribution companies. Owing to the capital-intensive nature of the field and longspan operations of companies, the significance of a strategy is emphasised. A well-devised strategy combines awareness of challenges posed by the operating environment and the future targets of the distribution company. Economic regulation, ageing infrastructure, scarcity of resources and tightening supply requirements with challenges created by the climate change put a pressure on the strategy work. On the other hand, technology development related to network automation and underground cabling assists in answering these challenges. This dissertation aims at developing process knowledge and establishing a methodological framework by which key issues related to network development can be addressed. Moreover, the work develops tools by which the effects of changes in the operating environment on the distribution business can be analysed in the strategy work. To this end, the work discusses certain characteristics of the distribution business and describes the strategy process at a principle level. Further, the work defines the subtasks in the strategy process and presents the key elements in the strategy work and long-term network planning. The work delineates the factors having either a direct or indirect effect on strategic planning and development needs in the networks; in particular, outage costs constitute an important part of the economic regulation of the distribution business, reliability being thus a key driver in network planning. The dissertation describes the methodology and tools applied to cost and reliability analyses in the strategy work. The work focuses on determination of the techno-economic feasibility of different network development technologies; these feasibility surveys are linked to the economic regulation model of the distribution business, in particular from the viewpoint of reliability of electricity supply and allowed return. The work introduces the asset management system developed for research purposes and to support the strategy work, the calculation elements of the system and initial data used in the network analysis. The key elements of this asset management system are utilised in the dissertation. Finally, the study addresses the stages of strategic decision-making and compilation of investment strategies. Further, the work illustrates implementation of strategic planning in an actual distribution company environment.
Resumo:
Two food products (powders) were obtained by hot-air drying or lyophilisation methods on the whole guava fruits. The powders were characterised by sensory and thermal analyses (TGA-DSC), infrared spectroscopy (IR), X-ray diffraction (XRD) and scanning electron microscopy (SEM). Thermal, morphological and structural characterisations showed a similar behaviour for the two solids. TGA-DSC and IR showed the presence of pectin as the main constituent of solids. A semi-crystalline profile was evidenced by XRD, and lamellar/spherical morphologies were observed by SEM. Sensory analyses revealed an aroma highly related to guava. These value-added food products are an alternative to process guava and avoid loss during postharvest handling.
Resumo:
A company’s capability to map out its cost position compared to other market players is important for competitive decision making. One aspect of cost position is direct product cost that illustrates the cost efficiency of a company’s product designs. If a company can evaluate and compare its own and other market players’ direct product costs, it can implement better decisions in product development and management, manufacturing, sourcing, etc. The main objective of this thesis was to develop a cost evaluation process for competitors’ products. This objective includes a process description and an analysis tool for cost evaluations. Additionally, process implementation is discussed as well. The main result of this thesis was a process description consisting of a sixteen steps process and an Excel based analysis tool. Since literature was quite limited in this field, the solution proposal was combined from many different theoretical concepts. It includes influences from reverse engineering, product cost assessment, benchmarking and cost based decision making. This solution proposal will lead to more systematic and standardized cost position analyses and result in better cost transparency in decision making.
Resumo:
Calcium oxide looping is a carbon dioxide sequestration technique that utilizes the partially reversible reaction between limestone and carbon dioxide in two interconnected fluidised beds, carbonator and calciner. Flue gases from a combustor are fed into the carbonator where calcium oxide reacts with carbon dioxide within the gases at a temperature of 650 ºC. Calcium oxide is transformed into calcium carbonate which is circulated into the regenerative calciner, where calcium carbonate is returned into calcium oxide and a stream of pure carbon dioxide at a higher temperature of 950 ºC. Calcium oxide looping has proved to have a low impact on the overall process efficiency and would be easily retrofitted into existing power plants. This master’s thesis is done in participation to an EU funded project CaOling as a part of the Lappeenranta University of Technology deliverable, reactor modelling and scale-up tools. Thesis concentrates in creating the first model frame and finding the physically relevant phenomena governing the process.
Resumo:
Transitional flow past a three-dimensional circular cylinder is a widely studied phenomenon since this problem is of interest with respect to many technical applications. In the present work, the numerical simulation of flow past a circular cylinder, performed by using a commercial CFD code (ANSYS Fluent 12.1) with large eddy simulation (LES) and RANS (κ - ε and Shear-Stress Transport (SST) κ - ω! model) approaches. The turbulent flow for ReD = 1000 & 3900 is simulated to investigate the force coefficient, Strouhal number, flow separation angle, pressure distribution on cylinder and the complex three dimensional vortex shedding of the cylinder wake region. The numerical results extracted from these simulations have good agreement with the experimental data (Zdravkovich, 1997). Moreover, grid refinement and time-step influence have been examined. Numerical calculations of turbulent cross-flow in a staggered tube bundle continues to attract interest due to its importance in the engineering application as well as the fact that this complex flow represents a challenging problem for CFD. In the present work a time dependent simulation using κ – ε, κ - ω! and SST models are performed in two dimensional for a subcritical flow through a staggered tube bundle. The predicted turbulence statistics (mean and r.m.s velocities) have good agreement with the experimental data (S. Balabani, 1996). Turbulent quantities such as turbulent kinetic energy and dissipation rate are predicted using RANS models and compared with each other. The sensitivity of grid and time-step size have been analyzed. Model constants sensitivity study have been carried out by adopting κ – ε model. It has been observed that model constants are very sensitive to turbulence statistics and turbulent quantities.
Resumo:
The main outcome of the master thesis is innovative solution, which can support a choice of business process modeling methodology. Potential users of this tool are people with background in business process modeling and possibilities to collect required information about organization’s business processes. Master thesis states the importance of business process modeling in implementation of strategic goals of organization. It is made by revealing the place of the concept in Business Process Management (BPM) and its particular case Business Process Reengineering (BPR). In order to support the theoretical outcomes of the thesis a case study of Northern Dimension Research Centre (NORDI) in Lappeenranta University of Technology was conducted. On its example several solutions are shown: how to apply business process modeling methodologies in practice; in which way business process models can be useful for BPM and BPR initiatives; how to apply proposed innovative solution for a choice of business process modeling methodology.
Resumo:
The developing energy markets and rising energy system costs have sparked the need to find new forms of energy production and increase the self-sufficiency of energy production. One alternative is gasification, whose principles have been known for decades, but it is only recently when the technology has become a true alternative. However, in order to meet the requirements of modern energy production methods, it is necessary to study the phenomenon thoroughly. In order to understand the gasification process better and optimize it from the viewpoint of ecology and energy efficiency, it is necessary to develop effective and reliable modeling tools for gasifiers. The main aims of this work have been to understand gasification as a process and furthermore to develop an existing three-dimensional circulating fluidized bed modeling tool for modeling of gasification. The model is applied to two gasification processes of 12 and 50 MWth. The results of modeling and measurements have been compared and subsequently reviewed. The work was done in co-operation with Lappeenranta University of Technology and Foster Wheeler Energia Oy.
Resumo:
Agile software development has grown in popularity starting from the agile manifesto declared in 2001. However there is a strong belief that the agile methods are not suitable for embedded, critical or real-time software development, even though multiple studies and cases show differently. This thesis will present a custom agile process that can be used in embedded software development. The reasons for presumed unfitness of agile methods in embedded software development have mainly based on the feeling of these methods providing no real control, no strict discipline and less rigor engineering practices. One starting point is to provide a light process with disciplined approach to the embedded software development. Agile software development has gained popularity due to the fact that there are still big issues in software development as a whole. Projects fail due to schedule slips, budget surpassing or failing to meet the business needs. This does not change when talking about embedded software development. These issues are still valid, with multiple new ones rising from the quite complex and hard domain the embedded software developers work in. These issues are another starting point for this thesis. The thesis is based heavily on Feature Driven Development, a software development methodology that can be seen as a runner up to the most popular agile methodologies. The FDD as such is quite process oriented and is lacking few practices considered commonly as extremely important in agile development methodologies. In order for FDD to gain acceptance in the software development community it needs to be modified and enhanced. This thesis presents an improved custom agile process that can be used in embedded software development projects with size varying from 10 to 500 persons. This process is based on Feature Driven Development and by suitable parts to Extreme Programming, Scrum and Agile Modeling. Finally this thesis will present how the new process responds to the common issues in the embedded software development. The process of creating the new process is evaluated at the retrospective and guidelines for such process creation work are introduced. These emphasize the agility also in the process development through early and frequent deliveries and the team work needed to create suitable process.