939 resultados para monitoring process mean and variance
Resumo:
Laatu on osaltaan vahvistamassa asemaansa liike-elämässä yritysten kilpaillessa kansainvälisillä markkinoilla niin hinnalla kuin laadulla. Tämä suuntaus on synnyttänyt useita laatuohjelmia, joita käytetään ahkerasti yritysten kokonais- valtaisen laatujohtamisen (TQM) toteuttamisessa. Laatujohtaminen kattaa yrityksen kaikki toiminnot ja luo vaatimuksia myös yrityksen tukitoimintojen kehittämiselle ja parantamiselle. Näihin lukeutuu myös tämän tutkimuksen kohde tietohallinto (IT). Tutkielman tavoitteena oli kuvata IT prosessin nykytila. Tutkielmassa laadittu prosessikuvaus pohjautuu prosessijohtamisen teoriaan ja kohdeyrityksen käyttämään laatupalkinto kriteeristöön. Tutkimusmenetelmänä prosessin nykytilan selvittämiseksi käytettiin teemahaastattelutta. Prosessin nykytilan ja sille asetettujen vaatimusten selvittämiseksi haastateltiin IT prosessin asiakkaita. Prosessianalyysi, tärkeimpien ala-prosessien tunnistaminen ja parannusalueiden löytäminen ovat tämän tutkielman keskeisemmät tulokset. Tutkielma painottui IT prosessin heikkouksien ja parannuskohteiden etsimiseen jatkuvan kehittämisen pohjaksi, ei niinkään prosessin radikaaliin uudistamiseen. Tutkielmassa esitellään TQM:n periaatteet, laatutyökaluja sekä prosessijohtamisen terminologia, periaatteet ja sen systemaattinen toteutus. Työ antaa myös kuvan siitä, miten TQM ja prosessijohtaminen niveltyvät yrityksen laatutyössä.
Resumo:
Innovation is the word of this decade. According to innovation definitions, without positive sales impact and meaningful market share the company’s product or service has not been an innovation. Research problem of this master thesis is to find out what is the innovation process of complex new consumer products and services in new innovation paradigm. The objective is to get answers to two research questions: 1) What are the critical success factors what company should do when it is implementing the paradigm change in mass markets consumer business with complex products and services? 2) What is the process or framework one firm could follow? The research problem is looked from one company’s innovation creation process, networking and organization change management challenges point of views. Special focus is to look the research problem from an existing company perspective which is entering new business area. Innovation process management framework of complex new consumer products and services in new innovation paradigm has been created with support of several existing innovation theories. The new process framework includes the critical innovation process elements companies should take into consideration in their daily activities when they are in their new business innovation implementing process. Case company location based business implementation activities are studied via the new innovation process framework. This case study showed how important it is to manage the process, look how the target market and the competition in it is developing during company’s own innovation process, make decisions at right time and from beginning plan and implement the organization change management as one activity in the innovation process. In the end this master thesis showed that all companies need to create their own innovation process master plan with milestones and activities. One plan does not fit all, but all companies can start their planning from the new innovation process what was introduced in this master thesis.
Resumo:
In many industrial applications, accurate and fast surface reconstruction is essential for quality control. Variation in surface finishing parameters, such as surface roughness, can reflect defects in a manufacturing process, non-optimal product operational efficiency, and reduced life expectancy of the product. This thesis considers reconstruction and analysis of high-frequency variation, that is roughness, on planar surfaces. Standard roughness measures in industry are calculated from surface topography. A fast and non-contact method to obtain surface topography is to apply photometric stereo in the estimation of surface gradients and to reconstruct the surface by integrating the gradient fields. Alternatively, visual methods, such as statistical measures, fractal dimension and distance transforms, can be used to characterize surface roughness directly from gray-scale images. In this thesis, the accuracy of distance transforms, statistical measures, and fractal dimension are evaluated in the estimation of surface roughness from gray-scale images and topographies. The results are contrasted to standard industry roughness measures. In distance transforms, the key idea is that distance values calculated along a highly varying surface are greater than distances calculated along a smoother surface. Statistical measures and fractal dimension are common surface roughness measures. In the experiments, skewness and variance of brightness distribution, fractal dimension, and distance transforms exhibited strong linear correlations to standard industry roughness measures. One of the key strengths of photometric stereo method is the acquisition of higher frequency variation of surfaces. In this thesis, the reconstruction of planar high-frequency varying surfaces is studied in the presence of imaging noise and blur. Two Wiener filterbased methods are proposed of which one is optimal in the sense of surface power spectral density given the spectral properties of the imaging noise and blur. Experiments show that the proposed methods preserve the inherent high-frequency variation in the reconstructed surfaces, whereas traditional reconstruction methods typically handle incorrect measurements by smoothing, which dampens the high-frequency variation.
Resumo:
The aim of this Thesis is to study how to manage the front-end of the offering planning process. This includes actual process development and methods to gather and analyze information to achieve the best outcome in customer oriented product offering. Study is carried out in two parts: theoretical part and company related part. Theoretical framework is created introducing different types of approaches to manage product planning processes. Products are seen as platforms and they are broken down to subsystems to show different parts of the development. With the help of the matrix-based approaches product platform related information is gathered and analyzed. In this kind of analysis business/market drivers and cus-tomer/competitor information are connected with product subsystems. This gives possibilities to study product gaps/needs and possible future ideas/scenarios in different customer segments. Company related part consists of offering planning process development in real company environment. Process formation includes documents and tools that guide planning from the information gathering to the prioritization and decision making.
Resumo:
Rosin is a natural product from pine forests and it is used as a raw material in resinate syntheses. Resinates are polyvalent metal salts of rosin acids and especially Ca- and Ca/Mg- resinates find wide application in the printing ink industry. In this thesis, analytical methods were applied to increase general knowledge of resinate chemistry and the reaction kinetics was studied in order to model the non linear solution viscosity increase during resinate syntheses by the fusion method. Solution viscosity in toluene is an important quality factor for resinates to be used in printing inks. The concept of critical resinate concentration, c crit, was introduced to define an abrupt change in viscosity dependence on resinate concentration in the solution. The concept was then used to explain the non-inear solution viscosity increase during resinate syntheses. A semi empirical model with two estimated parameters was derived for the viscosity increase on the basis of apparent reaction kinetics. The model was used to control the viscosity and to predict the total reaction time of the resinate process. The kinetic data from the complex reaction media was obtained by acid value titration and by FTIR spectroscopic analyses using a conventional calibration method to measure the resinate concentration and the concentration of free rosin acids. A multivariate calibration method was successfully applied to make partial least square (PLS) models for monitoring acid value and solution viscosity in both mid-infrared (MIR) and near infrared (NIR) regions during the syntheses. The calibration models can be used for on line resinate process monitoring. In kinetic studies, two main reaction steps were observed during the syntheses. First a fast irreversible resination reaction occurs at 235 °C and then a slow thermal decarboxylation of rosin acids starts to take place at 265 °C. Rosin oil is formed during the decarboxylation reaction step causing significant mass loss as the rosin oil evaporates from the system while the viscosity increases to the target level. The mass balance of the syntheses was determined based on the resinate concentration increase during the decarboxylation reaction step. A mechanistic study of the decarboxylation reaction was based on the observation that resinate molecules are partly solvated by rosin acids during the syntheses. Different decarboxylation mechanisms were proposed for the free and solvating rosin acids. The deduced kinetic model supported the analytical data of the syntheses in a wide resinate concentration region, over a wide range of viscosity values and at different reaction temperatures. In addition, the application of the kinetic model to the modified resinate syntheses gave a good fit. A novel synthesis method with the addition of decarboxylated rosin (i.e. rosin oil) to the reaction mixture was introduced. The conversion of rosin acid to resinate was increased to the level necessary to obtain the target viscosity for the product at 235 °C. Due to a lower reaction temperature than in traditional fusion synthesis at 265 °C, thermal decarboxylation is avoided. As a consequence, the mass yield of the resinate syntheses can be increased from ca. 70% to almost 100% by recycling the added rosin oil.
Resumo:
There is an increasing interest in the use of breath analysis for monitoring human physiology and exposure to toxic substances or environmental pollutants. This review focuses on the current status of the sampling procedures, collection devices and sample-enrichment methodologies used for exhaled breath-vapor analysis. We discuss the different parameters affecting each of the above steps, taking into account the requirements for breath analysis in exposure assessments and the need to analyze target compounds at sub-ppbv levels. Finally, we summarize the practical applications of exposure analysis in the past two decades
Resumo:
This thesis deals with improving international airport baggage supply chain management (SCM) by means of information technology and new baggage handling system. This study aims to focus on supply chain visibility in practice and to suggest different ways to improve the supply chain performance through information sharing. The objective is also to define how radio frequency identification (RFID) and enterprise resource planning (ERP) can make processes more transparent. In order to get full benefits from processes, effective business process management and monitoring as well as the key performance indicators must be defined, implemented and visualized through e.g. dashboard views for different roles. As an outcome of the research the need for the use of information technology systems and more advanced technologies, e.g. RFID in the supply chain management is evident. Sophisticated ERP is crucial in boosting SCM business processes and profitability. This would be beneficial for dynamic decision making as well in the airport and airline supply chain management. In the long term, economic aspects support the actions I have suggested in order to make production more flexible in reacting to quick changes.
Resumo:
In general, laboratory activities are costly in terms of time, space, and money. As such, the ability to provide realistically simulated laboratory data that enables students to practice data analysis techniques as a complementary activity would be expected to reduce these costs while opening up very interesting possibilities. In the present work, a novel methodology is presented for design of analytical chemistry instrumental analysis exercises that can be automatically personalized for each student and the results evaluated immediately. The proposed system provides each student with a different set of experimental data generated randomly while satisfying a set of constraints, rather than using data obtained from actual laboratory work. This allows the instructor to provide students with a set of practical problems to complement their regular laboratory work along with the corresponding feedback provided by the system's automatic evaluation process. To this end, the Goodle Grading Management System (GMS), an innovative web-based educational tool for automating the collection and assessment of practical exercises for engineering and scientific courses, was developed. The proposed methodology takes full advantage of the Goodle GMS fusion code architecture. The design of a particular exercise is provided ad hoc by the instructor and requires basic Matlab knowledge. The system has been employed with satisfactory results in several university courses. To demonstrate the automatic evaluation process, three exercises are presented in detail. The first exercise involves a linear regression analysis of data and the calculation of the quality parameters of an instrumental analysis method. The second and third exercises address two different comparison tests, a comparison test of the mean and a t-paired test.
Resumo:
The overriding aim of this drama educational case study is to deepen the understanding of meaning making in a creative intercultural youth theatre process and to examine it in the context of the 10th European Children's TheatreEncounter. The research task is to give a theoretical description of some key features of a creative drama process as the basis for theory about meaning makingin physical theatre. The first task is to illuminate the culture-historical connections of the multilayered practice of the EDERED-association. The second taskis to analyse and interpret theatrical meaning making. The ethnographical research site is regarded as a theatrical event. The analysis of the theatrical eventis divided into four segments: cultural contexts, contextual theatricality, theatrical playing and playing culture. These segments are connected with four research questions: What are the cultural contexts of a creative drama process? How can the organisation of the Encounter, genres, aesthetic codes and perception ofcodes be seen to influence the lived experiences of the participants? What are some of the key phases and characteristics in a creative practice? What kind of cultural learning can be interpreted from the performance texts? The interpretative question concerns identity and community (re)construction. How are the categories, `community´ and `child´ constructed in the Encounter culture? In this drama educational case study the research material (transcribed interviews, coded questionnaire answers, participant drawings, videotaped process text and performance texts) are examined in a multi-method analysis in the meta-theoretical framework of Dewey's naturalistic pragmatism. A three-dimensional research interest through a combination of lived experiences, social contexts and cultural-aesthetical practices compared with drama-educational practices required the methodological project of cultural studies. Furthermore, the critical interpretation of cultural texts is divided into three levels of analyses which are called description, structural analysis and theoretical interpretation. Dialogic validity (truthfulness, self-reflexivity and polyvocality) is combined with contextual validity (sensitivity to social context and awareness of historicity) and with deconstructive validity (awareness of the social discourses). My research suggests that itis possible, by means of physical theatre, to construct symbolic worlds where questions about intercultural identity and multilingual community are examined and where provisional answers are constructed in social interaction.
Resumo:
In the present dissertation, multilingual thesauri were approached as cultural products and the focus was twofold: On the empirical level the focus was placed on the translatability of certain British-English social science indexing terms into the Finnish language and culture at a concept, a term and an indexing term level. On the theoretical level the focus was placed on the aim of translation and on the concept of equivalence. In accordance with modern communicative and dynamic translation theories the interest was on the human dimension. The study is qualitative. In this study, equivalence was understood in a similar way to how dynamic, functional equivalence is commonly understood in translation studies. Translating was seen as a decision-making process, where a translator often has different kinds of possibilities to choose in order to fulfil the function of the translation. Accordingly, and as a starting point for the construction of the empirical part, the function of the source text was considered to be the same or similar to the function of the target text, that is, a functional thesaurus both in source and target context. Further, the study approached the challenges of multilingual thesaurus construction from the perspectives of semantics and pragmatics. In semantic analysis the focus was on what the words conventionally mean and in pragmatics on the ‘invisible’ meaning - or how we recognise what is meant even when it is not actually said (or written). Languages and ideas expressed by languages are created mainly in accordance with expressional needs of the surrounding culture and thesauri were considered to reflect several subcultures and consequently the discourses which represent them. The research material consisted of different kinds of potential discourses: dictionaries, database records, and thesauri, Finnish versus British social science researches, Finnish versus British indexers, simulated indexing tasks with five articles and Finnish versus British thesaurus constructors. In practice, the professional background of the two last mentioned groups was rather similar. It became even more clear that all the material types had their own characteristics, although naturally not entirely separate from each other. It is further noteworthy that the different types and origins of research material were not used to represent true comparison pairs, and that the aim of triangulation of methods and material was to gain a holistic view. The general research questions were: 1. Can differences be found between Finnish and British discourses regarding family roles as thesaurus terms, and if so, what kinds of differences and which are the implications for multilingual thesaurus construction? 2. What is the pragmatic indexing term equivalence? The first question studied how the same topic (family roles) was represented in different contexts and by different users, and further focused on how the possible differences were handled in multilingual thesaurus construction. The second question was based on findings of the previous one, and answered to the final question as to what kinds of factors should be considered when defining translation equivalence in multilingual thesaurus construction. The study used multiple cases and several data collection and analysis methods aiming at theoretical replication and complementarity. The empirical material and analysis consisted of focused interviews (with Finnish and British social scientists, thesaurus constructors and indexers), simulated indexing tasks with Finnish and British indexers, semantic component analysis of dictionary definitions and translations, coword analysis and datasets retrieved in databases, and discourse analysis of thesauri. As a terminological starting point a topic and case family roles was selected. The results were clear: 1) It was possible to identify different discourses. There also existed subdiscourses. For example within the group of social scientists the orientation to qualitative versus quantitative research had an impact on the way they reacted to the studied words and discourses, and indexers placed more emphasis on the information seekers whereas thesaurus constructors approached the construction problems from a more material based solution. The differences between the different specialist groups i.e. the social scientists, the indexers and the thesaurus constructors were often greater than between the different geo-cultural groups i.e. Finnish versus British. The differences occurred as a result of different translation aims, diverging expectations for multilingual thesauri and variety of practices. For multilingual thesaurus construction this means severe challenges. The clearly ambiguous concept of multilingual thesaurus as well as different construction and translation strategies should be considered more precisely in order to shed light on focus and equivalence types, which are clearly not self-evident. The research also revealed the close connection between the aims of multilingual thesauri and the pragmatic indexing term equivalence. 2) The pragmatic indexing term equivalence is very much context-depended. Although thesaurus term equivalence is defined and standardised in the field of library and information science (LIS), it is not understood in one established way and the current LIS tools are inadequate to provide enough analytical tools for both constructing and studying different kinds of multilingual thesauri as well as their indexing term equivalence. The tools provided in translation science were more practical and theoretical, and especially the division of different meanings of a word provided a useful tool in analysing the pragmatic equivalence, which often differs from the ideal model represented in thesaurus construction literature. The study thus showed that the variety of different discourses should be acknowledged, there is a need for operationalisation of new types of multilingual thesauri, and the factors influencing pragmatic indexing term equivalence should be discussed more precisely than is traditionally done.
Resumo:
The importance of Information Technology (IT) in the business environment is continuously growing. This stimulates the increase of size, complexity and number of IT projects and raises the need for IT Project Portfolio Management (IT PPM). While being actively discussed for the last few decades, IT PPM has a short history of practical implementation. This creates inconsistency in the views of different authors and provides an opportunity for additional research. As a first step, this research explores the existing studies and brings together the views of different authors on IT PPM. As a result, a high-level IT PPM Process Cycle and a set of Key Success Factors for IT PPM are proposed. IT PPM Process Cycle gives an overview of the main elements of IT PPM process, while the set of Key Success Factors provides a number of factors that should be considered during the implementation. As a second step, both theoretical deliverables are empirically tested by a case study and a survey conducted in a big multinational company. The case study is used to analyze process framework of the studied company towards the developed IT PPM Process Cycle. Subsequently, a survey was conducted among subject matter experts of the same company to evaluate the importance and relevance of the proposed Key Success Factors. Finally, this thesis concludes with findings made during the case study and provides an empirically tested selection of factors to be taken into account. These two deliverables can be used by both academics and practitioners to close the gaps in existing literature and assist in IT PPM implementation.
Resumo:
The importance of efficient supply chain management has increased due to globalization and the blurring of organizational boundaries. Various supply chain management technologies have been identified to drive organizational profitability and financial performance. Organizations have historically been concentrating heavily on the flow of goods and services, while less attention has been dedicated to the flow of money. While supply chains are becoming more transparent and automated, new opportunities for financial supply chain management have emerged through information technology solutions and comprehensive financial supply chain management strategies. This research concentrates on the end part of the purchasing process which is the handling of invoices. Efficient invoice processing can have an impact on organizations working capital management and thus provide companies with better readiness to face the challenges related to cash management. Leveraging a process mining solution the aim of this research was to examine the automated invoice handling process of four different organizations. The invoice data was collected from each organizations invoice processing system. The sample included all the invoices organizations had processed during the year 2012. The main objective was to find out whether e-invoices are faster to process in an automated invoice processing solution than scanned invoices (post entry into invoice processing solution). Other objectives included looking into the longest lead times between process steps and the impact of manual process steps on cycle time. Processing of invoices from maverick purchases was also examined. Based on the results of the research and previous literature on the subject, suggestions for improving the process were proposed. The results of the research indicate that scanned invoices were processed faster than e-invoices. This is mostly due to the more complex processing of e-invoices. It should be noted however that the manual tasks related to turning a paper invoice into electronic format through scanning are ignored in this research. The transitions with the longest lead times in the invoice handling process included both pre-automated steps as well as manual steps performed by humans. When the most common manual steps were examined in more detail, it was clear that these steps had a prolonging impact on the process. Regarding invoices from maverick purchases the evidence shows that these invoices were slower to process than invoices from purchases conducted through e-procurement systems and from preferred suppliers. Suggestions on how to improve the process included: increasing invoice matching, reducing of manual steps and leveraging of different value added services such as invoice validation service, mobile solutions and supply chain financing services. For companies that have already reaped all the process efficiencies the next step is to engage in collaborative financial supply chain management strategies that can benefit the whole supply chain.
Resumo:
The dynamics of the tree community and 30 tree populations were examined in an area of tropical semideciduous forest located on the margin of the Rio Grande, SE Brazil, based on surveys done in 1990 and 1997 in three 0.18 ha plots. The main purpose was to assess whether variations in dynamics were related to topography and the effects of a catastrophic flood in 1992. Rates of mortality and recruitment of trees and gain and loss of basal area in two topographic sites, lower (flooded) and upper (non-flooded), were obtained. Projected trajectories of mean and accelerated growth in diameter were obtained for each species. In both topographic sites, mortality rates surpassed recruitment rates, gain rates of basal area surpassed loss rates, and size distributions changed, with declining proportions of smaller trees. These overall changes were possibly related to increased underground water supply after the 1992 flood as well as to a c. 250-year-old process of primary succession on abandoned gold mines. Possible effects of the 1992 flood showed up in the higher proportions of dead trees in the flooded sites and faster growth rates in the flood-free sites. Species of different regeneration guilds showed particular trends with respect to their demographic changes and diameter growth patterns. Nevertheless, patterns of population dynamics differed between topographic sites for only two species.
Resumo:
The report presents the results of the commercialization project called the Container logistic services for forest bioenergy. The project promotes new business that is emerging around overall container logistic services in the bioenergy sector. The results assess the European markets of the container logistics for biomass, enablers for new business creation and required service bundles for the concept. We also demonstrate the customer value of the container logistic services for different market segments. The concept analysis is based on concept mapping, quality function deployment process (QFD) and business network analysis. The business network analysis assesses key shareholders and their mutual connections. The performance of the roadside chipping chain is analysed by the logistic cost simulation, RFID system demonstration and freezing tests. The EU has set the renewable energy target to 20 % in 2020 of which Biomass could account for two-thirds. In the Europe, the production of wood fuels was 132.9 million solid-m3 in 2012 and production of wood chips and particles was 69.0 million solidm3. The wood-based chips and particle flows are suitable for container transportation providing market of 180.6 million loose- m3 which mean 4.5 million container loads per year. The intermodal logistics of trucks and trains are promising for the composite containers because the biomass does not freeze onto the inner surfaces in the unloading situations. The overall service concept includes several packages: container rental, container maintenance, terminal services, RFID-tracking service, and simulation and ERP-integration service. The container rental and maintenance would provide transportation entrepreneurs a way to increase the capacity without high investment costs. The RFID-concept would lead to better work planning improving profitability throughout the logistic chain and simulation supports fuel supply optimization.
Resumo:
This thesis studies the possibility to use lean tools and methods in a quotation process which is carried out in an office environment. The aim of the study was to find out and test the relevant lean tools and methods which can help to balance and standardize the quotation process, and reduce the variance in quotation lead times and in quality. Seminal works, researches and guide books related to the topic were used as the basis for the theory development. Based on the literature review and the case company’s own lean experience, the applicable lean tools and methods were selected to be tested by a sales support team. Leveling production, by product categorization and value stream mapping, was a key method to be used to balance the quotation process. 5S method was started concurrently for standardizing the work. Results of the testing period showed that lean tools and methods are applicable in office process and selected tools and methods helped to balance and standardize the quotation process. Case company’s sales support team decided to implement new lean based quotation process model.