943 resultados para Paper-based


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study has been made for specific paper production line at an international forest industry company in Finland. The main purpose for the study was a need to examine the current situation of the customer knowledge and its’ sharing at case production line, recognize the problems in it and finally, find out the improvement actions. The study is composed of theoretical and empirical parts. In theoretical part, knowledge management and information sharing in addition to customer knowledge management are presented. Empirical data from case production line was collected by using survey questionnaires. The results are analyzed in discussion and conclusions and finally, study ends with summary which includes recommendations. Based on the study, the amount and quality of customer knowledge and gaining and transferring the customer knowledge were found as the main challenges. The proposed solutions were discovered from moving towards more dynamic operating environment and in the area of customer knowledge management, especially from the communities of creation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

B2B document handling is moving from paper to electronic networks and electronic domain very rapidly. Moving, handling and transforming large electronic business documents requires a lot from the systems handling them. This paper explores new technologies such as SOA, event-driven systems and ESB and a scalable, event-driven enterprise service bus is created to demonstrate these new approaches to message handling. As an end result, we have a small but fully functional messaging system with several different components. This is the first larger Java-project done in-house, so on the side we developed our own set of best practices of Java development, setting up configurations, tools, code repositories and class naming and much more.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the paper machine, it is not a desired feature for the boundary layer flows in the fabric and the roll surfaces to travel into the closing nips, creating overpressure. In this thesis, the aerodynamic behavior of the grooved roll and smooth rolls is compared in order to understand the nip flow phenomena, which is the main reason why vacuum and grooved roll constructions are designed. A common method to remove the boundary layer flow from the closing nip is to use the vacuum roll construction. The downside of the use of vacuum rolls is high operational costs due to pressure losses in the vacuum roll shell. The deep grooved roll has the same goal, to create a pressure difference over the paper web and keep the paper attached to the roll or fabric surface in the drying pocket of the paper machine. A literature review revealed that the aerodynamic functionality of the grooved roll is not very well known. In this thesis, the aerodynamic functionality of the grooved roll in interaction with a permeable or impermeable wall is studied by varying the groove properties. Computational fluid dynamics simulations are utilized as the research tool. The simulations have been performed with commercial fluid dynamics software, ANSYS Fluent. Simulation results made with 3- and 2-dimensional fluid dynamics models are compared to laboratory scale measurements. The measurements have been made with a grooved roll simulator designed for the research. The variables in the comparison are the paper or fabric wrap angle, surface velocities, groove geometry and wall permeability. Present-day computational and modeling resources limit grooved roll fluid dynamics simulations in the paper machine scale. Based on the analysis of the aerodynamic functionality of the grooved roll, a grooved roll simulation tool is proposed. The smooth roll simulations show that the closing nip pressure does not depend on the length of boundary layer development. The surface velocity increase affects the pressure distribution in the closing and opening nips. The 3D grooved roll model reveals the aerodynamic functionality of the grooved roll. With the optimal groove size it is possible to avoid closing nip overpressure and keep the web attached to the fabric surface in the area of the wrap angle. The groove flow friction and minor losses play a different role when the wrap angle is changed. The proposed 2D grooved roll simulation tool is able to replicate the grooved aerodynamic behavior with reasonable accuracy. A small wrap angle predicts the pressure distribution correctly with the chosen approach for calculating the groove friction losses. With a large wrap angle, the groove friction loss shows too large pressure gradients, and the way of calculating the air flow friction losses in the groove has to be reconsidered. The aerodynamic functionality of the grooved roll is based on minor and viscous losses in the closing and opening nips as well as in the grooves. The proposed 2D grooved roll model is a simplification in order to reduce computational and modeling efforts. The simulation tool makes it possible to simulate complex paper machine constructions in the paper machine scale. In order to use the grooved roll as a replacement for the vacuum roll, the grooved roll properties have to be considered on the basis of the web handling application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this thesis was to analyze the background information of an activity-based costing system, which is being used in a domestic forest industry company. The reports produced by the system have not been reliable, and this has caused the utilization of the system to diminish. The study was initiated by examining the theory of activity-based costing. It was also discovered, that the system produces management accounting information and therefore also that theory was introduced briefly. Next the possible sources of errors were examined. The significance of these errors was evaluated and waste handling was chosen as a subject of further study. The problem regarding waste handling was that there is no waste compensation in current model. When paper or board machine produces waste, it can be used as raw material in the process. However, at the moment the product, which is being produced, at the time does not get any compensation. The use of compensation has not been possible due to not knowing the quantity of process waste. As a result of the study a calculatory model, which enables calculating the quantity of process waste based on the data from the mill system, was introduced. This, for one, enables starting to use waste compensation in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Assessing of the costs of treating disease is necessary to demonstrate cost-effectiveness and to estimate the budget impact of new interventions and therapeutic innovations. However, there are few comprehensive studies on resource use and costs associated with lung cancer patients in clinical practice in Spain or internationally. The aim of this paper was to assess the hospital cost associated with lung cancer diagnosis and treatment by histology, type of cost and stage at diagnosis in the Spanish National Health Service. Methods: A retrospective, descriptive analysis on resource use and a direct medical cost analysis were performed. Resource utilisation data were collected by means of patient files from nine teaching hospitals. From a hospital budget impact perspective, the aggregate and mean costs per patient were calculated over the first three years following diagnosis or up to death. Both aggregate and mean costs per patient were analysed by histology, stage at diagnosis and cost type. Results: A total of 232 cases of lung cancer were analysed, of which 74.1% corresponded to non-small cell lung cancer (NSCLC) and 11.2% to small cell lung cancer (SCLC); 14.7% had no cytohistologic confirmation. The mean cost per patient in NSCLC ranged from 13,218 Euros in Stage III to 16,120 Euros in Stage II. The main cost components were chemotherapy (29.5%) and surgery (22.8%). Advanced disease stages were associated with a decrease in the relative weight of surgical and inpatient care costs but an increase in chemotherapy costs. In SCLC patients, the mean cost per patient was 15,418 Euros for limited disease and 12,482 Euros for extensive disease. The main cost components were chemotherapy (36.1%) and other inpatient costs (28.7%). In both groups, the Kruskall-Wallis test did not show statistically significant differences in mean cost per patient between stages. Conclusions: This study provides the costs of lung cancer treatment based on patient file reviews, with chemotherapy and surgery accounting for the major components of costs. This cost analysis is a baseline study that will provide a useful source of information for future studies on cost-effectiveness and on the budget impact of different therapeutic innovations in Spain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper stresses the importance of developing mathematical thought in young children based on everyday contexts, since these are meaningful learning situations with an interdisciplinary, globalised focus. The first part sets out the framework of reference that lays the theoretical foundations for these kinds of educational practices. The second part gives some teaching orientations for work based on everyday contexts. It concludes with the presentation of the activity 'We’re off to the cinema to learn mathematics!'

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports how laboratory projects (LP) coupled to inquiry-based learning (IBL) were implemented in a practical inorganic chemistry course. Several coordination compounds have been successfully synthesised by students according to the proposed topics by the LP-IBL junction, and the chemistry of a number of metals has been studied. Qualitative data were collected from written reports, oral presentations, lab-notebook reviews and personal discussions with the students through an experimental course with undergraduate second-year students at the Universidad Nacional de Colombia during the last 5 years. Positive skills production was observed by combining LP and IBL. Conceptual, practical, interpretational, constructional (questions, explanations, hypotheses), communicational, environmental and application abilities were revealed by the students throughout the experimental course.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dirt counting and dirt particle characterisation of pulp samples is an important part of quality control in pulp and paper production. The need for an automatic image analysis system to consider dirt particle characterisation in various pulp samples is also very critical. However, existent image analysis systems utilise a single threshold to segment the dirt particles in different pulp samples. This limits their precision. Based on evidence, designing an automatic image analysis system that could overcome this deficiency is very useful. In this study, the developed Niblack thresholding method is proposed. The method defines the threshold based on the number of segmented particles. In addition, the Kittler thresholding is utilised. Both of these thresholding methods can determine the dirt count of the different pulp samples accurately as compared to visual inspection and the Digital Optical Measuring and Analysis System (DOMAS). In addition, the minimum resolution needed for acquiring a scanner image is defined. By considering the variation in dirt particle features, the curl shows acceptable difference to discriminate the bark and the fibre bundles in different pulp samples. Three classifiers, called k-Nearest Neighbour, Linear Discriminant Analysis and Multi-layer Perceptron are utilised to categorize the dirt particles. Linear Discriminant Analysis and Multi-layer Perceptron are the most accurate in classifying the segmented dirt particles by the Kittler thresholding with morphological processing. The result shows that the dirt particles are successfully categorized for bark and for fibre bundles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The topic of this thesis is the simulation of a combination of several control and data assimilation methods, meant to be used for controlling the quality of paper in a paper machine. Paper making is a very complex process and the information obtained from the web is sparse. A paper web scanner can only measure a zig zag path on the web. An assimilation method is needed to process estimates for Machine Direction (MD) and Cross Direction (CD) profiles of the web. Quality control is based on these measurements. There is an increasing need for intelligent methods to assist in data assimilation. The target of this thesis is to study how such intelligent assimilation methods are affecting paper web quality. This work is based on a paper web simulator, which has been developed in the TEKES funded MASI NoTes project. The simulator is a valuable tool in comparing different assimilation methods. The thesis contains the comparison of four different assimilation methods. These data assimilation methods are a first order Bayesian model estimator, an ARMA model based on a higher order Bayesian estimator, a Fourier transform based Kalman filter estimator and a simple block estimator. The last one can be considered to be close to current operational methods. From these methods Bayesian, ARMA and Kalman all seem to have advantages over the commercial one. The Kalman and ARMA estimators seems to be best in overall performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The theory part of the Master’s thesis introduces fibres with high tensile strength and elongation used in the production of paper or board. Strong speciality papers are made of bleached softwood long fibre pulp. The aim of the thesis is to find new fibres suitable for paper making to increase either tensile strength, elongation or both properties. The study introduces how fibres bond and what kind of fibres give the strongest bonds into fibre matrix. The fibres that are used the in manufacturing of non-wovens are long and elastic. They are longer than softwood cellulose fibres. The end applications of non-wovens and speciality papers are often the same, for instance, wet napkins or filter media. The study finds out which fibres are used in non-wovens and whether the same fibres could be added to cellulose pulp as armature fibres, what it would require for these fibres to be blended in cellulose, how they would bind with cellulose and whether some binding agents or thermal bonding, such as hot calendaring would be necessary. The following fibres are presented: viscose, polyester, nylon, polyethylene, polypropylene and bicomponent fibres. In the empiric part of the study the most suitable new fibres are selected for making hand sheets in laboratory. Test fibres are blended with long fibre cellulose. The test fibres are viscose (Tencel), polypropylene and polyethylene. Based on the technical values measured in the sheets, the study proposes how to continue trials on paper machine with viscose, polyester, bicomponent and polypropylene fibres.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective of this research is creating a performance measurement system for accounting services of a large paper industry company. In this thesis there are compared different performance measurement system and then selected two systems, which are presented and compared more detailed. Performance Prism system is the used framework in this research. Performance Prism using success maps to determining objectives. Model‟s target areas are divided into five groups: stakeholder satisfaction, stakeholder contribution, strategy, processes and capabilities. The measurement system creation began by identifying stakeholders and defining their objectives. Based on the objectives are created success map. Measures are created based on the objectives and success map. Then is defined needed data for measures. In the final measurement system, there are total just over 40 measures. Each measure is defined specific target level and ownership. Number of measures is fairly large, but this is the first version of the measurement system, so the amount is acceptable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The properties of the paper surface play a crucial role in ensuring suitable quality and runnability in various converting and finishing operations, such as printing. Plasma surface modification makes it possible to modify the surface chemistry of paper without altering the bulk material properties. This also makes it possible to investigate the role of the surface chemistry alone on printability without influencing the porous structure of the pigment-coated paper. Since the porous structure of a pigment coating controls both ink setting and optical properties, surface chemical changes created by a plasma modification have a potential to decouple these two effects and to permit a better optimization of them both. The aim of this work was to understand the effects of plasma surface modification on paper properties, and how it influences printability in the sheet-fed offset process. The objective was to broaden the fundamental understanding of the role of surface chemistry on offset printing. The effects of changing the hydrophilicity/ hydrophobicity and the surface chemical composition by plasma activation and plasma coatings on the properties of coated paper and on ink-paper interactions as well as on sheet-fed offset print quality were investigated. In addition, the durability of the plasma surface modification was studied. Nowadays, a typical sheet-fed offset press also contains units for surface finishing, for example UVvarnishing. The role of the surface chemistry on the UV-varnish absorption into highly permeable and porous pigment-coated paper was also investigated. With plasma activation it was possible to increase the surface energy and hydrophilicity of paper. Both polar and dispersion interactions were found to increase, although the change was greater in the polar interactions due to induced oxygen molecular groups. The results indicated that plasma activation takes place particularly in high molecular weight components such as the dispersion chemicals used to stabilize the pigment and latex particles. Surface composition, such as pigment and binder type, was found to influence the response to the plasma activation. The general trend was that pilot-scale treatment modified the surface chemistry without altering the physical coating structure, whereas excessive laboratory-scale treatment increased the surface roughness and reduced the surface strength, which led to micro-picking in printing. It was shown that pilot-scale plasma activation in combination with appropriate ink oils makes it possible to adjust the ink-setting rate. The ink-setting rate decreased with linseed-oil-based inks, probably due to increased acid-base interactions between the polar groups in the oil and the plasma-treated paper surface. With mineral-oil-based inks, the ink setting accelerated due to plasma activation. Hydrophobic plasma coatings were able to reduce or even prevent the absorption of dampening water into pigmentcoated paper, even when the dampening water was applied under the influence of nip pressure. A uniform hydrophobic plasma coating with sufficient chemical affinity with ink gave an improved print quality in terms of higher print density and lower print mottle. It was also shown that a fluorocarbon plasma coating reduced the free wetting of the UV-varnish into the highly permeable and porous pigment coating. However, when the UV-varnish was applied under the influence of nip pressure, which leads to forced wetting, the role of the surface chemical composition seems to be much less. A decay in surface energy and wettability occurred during the first weeks of storage after plasma activation, after which it leveled off. However, the oxygen/carbon elemental ratio did not decrease as a function of time, indicating that ageing could be caused by a re-orientation of polar groups or by a contamination of the surface. The plasma coatings appeared to be more stable when the hydrophobicity was higher, probably due to fewer interactions with oxygen and water vapor in the air.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is devoted to the analysis of signal variation of the Cross-Direction and Machine-Direction measurements from paper web. The data that we possess comes from the real paper machine. Goal of the work is to reconstruct the basis weight structure of the paper and to predict its behaviour to the future. The resulting synthetic data is needed for simulation of paper web. The main idea that we used for describing the basis weight variation in the Cross-Direction is Empirical Orthogonal Functions (EOF) algorithm, which is closely related to Principal Component Analysis (PCA) method. Signal forecasting in time is based on Time-Series analysis. Two principal mathematical procedures that we used in the work are Autoregressive-Moving Average (ARMA) modelling and Ornstein–Uhlenbeck (OU) process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this academic economic geographical dissertation is to study and describe how competitiveness in the Finnish paper industry has developed during 2001–2008. During these years, the Finnish paper industry has faced economically challenging times. This dissertation attempts to fill the existing gap between theoretical and empirical discussions concerning economic geographical issues in the paper industry. The main research questions are: How have the supply chain costs and margins developed during 2001–2008? How do sales prices, transportation, and fixed and variable costs correlate with gross margins in a spatial context? The research object for this case study is a typical large Finnish paper mill that exports over 90 % of its production. The economic longitudinal research data were obtained from the case mill’s controlled economic system and, correlation (R2) analysis was used as the main research method. The time series data cover monthly economic and manufacturing observations from the mill from 2001 to 2008. The study reveals the development of prices, costs and transportation in the case mill, and it shows how economic variables correlate with the paper mills’ gross margins in various markets in Europe. The research methods of economic geography offer perspectives that pay attention to the spatial (market) heterogeneity. This type of research has been quite scarce in the research tradition of Finnish economic geography and supply chain management. This case study gives new insight into the research tradition of Finnish economic geography and supply chain management and its applications. As a concrete empirical result, this dissertation states that the competitive advantages of the Finnish paper industry were significantly weakened during 2001–2008 by low paper prices, costly manufacturing and expensive transportation. Statistical analysis expose that, in several important markets, transport costs lower gross margins as much as decreasing paper prices, which was a new finding. Paper companies should continuously pay attention to lowering manufacturing and transporting costs to achieve more profitable economic performance. The location of a mill being far from markets clearly has an economic impact on paper manufacturing, as paper demand is decreasing and oversupply is pressuring paper prices down. Therefore, market and economic forecasting in the paper industry is advantageous at the country and product levels while simultaneously taking into account the economic geographically specific dimensions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Case-based reasoning (CBR) is a recent approach to problem solving and learning that has got a lot of attention over the last years. In this work, the CBR methodology is used to reduce the time and amount of resources spent on carry out experiments to determine the viscosity of the new slurry. The aim of this work is: to develop a CBR system to support the decision making process about the type of slurries behavior, to collect a sufficient volume of qualitative data for case base, and to calculate the viscosity of the Newtonian slurries. Firstly in this paper, the literature review about the types of fluid flow, Newtonian and non-Newtonian slurries is presented. Some physical properties of the suspensions are also considered. The second part of the literature review provides an overview of the case-based reasoning field. Different models and stages of CBR cycles, benefits and disadvantages of this methodology are considered subsequently. Brief review of the CBS tools is also given in this work. Finally, some results of work and opportunities for system modernization are presented. To develop a decision support system for slurry viscosity determination, software application MS Office Excel was used. Designed system consists of three parts: workspace, the case base, and section for calculating the viscosity of Newtonian slurries. First and second sections are supposed to work with Newtonian and Bingham fluids. In the last section, apparent viscosity can be calculated for Newtonian slurries.