38 resultados para Orthodontic anchorage techniques
Resumo:
The objective in this Master’s Thesis was to determine VOC emissions from veneer drying in softwood plywood manufacturing. Emissions from plywood industry have become an important factor because of the tightened regulations worldwide. In this Thesis is researched quality and quantity of the VOCs from softwood veneer drying. One of the main objectives was to find out suitable cleaning techniques for softwood VOC emissions. In introduction part is presented veneer drying machines, wood mechanical and chemical properties. VOC control techniques and specified VOC limits are introduced also in the introduction part. Plywood mills have not had interest to VOC emissions previously nevertheless nowadays plywood mills worldwide must consider reduction of the emissions. This Thesis includes measuring of emissions from softwood veneer dryer, analyzation of measured test results and reviewing results. Different air conditions inside of the dryer were considered during planning of the measurements. Results of the emissions measurements were compared to the established laws. Results from this Thesis were softwood veneer dryer emissions in different air conditions. Emission control techniques were also studied for softwood veneer dryer emissions for further specific research.
Resumo:
Selling is much maligned, often under-valued subject whose inadequate showing in business schools is in inverse proportion to the many job opportunities it offers and the importance of salespeople bringing incomes to companies. The purpose of this research is to increase the understanding of customer-oriented selling and examine the influence of customer-oriented philosophy on selling process, the applicability of selling techniques to this philosophy and the importance of them to salespeople. The empirical section of the study is two-fold. Firstly, the data of qualitative part was collected by conducting five thematic interviews among sales consultants and case company representatives. The findings of the study indicate that customer-oriented selling requires the activity of salespeople. In the customer-oriented personal selling process, salespeople invest time in the preplanning, the need analysis and the benefit demonstration stages. However, the findings propose that salespeople today must also have the basic capabilities for executing the traditional sales process, and the balance between traditional and consultative selling process will change as the duration of the relationship between the salesperson and customer increases. The study also proposes that selling techniques still belong to the customer-oriented selling process, although their roles might be modest. This thesis mapped 75 selling techniques and the quantitative part of the study explored what selling techniques are considered to be important by salespeople in direct selling industry when they make sales with new and existing customers. Response rate of the survey was 69.5%.
Resumo:
The goal of the study was to analyse orthodontic care in Finnish health centres with special reference to the delivery, outcome and costs of treatment. Public orthodontic care was studied by two questionnaires sent to the chief dental officers of all health centres (n = 276) and to all specialist orthodontists in Finland (n = 146). The large regional variation was mentioned by the orthodontists as the most important factor requiring improvement. Orthodontic practices and outcome were studied in eight Finnish municipal health centres representing early and late timing of treatment. A random sample of 16- and 18-year-olds (n = 1109) living in these municipalities was examined for acceptability of occlusion with the Occlusal Morphology and Function Index (OMFI). In acceptability of occlusion, only minor differences were found between the two timing groups. The percentage of subjects with acceptable morphology was higher among untreated than among treated adolescents. The costs of orthodontic care were estimated among the adolescents with a treatment history. The mean appliance costs were higher in the late, and the mean visit costs higher in the early timing group. The cost-effectiveness of orthodontic services differed among the health centres, but was almost equal in the two timing groups. National guidelines and delegation of orthodontic tasks were suggested as the tools for reducing the variation among the health centres. In the eight health centres, considerable variation was found in acceptability of occlusion and in cost-effectiveness of services. The cost-effectiveness was not directly connected with the timing of treatment.
Resumo:
Visual data mining (VDM) tools employ information visualization techniques in order to represent large amounts of high-dimensional data graphically and to involve the user in exploring data at different levels of detail. The users are looking for outliers, patterns and models – in the form of clusters, classes, trends, and relationships – in different categories of data, i.e., financial, business information, etc. The focus of this thesis is the evaluation of multidimensional visualization techniques, especially from the business user’s perspective. We address three research problems. The first problem is the evaluation of projection-based visualizations with respect to their effectiveness in preserving the original distances between data points and the clustering structure of the data. In this respect, we propose the use of existing clustering validity measures. We illustrate their usefulness in evaluating five visualization techniques: Principal Components Analysis (PCA), Sammon’s Mapping, Self-Organizing Map (SOM), Radial Coordinate Visualization and Star Coordinates. The second problem is concerned with evaluating different visualization techniques as to their effectiveness in visual data mining of business data. For this purpose, we propose an inquiry evaluation technique and conduct the evaluation of nine visualization techniques. The visualizations under evaluation are Multiple Line Graphs, Permutation Matrix, Survey Plot, Scatter Plot Matrix, Parallel Coordinates, Treemap, PCA, Sammon’s Mapping and the SOM. The third problem is the evaluation of quality of use of VDM tools. We provide a conceptual framework for evaluating the quality of use of VDM tools and apply it to the evaluation of the SOM. In the evaluation, we use an inquiry technique for which we developed a questionnaire based on the proposed framework. The contributions of the thesis consist of three new evaluation techniques and the results obtained by applying these evaluation techniques. The thesis provides a systematic approach to evaluation of various visualization techniques. In this respect, first, we performed and described the evaluations in a systematic way, highlighting the evaluation activities, and their inputs and outputs. Secondly, we integrated the evaluation studies in the broad framework of usability evaluation. The results of the evaluations are intended to help developers and researchers of visualization systems to select appropriate visualization techniques in specific situations. The results of the evaluations also contribute to the understanding of the strengths and limitations of the visualization techniques evaluated and further to the improvement of these techniques.
Resumo:
The potential for enhancing the energy efficiency of industrial pumping processes is estimated to be in some cases up to 50 %. One way to define further this potential is to implement techniques in accordance to definition of best available techniques in pumping applications. These techniques are divided into three main categories: Design, control method & maintenance and distribution system. In the theory part of this thesis first the definition of best available techniques (BAT) and its applicability on pumping processes is issued. Next, the theory around pumping with different pump types is handled, the main stress being in centrifugal pumps. Other components needed in a pumping process are dealt by presenting different control methods, use of an electric motor, variable speed drive and the distribution system. Last part of the theory is about industrial pumping processes from water distribution, sewage water and power plant applications, some of which are used further on in the empirical part as example cases. For the empirical part of this study four case studies on typical pumping processes from older Master’s these were selected. Firstly the original results were analyzed by studying the distribution of energy consumption between different system components and using the definition of BAT in pumping, possible ways to improve energy efficiency were evaluated. The goal in this study was that by the achieved results it would be possible to identify the characteristic energy consumption of these and similar pumping processes. Through this data it would then be easier to focus energy efficiency actions where they might be the most applicable, both technically and economically.
Resumo:
This book is dedicated to celebrate the 60th birthday of Professor Rainer Huopalahti. Professor Rainer “Repe” Huopalahti has had, and in fact is still enjoying a distinguished career in the analysis of food and food related flavor compounds. One will find it hard to make any progress in this particular field without a valid and innovative sample handling technique and this is a field in which Professor Huopalahti has made great contributions. The title and the front cover of this book honors Professor Huopahti’s early steps in science. His PhD thesis which was published on 1985 is entitled “Composition and content of aroma compounds in the dill herb, Anethum graveolens L., affected by different factors”. At that time, the thesis introduced new technology being applied to sample handling and analysis of flavoring compounds of dill. Sample handling is an essential task that in just about every analysis. If one is working with minor compounds in a sample or trying to detect trace levels of the analytes, one of the aims of sample handling may be to increase the sensitivity of the analytical method. On the other hand, if one is working with a challenging matrix such as the kind found in biological samples, one of the aims is to increase the selectivity. However, quite often the aim is to increase both the selectivity and the sensitivity. This book provides good and representative examples about the necessity of valid sample handling and the role of the sample handling in the analytical method. The contributors of the book are leading Finnish scientists on the field of organic instrumental analytical chemistry. Some of them are also Repe’ s personal friends and former students from the University of Turku, Department of Biochemistry and Food Chemistry. Importantly, the authors all know Repe in one way or another and are well aware of his achievements on the field of analytical chemistry. The editorial team had a great time during the planning phase and during the “hard work editorial phase” of the book. For example, we came up with many ideas on how to publish the book. After many long discussions, we decided to have a limited edition as an “old school hard cover book” – and to acknowledge more modern ways of disseminating knowledge by publishing an internet version of the book on the webpages of the University of Turku. Downloading the book from the webpage for personal use is free of charge. We believe and hope that the book will be read with great interest by scientists working in the fascinating field of organic instrumental analytical chemistry. We decided to publish our book in English for two main reasons. First, we believe that in the near future, more and more teaching in Finnish Universities will be delivered in English. To facilitate this process and encourage students to develop good language skills, it was decided to be published the book in English. Secondly, we believe that the book will also interest scientists outside Finland – particularly in the other member states of the European Union. The editorial team thanks all the authors for their willingness to contribute to this book – and to adhere to the very strict schedule. We also want to thank the various individuals and enterprises who financially supported the book project. Without that support, it would not have been possible to publish the hardcover book.
Resumo:
Denne avhandlinga er resultatet av eit behov for å forske på og utvikle ein didaktikk for tekstilfaget ved Folkekunststudiet, Institutt for folkekultur, Høgskolen i Telemark, Noreg. Studiet med folkekunst som fagfelt er eit relativt ungt studium på høgskolen, som vart etablert i 1984. Problemstillinga i denne avhandlinga er korleis ein kan utvikle ein forskingsbasert didaktikk der dei grunnleggjande prinsippa som særmerkjer den tradisjonelle folkekunsten, vert tekne vare på. I arbeidet med avhandlinga har eg prøvd på å klårgjere problemstillinga ut frå ulike perspektiv. Forskingsarbeidet har fokus på kommunikasjon og arbeidsmåtar i ljos av ulike teoriar. Det er ei hermeneutisk tilnærming som er vald for den didaktiske forståinga. Det fyrste drøftingstemaet har søkjeljos på kommunikasjon og dialog ved vidareføring av tekstil folkekunst. Både estetiske, teoretiske, praktiske og sosiale aspekt er nedfelte i læreplanen for studiet og skal utgjere grunnlaget for kommunikasjon og arbeidsmåtar. I den skapande og kopierande prosessen er det utvikla språklege reiskapar for både den sosiale og den estetiske sida der den teoretiske og praktiske faktoren er integrert. Møte med døme på tekstile tradisjonar så vel som praktisk forming av tekstilar har ført til refleksjon og dialog som involverer kontemplasjon, korrespondens og imaginasjon. Det andre temaet som er drøfta, er vidareføring av tradisjonelt visuelt formspråk. Her er merksemda retta mot kva som har skjedd formalt med ei gruppe tradisjonelle formelement i tekstilar i den institusjonelle vidareføringa over eit lengre tidsperspektiv. Resultatet syner at mange tradisjonelle formelement er borte frå den institusjonelle produksjonen. Formelementa kan ha fått ei meir naturalistisk utforming, eller dei er overførte til andre tradisjonelle tekstilteknikkar enn dei som var utgangspunktet. Rombeforma i den institusjonelle produksjonen er utført i færre variasjonar og kombinasjonar enn i den tradisjonelle produksjonen. Konklusjonen på drøftingstemaet er at spelereglar og spatialitet i høve til den formale komponenten i utvalet med tradisjonelle tekstilar ikkje er vidareført i alle gruppene av institusjonelle produkt. Resultatet kan få innverknad og fylgjer for utforming av ein framtidig didaktikk for faget. Arbeidsmåte og erfaring frå vidareføring av tekstile tradisjonar er det tredje temaet som er drøfta i avhandlinga. Kopiering og skapande prosessar er arbeidsmåtar som er brukte ved studiet i dag, og dei utgjer grunnlag for drøftingar i relasjon til vidareføring og erfaring. Konklusjonen er at i den skapande prosessen korrigerer tradisjonen utforminga, medan i kopiprosessen er eigen stilvilje og improvisering resultat av prosessane. Personar som deltek i prosessane, har sett seg sjølve og si historiske forankring inn i spelet, der visuelt og verbalt språk er resultat av integrasjon av tradisjonar. Dei tre drøftingstemaa utgjer grunnlaget for at didaktikk for folkekunst og tekstil kan bli intersubjektiv forståing av kva ein didaktikk for vidareføring av folkekunst med vekt på tekstil kan vere i utdanninga i dag. Indre og ytre dialog i skapande og kopierande prosessar femner om estetiske, materielle og tekniske faktorar sett i høve til spelereglar, spelerom, spatialitet og samspel i møte med tekstile tradisjonar. Samla utgjer det ein forskingsbasert didaktikk for faget der den overordna intensjonen er samtykke mellom tradisjon og spel.
Resumo:
The use of intensity-modulated radiotherapy (IMRT) has increased extensively in the modern radiotherapy (RT) treatments over the past two decades. Radiation dose distributions can be delivered with higher conformality with IMRT when compared to the conventional 3D-conformal radiotherapy (3D-CRT). Higher conformality and target coverage increases the probability of tumour control and decreases the normal tissue complications. The primary goal of this work is to improve and evaluate the accuracy, efficiency and delivery techniques of RT treatments by using IMRT. This study evaluated the dosimetric limitations and possibilities of IMRT in small (treatments of head-and-neck, prostate and lung cancer) and large volumes (primitive neuroectodermal tumours). The dose coverage of target volumes and the sparing of critical organs were increased with IMRT when compared to 3D-CRT. The developed split field IMRT technique was found to be safe and accurate method in craniospinal irradiations. By using IMRT in simultaneous integrated boosting of biologically defined target volumes of localized prostate cancer high doses were achievable with only small increase in the treatment complexity. Biological plan optimization increased the probability of uncomplicated control on average by 28% when compared to standard IMRT delivery. Unfortunately IMRT carries also some drawbacks. In IMRT the beam modulation is realized by splitting a large radiation field to small apertures. The smaller the beam apertures are the larger the rebuild-up and rebuild-down effects are at the tissue interfaces. The limitations to use IMRT with small apertures in the treatments of small lung tumours were investigated with dosimetric film measurements. The results confirmed that the peripheral doses of the small lung tumours were decreased as the effective field size was decreased. The studied calculation algorithms were not able to model the dose deficiency of the tumours accurately. The use of small sliding window apertures of 2 mm and 4 mm decreased the tumour peripheral dose by 6% when compared to 3D-CRT treatment plan. A direct aperture based optimization (DABO) technique was examined as a solution to decrease the treatment complexity. The DABO IMRT technique was able to achieve treatment plans equivalent with the conventional IMRT fluence based optimization techniques in the concave head-and-neck target volumes. With DABO the effective field sizes were increased and the number of MUs was reduced with a factor of two. The optimality of a treatment plan and the therapeutic ratio can be further enhanced by using dose painting based on regional radiosensitivities imaged with functional imaging methods.
Resumo:
Formal software development processes and well-defined development methodologies are nowadays seen as the definite way to produce high-quality software within time-limits and budgets. The variety of such high-level methodologies is huge ranging from rigorous process frameworks like CMMI and RUP to more lightweight agile methodologies. The need for managing this variety and the fact that practically every software development organization has its own unique set of development processes and methods have created a profession of software process engineers. Different kinds of informal and formal software process modeling languages are essential tools for process engineers. These are used to define processes in a way which allows easy management of processes, for example process dissemination, process tailoring and process enactment. The process modeling languages are usually used as a tool for process engineering where the main focus is on the processes themselves. This dissertation has a different emphasis. The dissertation analyses modern software development process modeling from the software developers’ point of view. The goal of the dissertation is to investigate whether the software process modeling and the software process models aid software developers in their day-to-day work and what are the main mechanisms for this. The focus of the work is on the Software Process Engineering Metamodel (SPEM) framework which is currently one of the most influential process modeling notations in software engineering. The research theme is elaborated through six scientific articles which represent the dissertation research done with process modeling during an approximately five year period. The research follows the classical engineering research discipline where the current situation is analyzed, a potentially better solution is developed and finally its implications are analyzed. The research applies a variety of different research techniques ranging from literature surveys to qualitative studies done amongst software practitioners. The key finding of the dissertation is that software process modeling notations and techniques are usually developed in process engineering terms. As a consequence the connection between the process models and actual development work is loose. In addition, the modeling standards like SPEM are partially incomplete when it comes to pragmatic process modeling needs, like light-weight modeling and combining pre-defined process components. This leads to a situation, where the full potential of process modeling techniques for aiding the daily development activities can not be achieved. Despite these difficulties the dissertation shows that it is possible to use modeling standards like SPEM to aid software developers in their work. The dissertation presents a light-weight modeling technique, which software development teams can use to quickly analyze their work practices in a more objective manner. The dissertation also shows how process modeling can be used to more easily compare different software development situations and to analyze their differences in a systematic way. Models also help to share this knowledge with others. A qualitative study done amongst Finnish software practitioners verifies the conclusions of other studies in the dissertation. Although processes and development methodologies are seen as an essential part of software development, the process modeling techniques are rarely used during the daily development work. However, the potential of these techniques intrigues the practitioners. As a conclusion the dissertation shows that process modeling techniques, most commonly used as tools for process engineers, can also be used as tools for organizing the daily software development work. This work presents theoretical solutions for bringing the process modeling closer to the ground-level software development activities. These theories are proven feasible by presenting several case studies where the modeling techniques are used e.g. to find differences in the work methods of the members of a software team and to share the process knowledge to a wider audience.
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
The amount of biological data has grown exponentially in recent decades. Modern biotechnologies, such as microarrays and next-generation sequencing, are capable to produce massive amounts of biomedical data in a single experiment. As the amount of the data is rapidly growing there is an urgent need for reliable computational methods for analyzing and visualizing it. This thesis addresses this need by studying how to efficiently and reliably analyze and visualize high-dimensional data, especially that obtained from gene expression microarray experiments. First, we will study the ways to improve the quality of microarray data by replacing (imputing) the missing data entries with the estimated values for these entries. Missing value imputation is a method which is commonly used to make the original incomplete data complete, thus making it easier to be analyzed with statistical and computational methods. Our novel approach was to use curated external biological information as a guide for the missing value imputation. Secondly, we studied the effect of missing value imputation on the downstream data analysis methods like clustering. We compared multiple recent imputation algorithms against 8 publicly available microarray data sets. It was observed that the missing value imputation indeed is a rational way to improve the quality of biological data. The research revealed differences between the clustering results obtained with different imputation methods. On most data sets, the simple and fast k-NN imputation was good enough, but there were also needs for more advanced imputation methods, such as Bayesian Principal Component Algorithm (BPCA). Finally, we studied the visualization of biological network data. Biological interaction networks are examples of the outcome of multiple biological experiments such as using the gene microarray techniques. Such networks are typically very large and highly connected, thus there is a need for fast algorithms for producing visually pleasant layouts. A computationally efficient way to produce layouts of large biological interaction networks was developed. The algorithm uses multilevel optimization within the regular force directed graph layout algorithm.
Resumo:
Switching power supplies are usually implemented with a control circuitry that uses constant clock frequency turning the power semiconductor switches on and off. A drawback of this customary operating principle is that the switching frequency and harmonic frequencies are present in both the conducted and radiated EMI spectrum of the power converter. Various variable-frequency techniques have been introduced during the last decade to overcome the EMC problem. The main objective of this study was to compare the EMI and steady-state performance of a switch mode power supply with different spread-spectrum/variable-frequency methods. Another goal was to find out suitable tools for the variable-frequency EMI analysis. This thesis can be divided into three main parts: Firstly, some aspects of spectral estimation and measurement are presented. Secondly, selected spread spectrum generation techniques are presented with simulations and background information. Finally, simulations and prototype measurements from the EMC and the steady-state performance are carried out in the last part of this work. Combination of the autocorrelation function, the Welch spectrum estimate and the spectrogram were used as a substitute for ordinary Fourier methods in the EMC analysis. It was also shown that the switching function can be used in preliminary EMC analysis of a SMPS and the spectrum and autocorrelation sequence of a switching function correlates with the final EMI spectrum. This work is based on numerous simulations and measurements made with the prototype. All these simulations and measurements are made with the boost DC/DC converter. Four different variable-frequency modulation techniques in six different configurations were analyzed and the EMI performance was compared to the constant frequency operation. Output voltage and input current waveforms were also analyzed in time domain to see the effect of the spread spectrum operation on these quantities. According to the results presented in this work, spread spectrum modulation can be utilized in power converter for EMI mitigation. The results from steady-state voltage measurements show, that the variable-frequency operation of the SMPS has effect on the voltage ripple, but the ripple measured from the prototype is still acceptable in some applications. Both current and voltage ripple can be controlled with proper main circuit and controller design.