991 resultados para soft-commutation techniques
Resumo:
Visual data mining (VDM) tools employ information visualization techniques in order to represent large amounts of high-dimensional data graphically and to involve the user in exploring data at different levels of detail. The users are looking for outliers, patterns and models – in the form of clusters, classes, trends, and relationships – in different categories of data, i.e., financial, business information, etc. The focus of this thesis is the evaluation of multidimensional visualization techniques, especially from the business user’s perspective. We address three research problems. The first problem is the evaluation of projection-based visualizations with respect to their effectiveness in preserving the original distances between data points and the clustering structure of the data. In this respect, we propose the use of existing clustering validity measures. We illustrate their usefulness in evaluating five visualization techniques: Principal Components Analysis (PCA), Sammon’s Mapping, Self-Organizing Map (SOM), Radial Coordinate Visualization and Star Coordinates. The second problem is concerned with evaluating different visualization techniques as to their effectiveness in visual data mining of business data. For this purpose, we propose an inquiry evaluation technique and conduct the evaluation of nine visualization techniques. The visualizations under evaluation are Multiple Line Graphs, Permutation Matrix, Survey Plot, Scatter Plot Matrix, Parallel Coordinates, Treemap, PCA, Sammon’s Mapping and the SOM. The third problem is the evaluation of quality of use of VDM tools. We provide a conceptual framework for evaluating the quality of use of VDM tools and apply it to the evaluation of the SOM. In the evaluation, we use an inquiry technique for which we developed a questionnaire based on the proposed framework. The contributions of the thesis consist of three new evaluation techniques and the results obtained by applying these evaluation techniques. The thesis provides a systematic approach to evaluation of various visualization techniques. In this respect, first, we performed and described the evaluations in a systematic way, highlighting the evaluation activities, and their inputs and outputs. Secondly, we integrated the evaluation studies in the broad framework of usability evaluation. The results of the evaluations are intended to help developers and researchers of visualization systems to select appropriate visualization techniques in specific situations. The results of the evaluations also contribute to the understanding of the strengths and limitations of the visualization techniques evaluated and further to the improvement of these techniques.
Resumo:
The potential for enhancing the energy efficiency of industrial pumping processes is estimated to be in some cases up to 50 %. One way to define further this potential is to implement techniques in accordance to definition of best available techniques in pumping applications. These techniques are divided into three main categories: Design, control method & maintenance and distribution system. In the theory part of this thesis first the definition of best available techniques (BAT) and its applicability on pumping processes is issued. Next, the theory around pumping with different pump types is handled, the main stress being in centrifugal pumps. Other components needed in a pumping process are dealt by presenting different control methods, use of an electric motor, variable speed drive and the distribution system. Last part of the theory is about industrial pumping processes from water distribution, sewage water and power plant applications, some of which are used further on in the empirical part as example cases. For the empirical part of this study four case studies on typical pumping processes from older Master’s these were selected. Firstly the original results were analyzed by studying the distribution of energy consumption between different system components and using the definition of BAT in pumping, possible ways to improve energy efficiency were evaluated. The goal in this study was that by the achieved results it would be possible to identify the characteristic energy consumption of these and similar pumping processes. Through this data it would then be easier to focus energy efficiency actions where they might be the most applicable, both technically and economically.
Resumo:
This study evaluated establishment methods for a mixture of herbaceous forage legumes [Centrosema acutifolium, Clitoria ternatea, Pueraria phaseoloides, Stylosanthes Campo Grande (Stylosanthes capitata + S. macrocephala), Calopogonium mucunoides, Lablab purpureus, Arachis pintoi, and Aeschynomene villosa] under the shade of an Eucalyptus grandis plantation submitted to thinning (40%) 8 years after planting in Anhembi, São Paulo (22°40'S, 48°10'W, altitude of 455 m). The experiment started in December 2008 and consisted of the comparison of the following four types of seed incorporation by light disc harrowing: (1) broadcast sowing without seed incorporation; disc harrowing before (2) or after (3) planting, and (4) disc harrowing before and after planting. Ninety days after planting, the number of legume plants/m2 and the percentage of ground cover by the plants varied between the treatments tested; however, the treatments had no effect on the dry matter accumulation of forage legumes. Disc harrowing before planting yielded superior results compared to the treatments without disc harrowing and disc harrowing after planting. At the end of the experimental period, the plots contained Arachis, Centrosema, Stylosanthes, and Pueraria. The dry matter accumulated by Centrosema corresponded to 73% of total dry matter yield of the plots. The participation of Arachis, Centrosema and Stylosanthes in final dry matter composition of the plots varied according to establishment method. The advantages of the use of species mixtures rather than monocultures in the understory of forest plantations were discussed.
Resumo:
This book is dedicated to celebrate the 60th birthday of Professor Rainer Huopalahti. Professor Rainer “Repe” Huopalahti has had, and in fact is still enjoying a distinguished career in the analysis of food and food related flavor compounds. One will find it hard to make any progress in this particular field without a valid and innovative sample handling technique and this is a field in which Professor Huopalahti has made great contributions. The title and the front cover of this book honors Professor Huopahti’s early steps in science. His PhD thesis which was published on 1985 is entitled “Composition and content of aroma compounds in the dill herb, Anethum graveolens L., affected by different factors”. At that time, the thesis introduced new technology being applied to sample handling and analysis of flavoring compounds of dill. Sample handling is an essential task that in just about every analysis. If one is working with minor compounds in a sample or trying to detect trace levels of the analytes, one of the aims of sample handling may be to increase the sensitivity of the analytical method. On the other hand, if one is working with a challenging matrix such as the kind found in biological samples, one of the aims is to increase the selectivity. However, quite often the aim is to increase both the selectivity and the sensitivity. This book provides good and representative examples about the necessity of valid sample handling and the role of the sample handling in the analytical method. The contributors of the book are leading Finnish scientists on the field of organic instrumental analytical chemistry. Some of them are also Repe’ s personal friends and former students from the University of Turku, Department of Biochemistry and Food Chemistry. Importantly, the authors all know Repe in one way or another and are well aware of his achievements on the field of analytical chemistry. The editorial team had a great time during the planning phase and during the “hard work editorial phase” of the book. For example, we came up with many ideas on how to publish the book. After many long discussions, we decided to have a limited edition as an “old school hard cover book” – and to acknowledge more modern ways of disseminating knowledge by publishing an internet version of the book on the webpages of the University of Turku. Downloading the book from the webpage for personal use is free of charge. We believe and hope that the book will be read with great interest by scientists working in the fascinating field of organic instrumental analytical chemistry. We decided to publish our book in English for two main reasons. First, we believe that in the near future, more and more teaching in Finnish Universities will be delivered in English. To facilitate this process and encourage students to develop good language skills, it was decided to be published the book in English. Secondly, we believe that the book will also interest scientists outside Finland – particularly in the other member states of the European Union. The editorial team thanks all the authors for their willingness to contribute to this book – and to adhere to the very strict schedule. We also want to thank the various individuals and enterprises who financially supported the book project. Without that support, it would not have been possible to publish the hardcover book.
Resumo:
The aim of this study was to group temporal profiles of 10-day composites NDVI product by similarity, which was obtained by the SPOT Vegetation sensor, for municipalities with high soybean production in the state of Paraná, Brazil, in the 2005/2006 cropping season. Data mining is a valuable tool that allows extracting knowledge from a database, identifying valid, new, potentially useful and understandable patterns. Therefore, it was used the methods for clusters generation by means of the algorithms K-Means, MAXVER and DBSCAN, implemented in the WEKA software package. Clusters were created based on the average temporal profiles of NDVI of the 277 municipalities with high soybean production in the state and the best results were found with the K-Means algorithm, grouping the municipalities into six clusters, considering the period from the beginning of October until the end of March, which is equivalent to the crop vegetative cycle. Half of the generated clusters presented spectro-temporal pattern, a characteristic of soybeans and were mostly under the soybean belt in the state of Paraná, which shows good results that were obtained with the proposed methodology as for identification of homogeneous areas. These results will be useful for the creation of regional soybean "masks" to estimate the planted area for this crop.
Resumo:
The use of intensity-modulated radiotherapy (IMRT) has increased extensively in the modern radiotherapy (RT) treatments over the past two decades. Radiation dose distributions can be delivered with higher conformality with IMRT when compared to the conventional 3D-conformal radiotherapy (3D-CRT). Higher conformality and target coverage increases the probability of tumour control and decreases the normal tissue complications. The primary goal of this work is to improve and evaluate the accuracy, efficiency and delivery techniques of RT treatments by using IMRT. This study evaluated the dosimetric limitations and possibilities of IMRT in small (treatments of head-and-neck, prostate and lung cancer) and large volumes (primitive neuroectodermal tumours). The dose coverage of target volumes and the sparing of critical organs were increased with IMRT when compared to 3D-CRT. The developed split field IMRT technique was found to be safe and accurate method in craniospinal irradiations. By using IMRT in simultaneous integrated boosting of biologically defined target volumes of localized prostate cancer high doses were achievable with only small increase in the treatment complexity. Biological plan optimization increased the probability of uncomplicated control on average by 28% when compared to standard IMRT delivery. Unfortunately IMRT carries also some drawbacks. In IMRT the beam modulation is realized by splitting a large radiation field to small apertures. The smaller the beam apertures are the larger the rebuild-up and rebuild-down effects are at the tissue interfaces. The limitations to use IMRT with small apertures in the treatments of small lung tumours were investigated with dosimetric film measurements. The results confirmed that the peripheral doses of the small lung tumours were decreased as the effective field size was decreased. The studied calculation algorithms were not able to model the dose deficiency of the tumours accurately. The use of small sliding window apertures of 2 mm and 4 mm decreased the tumour peripheral dose by 6% when compared to 3D-CRT treatment plan. A direct aperture based optimization (DABO) technique was examined as a solution to decrease the treatment complexity. The DABO IMRT technique was able to achieve treatment plans equivalent with the conventional IMRT fluence based optimization techniques in the concave head-and-neck target volumes. With DABO the effective field sizes were increased and the number of MUs was reduced with a factor of two. The optimality of a treatment plan and the therapeutic ratio can be further enhanced by using dose painting based on regional radiosensitivities imaged with functional imaging methods.
Resumo:
Acquired chest wall defects present a challenging problem for thoracic surgeons. Many of such defects can be repaired with the use of local and regional musculocutaneous flaps, but larger defects compromising skeletal structure require increasingly sophisticated reconstructive techniques. The following discussion will review the options for repair acquired chest wall defects based in literature. The authors searched the Pubmed (www.pubmed.com) and found citations from January 1996 to February 2008. By reading the titles and the abstracts most of the citations were discharged because they focused in congenital chest wall defects or were cases report. However, many papers were found describing the outcome of large series of patients with acquired chest wall deformities. A review of recent literature shows that the repair of chest wall defects with soft tissues, if possible, remains the treatment of choice. Large chest wall defects require skeletal reconstruction to prevent paradoxical respiration. The selection of the most appropriate flap is primary dictated by the location and the size of the defect. It is important to transfer tissue with good vitality, so understanding the vascular supply is imperative. Autogenous grafts have been used in the past for skeletal reconstruction but a combination of synthetic materials with musculocutaneous flaps has been used lately. Based in the literature, the use of prosthetic material in chest wall reconstruction does not significantly increases the risk of wound infection.
Resumo:
Formal software development processes and well-defined development methodologies are nowadays seen as the definite way to produce high-quality software within time-limits and budgets. The variety of such high-level methodologies is huge ranging from rigorous process frameworks like CMMI and RUP to more lightweight agile methodologies. The need for managing this variety and the fact that practically every software development organization has its own unique set of development processes and methods have created a profession of software process engineers. Different kinds of informal and formal software process modeling languages are essential tools for process engineers. These are used to define processes in a way which allows easy management of processes, for example process dissemination, process tailoring and process enactment. The process modeling languages are usually used as a tool for process engineering where the main focus is on the processes themselves. This dissertation has a different emphasis. The dissertation analyses modern software development process modeling from the software developers’ point of view. The goal of the dissertation is to investigate whether the software process modeling and the software process models aid software developers in their day-to-day work and what are the main mechanisms for this. The focus of the work is on the Software Process Engineering Metamodel (SPEM) framework which is currently one of the most influential process modeling notations in software engineering. The research theme is elaborated through six scientific articles which represent the dissertation research done with process modeling during an approximately five year period. The research follows the classical engineering research discipline where the current situation is analyzed, a potentially better solution is developed and finally its implications are analyzed. The research applies a variety of different research techniques ranging from literature surveys to qualitative studies done amongst software practitioners. The key finding of the dissertation is that software process modeling notations and techniques are usually developed in process engineering terms. As a consequence the connection between the process models and actual development work is loose. In addition, the modeling standards like SPEM are partially incomplete when it comes to pragmatic process modeling needs, like light-weight modeling and combining pre-defined process components. This leads to a situation, where the full potential of process modeling techniques for aiding the daily development activities can not be achieved. Despite these difficulties the dissertation shows that it is possible to use modeling standards like SPEM to aid software developers in their work. The dissertation presents a light-weight modeling technique, which software development teams can use to quickly analyze their work practices in a more objective manner. The dissertation also shows how process modeling can be used to more easily compare different software development situations and to analyze their differences in a systematic way. Models also help to share this knowledge with others. A qualitative study done amongst Finnish software practitioners verifies the conclusions of other studies in the dissertation. Although processes and development methodologies are seen as an essential part of software development, the process modeling techniques are rarely used during the daily development work. However, the potential of these techniques intrigues the practitioners. As a conclusion the dissertation shows that process modeling techniques, most commonly used as tools for process engineers, can also be used as tools for organizing the daily software development work. This work presents theoretical solutions for bringing the process modeling closer to the ground-level software development activities. These theories are proven feasible by presenting several case studies where the modeling techniques are used e.g. to find differences in the work methods of the members of a software team and to share the process knowledge to a wider audience.
Resumo:
For oral rehabilitation with implant-supported prostheses, there are required procedures to create the bone volume needed for installation of the implants. Thus, bone grafts from intraoral or extraoral donor sites represent a very favorable opportunity. This study aimed to review the literature on the subject, seeking to discuss parameters for the indications, advantages and complications of techniques for autogenous bone grafts.
Resumo:
Connectivity depends on rates of dispersal between communities. For marine soft-sediment communities continued small-scale dispersal as post-larvae and as adults can be equally important in maintaining community composition, as initial recruitment of substrate by pelagic larvae. In this thesis post-larval dispersal strategies of benthic invertebrates, as well as mechanisms by which communities are connected were investigated. Such knowledge on dispersal is scarce, due to the difficulties in actually measuring dispersal directly in nature, and dispersal has not previously been quantified in the Baltic Sea. Different trap-types were used underwater to capture dispersing invertebrates at different sites, while in parallel measuring waves and currents. Local community composition was found to change predictably under varying rates of dispersal and physical connectivity (waves and currents). This response was, however, dependent on dispersal-related traits of taxa. Actively dispersing taxa will be relatively better at maintaining their position, as they are not as dependent on hydrodynamic conditions for dispersal and will be less prone to be passively transported by currents. Taxa also dispersed in relative proportions that were distinctly different from resident community composition and a significant proportion (40 %) of taxa were found to lack a planktonic larval life-stage. Community assembly was re-started in a large-scale manipulative field experiment over one year across several sites, which revealed how patterns of community composition (α-, β- and λ-diversity) change depending on rates of dispersal. Results also demonstrated that in response to small-scale disturbance, initial recruitment was by nearby-dominant species after which other species arrived from successively further away. At later assembly time, the number of coexisting species increased beyond what was expected purely by local niche requirements (species sorting), transferring regional differences in community composition (β-diversity) to the local scale (α-diversity, mass effect). Findings of this thesis complement more theoretical studies in metacommunity ecology by demonstrating that understanding how and when individuals disperse relative to underlying environmental heterogeneity is key to interpreting how patterns of diversity change across different spatial scales. Such information from nature is critical when predicting responses to, for example, different types of disturbances or management actions in conservation.
Resumo:
Forty-one wild house mice (Mus musculus) were trapped in an urban area, near railways, in Santa Fe city, Argentina. Both kidneys from each mouse were removed for bacteriological and histological examination. One kidney was inoculated into Fletcher semi-solid medium and isolates were serologically typed. The other kidney was microscopically examined after hematoxylin-eosin, silver impregnation and immunohistochemical stains. Leptospires, all of them belonging to the Ballum serogroup, were isolated from 16 (39%) out of 41 samples. The presence of the agent was recorded in 18 (44%) and in 19 (46%) out of 41 silver impregnated and immunohistochemically stained samples respectively. Additionally, leptospires were detected in high number on the apical surface of epithelial cells and in the lumen of medullary tubules and they were less frequently seen on the apical surface of epithelial cells or in the lumen of the cortical tubules, which represents an unusual finding in carrier animals. Microscopic lesions consisting of focal mononuclear interstitial nephritis, glomerular shrinkage and desquamation of tubular epithelial cells were observed in 13 of 19 infected and in 10 of 22 non-infected mice; differences in presence of lesions between infected and non-infected animals were not statistically significant (P=0,14). The three techniques, culture, silver impregnation and immunohistochemistry, had a high agreement (k³0.85) and no significant differences between them were detected (P>0.05). In addition, an unusual location of leptospires in kidneys of carrier animals was reported, but a relationship between lesions and presence of leptospires could not be established.
Resumo:
Tutkielmassa analysoidaan kolmen internetsivuston uutisartikkeleita kielitieteen näkökulmasta. Tavoitteena on selvittää esiintyykö internetsivustojen BBC, CNN ja Fox News uutisoinnissa politiikkaan liittyviä ennakkoasenteita tai puolueellisuuksia ja miten ne käytännössä näkyvät uutisartikkeleiden kielessä. Kriittiseen diskurssianalyysiin pohjautuen tutkielma esittelee jokaisen uutissivuston taustaan (esimerkiksi rakenteeseen ja rahoitukseen) liittyviä seikkoja sekä mediadiskurssiin ja politiikkaan liittyvät taustatiedot, jolla taataan Norman Fairclough'n kolmivaiheisen menetelmän mahdollisimman perusteellinen toteuttaminen. Uutissivustoja analysoidaan kriittiselle diskurssianalyysille sopivan funktionaalisen kieliopin ja muiden lingvististen välineiden avulla. Koko aineiston (404 artikkelia) otsikot analysoidaan ensin, minkä jälkeen analysoidaan yhdeksän kokonaista artikkelia kolmeen eri aihealueeseen liittyen niin, että jokaiselta internetsivustolta analysoidaan yksi artikkeli jokaista aihetta kohden. Analyysikeinoina käytetään ensisijaisesti systeemis-funktionaalisen kieliopin tekstuaalisen metafunktion välineitä (thematic structure). Myös ideationaalisen metafunktion välineitä (transitivity), referenssiketjuja (referential identity chains) ja leksikaalista analyysia käytetään hyväksi. Lähtökohtaisesti tavoitteena on analysoida uutissivustoja vertailevasti, jolloin analyysin tulokset ovat paremmin havainnoitavissa ja perusteltavissa. Hypoteesi aikaisempien tutkimusten ja yleisen mielikuvan perusteella on, että CNN uutisoi demokraattipuolueelle ja Fox News taas republikaanipuolueelle edulliseen sävyyn. Tutkimustulokset vaihtelivat hypoteesia tukevista ja sen vastaisista tuloksista niihin, jotka eivät olleet tarpeeksi tuettuja kumpaankaan suuntaan. Vahvimmat tulokset ovat kuitenkin hypoteesia tukevia, joten tässä tutkielmassa todetaan, ettei uutisointi ole puolueetonta ainakaan näiden kolmen internetsivuston kohdalla. Lisäksi muutaman aihealueen kohdalla uutisointi on niin toistuvaa tietystä näkökulmasta, että luonnollistumisteorian mukaista aatteiden luonnollistumista saattaa tapahtua. Tutkielmassa käytettyjen menetelmien menestyksen perusteella suositellaan, että tekstuaalisen metafunktion analyysivälineitä käytetään enemmän. Lisäksi suositellaan meta-analyysin harkitsemista, jotta voitaisiin selvittää, mitkä analyysimetodit parhaiten sopivat minkäkinlaisen aineiston analysointiin.
Resumo:
When modeling machines in their natural working environment collisions become a very important feature in terms of simulation accuracy. By expanding the simulation to include the operation environment, the need for a general collision model that is able to handle a wide variety of cases has become central in the development of simulation environments. With the addition of the operating environment the challenges for the collision modeling method also change. More simultaneous contacts with more objects occur in more complicated situations. This means that the real-time requirement becomes more difficult to meet. Common problems in current collision modeling methods include for example dependency on the geometry shape or mesh density, calculation need increasing exponentially in respect to the number of contacts, the lack of a proper friction model and failures due to certain configurations like closed kinematic loops. All these problems mean that the current modeling methods will fail in certain situations. A method that would not fail in any situation is not very realistic but improvements can be made over the current methods.