976 resultados para Meta-heuristic techniques
Resumo:
The current study aims to verify the best method for a rapid and efficient extraction of flavonoids from Alpinia zerumbet. Dried leaves were extracted using distillated water and ethanol 70% by extraction methods of shaking maceration, ultrasonic, microwave and stirring. By the application of TLC and reversed-phase HPLC techniques the rutin and kaempferol-3-O-glucuronide were detected. Ethanol 70% was more efficient for flavonoids extraction than water. No significant yielding variation was verified for ultrasonic, microwave and stirring methods using ethanol 70% (11 to 14%). The relative concentration of rutin and kaempferol-3-O-glucuronide, respectively, was higher by ultrasonic (1.5 and 5.62 mg g-1 dried leaves, respectively) and by microwave (1.0 and 6.64 mg g-1 dried leaves) methods using ethanol. Rapid and simplified extraction proceeding optimize phytochemical work and acquisition of secondary metabolites.
Resumo:
"Meta-estética e ética francesa do sentido" é uma análise de alguns conceitos formadores do chamado "pós-estruturalismo" e de certos aspectos de sua seqüência histórica. Através de fontes textuais de pensadores que ocupam um momento significativo da produção filosófica internacional, com Jacques Derrida, Gilles Deleuze, Michel Serres e Jean-Luc Nancy, dos anos sessenta até os noventa, o texto coloca em perspectiva conceitos nevrálgicos e estratégicos ressituados nas suas implicações críticas. A manifestação das convergências ligando os pensamentos desses quatro filósofos permite ressaltar os bastidores especulativos de uma "condição poética do pensamento" (Alain Badiou) delineando os contornos de uma meta-estética do sentido que é ao mesmo tempo uma ética. Essa fusão, bem sintetizada na fórmula de Michel Serres, que diz que "a moral é a física", é determinada pelas elaborações, as experimentações e as conquistas realizadas na filosofia derridiana da desconstrução, na filosofia deleuziana do conceito, na filosofia serresiana da física e na filosofia nancyana da arealidade. Os processos em jogo nesses sistemas tentam descobrir nos estratos aporéticos do pensamento as chances de induzir uma cosmologia paradoxal e inaudita.
Resumo:
Defendo que o estudo de Aristóteles sobre poiêtikê technê deve ser entendido como um estudo sobre um vocabulário meta-filosófico. Defendo ainda que a sua vantagem principal é a de tornar explícita uma contiguidade conceptual entre um conjunto de problemas relacionados com teoria da acção, racionalidade e cognição colectiva, assim como a de dar inteligibilidade indirecta à partilha de disposições em comunidades humanas.
Resumo:
RESUMO A comunicação clínica e o profissionalismo estão entre as principais competências médicas e, portanto, devem ter sua avaliação garantida. Nesse contexto, o exame clínico objetivo estruturado (OSCE) tem papel fundamental. Objetivos Descrever as etapas de elaboração de um OSCE, bem como a avaliação da qualidade das estações e a percepção do estudante de Medicina sobre a sua realização. Método O estudo é composto pela realização de um OSCE com quatro estações por 16 estudantes de Medicina e pela análise da qualidade psicométrica e aplicação de um questionário de satisfação. Resultados Para os estudantes, o OSCE é o método que melhor avalia e ensina essas competências, ao passo que os testes de múltipla escolha estão no polo oposto quanto à avaliação. Em relação à qualidade múltipla das estações: duas se apresentaram com boa confiabilidade, uma se tornou satisfatória após adequação e uma se revelou inconsistente. Conclusão Mesmo bem avaliadas pelos estudantes, algumas estações apresentaram falhas. A análise do OSCE é fundamental para sua validade e mensurabilidade, em especial para o OSCE de alta aposta.
Resumo:
Visual data mining (VDM) tools employ information visualization techniques in order to represent large amounts of high-dimensional data graphically and to involve the user in exploring data at different levels of detail. The users are looking for outliers, patterns and models – in the form of clusters, classes, trends, and relationships – in different categories of data, i.e., financial, business information, etc. The focus of this thesis is the evaluation of multidimensional visualization techniques, especially from the business user’s perspective. We address three research problems. The first problem is the evaluation of projection-based visualizations with respect to their effectiveness in preserving the original distances between data points and the clustering structure of the data. In this respect, we propose the use of existing clustering validity measures. We illustrate their usefulness in evaluating five visualization techniques: Principal Components Analysis (PCA), Sammon’s Mapping, Self-Organizing Map (SOM), Radial Coordinate Visualization and Star Coordinates. The second problem is concerned with evaluating different visualization techniques as to their effectiveness in visual data mining of business data. For this purpose, we propose an inquiry evaluation technique and conduct the evaluation of nine visualization techniques. The visualizations under evaluation are Multiple Line Graphs, Permutation Matrix, Survey Plot, Scatter Plot Matrix, Parallel Coordinates, Treemap, PCA, Sammon’s Mapping and the SOM. The third problem is the evaluation of quality of use of VDM tools. We provide a conceptual framework for evaluating the quality of use of VDM tools and apply it to the evaluation of the SOM. In the evaluation, we use an inquiry technique for which we developed a questionnaire based on the proposed framework. The contributions of the thesis consist of three new evaluation techniques and the results obtained by applying these evaluation techniques. The thesis provides a systematic approach to evaluation of various visualization techniques. In this respect, first, we performed and described the evaluations in a systematic way, highlighting the evaluation activities, and their inputs and outputs. Secondly, we integrated the evaluation studies in the broad framework of usability evaluation. The results of the evaluations are intended to help developers and researchers of visualization systems to select appropriate visualization techniques in specific situations. The results of the evaluations also contribute to the understanding of the strengths and limitations of the visualization techniques evaluated and further to the improvement of these techniques.
Resumo:
Systems biology is a new, emerging and rapidly developing, multidisciplinary research field that aims to study biochemical and biological systems from a holistic perspective, with the goal of providing a comprehensive, system- level understanding of cellular behaviour. In this way, it addresses one of the greatest challenges faced by contemporary biology, which is to compre- hend the function of complex biological systems. Systems biology combines various methods that originate from scientific disciplines such as molecu- lar biology, chemistry, engineering sciences, mathematics, computer science and systems theory. Systems biology, unlike “traditional” biology, focuses on high-level concepts such as: network, component, robustness, efficiency, control, regulation, hierarchical design, synchronization, concurrency, and many others. The very terminology of systems biology is “foreign” to “tra- ditional” biology, marks its drastic shift in the research paradigm and it indicates close linkage of systems biology to computer science. One of the basic tools utilized in systems biology is the mathematical modelling of life processes tightly linked to experimental practice. The stud- ies contained in this thesis revolve around a number of challenges commonly encountered in the computational modelling in systems biology. The re- search comprises of the development and application of a broad range of methods originating in the fields of computer science and mathematics for construction and analysis of computational models in systems biology. In particular, the performed research is setup in the context of two biolog- ical phenomena chosen as modelling case studies: 1) the eukaryotic heat shock response and 2) the in vitro self-assembly of intermediate filaments, one of the main constituents of the cytoskeleton. The range of presented approaches spans from heuristic, through numerical and statistical to ana- lytical methods applied in the effort to formally describe and analyse the two biological processes. We notice however, that although applied to cer- tain case studies, the presented methods are not limited to them and can be utilized in the analysis of other biological mechanisms as well as com- plex systems in general. The full range of developed and applied modelling techniques as well as model analysis methodologies constitutes a rich mod- elling framework. Moreover, the presentation of the developed methods, their application to the two case studies and the discussions concerning their potentials and limitations point to the difficulties and challenges one encounters in computational modelling of biological systems. The problems of model identifiability, model comparison, model refinement, model inte- gration and extension, choice of the proper modelling framework and level of abstraction, or the choice of the proper scope of the model run through this thesis.
Resumo:
The potential for enhancing the energy efficiency of industrial pumping processes is estimated to be in some cases up to 50 %. One way to define further this potential is to implement techniques in accordance to definition of best available techniques in pumping applications. These techniques are divided into three main categories: Design, control method & maintenance and distribution system. In the theory part of this thesis first the definition of best available techniques (BAT) and its applicability on pumping processes is issued. Next, the theory around pumping with different pump types is handled, the main stress being in centrifugal pumps. Other components needed in a pumping process are dealt by presenting different control methods, use of an electric motor, variable speed drive and the distribution system. Last part of the theory is about industrial pumping processes from water distribution, sewage water and power plant applications, some of which are used further on in the empirical part as example cases. For the empirical part of this study four case studies on typical pumping processes from older Master’s these were selected. Firstly the original results were analyzed by studying the distribution of energy consumption between different system components and using the definition of BAT in pumping, possible ways to improve energy efficiency were evaluated. The goal in this study was that by the achieved results it would be possible to identify the characteristic energy consumption of these and similar pumping processes. Through this data it would then be easier to focus energy efficiency actions where they might be the most applicable, both technically and economically.
Resumo:
This study evaluated establishment methods for a mixture of herbaceous forage legumes [Centrosema acutifolium, Clitoria ternatea, Pueraria phaseoloides, Stylosanthes Campo Grande (Stylosanthes capitata + S. macrocephala), Calopogonium mucunoides, Lablab purpureus, Arachis pintoi, and Aeschynomene villosa] under the shade of an Eucalyptus grandis plantation submitted to thinning (40%) 8 years after planting in Anhembi, São Paulo (22°40'S, 48°10'W, altitude of 455 m). The experiment started in December 2008 and consisted of the comparison of the following four types of seed incorporation by light disc harrowing: (1) broadcast sowing without seed incorporation; disc harrowing before (2) or after (3) planting, and (4) disc harrowing before and after planting. Ninety days after planting, the number of legume plants/m2 and the percentage of ground cover by the plants varied between the treatments tested; however, the treatments had no effect on the dry matter accumulation of forage legumes. Disc harrowing before planting yielded superior results compared to the treatments without disc harrowing and disc harrowing after planting. At the end of the experimental period, the plots contained Arachis, Centrosema, Stylosanthes, and Pueraria. The dry matter accumulated by Centrosema corresponded to 73% of total dry matter yield of the plots. The participation of Arachis, Centrosema and Stylosanthes in final dry matter composition of the plots varied according to establishment method. The advantages of the use of species mixtures rather than monocultures in the understory of forest plantations were discussed.
Resumo:
This book is dedicated to celebrate the 60th birthday of Professor Rainer Huopalahti. Professor Rainer “Repe” Huopalahti has had, and in fact is still enjoying a distinguished career in the analysis of food and food related flavor compounds. One will find it hard to make any progress in this particular field without a valid and innovative sample handling technique and this is a field in which Professor Huopalahti has made great contributions. The title and the front cover of this book honors Professor Huopahti’s early steps in science. His PhD thesis which was published on 1985 is entitled “Composition and content of aroma compounds in the dill herb, Anethum graveolens L., affected by different factors”. At that time, the thesis introduced new technology being applied to sample handling and analysis of flavoring compounds of dill. Sample handling is an essential task that in just about every analysis. If one is working with minor compounds in a sample or trying to detect trace levels of the analytes, one of the aims of sample handling may be to increase the sensitivity of the analytical method. On the other hand, if one is working with a challenging matrix such as the kind found in biological samples, one of the aims is to increase the selectivity. However, quite often the aim is to increase both the selectivity and the sensitivity. This book provides good and representative examples about the necessity of valid sample handling and the role of the sample handling in the analytical method. The contributors of the book are leading Finnish scientists on the field of organic instrumental analytical chemistry. Some of them are also Repe’ s personal friends and former students from the University of Turku, Department of Biochemistry and Food Chemistry. Importantly, the authors all know Repe in one way or another and are well aware of his achievements on the field of analytical chemistry. The editorial team had a great time during the planning phase and during the “hard work editorial phase” of the book. For example, we came up with many ideas on how to publish the book. After many long discussions, we decided to have a limited edition as an “old school hard cover book” – and to acknowledge more modern ways of disseminating knowledge by publishing an internet version of the book on the webpages of the University of Turku. Downloading the book from the webpage for personal use is free of charge. We believe and hope that the book will be read with great interest by scientists working in the fascinating field of organic instrumental analytical chemistry. We decided to publish our book in English for two main reasons. First, we believe that in the near future, more and more teaching in Finnish Universities will be delivered in English. To facilitate this process and encourage students to develop good language skills, it was decided to be published the book in English. Secondly, we believe that the book will also interest scientists outside Finland – particularly in the other member states of the European Union. The editorial team thanks all the authors for their willingness to contribute to this book – and to adhere to the very strict schedule. We also want to thank the various individuals and enterprises who financially supported the book project. Without that support, it would not have been possible to publish the hardcover book.
Resumo:
The aim of this study was to group temporal profiles of 10-day composites NDVI product by similarity, which was obtained by the SPOT Vegetation sensor, for municipalities with high soybean production in the state of Paraná, Brazil, in the 2005/2006 cropping season. Data mining is a valuable tool that allows extracting knowledge from a database, identifying valid, new, potentially useful and understandable patterns. Therefore, it was used the methods for clusters generation by means of the algorithms K-Means, MAXVER and DBSCAN, implemented in the WEKA software package. Clusters were created based on the average temporal profiles of NDVI of the 277 municipalities with high soybean production in the state and the best results were found with the K-Means algorithm, grouping the municipalities into six clusters, considering the period from the beginning of October until the end of March, which is equivalent to the crop vegetative cycle. Half of the generated clusters presented spectro-temporal pattern, a characteristic of soybeans and were mostly under the soybean belt in the state of Paraná, which shows good results that were obtained with the proposed methodology as for identification of homogeneous areas. These results will be useful for the creation of regional soybean "masks" to estimate the planted area for this crop.
Resumo:
The use of intensity-modulated radiotherapy (IMRT) has increased extensively in the modern radiotherapy (RT) treatments over the past two decades. Radiation dose distributions can be delivered with higher conformality with IMRT when compared to the conventional 3D-conformal radiotherapy (3D-CRT). Higher conformality and target coverage increases the probability of tumour control and decreases the normal tissue complications. The primary goal of this work is to improve and evaluate the accuracy, efficiency and delivery techniques of RT treatments by using IMRT. This study evaluated the dosimetric limitations and possibilities of IMRT in small (treatments of head-and-neck, prostate and lung cancer) and large volumes (primitive neuroectodermal tumours). The dose coverage of target volumes and the sparing of critical organs were increased with IMRT when compared to 3D-CRT. The developed split field IMRT technique was found to be safe and accurate method in craniospinal irradiations. By using IMRT in simultaneous integrated boosting of biologically defined target volumes of localized prostate cancer high doses were achievable with only small increase in the treatment complexity. Biological plan optimization increased the probability of uncomplicated control on average by 28% when compared to standard IMRT delivery. Unfortunately IMRT carries also some drawbacks. In IMRT the beam modulation is realized by splitting a large radiation field to small apertures. The smaller the beam apertures are the larger the rebuild-up and rebuild-down effects are at the tissue interfaces. The limitations to use IMRT with small apertures in the treatments of small lung tumours were investigated with dosimetric film measurements. The results confirmed that the peripheral doses of the small lung tumours were decreased as the effective field size was decreased. The studied calculation algorithms were not able to model the dose deficiency of the tumours accurately. The use of small sliding window apertures of 2 mm and 4 mm decreased the tumour peripheral dose by 6% when compared to 3D-CRT treatment plan. A direct aperture based optimization (DABO) technique was examined as a solution to decrease the treatment complexity. The DABO IMRT technique was able to achieve treatment plans equivalent with the conventional IMRT fluence based optimization techniques in the concave head-and-neck target volumes. With DABO the effective field sizes were increased and the number of MUs was reduced with a factor of two. The optimality of a treatment plan and the therapeutic ratio can be further enhanced by using dose painting based on regional radiosensitivities imaged with functional imaging methods.
Resumo:
Tämä tutkielma kuuluu merkkijonoalgoritmiikan piiriin. Merkkijono S on merkkijonojen X[1..m] ja Y[1..n] yhteinen alijono, mikäli se voidaan muodostaa poistamalla X:stä 0..m ja Y:stä 0..n kappaletta merkkejä mielivaltaisista paikoista. Jos yksikään X:n ja Y:n yhteinen alijono ei ole S:ää pidempi, sanotaan, että S on X:n ja Y:n pisin yhteinen alijono (lyh. PYA). Tässä työssä keskitytään kahden merkkijonon PYAn ratkaisemiseen, mutta ongelma on yleistettävissä myös useammalle jonolle. PYA-ongelmalle on sovelluskohteita – paitsi tietojenkäsittelytieteen niin myös bioinformatiikan osa-alueilla. Tunnetuimpia niistä ovat tekstin ja kuvien tiivistäminen, tiedostojen versionhallinta, hahmontunnistus sekä DNA- ja proteiiniketjujen rakennetta vertaileva tutkimus. Ongelman ratkaisemisen tekee hankalaksi ratkaisualgoritmien riippuvuus syötejonojen useista eri parametreista. Näitä ovat syötejonojen pituuden lisäksi mm. syöttöaakkoston koko, syötteiden merkkijakauma, PYAn suhteellinen osuus lyhyemmän syötejonon pituudesta ja täsmäävien merkkiparien lukumäärä. Täten on vaikeaa kehittää algoritmia, joka toimisi tehokkaasti kaikille ongelman esiintymille. Tutkielman on määrä toimia yhtäältä käsikirjana, jossa esitellään ongelman peruskäsitteiden kuvauksen jälkeen jo aikaisemmin kehitettyjä tarkkoja PYAalgoritmeja. Niiden tarkastelu on ryhmitelty algoritmin toimintamallin mukaan joko rivi, korkeuskäyrä tai diagonaali kerrallaan sekä monisuuntaisesti prosessoiviin. Tarkkojen menetelmien lisäksi esitellään PYAn pituuden ylä- tai alarajan laskevia heuristisia menetelmiä, joiden laskemia tuloksia voidaan hyödyntää joko sellaisinaan tai ohjaamaan tarkan algoritmin suoritusta. Tämä osuus perustuu tutkimusryhmämme julkaisemiin artikkeleihin. Niissä käsitellään ensimmäistä kertaa heuristiikoilla tehostettuja tarkkoja menetelmiä. Toisaalta työ sisältää laajahkon empiirisen tutkimusosuuden, jonka tavoitteena on ollut tehostaa olemassa olevien tarkkojen algoritmien ajoaikaa ja muistinkäyttöä. Kyseiseen tavoitteeseen on pyritty ohjelmointiteknisesti esittelemällä algoritmien toimintamallia hyvin tukevia tietorakenteita ja rajoittamalla algoritmien suorittamaa tuloksetonta laskentaa parantamalla niiden kykyä havainnoida suorituksen aikana saavutettuja välituloksia ja hyödyntää niitä. Tutkielman johtopäätöksinä voidaan yleisesti todeta tarkkojen PYA-algoritmien heuristisen esiprosessoinnin lähes systemaattisesti pienentävän niiden suoritusaikaa ja erityisesti muistintarvetta. Lisäksi algoritmin käyttämällä tietorakenteella on ratkaiseva vaikutus laskennan tehokkuuteen: mitä paikallisempia haku- ja päivitysoperaatiot ovat, sitä tehokkaampaa algoritmin suorittama laskenta on.
Resumo:
OBJETIVO: Analisar e comparar os diversos procedimentos cirúrgicos descritos para o tratamento da doença pilonidal. MÉTODO: Foram selecionados 34 trabalhos publicados em revistas indexadas, totalizando 8698 doentes operados. Realizou-se meta-análise para comparação das sete principais técnicas cirúrgicas descritas na literatura, quanto aos resultados em relação à recidiva e ao tempo de cicatrização no pós-operatório. RESULTADOS: Do total de doentes estudados, houve recidiva em 230 doentes (2,6%). O tempo de cicatrização no pós-operatório foi significantemente maior no grupo de excisão sem sutura. As recidivas foram estatisticamente semelhantes nos métodos: excisão sem sutura, marsupialização, incisão e curetagem, excisão e retalho e técnica de Karidakys. Os métodos que apresentaram maior índice de recidiva (estatisticamente significante - p<0,001) foram: excisão e sutura primária e o método de Bascom. CONCLUSÕES: Conclui-se, por esse estudo, que os resultados em relação à recidiva são estatisticamente semelhantes em todos os métodos, com exceção da excisão e sutura primária e da técnica de Bascom, que apresentaram recidivas mais freqüentes. O tempo de cicatrização foi maior nos indivíduos operados pela técnica de excisão sem sutura primária.
Resumo:
OBJETIVO: avaliar se a Lei Seca cumpriu sua meta após três anos da promulgação. MÉTODOS: estudo retrospectivo dos pacientes com fraturas craniofaciais submetidos a tratamento cirúrgico em um hospital universitário, em dois períodos: antes (2005 a 2008) e após a implantação da lei (2008 a 2011). RESULTADOS:foram operados 265 pacientes (220 homens e 45 mulheres) nesse período sendo, 149 (56%) antes da lei e 116 (44%) após a lei, indicando redução no número de traumatismos (p=0,04). Houve predomínio da faixa etária entre 19 e 40 anos, em ambos os períodos. As principais causas dos traumas foram os acidentes automobilísticos, as agressões físicas e as quedas. O abuso de álcool foi identificado em 15,4% dos pacientes antes e 19% após a lei. A mandíbula e o complexo maxilozigomático foram os ossos mais acometidos. CONCLUSÃO:a redução no número de politraumatizados operados ficou aquém do esperado e almejado.
Resumo:
Formal software development processes and well-defined development methodologies are nowadays seen as the definite way to produce high-quality software within time-limits and budgets. The variety of such high-level methodologies is huge ranging from rigorous process frameworks like CMMI and RUP to more lightweight agile methodologies. The need for managing this variety and the fact that practically every software development organization has its own unique set of development processes and methods have created a profession of software process engineers. Different kinds of informal and formal software process modeling languages are essential tools for process engineers. These are used to define processes in a way which allows easy management of processes, for example process dissemination, process tailoring and process enactment. The process modeling languages are usually used as a tool for process engineering where the main focus is on the processes themselves. This dissertation has a different emphasis. The dissertation analyses modern software development process modeling from the software developers’ point of view. The goal of the dissertation is to investigate whether the software process modeling and the software process models aid software developers in their day-to-day work and what are the main mechanisms for this. The focus of the work is on the Software Process Engineering Metamodel (SPEM) framework which is currently one of the most influential process modeling notations in software engineering. The research theme is elaborated through six scientific articles which represent the dissertation research done with process modeling during an approximately five year period. The research follows the classical engineering research discipline where the current situation is analyzed, a potentially better solution is developed and finally its implications are analyzed. The research applies a variety of different research techniques ranging from literature surveys to qualitative studies done amongst software practitioners. The key finding of the dissertation is that software process modeling notations and techniques are usually developed in process engineering terms. As a consequence the connection between the process models and actual development work is loose. In addition, the modeling standards like SPEM are partially incomplete when it comes to pragmatic process modeling needs, like light-weight modeling and combining pre-defined process components. This leads to a situation, where the full potential of process modeling techniques for aiding the daily development activities can not be achieved. Despite these difficulties the dissertation shows that it is possible to use modeling standards like SPEM to aid software developers in their work. The dissertation presents a light-weight modeling technique, which software development teams can use to quickly analyze their work practices in a more objective manner. The dissertation also shows how process modeling can be used to more easily compare different software development situations and to analyze their differences in a systematic way. Models also help to share this knowledge with others. A qualitative study done amongst Finnish software practitioners verifies the conclusions of other studies in the dissertation. Although processes and development methodologies are seen as an essential part of software development, the process modeling techniques are rarely used during the daily development work. However, the potential of these techniques intrigues the practitioners. As a conclusion the dissertation shows that process modeling techniques, most commonly used as tools for process engineers, can also be used as tools for organizing the daily software development work. This work presents theoretical solutions for bringing the process modeling closer to the ground-level software development activities. These theories are proven feasible by presenting several case studies where the modeling techniques are used e.g. to find differences in the work methods of the members of a software team and to share the process knowledge to a wider audience.