40 resultados para Styrax camporum extract
Resumo:
The core aim of machine learning is to make a computer program learn from the experience. Learning from data is usually defined as a task of learning regularities or patterns in data in order to extract useful information, or to learn the underlying concept. An important sub-field of machine learning is called multi-view learning where the task is to learn from multiple data sets or views describing the same underlying concept. A typical example of such scenario would be to study a biological concept using several biological measurements like gene expression, protein expression and metabolic profiles, or to classify web pages based on their content and the contents of their hyperlinks. In this thesis, novel problem formulations and methods for multi-view learning are presented. The contributions include a linear data fusion approach during exploratory data analysis, a new measure to evaluate different kinds of representations for textual data, and an extension of multi-view learning for novel scenarios where the correspondence of samples in the different views or data sets is not known in advance. In order to infer the one-to-one correspondence of samples between two views, a novel concept of multi-view matching is proposed. The matching algorithm is completely data-driven and is demonstrated in several applications such as matching of metabolites between humans and mice, and matching of sentences between documents in two languages.
Resumo:
Nucleation is the first step in a phase transition where small nuclei of the new phase start appearing in the metastable old phase, such as the appearance of small liquid clusters in a supersaturated vapor. Nucleation is important in various industrial and natural processes, including atmospheric new particle formation: between 20 % to 80 % of atmospheric particle concentration is due to nucleation. These atmospheric aerosol particles have a significant effect both on climate and human health. Different simulation methods are often applied when studying things that are difficult or even impossible to measure, or when trying to distinguish between the merits of various theoretical approaches. Such simulation methods include, among others, molecular dynamics and Monte Carlo simulations. In this work molecular dynamics simulations of the homogeneous nucleation of Lennard-Jones argon have been performed. Homogeneous means that the nucleation does not occur on a pre-existing surface. The simulations include runs where the starting configuration is a supersaturated vapor and the nucleation event is observed during the simulation (direct simulations), as well as simulations of a cluster in equilibrium with a surrounding vapor (indirect simulations). The latter type are a necessity when the conditions prevent the occurrence of a nucleation event in a reasonable timeframe in the direct simulations. The effect of various temperature control schemes on the nucleation rate (the rate of appearance of clusters that are equally able to grow to macroscopic sizes and to evaporate) was studied and found to be relatively small. The method to extract the nucleation rate was also found to be of minor importance. The cluster sizes from direct and indirect simulations were used in conjunction with the nucleation theorem to calculate formation free energies for the clusters in the indirect simulations. The results agreed with density functional theory, but were higher than values from Monte Carlo simulations. The formation energies were also used to calculate surface tension for the clusters. The sizes of the clusters in the direct and indirect simulations were compared, showing that the direct simulation clusters have more atoms between the liquid-like core of the cluster and the surrounding vapor. Finally, the performance of various nucleation theories in predicting simulated nucleation rates was investigated, and the results among other things highlighted once again the inadequacy of the classical nucleation theory that is commonly employed in nucleation studies.
Resumo:
The first line medication for mild to moderate Alzheimer s disease (AD) is based on cholinesterase inhibitors which prolong the effect of the neurotransmitter acetylcholine in cholinergic nerve synapses which relieves the symptoms of the disease. Implications of cholinesterases involvement in disease modifying processes has increased interest in this research area. The drug discovery and development process is a long and expensive process that takes on average 13.5 years and costs approximately 0.9 billion US dollars. Drug attritions in the clinical phases are common due to several reasons, e.g., poor bioavailability of compounds leading to low efficacy or toxic effects. Thus, improvements in the early drug discovery process are needed to create highly potent non-toxic compounds with predicted drug-like properties. Nature has been a good source for the discovery of new medicines accounting for around half of the new drugs approved to market during the last three decades. These compounds are direct isolates from the nature, their synthetic derivatives or natural mimics. Synthetic chemistry is an alternative way to produce compounds for drug discovery purposes. Both sources have pros and cons. The screening of new bioactive compounds in vitro is based on assaying compound libraries against targets. Assay set-up has to be adapted and validated for each screen to produce high quality data. Depending on the size of the library, miniaturization and automation are often requirements to reduce solvent and compound amounts and fasten the process. In this contribution, natural extract, natural pure compound and synthetic compound libraries were assessed as sources for new bioactive compounds. The libraries were screened primarily for acetylcholinesterase inhibitory effect and secondarily for butyrylcholinesterase inhibitory effect. To be able to screen the libraries, two assays were evaluated as screening tools and adapted to be compatible with special features of each library. The assays were validated to create high quality data. Cholinesterase inhibitors with various potencies and selectivity were found in natural product and synthetic compound libraries which indicates that the two sources complement each other. It is acknowledged that natural compounds differ structurally from compounds in synthetic compound libraries which further support the view of complementation especially if a high diversity of structures is the criterion for selection of compounds in a library.
Resumo:
FTIR-spektroskopia (Fourier-muunnosinfrapunaspektroskopia) on nopea analyysimenetelmä. Fourier-laitteissa interferometrin käyttäminen mahdollistaa koko infrapunataajuusalueen mittaamisen muutamassa sekunnissa. ATR-liitännäisellä varustetun FTIR-spektrometrin käyttö ei edellytä juuri näytteen valmistusta ja siksi menetelmä on käytössä myös helppo. ATR-liitännäinen mahdollistaa myös monien erilaisten näytteiden analysoinnin. Infrapunaspektrin mittaaminen onnistuu myös sellaisista näytteistä, joille perinteisiä näytteenvalmistusmenetelmiä ei voida käyttää. FTIR-spektroskopian avulla saatu tieto yhdistetään usein tilastollisiin monimuuttuja-analyyseihin. Klusterianalyysin avulla voidaan spektreistä saatu tieto ryhmitellä samanlaisuuteen perustuen. Hierarkkisessa klusterianalyysissa objektien välinen samanlaisuus määritetään laskemalla niiden välinen etäisyys. Pääkomponenttianalyysin avulla vähennetään datan ulotteisuutta ja luodaan uusia korreloimattomia pääkomponentteja. Pääkomponenttien tulee säilyttää mahdollisimman suuri määrä alkuperäisen datan variaatiosta. FTIR-spektroskopian ja monimuuttujamenetelmien sovellusmahdollisuuksia on tutkittu paljon. Elintarviketeollisuudessa sen soveltuvuutta esimerkiksi laadun valvontaan on tutkittu. Menetelmää on käytetty myös haihtuvien öljyjen kemiallisten koostumusten tunnistukseen sekä öljykasvien kemotyyppien havaitsemiseen. Tässä tutkimuksessa arvioitiin menetelmän käyttöä suoputken uutenäytteiden luokittelussa. Tutkimuksessa suoputken eri kasvinosien uutenäytteiden FTIR-spektrejä vertailtiin valikoiduista puhdasaineista mitattuihin FTIR-spektreihin. Puhdasaineiden FTIR-spektreistä tunnistettiin niiden tyypilliset absorptiovyöhykkeet. Furanokumariinien spektrien intensiivisten vyöhykkeiden aaltolukualueet valittiin monimuuttuja-analyyseihin. Monimuuttuja-analyysit tehtiin myös IR-spektrin sormenjälkialueelta aaltolukualueelta 1785-725 cm-1. Uutenäytteitä pyrittiin luokittelemaan niiden keräyspaikan ja kumariinipitoisuuden mukaan. Keräyspaikan mukaan ryhmittymistä oli havaittavissa, mikä selittyi vyöhykkeiden aaltolukualueiden mukaan tehdyissä analyyseissa pääosin kumariinipitoisuuksilla. Näissä analyyseissa uutenäytteet pääosin ryhmittyivät ja erottuivat kokonaiskumariinipitoisuuksien mukaan. Myös aaltolukualueen 1785-725 cm-1 analyyseissa havaittiin keräyspaikan mukaan ryhmittymistä, mitä kumariinipitoisuudet eivät kuitenkaan selittäneet. Näihin ryhmittymisiin vaikuttivat mahdollisesti muiden yhdisteiden samanlaiset pitoisuudet näytteissä. Analyyseissa käytettiin myös muita aaltolukualueita, mutta tulokset eivät juuri poikenneet aiemmista. 2. kertaluvun derivaattaspektrien monimuuttuja-analyysit sormenjälkialueelta eivät myöskään muuttaneet tuloksia havaittavasti. Jatkotutkimuksissa nyt käytettyä menetelmää on mahdollista edelleen kehittää esimerkiksi tutkimalla monimuuttuja-analyyseissa 2. kertaluvun derivaattaspektreistä suppeampia, tarkkaan valittuja aaltolukualueita.
Resumo:
Modern smart phones often come with a significant amount of computational power and an integrated digital camera making them an ideal platform for intelligents assistants. This work is restricted to retail environments, where users could be provided with for example navigational in- structions to desired products or information about special offers within their close proximity. This kind of applications usually require information about the user's current location in the domain environment, which in our case corresponds to a retail store. We propose a vision based positioning approach that recognizes products the user's mobile phone's camera is currently pointing at. The products are related to locations within the store, which enables us to locate the user by pointing the mobile phone's camera to a group of products. The first step of our method is to extract meaningful features from digital images. We use the Scale- Invariant Feature Transform SIFT algorithm, which extracts features that are highly distinctive in the sense that they can be correctly matched against a large database of features from many images. We collect a comprehensive set of images from all meaningful locations within our domain and extract the SIFT features from each of these images. As the SIFT features are of high dimensionality and thus comparing individual features is infeasible, we apply the Bags of Keypoints method which creates a generic representation, visual category, from all features extracted from images taken from a specific location. A category for an unseen image can be deduced by extracting the corresponding SIFT features and by choosing the category that best fits the extracted features. We have applied the proposed method within a Finnish supermarket. We consider grocery shelves as categories which is a sufficient level of accuracy to help users navigate or to provide useful information about nearby products. We achieve a 40% accuracy which is quite low for commercial applications while significantly outperforming the random guess baseline. Our results suggest that the accuracy of the classification could be increased with a deeper analysis on the domain and by combining existing positioning methods with ours.
Resumo:
Tiivistelmä ReferatAbstract Metabolomics is a rapidly growing research field that studies the response of biological systems to environmental factors, disease states and genetic modifications. It aims at measuring the complete set of endogenous metabolites, i.e. the metabolome, in a biological sample such as plasma or cells. Because metabolites are the intermediates and end products of biochemical reactions, metabolite compositions and metabolite levels in biological samples can provide a wealth of information on on-going processes in a living system. Due to the complexity of the metabolome, metabolomic analysis poses a challenge to analytical chemistry. Adequate sample preparation is critical to accurate and reproducible analysis, and the analytical techniques must have high resolution and sensitivity to allow detection of as many metabolites as possible. Furthermore, as the information contained in the metabolome is immense, the data set collected from metabolomic studies is very large. In order to extract the relevant information from such large data sets, efficient data processing and multivariate data analysis methods are needed. In the research presented in this thesis, metabolomics was used to study mechanisms of polymeric gene delivery to retinal pigment epithelial (RPE) cells. The aim of the study was to detect differences in metabolomic fingerprints between transfected cells and non-transfected controls, and thereafter to identify metabolites responsible for the discrimination. The plasmid pCMV-β was introduced into RPE cells using the vector polyethyleneimine (PEI). The samples were analyzed using high performance liquid chromatography (HPLC) and ultra performance liquid chromatography (UPLC) coupled to a triple quadrupole (QqQ) mass spectrometer (MS). The software MZmine was used for raw data processing and principal component analysis (PCA) was used in statistical data analysis. The results revealed differences in metabolomic fingerprints between transfected cells and non-transfected controls. However, reliable fingerprinting data could not be obtained because of low analysis repeatability. Therefore, no attempts were made to identify metabolites responsible for discrimination between sample groups. Repeatability and accuracy of analyses can be influenced by protocol optimization. However, in this study, optimization of analytical methods was hindered by the very small number of samples available for analysis. In conclusion, this study demonstrates that obtaining reliable fingerprinting data is technically demanding, and the protocols need to be thoroughly optimized in order to approach the goals of gaining information on mechanisms of gene delivery.
Resumo:
Human activities extract and displace different substances and materials from the earth s crust, thus causing various environmental problems, such as climate change, acidification and eutrophication. As problems have become more complicated, more holistic measures that consider the origins and sources of pollutants have been called for. Industrial ecology is a field of science that forms a comprehensive framework for studying the interactions between the modern technological society and the environment. Industrial ecology considers humans and their technologies to be part of the natural environment, not separate from it. Industrial operations form natural systems that must also function as such within the constraints set by the biosphere. Industrial symbiosis (IS) is a central concept of industrial ecology. Industrial symbiosis studies look at the physical flows of materials and energy in local industrial systems. In an ideal IS, waste material and energy are exchanged by the actors of the system, thereby reducing the consumption of virgin material and energy inputs and the generation of waste and emissions. Companies are seen as part of the chains of suppliers and consumers that resemble those of natural ecosystems. The aim of this study was to analyse the environmental performance of an industrial symbiosis based on pulp and paper production, taking into account life cycle impacts as well. Life Cycle Assessment (LCA) is a tool for quantitatively and systematically evaluating the environmental aspects of a product, technology or service throughout its whole life cycle. Moreover, the Natural Step Sustainability Principles formed a conceptual framework for assessing the environmental performance of the case study symbiosis (Paper I). The environmental performance of the case study symbiosis was compared to four counterfactual reference scenarios in which the actors of the symbiosis operated on their own. The research methods used were process-based life cycle assessment (LCA) (Papers II and III) and hybrid LCA, which combines both process and input-output LCA (Paper IV). The results showed that the environmental impacts caused by the extraction and processing of the materials and the energy used by the symbiosis were considerable. If only the direct emissions and resource use of the symbiosis had been considered, less than half of the total environmental impacts of the system would have been taken into account. When the results were compared with the counterfactual reference scenarios, the net environmental impacts of the symbiosis were smaller than those of the reference scenarios. The reduction in environmental impacts was mainly due to changes in the way energy was produced. However, the results are sensitive to the way the reference scenarios are defined. LCA is a useful tool for assessing the overall environmental performance of industrial symbioses. It is recommended that in addition to the direct effects, the upstream impacts should be taken into account as well when assessing the environmental performance of industrial symbioses. Industrial symbiosis should be seen as part of the process of improving the environmental performance of a system. In some cases, it may be more efficient, from an environmental point of view, to focus on supply chain management instead.
Resumo:
Tutkielman kirjallisuuskatsauksessa tarkasteltiin kauran leivontateknologisia ominaisuuksia, entsyymiaktiivista leivontaa ja ruismaltaan hyödyntämistä vähägluteenisessa leivonnassa. Kokeellisessa osiossa tutkittiin ruismallashapantaikinasta valmistetun uutteen vaikutusta kaurataikinan viskositeettiin ja kauraleivän ominaisuuksiin. Työn tarkoituksena oli kehittää maultaan ja rakenteeltaan onnistunut rukiinmakuinen kauraleipä. Ruismaltaan entsyymien annettiin pilkkoa keliaakikolle haitallisia rukiin prolamiineja hapantaikinaprosessissa. Hapantaikinasta erotettiin uute sentrifugoimalla. Leivontakokeisiin käytettiin entsyymiaktiivista ja kuumentamalla inaktivoitua uutetta. Uutteella korvattiin taikinavettä 15, 25 ja 30 % (taikinan painosta). Leivonta toteutettiin miniatyyrikoossa, vuokaleivontana 20 g:n taikinapaloja käyttäen. Taikinoiden viskositeetti mitattiin tarkoituksena seurata beetaglukaanin hydrolyysiä. Rukiin makua mitattiin koulutetun raadin avulla. Happaman uutteen lisäys laski taikinan pH-arvoa noin 5,8:sta noin 4,4:ään. Entsyymiaktiivisen uutteen lisäys laski taikinan viskositeettia ja inaktivoitu uute puolestaan kasvatti sitä. Leipien sisus tiivistyi, jolloin mitatut sisuksen kovuudet kasvoivat uutteen lisäyksen myötä. Uutelisäys paransi leipien makua ja aromia. Uutteen vaikutuksesta leipien huokoset olivat pienempiä ja ne jakaantuivat tasaisemmin leipämatriisiin. Jos uutetta käytettiin inaktivoituna, leipien murenevuus kasvoi. Tutkimuksessa kehitetyn teknologian avulla oli mahdollista valmistaa hyvänlaatuinen, rukiinmakuinen kauraleipä myös ilman että uutteen entsyymit inaktivoitiin keittämällä. Tähän vaikutti ilmeisesti taikinan alhainen pH, joka inhiboi alfa-amylaasia, ja kauratärkkelyksen korkea liisteröitymislämpötila, jolloin entsyymien inaktivoituminen paiston aikana tapahtui ennen kuin tärkkelys tuli alttiiksi liialliselle pilkkoutumiselle. Tämä mahdollistaa uutteen käytön osana leivontaprosessia ilman inaktivointia. Hapantaikinafermentaatio osana gluteenitonta leivontaa havaittiin toimivaksi yhdistelmäksi, sillä se paransi leivän väriä, makua ja rakennetta. Myös leivän homeeton aika parani jo vähäisenkin uutelisäyksen vaikutuksesta. Näyttää siltä, että tämän teknologian avulla on mahdollista tuoda esille pitkään kaivattua rukiin makua vähägluteenisten kauraleipien valikoimassa. Laskennallisesti ja aiempiin tuloksiin tukeutuen, voitiin päätellä, että leivän prolamiinipitoisuudessa on mahdollista päästä tasolle 63,5 mg/kg, mutta jatkokehityksen avulla päästäisiin luultavasti vielä parempiin tuloksiin.
Resumo:
Research on reading has been successful in revealing how attention guides eye movements when people read single sentences or text paragraphs in simplified and strictly controlled experimental conditions. However, less is known about reading processes in more naturalistic and applied settings, such as reading Web pages. This thesis investigates online reading processes by recording participants eye movements. The thesis consists of four experimental studies that examine how location of stimuli presented outside the currently fixated region (Study I and III), text format (Study II), animation and abrupt onset of online advertisements (Study III), and phase of an online information search task (Study IV) affect written language processing. Furthermore, the studies investigate how the goal of the reading task affects attention allocation during reading by comparing reading for comprehension with free browsing, and by varying the difficulty of an information search task. The results show that text format affects the reading process, that is, vertical text (word/line) is read at a slower rate than a standard horizontal text, and the mean fixation durations are longer for vertical text than for horizontal text. Furthermore, animated online ads and abrupt ad onsets capture online readers attention and direct their gaze toward the ads, and distract the reading process. Compared to a reading-for-comprehension task, online ads are attended to more in a free browsing task. Moreover, in both tasks abrupt ad onsets result in rather immediate fixations toward the ads. This effect is enhanced when the ad is presented in the proximity of the text being read. In addition, the reading processes vary when Web users proceed in online information search tasks, for example when they are searching for a specific keyword, looking for an answer to a question, or trying to find a subjectively most interesting topic. A scanning type of behavior is typical at the beginning of the tasks, after which participants tend to switch to a more careful reading state before finishing the tasks in the states referred to as decision states. Furthermore, the results also provided evidence that left-to-right readers extract more parafoveal information to the right of the fixated word than to the left, suggesting that learning biases attentional orienting towards the reading direction.
Resumo:
Metsäteollisuudesta kertyy vuosittain suuria määriä ylijäämämateriaalia, kuten puun kuorta ja oksia.Ylimääräinen aines käytetään pääasiassa energiantuotantoon, mutta uusia soveltamismahdollisuuksia kaivataan. Kuoren on havaittu olevan potentiaalinen lähde monille bioaktiivisille yhdisteille, joille olisi käyttöä esimerkiksi lääke- ja kemianteollisuudessa sekä maa-, metsä- ja puutarhatuotannon tuholaistorjunnassa. Tutkimus on osa Euroopan Unionin rahoittamaa ForestSpeCs-projektia, jonka tarkoituksena on selvittää metsäteollisuuden ylijäämämateriaalien vaihtoehtoisia käyttötapoja. Valittujen kymmenen teollisesti merkittävän pohjoisen puulajin (Abies nephrolepis, Betula pendula, Larix decidua, L. gmelinii, L. sibirica, Picea abies, P. ajanensis, P. pumila, Pinus sylvestris, Populus tremula) kuoresta uutettujen aineiden soveltuvuutta syönninestoaineeksi testattiin kaaliperhosen (Pieris brassicae L.) ja krysanteemiyökkösen (Spodoptera littoralis Boisduval) toukilla sekä osittain sinappikuoriaisella (Phaedon cochloreae Fabricius) ja idänlehtikuoriaisella (Agelastica alni L.). Uutteet valmistettiin yhteistyössä projektin ryhmien avulla tai itsenäisesti erilaisin menetelmin. Testaukset tehtiin laboratorio-oloissa käyttäen lehtikiekkojen valintabiotestiä sekä karkeilla uutteilla että niistä erotelluilla yksittäisillä yhdisteillä. Tehdyistä mittauksista laskettiin syönninestoindeksit (FDI). Tulosten perusteella lähes kaikki testatut uutteet vaikuttivat ainakin jossain määrin kohdehyönteisen syöntikäyttäytymiseen. Hieman yli puolet kaaliperhosella testatuista 46 uutteesta aiheuttivat yli 50 % syönnineston eli kaaliperhonen suosi kontrollilehtiä uutteella käsiteltyjä todennäköisemmin. Krysanteemiyökkösellä yli 50 %:n syönnineston aiheuttivat vain seitsemän testatuista 56 uutteesta. Lisäksi kolme uutetta lisäsi käsiteltyjen kiekkojen syöntiä merkittävästi. Idänlehtikuoriaistoukat ja -aikuiset karttoivat erityisesti abietiinihapolla käsiteltyjä lehtiä. Sinappikuoriaisella testatut uutteet toimivat myös lupaavasti. Testattujen puulajien kuoresta on mahdollista uuttaa biologisesti aktiivisia yhdisteitä, mutta tuholaistorjunnan kannalta oikeiden pitoisuuksien ja tehokkaiden uuttomenetelmien löytäminen vaatii jatkotutkimuksia. Kuoren sisältämien yhdisteiden laatu ja määrä vaihtelevat monien tekijöiden, kuten ympäristön ja genetiikan vaikutuksesta. Hyönteisten sietokyky vaihtelee myös paljon lajeittain ja yksilöidenkin välillä on eroja. Uutteista valmistettavia torjunta-aineita olisi kuitenkin mahdollista sisällyttää esimerkiksi integroituun torjuntaan muiden menetelmien rinnalle tulevaisuudessa.