22 resultados para synchroton-based techniques

em Helda - Digital Repository of University of Helsinki


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Megasphaera cerevisiae, Pectinatus cerevisiiphilus, Pectinatus frisingensis, Selenomonas lacticifex, Zymophilus paucivorans and Zymophilus raffinosivorans are strictly anaerobic Gram-stain-negative bacteria that are able to spoil beer by producing off-flavours and turbidity. They have only been isolated from the beer production chain. The species are phylogenetically affiliated to the Sporomusa sub-branch in the class "Clostridia". Routine cultivation methods for detection of strictly anaerobic bacteria in breweries are time-consuming and do not allow species identification. The main aim of this study was to utilise DNA-based techniques in order to improve detection and identification of the Sporomusa sub-branch beer-spoilage bacteria and to increase understanding of their biodiversity, evolution and natural sources. Practical PCR-based assays were developed for monitoring of M. cerevisiae, Pectinatus species and the group of Sporomusa sub-branch beer spoilers throughout the beer production process. The developed assays reliably differentiated the target bacteria from other brewery-related microbes. The contaminant detection in process samples (10 1,000 cfu/ml) could be accomplished in 2 8 h. Low levels of viable cells in finished beer (≤10 cfu/100 ml) were usually detected after 1 3 d culture enrichment. Time saving compared to cultivation methods was up to 6 d. Based on a polyphasic approach, this study revealed the existence of three new anaerobic spoilage species in the beer production chain, i.e. Megasphaera paucivorans, Megasphaera sueciensis and Pectinatus haikarae. The description of these species enabled establishment of phenotypic and DNA-based methods for their detection and identification. The 16S rRNA gene based phylogenetic analysis of the Sporomusa sub-branch showed that the genus Selenomonas originates from several ancestors and will require reclassification. Moreover, Z. paucivorans and Z. raffinosivorans were found to be in fact members of the genus Propionispira. This relationship implies that they were carried to breweries along with plant material. The brewery-related Megasphaera species formed a distinct sub-group that did not include any sequences from other sources, suggesting that M. cerevisiae, M. paucivorans and M. sueciensis may be uniquely adapted to the brewery ecosystem. M. cerevisiae was also shown to exhibit remarkable resistance against many brewery-related stress conditions. This may partly explain why it is a brewery contaminant. This study showed that DNA-based techniques provide useful tools for obtaining more rapid and specific information about the presence and identity of the strictly anaerobic spoilage bacteria in the beer production chain than is possible using cultivation methods. This should ensure financial benefits to the industry and better product quality to customers. In addition, DNA-based analyses provided new insight into the biodiversity as well as natural sources and relations of the Sporomusa sub-branch bacteria. The data can be exploited for taxonomic classification of these bacteria and for surveillance and control of contaminations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

To obtain data on phytoplankton dynamics with improved spatial and temporal resolution, and at reduced cost, traditional phytoplankton monitoring methods have been supplemented with optical approaches. In this thesis, I have explored various fluorescence-based techniques for detection of phytoplankton abundance, taxonomy and physiology in the Baltic Sea. In algal cultures used in this thesis, the availability of nitrogen and light conditions caused changes in pigmentation, and consequently in light absorption and fluorescence properties of cells. In the Baltic Sea, physical environmental factors (e.g. mixing depth, irradiance and temperature) and related seasonal succession in the phytoplankton community explained a large part of the seasonal variability in the magnitude and shape of Chlorophyll a (Chla)-specific absorption. The variability in Chla-specific fluorescence was related to the abundance of cyanobacteria, the size structure of the phytoplankton community, and absorption characteristics of phytoplankton. Cyanobacteria show very low Chla-specific fluorescence. In the presence of eukaryotic species, Chla fluorescence describes poorly cyanobacteria. During cyanobacterial bloom in the Baltic Sea, phycocyanin fluorescence explained large part of the variability in Chla concentrations. Thus, both Chla and phycocyanin fluorescence were required to predict Chla concentration. Phycobilins are major light harvesting pigments for cyanobacteria. In the open Baltic Sea, small picoplanktonic cyanobacteria were the main source of phycoerythrin fluorescence and absorption signal. Large filamentous cyanobacteria, forming harmful blooms, were the main source of the phycocyanin fluorescence signal and typically their biomass and phycocyanin fluorescence were linearly related. Using phycocyanin fluorescence, dynamics of cyanobacterial blooms can be detected at high spatial and seasonal resolution not possible with other methods. Various taxonomic phytoplankton pigment groups can be separated by spectral fluorescence. I compared multivariate calibration methods for the retrieval of phytoplankton biomass in different taxonomic groups. Partial least squares regression method gave the closest predictions for all taxonomic groups, and the accuracy was adequate for phytoplankton bloom detection. Variable fluorescence has been proposed as a tool to study the physiological state of phytoplankton. My results from the Baltic Sea emphasize that variable fluorescence alone cannot be used to detect nutrient limitation of phytoplankton. However, when combined with experiments with active nutrient manipulation, and other nutrient limitation indices, variable fluorescence provided valuable information on the physiological responses of the phytoplankton community. This thesis found a severe limitation of a commercial fast repetition rate fluorometer, which couldn t detect the variable fluorescence of phycoerythrin-lacking cyanobacteria. For these species, the Photosystem II absorption of blue light is very low, and fluorometer excitation light did not saturate Photosystem II during a measurement. This thesis encourages the use of various in vivo fluorescence methods for the detection of bulk phytoplankton biomass, biomass of cyanobacteria, chemotaxonomy of phytoplankton community, and phytoplankton physiology. Fluorescence methods can support traditional phytoplankton monitoring by providing continuous measurements of phytoplankton, and thereby strengthen the understanding of the links between biological, chemical and physical processes in aquatic ecosystems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Atmospheric aerosol particles have a strong impact on the global climate. A deep understanding of the physical and chemical processes affecting the atmospheric aerosol climate system is crucial in order to describe those processes properly in global climate models. Besides the climatic effects, aerosol particles can deteriorate e.g. visibility and human health. Nucleation is a fundamental step in atmospheric new particle formation. However, details of the atmospheric nucleation mechanisms have remained unresolved. The main reason for that has been the non-existence of instruments capable of measuring neutral newly formed particles in the size range below 3 nm in diameter. This thesis aims to extend the detectable particle size range towards close-to-molecular sizes (~1nm) of freshly nucleated clusters, and by direct measurement obtain the concentrations of sub-3 nm particles in atmospheric environment and in well defined laboratory conditions. In the work presented in this thesis, new methods and instruments for the sub-3 nm particle detection were developed and tested. The selected approach comprises four different condensation based techniques and one electrical detection scheme. All of them are capable to detect particles with diameters well below 3 nm, some even down to ~1 nm. The developed techniques and instruments were deployed in the field measurements as well as in laboratory nucleation experiments. Ambient air studies showed that in a boreal forest environment a persistent population of 1-2 nm particles or clusters exists. The observation was done using 4 different instruments showing a consistent capability for the direct measurement of the atmospheric nucleation. The results from the laboratory experiments showed that sulphuric acid is a key species in the atmospheric nucleation. The mismatch between the earlier laboratory data and ambient observations on the dependency of nucleation rate on sulphuric acid concentration was explained. The reason was shown to be associated in the inefficient growth of the nucleated clusters and in the insufficient detection efficiency of particle counters used in the previous experiments. Even though the exact molecular steps of nucleation still remain an open question, the instrumental techniques developed in this work as well as their application in laboratory and ambient studies opened a new view into atmospheric nucleation and prepared the way for investigating the nucleation processes with more suitable tools.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this thesis we deal with the concept of risk. The objective is to bring together and conclude on some normative information regarding quantitative portfolio management and risk assessment. The first essay concentrates on return dependency. We propose an algorithm for classifying markets into rising and falling. Given the algorithm, we derive a statistic: the Trend Switch Probability, for detection of long-term return dependency in the first moment. The empirical results suggest that the Trend Switch Probability is robust over various volatility specifications. The serial dependency in bear and bull markets behaves however differently. It is strongly positive in rising market whereas in bear markets it is closer to a random walk. Realized volatility, a technique for estimating volatility from high frequency data, is investigated in essays two and three. In the second essay we find, when measuring realized variance on a set of German stocks, that the second moment dependency structure is highly unstable and changes randomly. Results also suggest that volatility is non-stationary from time to time. In the third essay we examine the impact from market microstructure on the error between estimated realized volatility and the volatility of the underlying process. With simulation-based techniques we show that autocorrelation in returns leads to biased variance estimates and that lower sampling frequency and non-constant volatility increases the error variation between the estimated variance and the variance of the underlying process. From these essays we can conclude that volatility is not easily estimated, even from high frequency data. It is neither very well behaved in terms of stability nor dependency over time. Based on these observations, we would recommend the use of simple, transparent methods that are likely to be more robust over differing volatility regimes than models with a complex parameter universe. In analyzing long-term return dependency in the first moment we find that the Trend Switch Probability is a robust estimator. This is an interesting area for further research, with important implications for active asset allocation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Miniaturization of analytical instrumentation is attracting growing interest in response to the explosive demand for rapid, yet sensitive analytical methods and low-cost, highly automated instruments for pharmaceutical and bioanalyses and environmental monitoring. Microfabrication technology in particular, has enabled fabrication of low-cost microdevices with a high degree of integrated functions, such as sample preparation, chemical reaction, separation, and detection, on a single microchip. These miniaturized total chemical analysis systems (microTAS or lab-on-a-chip) can also be arrayed for parallel analyses in order to accelerate the sample throughput. Other motivations include reduced sample consumption and waste production as well as increased speed of analysis. One of the most promising hyphenated techniques in analytical chemistry is the combination of a microfluidic separation chip and mass spectrometer (MS). In this work, the emerging polymer microfabrication techniques, ultraviolet lithography in particular, were exploited to develop a capillary electrophoresis (CE) separation chip which incorporates a monolithically integrated electrospray ionization (ESI) emitter for efficient coupling with MS. An epoxy photoresist SU-8 was adopted as structural material and characterized with respect to its physicochemical properties relevant to chip-based CE and ESI/MS, namely surface charge, surface interactions, heat transfer, and solvent compatibility. As a result, SU-8 was found to be a favorable material to substitute for the more commonly used glass and silicon in microfluidic applications. In addition, an infrared (IR) thermography was introduced as direct, non-intrusive method to examine the heat transfer and thermal gradients during microchip-CE. The IR data was validated through numerical modeling. The analytical performance of SU-8-based microchips was established for qualitative and quantitative CE-ESI/MS analysis of small drug compounds, peptides, and proteins. The CE separation efficiency was found to be similar to that of commercial glass microchips and conventional CE systems. Typical analysis times were only 30-90 s per sample indicating feasibility for high-throughput analysis. Moreover, a mass detection limit at the low-attomole level, as low as 10E+5 molecules, was achieved utilizing MS detection. The SU-8 microchips developed in this work could also be mass produced at low cost and with nearly identical performance from chip to chip. Until this work, the attempts to combine CE separation with ESI in a chip-based system, amenable to batch fabrication and capable of high, reproducible analytical performance, have not been successful. Thus, the CE-ESI chip developed in this work is a substantial step toward lab-on-a-chip technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this dissertation, I present an overall methodological framework for studying linguistic alternations, focusing specifically on lexical variation in denoting a single meaning, that is, synonymy. As the practical example, I employ the synonymous set of the four most common Finnish verbs denoting THINK, namely ajatella, miettiä, pohtia and harkita ‘think, reflect, ponder, consider’. As a continuation to previous work, I describe in considerable detail the extension of statistical methods from dichotomous linguistic settings (e.g., Gries 2003; Bresnan et al. 2007) to polytomous ones, that is, concerning more than two possible alternative outcomes. The applied statistical methods are arranged into a succession of stages with increasing complexity, proceeding from univariate via bivariate to multivariate techniques in the end. As the central multivariate method, I argue for the use of polytomous logistic regression and demonstrate its practical implementation to the studied phenomenon, thus extending the work by Bresnan et al. (2007), who applied simple (binary) logistic regression to a dichotomous structural alternation in English. The results of the various statistical analyses confirm that a wide range of contextual features across different categories are indeed associated with the use and selection of the selected think lexemes; however, a substantial part of these features are not exemplified in current Finnish lexicographical descriptions. The multivariate analysis results indicate that the semantic classifications of syntactic argument types are on the average the most distinctive feature category, followed by overall semantic characterizations of the verb chains, and then syntactic argument types alone, with morphological features pertaining to the verb chain and extra-linguistic features relegated to the last position. In terms of overall performance of the multivariate analysis and modeling, the prediction accuracy seems to reach a ceiling at a Recall rate of roughly two-thirds of the sentences in the research corpus. The analysis of these results suggests a limit to what can be explained and determined within the immediate sentential context and applying the conventional descriptive and analytical apparatus based on currently available linguistic theories and models. The results also support Bresnan’s (2007) and others’ (e.g., Bod et al. 2003) probabilistic view of the relationship between linguistic usage and the underlying linguistic system, in which only a minority of linguistic choices are categorical, given the known context – represented as a feature cluster – that can be analytically grasped and identified. Instead, most contexts exhibit degrees of variation as to their outcomes, resulting in proportionate choices over longer stretches of usage in texts or speech.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Standards have been placed to regulate the microbial and preservative contents to assure that foods are safe to the consumer. In a case of a food-related disease outbreak, it is crucial to be able to detect and identify quickly and accurately the cause of the disease. In addition, for every day control of food microbial and preservative contents, the detection methods must be easily performed for numerous food samples. In this present study, quicker alternative methods were studied for identification of bacteria by DNA fingerprinting. A flow cytometry method was developed as an alternative to pulsed-field gel electrophoresis, the golden method . DNA fragment sizing by an ultrasensitive flow cytometer was able to discriminate species and strains in a reproducible and comparable manner to pulsed-field gel electrophoresis. This new method was hundreds times faster and 200,000 times more sensitive. Additionally, another DNA fingerprinting identification method was developed based on single-enzyme amplified fragment length polymorphism (SE-AFLP). This method allowed the differentiation of genera, species, and strains of pathogenic bacteria of Bacilli, Staphylococci, Yersinia, and Escherichia coli. These fingerprinting patterns obtained by SE-AFLP were simpler and easier to analyze than those by the traditional amplified fragment length polymorphism by double enzyme digestion. Nisin (E234) is added as a preservative to different types of foods, especially dairy products, around the world. Various detection methods exist for nisin, but they lack in sensitivity, speed or specificity. In this present study, a sensitive nisin-induced green fluorescent protein (GFPuv) bioassay was developed using the Lactococcus lactis two-component signal system NisRK and the nisin-inducible nisA promoter. The bioassay was extremely sensitive with detection limit of 10 pg/ml in culture supernatant. In addition, it was compatible for quantification from various food matrices, such as milk, salad dressings, processed cheese, liquid eggs, and canned tomatoes. Wine has good antimicrobial properties due to its alcohol concentration, low pH, and organic content and therefore often assumed to be microbially safe to consume. Another aim of this thesis was to study the microbiota of wines returned by customers complaining of food-poisoning symptoms. By partial 16S rRNA gene sequence analysis, ribotyping, and boar spermatozoa motility assay, it was identified that one of the wines contained a Bacillus simplex BAC91, which produced a heat-stable substance toxic to the mitochondria of sperm cells. The antibacterial activity of wine was tested on the vegetative cells and spores of B. simplex BAC91, B. cereus type strain ATCC 14579 and cereulide-producing B. cereus F4810/72. Although the vegetative cells and spores of B. simplex BAC91 were sensitive to the antimicrobial effects of wine, the spores of B. cereus strains ATCC 14579 and F4810/72 stayed viable for at least 4 months. According to these results, Bacillus spp., more specifically spores, can be a possible risk to the wine consumer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Determination of testosterone and related compounds in body fluids is of utmost importance in doping control and the diagnosis of many diseases. Capillary electromigration techniques are a relatively new approach for steroid research. Owing to their electrical neutrality, however, separation of steroids by capillary electromigration techniques requires the use of charged electrolyte additives that interact with the steroids either specifically or non-specifically. The analysis of testosterone and related steroids by non-specific micellar electrokinetic chromatography (MEKC) was investigated in this study. The partial filling (PF) technique was employed, being suitable for detection by both ultraviolet spectrophotometry (UV) and electrospray ionization mass spectrometry (ESI-MS). Efficient, quantitative PF-MEKC UV methods for steroid standards were developed through the use of optimized pseudostationary phases comprising surfactants and cyclodextrins. PF-MEKC UV proved to be a more sensitive, efficient and repeatable method for the steroids than PF-MEKC ESI-MS. It was discovered that in PF-MEKC analyses of electrically neutral steroids, ESI-MS interfacing sets significant limitations not only on the chemistry affecting the ionization and detection processes, but also on the separation. The new PF-MEKC UV method was successfully employed in the determination of testosterone in male urine samples after microscale immunoaffinity solid-phase extraction (IA-SPE). The IA-SPE method, relying on specific interactions between testosterone and a recombinant anti-testosterone Fab fragment, is the first such method described for testosterone. Finally, new data for interactions between steroids and human and bovine serum albumins were obtained through the use of affinity capillary electrophoresis. A new algorithm for the calculation of association constants between proteins and neutral ligands is introduced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Miniaturized mass spectrometric ionization techniques for environmental analysis and bioanalysis Novel miniaturized mass spectrometric ionization techniques based on atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI) were studied and evaluated in the analysis of environmental samples and biosamples. The three analytical systems investigated here were gas chromatography-microchip atmospheric pressure chemical ionization-mass spectrometry (GC-µAPCI-MS) and gas chromatography-microchip atmospheric pressure photoionization-mass spectrometry (GC-µAPPI-MS), where sample pretreatment and chromatographic separation precede ionization, and desorption atmospheric pressure photoionization-mass spectrometry (DAPPI-MS), where the samples are analyzed either as such or after minimal pretreatment. The gas chromatography-microchip atmospheric pressure ionization-mass spectrometry (GC-µAPI-MS) instrumentations were used in the analysis of polychlorinated biphenyls (PCBs) in negative ion mode and 2-quinolinone-derived selective androgen receptor modulators (SARMs) in positive ion mode. The analytical characteristics (i.e., limits of detection, linear ranges, and repeatabilities) of the methods were evaluated with PCB standards and SARMs in urine. All methods showed good analytical characteristics and potential for quantitative environmental analysis or bioanalysis. Desorption and ionization mechanisms in DAPPI were studied. Desorption was found to be a thermal process, with the efficiency strongly depending on thermal conductivity of the sampling surface. Probably the size and polarity of the analyte also play a role. In positive ion mode, the ionization is dependent on the ionization energy and proton affinity of the analyte and the spray solvent, while in negative ion mode the ionization mechanism is determined by the electron affinity and gas-phase acidity of the analyte and the spray solvent. DAPPI-MS was tested in the fast screening analysis of environmental, food, and forensic samples, and the results demonstrated the feasibility of DAPPI-MS for rapid screening analysis of authentic samples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Radioactive particles from three locations were investigated for elemental composition, oxidation states of matrix elements, and origin. Instrumental techniques applied to the task were scanning electron microscopy, X-ray and gamma-ray spectrometry, secondary ion mass spectrometry, and synchrotron radiation based microanalytical techniques comprising X-ray fluorescence spectrometry, X-ray fluorescence tomography, and X-ray absorption near-edge structure spectroscopy. Uranium-containing low activity particles collected from Irish Sea sediments were characterized in terms of composition and distribution of matrix elements and the oxidation states of uranium. Indications of the origin were obtained from the intensity ratios and the presence of thorium, uranium, and plutonium. Uranium in the particles was found to exist mostly as U(IV). Studies on plutonium particles from Runit Island (Marshall Islands) soil indicated that the samples were weapon fuel fragments originating from two separate detonations: a safety test and a low-yield test. The plutonium in the particles was found to be of similar age. The distribution and oxidation states of uranium and plutonium in the matrix of weapon fuel particles from Thule (Greenland) sediments were investigated. The variations in intensity ratios observed with different techniques indicated more than one origin. Uranium in particle matrixes was mostly U(IV), but plutonium existed in some particles mainly as Pu(IV), and in others mainly as oxidized Pu(VI). The results demonstrated that the various techniques were effectively applied in the characterization of environmental radioactive particles. An on-line method was developed for separating americium from environmental samples. The procedure utilizes extraction chromatography to separate americium from light lanthanides, and cation exchange to concentrate americium before the final separation in an ion chromatography column. The separated radiochemically pure americium fraction is measured by alpha spectrometry. The method was tested with certified sediment and soil samples and found to be applicable for the analysis of environmental samples containing a wide range of Am-241 activity. Proceeding from the on-line method developed for americium, a method was also developed for separating plutonium and americium. Plutonium is reduced to Pu(III), and separated together with Am(III) throughout the procedure. Pu(III) and Am(III) are eluted from the ion chromatography column as anionic dipicolinate and oxalate complexes, respectively, and measured by alpha spectrometry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The commodity plastics that are used in our everyday lives are based on polyolefin resins and they find wide variety of applications in several areas. Most of the production is carried out in catalyzed low pressure processes. As a consequence polymerization of ethene and α-olefins has been one of the focus areas for catalyst research both in industry and academia. Enormous amount of effort have been dedicated to fine tune the processes and to obtain better control of the polymerization and to produce tailored polymer structures The literature review of the thesis concentrates on the use of Group IV metal complexes as catalysts for polymerization of ethene and branched α-olefins. More precisely the review is focused on the use of complexes bearing [O,O] and [O,N] type ligands which have gained considerable interest. Effects of the ligand framework as well as mechanical and fluxional behaviour of the complexes are discussed. The experimental part consists mainly of development of new Group IV metal complexes bearing [O,O] and [O,N] ligands and their use as catalysts precursors in ethene polymerization. Part of the experimental work deals with usage of high-throughput techniques in tailoring properties of new polymer materials which are synthesized using Group IV complexes as catalysts. It is known that the by changing the steric and electronic properties of the ligand framework it is possible to fine tune the catalyst and to gain control over the polymerization reaction. This is why in this thesis the complex structures were designed so that the ligand frameworks could be fairly easily modified. All together 14 complexes were synthesised and used as catalysts in ethene polymerizations. It was found that the ligand framework did have an impact within the studied catalyst families. The activities of the catalysts were affected by the changes in complex structure and also effects on the produced polymers were observed: molecular weights and molecular weight distributions were depended on the used catalyst structure. Some catalysts also produced bi- or multi-modal polymers. During last decade high-throughput techniques developed in pharmaceutical industries have been adopted into polyolefin research in order to speed-up and optimize the catalyst candidates. These methods can now be regarded as established method suitable for both academia and industry alike. These high-throughput techniques were used in tailoring poly(4-methyl-1-pentene) polymers which were synthesized using Group IV metal complexes as catalysts. This work done in this thesis represents the first successful example where the high-throughput synthesis techniques are combined with high-throughput mechanical testing techniques to speed-up the discovery process for new polymer materials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, novel methodologies for the determination of antioxidative compounds in herbs and beverages were developed. Antioxidants are compounds that can reduce, delay or inhibit oxidative events. They are a part of the human defense system and are obtained through the diet. Antioxidants are naturally present in several types of foods, e.g. in fruits, beverages, vegetables and herbs. Antioxidants can also be added to foods during manufacturing to suppress lipid oxidation and formation of free radicals under conditions of cooking or storage and to reduce the concentration of free radicals in vivo after food ingestion. There is growing interest in natural antioxidants, and effective compounds have already been identified from antioxidant classes such as carotenoids, essential oils, flavonoids and phenolic acids. The wide variety of sample matrices and analytes presents quite a challenge for the development of analytical techniques. Growing demands have been placed on sample pretreatment. In this study, three novel extraction techniques, namely supercritical fluid extraction (SFE), pressurised hot water extraction (PHWE) and dynamic sonication-assisted extraction (DSAE) were studied. SFE was used for the extraction of lycopene from tomato skins and PHWE was used in the extraction of phenolic compounds from sage. DSAE was applied to the extraction of phenolic acids from Lamiaceae herbs. In the development of extraction methodologies, the main parameters of the extraction were studied and the recoveries were compared to those achieved by conventional extraction techniques. In addition, the stability of lycopene was also followed under different storage conditions. For the separation of the antioxidative compounds in the extracts, liquid chromatographic methods (LC) were utilised. Two novel LC techniques, namely ultra performance liquid chromatography (UPLC) and comprehensive two-dimensional liquid chromatography (LCxLC) were studied and compared with conventional high performance liquid chromatography (HPLC) for the separation of antioxidants in beverages and Lamiaceae herbs. In LCxLC, the selection of LC mode, column dimensions and flow rates were studied and optimised to obtain efficient separation of the target compounds. In addition, the separation powers of HPLC, UPLC, HPLCxHPLC and HPLCxUPLC were compared. To exploit the benefits of an integrated system, in which sample preparation and final separation are performed in a closed unit, dynamic sonication-assisted extraction was coupled on-line to a liquid chromatograph via a solid-phase trap. The increased sensitivity was utilised in the extraction of phenolic acids from Lamiaceae herbs. The results were compared to those of achieved by the LCxLC system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multi- and intralake datasets of fossil midge assemblages in surface sediments of small shallow lakes in Finland were studied to determine the most important environmental factors explaining trends in midge distribution and abundance. The aim was to develop palaeoenvironmental calibration models for the most important environmental variables for the purpose of reconstructing past environmental conditions. The developed models were applied to three high-resolution fossil midge stratigraphies from southern and eastern Finland to interpret environmental variability over the past 2000 years, with special focus on the Medieval Climate Anomaly (MCA), the Little Ice Age (LIA) and recent anthropogenic changes. The midge-based results were compared with physical properties of the sediment, historical evidence and environmental reconstructions based on diatoms (Bacillariophyta), cladocerans (Crustacea: Cladocera) and tree rings. The results showed that the most important environmental factor controlling midge distribution and abundance along a latitudinal gradient in Finland was the mean July air temperature (TJul). However, when the dataset was environmentally screened to include only pristine lakes, water depth at the sampling site became more important. Furthermore, when the dataset was geographically scaled to southern Finland, hypolimnetic oxygen conditions became the dominant environmental factor. The results from an intralake dataset from eastern Finland showed that the most important environmental factors controlling midge distribution within a lake basin were river contribution, water depth and submerged vegetation patterns. In addition, the results of the intralake dataset showed that the fossil midge assemblages represent fauna that lived in close proximity to the sampling sites, thus enabling the exploration of within-lake gradients in midge assemblages. Importantly, this within-lake heterogeneity in midge assemblages may have effects on midge-based temperature estimations, because samples taken from the deepest point of a lake basin may infer considerably colder temperatures than expected, as shown by the present test results. Therefore, it is suggested here that the samples in fossil midge studies involving shallow boreal lakes should be taken from the sublittoral, where the assemblages are most representative of the whole lake fauna. Transfer functions between midge assemblages and the environmental forcing factors that were significantly related with the assemblages, including mean air TJul, water depth, hypolimnetic oxygen, stream flow and distance to littoral vegetation, were developed using weighted averaging (WA) and weighted averaging-partial least squares (WA-PLS) techniques, which outperformed all the other tested numerical approaches. Application of the models in downcore studies showed mostly consistent trends. Based on the present results, which agreed with previous studies and historical evidence, the Medieval Climate Anomaly between ca. 800 and 1300 AD in eastern Finland was characterized by warm temperature conditions and dry summers, but probably humid winters. The Little Ice Age (LIA) prevailed in southern Finland from ca. 1550 to 1850 AD, with the coldest conditions occurring at ca. 1700 AD, whereas in eastern Finland the cold conditions prevailed over a longer time period, from ca. 1300 until 1900 AD. The recent climatic warming was clearly represented in all of the temperature reconstructions. In the terms of long-term climatology, the present results provide support for the concept that the North Atlantic Oscillation (NAO) index has a positive correlation with winter precipitation and annual temperature and a negative correlation with summer precipitation in eastern Finland. In general, the results indicate a relatively warm climate with dry summers but snowy winters during the MCA and a cool climate with rainy summers and dry winters during the LIA. The results of the present reconstructions and the forthcoming applications of the models can be used in assessments of long-term environmental dynamics to refine the understanding of past environmental reference conditions and natural variability required by environmental scientists, ecologists and policy makers to make decisions concerning the presently occurring global, regional and local changes. The developed midge-based models for temperature, hypolimnetic oxygen, water depth, littoral vegetation shift and stream flow, presented in this thesis, are open for scientific use on request.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis which consists of an introduction and four peer-reviewed original publications studies the problems of haplotype inference (haplotyping) and local alignment significance. The problems studied here belong to the broad area of bioinformatics and computational biology. The presented solutions are computationally fast and accurate, which makes them practical in high-throughput sequence data analysis. Haplotype inference is a computational problem where the goal is to estimate haplotypes from a sample of genotypes as accurately as possible. This problem is important as the direct measurement of haplotypes is difficult, whereas the genotypes are easier to quantify. Haplotypes are the key-players when studying for example the genetic causes of diseases. In this thesis, three methods are presented for the haplotype inference problem referred to as HaploParser, HIT, and BACH. HaploParser is based on a combinatorial mosaic model and hierarchical parsing that together mimic recombinations and point-mutations in a biologically plausible way. In this mosaic model, the current population is assumed to be evolved from a small founder population. Thus, the haplotypes of the current population are recombinations of the (implicit) founder haplotypes with some point--mutations. HIT (Haplotype Inference Technique) uses a hidden Markov model for haplotypes and efficient algorithms are presented to learn this model from genotype data. The model structure of HIT is analogous to the mosaic model of HaploParser with founder haplotypes. Therefore, it can be seen as a probabilistic model of recombinations and point-mutations. BACH (Bayesian Context-based Haplotyping) utilizes a context tree weighting algorithm to efficiently sum over all variable-length Markov chains to evaluate the posterior probability of a haplotype configuration. Algorithms are presented that find haplotype configurations with high posterior probability. BACH is the most accurate method presented in this thesis and has comparable performance to the best available software for haplotype inference. Local alignment significance is a computational problem where one is interested in whether the local similarities in two sequences are due to the fact that the sequences are related or just by chance. Similarity of sequences is measured by their best local alignment score and from that, a p-value is computed. This p-value is the probability of picking two sequences from the null model that have as good or better best local alignment score. Local alignment significance is used routinely for example in homology searches. In this thesis, a general framework is sketched that allows one to compute a tight upper bound for the p-value of a local pairwise alignment score. Unlike the previous methods, the presented framework is not affeced by so-called edge-effects and can handle gaps (deletions and insertions) without troublesome sampling and curve fitting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One major reason for the global decline of biodiversity is habitat loss and fragmentation. Conservation areas can be designed to reduce biodiversity loss, but as resources are limited, conservation efforts need to be prioritized in order to achieve best possible outcomes. The field of systematic conservation planning developed as a response to opportunistic approaches to conservation that often resulted in biased representation of biological diversity. The last two decades have seen the development of increasingly sophisticated methods that account for information about biodiversity conservation goals (benefits), economical considerations (costs) and socio-political constraints. In this thesis I focus on two general topics related to systematic conservation planning. First, I address two aspects of the question about how biodiversity features should be valued. (i) I investigate the extremely important but often neglected issue of differential prioritization of species for conservation. Species prioritization can be based on various criteria, and is always goal-dependent, but can also be implemented in a scientifically more rigorous way than what is the usual practice. (ii) I introduce a novel framework for conservation prioritization, which is based on continuous benefit functions that convert increasing levels of biodiversity feature representation to increasing conservation value using the principle that more is better. Traditional target-based systematic conservation planning is a special case of this approach, in which a step function is used for the benefit function. We have further expanded the benefit function framework for area prioritization to address issues such as protected area size and habitat vulnerability. In the second part of the thesis I address the application of community level modelling strategies to conservation prioritization. One of the most serious issues in systematic conservation planning currently is not the deficiency of methodology for selection and design, but simply the lack of data. Community level modelling offers a surrogate strategy that makes conservation planning more feasible in data poor regions. We have reviewed the available community-level approaches to conservation planning. These range from simplistic classification techniques to sophisticated modelling and selection strategies. We have also developed a general and novel community level approach to conservation prioritization that significantly improves on methods that were available before. This thesis introduces further degrees of realism into conservation planning methodology. The benefit function -based conservation prioritization framework largely circumvents the problematic phase of target setting, and allowing for trade-offs between species representation provides a more flexible and hopefully more attractive approach to conservation practitioners. The community-level approach seems highly promising and should prove valuable for conservation planning especially in data poor regions. Future work should focus on integrating prioritization methods to deal with multiple aspects in combination influencing the prioritization process, and further testing and refining the community level strategies using real, large datasets.