55 resultados para ANESTHETIC TECHNIQUES, General
Development of Sample Pretreatment and Liquid Chromatographic Techniques for Antioxidative Compounds
Resumo:
In this study, novel methodologies for the determination of antioxidative compounds in herbs and beverages were developed. Antioxidants are compounds that can reduce, delay or inhibit oxidative events. They are a part of the human defense system and are obtained through the diet. Antioxidants are naturally present in several types of foods, e.g. in fruits, beverages, vegetables and herbs. Antioxidants can also be added to foods during manufacturing to suppress lipid oxidation and formation of free radicals under conditions of cooking or storage and to reduce the concentration of free radicals in vivo after food ingestion. There is growing interest in natural antioxidants, and effective compounds have already been identified from antioxidant classes such as carotenoids, essential oils, flavonoids and phenolic acids. The wide variety of sample matrices and analytes presents quite a challenge for the development of analytical techniques. Growing demands have been placed on sample pretreatment. In this study, three novel extraction techniques, namely supercritical fluid extraction (SFE), pressurised hot water extraction (PHWE) and dynamic sonication-assisted extraction (DSAE) were studied. SFE was used for the extraction of lycopene from tomato skins and PHWE was used in the extraction of phenolic compounds from sage. DSAE was applied to the extraction of phenolic acids from Lamiaceae herbs. In the development of extraction methodologies, the main parameters of the extraction were studied and the recoveries were compared to those achieved by conventional extraction techniques. In addition, the stability of lycopene was also followed under different storage conditions. For the separation of the antioxidative compounds in the extracts, liquid chromatographic methods (LC) were utilised. Two novel LC techniques, namely ultra performance liquid chromatography (UPLC) and comprehensive two-dimensional liquid chromatography (LCxLC) were studied and compared with conventional high performance liquid chromatography (HPLC) for the separation of antioxidants in beverages and Lamiaceae herbs. In LCxLC, the selection of LC mode, column dimensions and flow rates were studied and optimised to obtain efficient separation of the target compounds. In addition, the separation powers of HPLC, UPLC, HPLCxHPLC and HPLCxUPLC were compared. To exploit the benefits of an integrated system, in which sample preparation and final separation are performed in a closed unit, dynamic sonication-assisted extraction was coupled on-line to a liquid chromatograph via a solid-phase trap. The increased sensitivity was utilised in the extraction of phenolic acids from Lamiaceae herbs. The results were compared to those of achieved by the LCxLC system.
Resumo:
Multi- and intralake datasets of fossil midge assemblages in surface sediments of small shallow lakes in Finland were studied to determine the most important environmental factors explaining trends in midge distribution and abundance. The aim was to develop palaeoenvironmental calibration models for the most important environmental variables for the purpose of reconstructing past environmental conditions. The developed models were applied to three high-resolution fossil midge stratigraphies from southern and eastern Finland to interpret environmental variability over the past 2000 years, with special focus on the Medieval Climate Anomaly (MCA), the Little Ice Age (LIA) and recent anthropogenic changes. The midge-based results were compared with physical properties of the sediment, historical evidence and environmental reconstructions based on diatoms (Bacillariophyta), cladocerans (Crustacea: Cladocera) and tree rings. The results showed that the most important environmental factor controlling midge distribution and abundance along a latitudinal gradient in Finland was the mean July air temperature (TJul). However, when the dataset was environmentally screened to include only pristine lakes, water depth at the sampling site became more important. Furthermore, when the dataset was geographically scaled to southern Finland, hypolimnetic oxygen conditions became the dominant environmental factor. The results from an intralake dataset from eastern Finland showed that the most important environmental factors controlling midge distribution within a lake basin were river contribution, water depth and submerged vegetation patterns. In addition, the results of the intralake dataset showed that the fossil midge assemblages represent fauna that lived in close proximity to the sampling sites, thus enabling the exploration of within-lake gradients in midge assemblages. Importantly, this within-lake heterogeneity in midge assemblages may have effects on midge-based temperature estimations, because samples taken from the deepest point of a lake basin may infer considerably colder temperatures than expected, as shown by the present test results. Therefore, it is suggested here that the samples in fossil midge studies involving shallow boreal lakes should be taken from the sublittoral, where the assemblages are most representative of the whole lake fauna. Transfer functions between midge assemblages and the environmental forcing factors that were significantly related with the assemblages, including mean air TJul, water depth, hypolimnetic oxygen, stream flow and distance to littoral vegetation, were developed using weighted averaging (WA) and weighted averaging-partial least squares (WA-PLS) techniques, which outperformed all the other tested numerical approaches. Application of the models in downcore studies showed mostly consistent trends. Based on the present results, which agreed with previous studies and historical evidence, the Medieval Climate Anomaly between ca. 800 and 1300 AD in eastern Finland was characterized by warm temperature conditions and dry summers, but probably humid winters. The Little Ice Age (LIA) prevailed in southern Finland from ca. 1550 to 1850 AD, with the coldest conditions occurring at ca. 1700 AD, whereas in eastern Finland the cold conditions prevailed over a longer time period, from ca. 1300 until 1900 AD. The recent climatic warming was clearly represented in all of the temperature reconstructions. In the terms of long-term climatology, the present results provide support for the concept that the North Atlantic Oscillation (NAO) index has a positive correlation with winter precipitation and annual temperature and a negative correlation with summer precipitation in eastern Finland. In general, the results indicate a relatively warm climate with dry summers but snowy winters during the MCA and a cool climate with rainy summers and dry winters during the LIA. The results of the present reconstructions and the forthcoming applications of the models can be used in assessments of long-term environmental dynamics to refine the understanding of past environmental reference conditions and natural variability required by environmental scientists, ecologists and policy makers to make decisions concerning the presently occurring global, regional and local changes. The developed midge-based models for temperature, hypolimnetic oxygen, water depth, littoral vegetation shift and stream flow, presented in this thesis, are open for scientific use on request.
Resumo:
In Finland, peat harvesting sites are utilized down almost to the mineral soil. In this situation the properties of mineral subsoil are likely to have considerable influence on the suitability for the various after-use forms. The aims of this study were to recognize the chemical and physical properties of mineral subsoils possibly limiting the after-use of cut-over peatlands, to define a minimum practice for mineral subsoil studies and to describe the role of different geological areas. The future percentages of the different after-use forms were predicted, which made it possible to predict also carbon accumulation in this future situation. Mineral subsoils of 54 different peat production areas were studied. Their general features and grain size distribution was analysed. Other general items studied were pH, electrical conductivity, organic matter, water soluble nutrients (P, NO3-N, NH4-N, S and Fe) and exchangeable nutrients (Ca, Mg and K). In some cases also other elements were analysed. In an additional case study carbon accumulation effectiveness before the intervention was evaluated on three sites in Oulu area (representing sites typically considered for peat production). Areas with relatively sulphur rich mineral subsoil and pool-forming areas with very fine and compact mineral subsoil together covered approximately 1/5 of all areas. These areas were unsuitable for commercial use. They were recommended for example for mire regeneration. Another approximate 1/5 of the areas included very coarse or very fine sediments. Commercial use of these areas would demand special techniques - like using the remaining peat layer for compensating properties missing from the mineral subsoil. One after-use form was seldom suitable for one whole released peat production area. Three typical distribution patterns (models) of different mineral subsoils within individual peatlands were found. 57 % of studied cut-over peatlands were well suited for forestry. In a conservative calculation 26% of the areas were clearly suitable for agriculture, horticulture or energy crop production. If till without large boulders was included, the percentage of areas suitable to field crop production would be 42 %. 9-14 % of all areas were well suitable for mire regeneration or bird sanctuaries, but all areas were considered possible for mire regeneration with correct techniques. Also another 11 % was recommended for mire regeneration to avoid disturbing the mineral subsoil, so total 20-25 % of the areas would be used for rewetting. High sulphur concentrations and acidity were typical to the areas below the highest shoreline of the ancient Litorina sea and Lake Ladoga Bothnian Bay zone. Also differences related to nutrition were detected. In coarse sediments natural nutrient concentration was clearly higher in Lake Ladoga Bothnian Bay zone and in the areas of Svecokarelian schists and gneisses, than in Granitoid area of central Finland and in Archaean gneiss areas. Based on this study the recommended minimum analysis for after-use planning was for pH, sulphur content and fine material (<0.06 mm) percentage. Nutrition capacity could be analysed using the natural concentrations of calcium, magnesium and potassium. Carbon accumulation scenarios were developed based on the land-use predictions. These scenarios were calculated for areas in peat production and the areas released from peat production (59300 ha + 15 671 ha). Carbon accumulation of the scenarios varied between 0.074 and 0.152 million t C a-1. In the three peatlands considered for peat production the long term carbon accumulation rates varied between 13 and 24 g C m-2 a-1. The natural annual carbon accumulation had been decreasing towards the time of possible intervention.
Resumo:
Topic detection and tracking (TDT) is an area of information retrieval research the focus of which revolves around news events. The problems TDT deals with relate to segmenting news text into cohesive stories, detecting something new, previously unreported, tracking the development of a previously reported event, and grouping together news that discuss the same event. The performance of the traditional information retrieval techniques based on full-text similarity has remained inadequate for online production systems. It has been difficult to make the distinction between same and similar events. In this work, we explore ways of representing and comparing news documents in order to detect new events and track their development. First, however, we put forward a conceptual analysis of the notions of topic and event. The purpose is to clarify the terminology and align it with the process of news-making and the tradition of story-telling. Second, we present a framework for document similarity that is based on semantic classes, i.e., groups of words with similar meaning. We adopt people, organizations, and locations as semantic classes in addition to general terms. As each semantic class can be assigned its own similarity measure, document similarity can make use of ontologies, e.g., geographical taxonomies. The documents are compared class-wise, and the outcome is a weighted combination of class-wise similarities. Third, we incorporate temporal information into document similarity. We formalize the natural language temporal expressions occurring in the text, and use them to anchor the rest of the terms onto the time-line. Upon comparing documents for event-based similarity, we look not only at matching terms, but also how near their anchors are on the time-line. Fourth, we experiment with an adaptive variant of the semantic class similarity system. The news reflect changes in the real world, and in order to keep up, the system has to change its behavior based on the contents of the news stream. We put forward two strategies for rebuilding the topic representations and report experiment results. We run experiments with three annotated TDT corpora. The use of semantic classes increased the effectiveness of topic tracking by 10-30\% depending on the experimental setup. The gain in spotting new events remained lower, around 3-4\%. The anchoring the text to a time-line based on the temporal expressions gave a further 10\% increase the effectiveness of topic tracking. The gains in detecting new events, again, remained smaller. The adaptive systems did not improve the tracking results.
Resumo:
The Minimum Description Length (MDL) principle is a general, well-founded theoretical formalization of statistical modeling. The most important notion of MDL is the stochastic complexity, which can be interpreted as the shortest description length of a given sample of data relative to a model class. The exact definition of the stochastic complexity has gone through several evolutionary steps. The latest instantation is based on the so-called Normalized Maximum Likelihood (NML) distribution which has been shown to possess several important theoretical properties. However, the applications of this modern version of the MDL have been quite rare because of computational complexity problems, i.e., for discrete data, the definition of NML involves an exponential sum, and in the case of continuous data, a multi-dimensional integral usually infeasible to evaluate or even approximate accurately. In this doctoral dissertation, we present mathematical techniques for computing NML efficiently for some model families involving discrete data. We also show how these techniques can be used to apply MDL in two practical applications: histogram density estimation and clustering of multi-dimensional data.
Resumo:
This thesis studies human gene expression space using high throughput gene expression data from DNA microarrays. In molecular biology, high throughput techniques allow numerical measurements of expression of tens of thousands of genes simultaneously. In a single study, this data is traditionally obtained from a limited number of sample types with a small number of replicates. For organism-wide analysis, this data has been largely unavailable and the global structure of human transcriptome has remained unknown. This thesis introduces a human transcriptome map of different biological entities and analysis of its general structure. The map is constructed from gene expression data from the two largest public microarray data repositories, GEO and ArrayExpress. The creation of this map contributed to the development of ArrayExpress by identifying and retrofitting the previously unusable and missing data and by improving the access to its data. It also contributed to creation of several new tools for microarray data manipulation and establishment of data exchange between GEO and ArrayExpress. The data integration for the global map required creation of a new large ontology of human cell types, disease states, organism parts and cell lines. The ontology was used in a new text mining and decision tree based method for automatic conversion of human readable free text microarray data annotations into categorised format. The data comparability and minimisation of the systematic measurement errors that are characteristic to each lab- oratory in this large cross-laboratories integrated dataset, was ensured by computation of a range of microarray data quality metrics and exclusion of incomparable data. The structure of a global map of human gene expression was then explored by principal component analysis and hierarchical clustering using heuristics and help from another purpose built sample ontology. A preface and motivation to the construction and analysis of a global map of human gene expression is given by analysis of two microarray datasets of human malignant melanoma. The analysis of these sets incorporate indirect comparison of statistical methods for finding differentially expressed genes and point to the need to study gene expression on a global level.
Resumo:
The metabolism of an organism consists of a network of biochemical reactions that transform small molecules, or metabolites, into others in order to produce energy and building blocks for essential macromolecules. The goal of metabolic flux analysis is to uncover the rates, or the fluxes, of those biochemical reactions. In a steady state, the sum of the fluxes that produce an internal metabolite is equal to the sum of the fluxes that consume the same molecule. Thus the steady state imposes linear balance constraints to the fluxes. In general, the balance constraints imposed by the steady state are not sufficient to uncover all the fluxes of a metabolic network. The fluxes through cycles and alternative pathways between the same source and target metabolites remain unknown. More information about the fluxes can be obtained from isotopic labelling experiments, where a cell population is fed with labelled nutrients, such as glucose that contains 13C atoms. Labels are then transferred by biochemical reactions to other metabolites. The relative abundances of different labelling patterns in internal metabolites depend on the fluxes of pathways producing them. Thus, the relative abundances of different labelling patterns contain information about the fluxes that cannot be uncovered from the balance constraints derived from the steady state. The field of research that estimates the fluxes utilizing the measured constraints to the relative abundances of different labelling patterns induced by 13C labelled nutrients is called 13C metabolic flux analysis. There exist two approaches of 13C metabolic flux analysis. In the optimization approach, a non-linear optimization task, where candidate fluxes are iteratively generated until they fit to the measured abundances of different labelling patterns, is constructed. In the direct approach, linear balance constraints given by the steady state are augmented with linear constraints derived from the abundances of different labelling patterns of metabolites. Thus, mathematically involved non-linear optimization methods that can get stuck to the local optima can be avoided. On the other hand, the direct approach may require more measurement data than the optimization approach to obtain the same flux information. Furthermore, the optimization framework can easily be applied regardless of the labelling measurement technology and with all network topologies. In this thesis we present a formal computational framework for direct 13C metabolic flux analysis. The aim of our study is to construct as many linear constraints to the fluxes from the 13C labelling measurements using only computational methods that avoid non-linear techniques and are independent from the type of measurement data, the labelling of external nutrients and the topology of the metabolic network. The presented framework is the first representative of the direct approach for 13C metabolic flux analysis that is free from restricting assumptions made about these parameters.In our framework, measurement data is first propagated from the measured metabolites to other metabolites. The propagation is facilitated by the flow analysis of metabolite fragments in the network. Then new linear constraints to the fluxes are derived from the propagated data by applying the techniques of linear algebra.Based on the results of the fragment flow analysis, we also present an experiment planning method that selects sets of metabolites whose relative abundances of different labelling patterns are most useful for 13C metabolic flux analysis. Furthermore, we give computational tools to process raw 13C labelling data produced by tandem mass spectrometry to a form suitable for 13C metabolic flux analysis.
Resumo:
Certain software products employing digital techniques for encryption of data are subject to export controls in the EU Member States pursuant to Community law and relevant laws in the Member States. These controls are agreed globally in the framework of the so-called Wassenaar Arrangement. Wassenaar is an informal non-proliferation regime aimed at promoting international stability and responsibility in transfers of strategic (dual-use) products and technology. This thesis covers provisions of Wassenaar, Community export control laws and export control laws of Finland, Sweden, Germany, France and United Kingdom. This thesis consists of five chapters. The first chapter discusses the ratio of export control laws and the impact they have on global trade. The ratio is originally defence-related - in general to prevent potential adversaries of participating States from having the same tools, and in particular in the case of cryptographic software to enable signals intelligence efforts. Increasingly as the use of cryptography in a civilian context has mushroomed, export restrictions can have negative effects on civilian trade. Information security solutions may also be took weak because of export restrictions on cryptography. The second chapter covers the OECD's Cryptography Policy, which had a significant effect on its member nations' national cryptography policies and legislation. The OECD is a significant organization,because it acts as a meeting forum for most important industrialized nations. The third chapter covers the Wassenaar Arrangement. The Arrangement is covered from the viewpoint of international law and politics. The Wassenaar control list provisions affecting cryptographic software transfers are also covered in detail. Control lists in the EU and in Member States are usually directly copied from Wassenaar control lists. Controls agreed in its framework set only a minimum level for participating States. However, Wassenaar countries can adopt stricter controls. The fourth chapter covers Community export control law. Export controls are viewed in Community law as falling within the domain of Common Commercial Policy pursuant to Article 133 of the EC Treaty. Therefore the Community has exclusive competence in export matters, save where a national measure is authorized by the Community or falls under foreign or security policy derogations established in Community law. The Member States still have a considerable amount of power in the domain of Common Foreign and Security Policy. They are able to maintain national export controls because export control laws are not fully harmonized. This can also have possible detrimental effects on the functioning of internal market and common export policies. In 1995 the EU adopted Dual-Use Regulation 3381/94/EC, which sets common rules for exports in Member States. Provisions of this regulation receive detailed coverage in this chapter. The fifth chapter covers national legislation and export authorization practices in five different Member States - in Finland, Sweden, Germany, France and in United Kingdom. Export control laws of those Member States are covered when the national laws differ from the uniform approach of the Community's acquis communautaire. Keywords: export control, encryption, software, dual-use, license, foreign trade, e-commerce, Internet
Resumo:
One major reason for the global decline of biodiversity is habitat loss and fragmentation. Conservation areas can be designed to reduce biodiversity loss, but as resources are limited, conservation efforts need to be prioritized in order to achieve best possible outcomes. The field of systematic conservation planning developed as a response to opportunistic approaches to conservation that often resulted in biased representation of biological diversity. The last two decades have seen the development of increasingly sophisticated methods that account for information about biodiversity conservation goals (benefits), economical considerations (costs) and socio-political constraints. In this thesis I focus on two general topics related to systematic conservation planning. First, I address two aspects of the question about how biodiversity features should be valued. (i) I investigate the extremely important but often neglected issue of differential prioritization of species for conservation. Species prioritization can be based on various criteria, and is always goal-dependent, but can also be implemented in a scientifically more rigorous way than what is the usual practice. (ii) I introduce a novel framework for conservation prioritization, which is based on continuous benefit functions that convert increasing levels of biodiversity feature representation to increasing conservation value using the principle that more is better. Traditional target-based systematic conservation planning is a special case of this approach, in which a step function is used for the benefit function. We have further expanded the benefit function framework for area prioritization to address issues such as protected area size and habitat vulnerability. In the second part of the thesis I address the application of community level modelling strategies to conservation prioritization. One of the most serious issues in systematic conservation planning currently is not the deficiency of methodology for selection and design, but simply the lack of data. Community level modelling offers a surrogate strategy that makes conservation planning more feasible in data poor regions. We have reviewed the available community-level approaches to conservation planning. These range from simplistic classification techniques to sophisticated modelling and selection strategies. We have also developed a general and novel community level approach to conservation prioritization that significantly improves on methods that were available before. This thesis introduces further degrees of realism into conservation planning methodology. The benefit function -based conservation prioritization framework largely circumvents the problematic phase of target setting, and allowing for trade-offs between species representation provides a more flexible and hopefully more attractive approach to conservation practitioners. The community-level approach seems highly promising and should prove valuable for conservation planning especially in data poor regions. Future work should focus on integrating prioritization methods to deal with multiple aspects in combination influencing the prioritization process, and further testing and refining the community level strategies using real, large datasets.
Resumo:
Viruses are submicroscopic, infectious agents that are obligate intracellular parasites. They adopt various types of strategies for their parasitic replication and proliferation in infected cells. The nucleic acid genome of a virus contains information that redirects molecular machinery of the cell to the replication and production of new virions. Viruses that replicate in the cytoplasm and are unable to use the nuclear transcription machinery of the host cell have developed their own transcription and capping systems. This thesis describes replication strategies of two distantly related viruses, hepatitis E virus (HEV) and Semliki Forest virus (SFV), which belong to the alphavirus-like superfamily of positive-strand RNA viruses. We have demonstrated that HEV and SFV share a unique cap formation pathway specific for alphavirus-like superfamily. The capping enzyme first acts as a methyltransferase, catalyzing the transfer of a methyl group from S-adenosylmethionine to GTP to yield m7GTP. It then transfers the methylated guanosine to the end of viral mRNA. Both reactions are virus-specific and differ from those described for the host cell. Therefore, these capping reactions offer attractive targets for the development of antiviral drugs. Additionally, it has been shown that replication of SFV and HEV takes place in association with cellular membranes. The origin of these membranes and the intracellular localization of the components of the replication complex were studied by modern microscopy techniques. It was demonstrated that SFV replicates in cytoplasmic membranes that are derived from endosomes and lysosomes. According to our studies, site for HEV replication seems to be the intermediate compartment which mediates the traffic between endoplasmic reticulum and the Golgi complex. As a result of this work, a unique mechanism of cap formation for hepatitis E virus replicase has been characterized. It represents a novel target for the development of specific inhibitors against viral replication.