13 resultados para analytical tools
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Lipoproteins are a heterogeneous population of blood plasma particles composed of apolipoproteins and lipids. Lipoproteins transport exogenous and endogenous triglycerides and cholesterol from sites of absorption and formation to sites of storage and usage. Three major classes of lipoproteins are distinguished according to their density: high-density (HDL), low-density (LDL) and very low-density lipoproteins (VLDL). While HDLs contain mainly apolipoproteins of lower molecular weight, the two other classes contain apolipoprotein B and apolipoprotein (a) together with triglycerides and cholesterol. HDL concentrations were found to be inversely related to coronary heart disease and LDL/VLDL concentrations directly related. Although many studies have been published in this area, few have concentrated on the exact protein composition of lipoprotein particles. Lipoproteins were separated by density gradient ultracentrifugation into different subclasses. Native gel electrophoresis revealed different gel migration behaviour of the particles, with less dense particles having higher apparent hydrodynamic radii than denser particles. Apolipoprotein composition profiles were measured by matrix-assisted laser desorption/ionization-mass spectrometry on a macromizer instrument, equipped with the recently introduced cryodetector technology, and revealed differences in apolipoprotein composition between HDL subclasses. By combining these profiles with protein identifications from native and denaturing polyacrylamide gels by liquid chromatography-tandem mass spectrometry, we characterized comprehensively the exact protein composition of different lipoprotein particles. We concluded that the differential display of protein weight information acquired by macromizer mass spectrometry is an excellent tool for revealing structural variations of different lipoprotein particles, and hence the foundation is laid for the screening of cardiovascular disease risk factors associated with lipoproteins.
Resumo:
Societies develop ways of making decisions regarding collective problems, thereby creating norms, rules, and institutions; this is what governance is about. In policy research, governance has become an important focus of attention; but debates show a lack of clarity at the conceptual level and a confusion between the use of the concept for prescriptive and analytical purposes. The present article is based on the hypothesis that using a clarified, non-normative governance perspective in policy research can contribute to an improved understanding of political processes, including formal and unrecognised ones, those embedded in larger and smaller social systems, as well as both vertical and horizontal political arrangements. The paper is the result of a collaborative engagement with the concept of governance within several networks, leading to the development of the Governance Analytical Framework (GAF). The GAF is a practical methodology for investigating governance processes, based on five analytical tools: problems, actors, social norms, processes, and nodal points. Besides describing the conceptual sources and analytical purpose of these five tools, the paper presents examples of how the GAF can be operationalised.
Resumo:
Metabolite identification and metabolite profiling are of major importance in the pharmaceutical and clinical context. However, highly polar and ionic substances are rarely included as analytical tools are missing. In this study, we present a new method for the determination of urinary sulfates, sulfonates, phosphates and other anions of strong acids. The method comprises a CE separation using an acidic BGE (pH
Resumo:
This in vitro study investigated the erosion-inhibiting properties of dental rinses during erosion in the presence of the salivary pellicle. The erosion inhibition by a Sn/F containing dental rinse (800 ppm Sn2+, 500 ppm F –, pH = 4.5) was compared with a fluoridated solution (500 ppm F –, pH = 4.5) and water(control). Calcium release and enamel softening were significantly reduced among enamel samples exposed to the Sn/F rinse (group SF)compared to those treated with the fluoride solution (group F) and the control (p 0.05). SEM showed slightly etched enamel interfaces in group SF, whereas the erosion was more pronounced in group F and even more severe in the control group. In conclusion, the Sn/F combination provided the best inhibition of erosion among tested solutions. This study demonstrates the application of different analytical tools for comparative erosion quantification.A strong correlation (r2 ≥ 0.783) was shown between calcium release and enamel softening during demineralization.
Resumo:
The brain is a complex neural network with a hierarchical organization and the mapping of its elements and connections is an important step towards the understanding of its function. Recent developments in diffusion-weighted imaging have provided the opportunity to reconstruct the whole-brain structural network in-vivo at a large scale level and to study the brain structural substrate in a framework that is close to the current understanding of brain function. However, methods to construct the connectome are still under development and they should be carefully evaluated. To this end, the first two studies included in my thesis aimed at improving the analytical tools specific to the methodology of brain structural networks. The first of these papers assessed the repeatability of the most common global and local network metrics used in literature to characterize the connectome, while in the second paper the validity of further metrics based on the concept of communicability was evaluated. Communicability is a broader measure of connectivity which accounts also for parallel and indirect connections. These additional paths may be important for reorganizational mechanisms in the presence of lesions as well as to enhance integration in the network. These studies showed good to excellent repeatability of global network metrics when the same methodological pipeline was applied, but more variability was detected when considering local network metrics or when using different thresholding strategies. In addition, communicability metrics have been found to add some insight into the integration properties of the network by detecting subsets of nodes that were highly interconnected or vulnerable to lesions. The other two studies used methods based on diffusion-weighted imaging to obtain knowledge concerning the relationship between functional and structural connectivity and about the etiology of schizophrenia. The third study integrated functional oscillations measured using electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) as well as diffusion-weighted imaging data. The multimodal approach that was applied revealed a positive relationship between individual fluctuations of the EEG alpha-frequency and diffusion properties of specific connections of two resting-state networks. Finally, in the fourth study diffusion-weighted imaging was used to probe for a relationship between the underlying white matter tissue structure and season of birth in schizophrenia patients. The results are in line with the neurodevelopmental hypothesis of early pathological mechanisms as the origin of schizophrenia. The different analytical approaches selected in these studies also provide arguments for discussion of the current limitations in the analysis of brain structural networks. To sum up, the first studies presented in this thesis illustrated the potential of brain structural network analysis to provide useful information on features of brain functional segregation and integration using reliable network metrics. In the other two studies alternative approaches were presented. The common discussion of the four studies enabled us to highlight the benefits and possibilities for the analysis of the connectome as well as some current limitations.
Resumo:
Foot-and-mouth disease (FMD) is a highly contagious disease that caused several large outbreaks in Europe in the last century. The last important outbreak in Switzerland took place in 1965/66 and affected more than 900 premises and more than 50,000 animals were slaughtered. Large-scale emergency vaccination of the cattle and pig population has been applied to control the epidemic. In recent years, many studies have used infectious disease models to assess the impact of different disease control measures, including models developed for diseases exotic for the specific region of interest. Often, the absence of real outbreak data makes a validation of such models impossible. This study aimed to evaluate whether a spatial, stochastic simulation model (the Davis Animal Disease Simulation model) can predict the course of a Swiss FMD epidemic based on the available historic input data on population structure, contact rates, epidemiology of the virus, and quality of the vaccine. In addition, the potential outcome of the 1965/66 FMD epidemic without application of vaccination was investigated. Comparing the model outcomes to reality, only the largest 10% of the simulated outbreaks approximated the number of animals being culled. However, the simulation model highly overestimated the number of culled premises. While the outbreak duration could not be well reproduced by the model compared to the 1965/66 epidemic, it was able to accurately estimate the size of the area infected. Without application of vaccination, the model predicted a much higher mean number of culled animals than with vaccination, demonstrating that vaccination was likely crucial in disease control for the Swiss FMD outbreak in 1965/66. The study demonstrated the feasibility to analyze historical outbreak data with modern analytical tools. However, it also confirmed that predicted epidemics from a most carefully parameterized model cannot integrate all eventualities of a real epidemic. Therefore, decision makers need to be aware that infectious disease models are useful tools to support the decision-making process but their results are not equal valuable as real observations and should always be interpreted with caution.
Resumo:
Global transcriptomic and proteomic profiling platforms have yielded important insights into the complex response to ionizing radiation (IR). Nonetheless, little is known about the ways in which small cellular metabolite concentrations change in response to IR. Here, a metabolomics approach using ultraperformance liquid chromatography coupled with electrospray time-of-flight mass spectrometry was used to profile, over time, the hydrophilic metabolome of TK6 cells exposed to IR doses ranging from 0.5 to 8.0 Gy. Multivariate data analysis of the positive ions revealed dose- and time-dependent clustering of the irradiated cells and identified certain constituents of the water-soluble metabolome as being significantly depleted as early as 1 h after IR. Tandem mass spectrometry was used to confirm metabolite identity. Many of the depleted metabolites are associated with oxidative stress and DNA repair pathways. Included are reduced glutathione, adenosine monophosphate, nicotinamide adenine dinucleotide, and spermine. Similar measurements were performed with a transformed fibroblast cell line, BJ, and it was found that a subset of the identified TK6 metabolites were effective in IR dose discrimination. The GEDI (Gene Expression Dynamics Inspector) algorithm, which is based on self-organizing maps, was used to visualize dynamic global changes in the TK6 metabolome that resulted from IR. It revealed dose-dependent clustering of ions sharing the same trends in concentration change across radiation doses. "Radiation metabolomics," the application of metabolomic analysis to the field of radiobiology, promises to increase our understanding of cellular responses to stressors such as radiation.
Resumo:
Experts working on behalf of international development organisations need better tools to assist land managers in developing countriesmaintain their livelihoods, as climate change puts pressure on the ecosystemservices that they depend upon. However, current understanding of livelihood vulnerability to climate change is based on a fractured and disparate set of theories andmethods. This reviewtherefore combines theoretical insights from sustainable livelihoods analysis with other analytical frameworks (including the ecosystem services framework, diffusion theory, social learning, adaptive management and transitions management) to assess the vulnerability of rural livelihoods to climate change. This integrated analytical framework helps diagnose vulnerability to climate change,whilst identifying and comparing adaptation options that could reduce vulnerability, following four broad steps: i) determine likely level of exposure to climate change, and how climate change might interact with existing stresses and other future drivers of change; ii) determine the sensitivity of stocks of capital assets and flows of ecosystem services to climate change; iii) identify factors influencing decisions to develop and/or adopt different adaptation strategies, based on innovation or the use/substitution of existing assets; and iv) identify and evaluate potential trade-offs between adaptation options. The paper concludes by identifying interdisciplinary research needs for assessing the vulnerability of livelihoods to climate change.
Resumo:
Oligonucleotides comprising unnatural building blocks, which interfere with the translation machinery, have gained increased attention for the treatment of gene-related diseases (e.g. antisense, RNAi). Due to structural modifications, synthetic oligonucleotides exhibit increased biostability and bioavailability upon administration. Consequently, classical enzyme-based sequencing methods are not applicable to their sequence elucidation and verification. Tandem mass spectrometry is the method of choice for performing such tasks, since gas-phase dissociation is not restricted to natural nucleic acids. However, tandem mass spectrometric analysis can generate product ion spectra of tremendous complexity, as the number of possible fragments grows rapidly with increasing sequence length. The fact that structural modifications affect the dissociation pathways greatly increases the variety of analytically valuable fragment ions. The gas-phase dissociation of oligonucleotides is characterized by the cleavage of one of the four bonds along the phosphodiester chain, by the accompanying loss of nucleases, and by the generation of internal fragments due to secondary backbone cleavage. For example, an 18-mer oligonucleotide yields a total number of 272’920 theoretical fragment ions. In contrast to the processing of peptide product ion spectra, which nowadays is highly automated, there is a lack of tools assisting the interpretation of oligonucleotide data. The existing web-based and stand-alone software applications are primarily designed for the sequence analysis of natural nucleic acids, but do not account for chemical modifications and adducts. Consequently, we developed a software to support the interpretation of mass spectrometric data of natural and modified nucleic acids and their adducts with chemotherapeutic agents.
Resumo:
This chapter summarises the metabolomic strategies currently in force used in plant science and describes the methods used. The metabolite profiling and fingerprinting of plant tissues through MS- and/or NMR-based approaches and the subsequent identification of biomarkers is detailed. Strategies for the microisolation and de novo identification of unknown biomarkers are also discussed. The various approaches are illustrated by a metabolomic study of the maize response to herbivory. A review of recent metabolomic studies performed on seed and crop plant tissues involving various analytical strategies is provided.
Resumo:
Land degradation is intrinsically complex and involves decisions by many agencies and individuals, land degradation map- ping should be used as a learning tool through which managers, experts and stakeholders can re-examine their views within a wider semantic context. In this paper, we introduce an analytical framework for mapping land degradation, developed by World Overview for Conservation Approaches and technologies (WOCAT) programs, which aims to develop some thematic maps that serve as an useful tool and including effective information on land degradation and conservation status. Consequently, this methodology would provide an important background for decision-making in order to launch rehabilitation/remediation actions in high-priority intervention areas. As land degradation mapping is a problem-solving task that aims to provide clear information, this study entails the implementation of WOCAT mapping tool, which integrate a set of indicators to appraise the severity of land degradation across a representative watershed. So this work focuses on the use of the most relevant indicators for measuring impacts of different degradation processes in El Mkhachbiya catchment, situated in Northwest of Tunisia and those actions taken to deal with them based on the analysis of operating modes and issues of degradation in different land use systems. This study aims to provide a database for surveillance and monitoring of land degradation, in order to support stakeholders in making appropriate choices and judge guidelines and possible suitable recommendations to remedy the situation in order to promote sustainable development. The approach is illustrated through a case study of an urban watershed in Northwest of Tunisia. Results showed that the main land degradation drivers in the study area were related to natural processes, which were exacerbated by human activities. So the output of this analytical framework enabled a better communication of land degradation issues and concerns in a way relevant for policymakers.
Resumo:
Environmental quality monitoring of water resources is challenged with providing the basis for safeguarding the environment against adverse biological effects of anthropogenic chemical contamination from diffuse and point sources. While current regulatory efforts focus on monitoring and assessing a few legacy chemicals, many more anthropogenic chemicals can be detected simultaneously in our aquatic resources. However, exposure to chemical mixtures does not necessarily translate into adverse biological effects nor clearly shows whether mitigation measures are needed. Thus, the question which mixtures are present and which have associated combined effects becomes central for defining adequate monitoring and assessment strategies. Here we describe the vision of the international, EU-funded project SOLUTIONS, where three routes are explored to link the occurrence of chemical mixtures at specific sites to the assessment of adverse biological combination effects. First of all, multi-residue target and non-target screening techniques covering a broader range of anticipated chemicals co-occurring in the environment are being developed. By improving sensitivity and detection limits for known bioactive compounds of concern, new analytical chemistry data for multiple components can be obtained and used to characterise priority mixtures. This information on chemical occurrence will be used to predict mixture toxicity and to derive combined effect estimates suitable for advancing environmental quality standards. Secondly, bioanalytical tools will be explored to provide aggregate bioactivity measures integrating all components that produce common (adverse) outcomes even for mixtures of varying compositions. The ambition is to provide comprehensive arrays of effect-based tools and trait-based field observations that link multiple chemical exposures to various environmental protection goals more directly and to provide improved in situ observations for impact assessment of mixtures. Thirdly, effect-directed analysis (EDA) will be applied to identify major drivers of mixture toxicity. Refinements of EDA include the use of statistical approaches with monitoring information for guidance of experimental EDA studies. These three approaches will be explored using case studies at the Danube and Rhine river basins as well as rivers of the Iberian Peninsula. The synthesis of findings will be organised to provide guidance for future solution-oriented environmental monitoring and explore more systematic ways to assess mixture exposures and combination effects in future water quality monitoring.
Resumo:
We describe the case of a patient with a T-lymphoblastic lymphoma whose disseminated mucormycosis was diagnosed with delay, and we address the diagnostic and therapeutic decision-making process and review the diagnostic workup of patients with potential IFD. The diagnosis was delayed despite a suggestive radiological presentation of the patient's pulmonary lesion. The uncommon risk profile (T-lymphoblastic lymphoma, short neutropenic phases) wrongly led to a low level of suspicion. The diagnosis was also hampered by the lack of indirect markers for infections caused by Mucorales, the low sensitivity of both fungal culture and panfungal PCR, and the limited availability of species-specific PCR. A high level of suspicion of IFD is needed, and aggressive diagnostic procedures should be promptly initiated even in apparently low-risk patients with uncommon presentations. The extent of the analytical workup should be decided on a case-by-case base. Diagnostic tests such as the galactomannan and β-D-glucan test and/or PCR on biological material followed by sequencing should be chosen according to their availability and after evaluation of their specificity and sensitivity. In high-risk patients, preemptive therapy with a broad-spectrum mould-active antifungal agent should be started before definitive diagnostic findings become available.