812 resultados para Scientific community
Resumo:
In this thesis acceleration of energetic particles at collisionless shock waves in space plasmas is studied using numerical simulations, with an emphasis on physical conditions applicable to the solar corona. The thesis consists of four research articles and an introductory part that summarises the main findings reached in the articles and discusses them with respect to theory of diffusive shock acceleration and observations. This thesis gives a brief review of observational properties of solar energetic particles and discusses a few open questions that are currently under active research. For example, in a few large gradual solar energetic particle events the heavy ion abundance ratios and average charge states show characteristics at high energies that are typically associated with flare-accelerated particles, i.e. impulsive events. The role of flare-accelerated particles in these and other gradual events has been discussed a lot in the scientific community, and it has been questioned if and how the observed features can be explained in terms of diffusive shock acceleration at shock waves driven by coronal mass ejections. The most extreme solar energetic particle events are the so-called ground level enhancements where particle receive so high energies that they can penetrate all the way through Earth's atmosphere and increase radiation levels at the surface. It is not known what conditions are required for acceleration into GeV/nuc energies, and the presence of both very fast coronal mass ejections and X-class solar flares makes it difficult to determine what is the role of these two accelerators in ground level enhancements. The theory of diffusive shock acceleration is reviewed and its predictions discussed with respect to the observed particle characteristics. We discuss how shock waves can be modeled and describe in detail the numerical model developed by the author. The main part of this thesis consists of the four scientific articles that are based on results of the numerical shock acceleration model developed by the author. The novel feature of this model is that it can handle complex magnetic geometries which are found, for example, near active regions in the solar corona. We show that, according to our simulations, diffusive shock acceleration can explain the observed variations in abundance ratios and average charge states, provided that suitable seed particles and magnetic geometry are available for the acceleration process in the solar corona. We also derive an injection threshold for diffusive shock acceleration that agrees with our simulation results very well, and which is valid under weakly turbulent conditions. Finally, we show that diffusive shock acceleration can produce GeV/nuc energies under suitable coronal conditions, which include the presence of energetic seed particles, a favourable magnetic geometry, and an enhanced level of ambient turbulence.
Resumo:
The use of animals in scientific experiments tends to arouse strong emotional reactions among the general public, the most essential concern being the pain and suffering they cause. It is felt that suffering inflicted on other beings, including animals, is not morally acceptable. Is the function of a researcher who uses animals morally acceptable and beneficial for humans and animals? May such a researcher him/herself decide what animal experiments he/she can perform or should some outsider have the right to decide what kind of experiments a researcher can or cannot perform? The research material comprises the legislation of Finland and that of some member and non-member states of the European Union, together with European Union directives and pertinent preparatory parliamentary documents. The author has likewise studied the vast literature on animal rights, both pro and contra writings and opinions. The opinions of philosophers on the moral and legal rights of animals are markedly conflicting. Some strongly support the existence of rights, while others totally refute such an opinion, claiming that the question is only of the moral principles of man himself which imply that animals must be treated in a human manner. Speaking of animal rights only tends to muddle ideas on the one hand in philosophical considerations and in legal analyses on the other. The development of legislation in Finland and some other member states of the European Union has in principle been similar. In Finland, the positive laws on animal experiments nowadays comply with the EU directive 86/609/EEC. However, there are marked differences between member states in respect of the way they have in practice implemented the principles of the EU directive. No essential alterations have in practice been discernible in the actual performance of animal experiments during the decades when legislation has been developed in different countries. Self-regulation within the scientific community has been markedly more effectual than legislative procedures. Legal regulation has nevertheless clearly influenced the quality of breeding and life conditions of experimental laboratory animals, cages for example being nowadays larger than hitherto. EU parliament and council have now accepted in September 2010 a new directive on animal experiments which must be implemented in the national legislations by January 1, 2013.
Resumo:
The purpose of this study was to deepen the understanding of market segmentation theory by studying the evolution of the concept and by identifying the antecedents and consequences of the theory. The research method was influenced by content analysis and meta-analysis. The evolution of market segmentation theory was studied as a reflection of evolution of marketing theory. According to this study, the theory of market segmentation has its roots in microeconomics and it has been influenced by different disciplines, such as motivation research and buyer behaviour theory. Furthermore, this study suggests that the evolution of market segmentation theory can be divided into four major eras: the era of foundations, development and blossoming, stillness and stagnation, and the era of re-emergence. Market segmentation theory emerged in the mid-1950’s and flourished during the period between mid-1950’s and the late 1970’s. During the 1980’s the theory lost its interest in the scientific community and no significant contributions were made. Now, towards the dawn of the new millennium, new approaches have emerged and market segmentation has gained new attention.
Resumo:
Gene expression is one of the most critical factors influencing the phenotype of a cell. As a result of several technological advances, measuring gene expression levels has become one of the most common molecular biological measurements to study the behaviour of cells. The scientific community has produced enormous and constantly increasing collection of gene expression data from various human cells both from healthy and pathological conditions. However, while each of these studies is informative and enlighting in its own context and research setup, diverging methods and terminologies make it very challenging to integrate existing gene expression data to a more comprehensive view of human transcriptome function. On the other hand, bioinformatic science advances only through data integration and synthesis. The aim of this study was to develop biological and mathematical methods to overcome these challenges and to construct an integrated database of human transcriptome as well as to demonstrate its usage. Methods developed in this study can be divided in two distinct parts. First, the biological and medical annotation of the existing gene expression measurements needed to be encoded by systematic vocabularies. There was no single existing biomedical ontology or vocabulary suitable for this purpose. Thus, new annotation terminology was developed as a part of this work. Second part was to develop mathematical methods correcting the noise and systematic differences/errors in the data caused by various array generations. Additionally, there was a need to develop suitable computational methods for sample collection and archiving, unique sample identification, database structures, data retrieval and visualization. Bioinformatic methods were developed to analyze gene expression levels and putative functional associations of human genes by using the integrated gene expression data. Also a method to interpret individual gene expression profiles across all the healthy and pathological tissues of the reference database was developed. As a result of this work 9783 human gene expression samples measured by Affymetrix microarrays were integrated to form a unique human transcriptome resource GeneSapiens. This makes it possible to analyse expression levels of 17330 genes across 175 types of healthy and pathological human tissues. Application of this resource to interpret individual gene expression measurements allowed identification of tissue of origin with 92.0% accuracy among 44 healthy tissue types. Systematic analysis of transcriptional activity levels of 459 kinase genes was performed across 44 healthy and 55 pathological tissue types and a genome wide analysis of kinase gene co-expression networks was done. This analysis revealed biologically and medically interesting data on putative kinase gene functions in health and disease. Finally, we developed a method for alignment of gene expression profiles (AGEP) to perform analysis for individual patient samples to pinpoint gene- and pathway-specific changes in the test sample in relation to the reference transcriptome database. We also showed how large-scale gene expression data resources can be used to quantitatively characterize changes in the transcriptomic program of differentiating stem cells. Taken together, these studies indicate the power of systematic bioinformatic analyses to infer biological and medical insights from existing published datasets as well as to facilitate the interpretation of new molecular profiling data from individual patients.
Resumo:
Madras triple helix’ was the name assigned by the scientific community in the West, to the molecular model proposed for the fibrous protein collagen, by G N Ramachandran’s group at the University of Madras. As mentioned jocularly in a recent retrospective of this work by Sasisekharan and Yathindra [1], the term was possibly coined due to the difficulty of Western scientists in pronouncing the Indian names of Ramachandran and his associates. The unravelling of the precise nature of collagen structure indeed makes for a fascinating story and as succinctly put by Dickerson [2]: “... to trace the evolution of the structure of collagen is to trace the evolution of fibrous protein crystallography in miniature”. This article is a brief review highlighting the pioneering contributions made by G N Ramachandran in elucidating the correct structure of this important molecule and is a sincere tribute by the author to her mentor, doctoral thesis supervisor and major source of inspiration for embarking on a career in biophysics
Resumo:
With the immense growth in the number of available protein structures, fast and accurate structure comparison has been essential. We propose an efficient method for structure comparison, based on a structural alphabet. Protein Blocks (PBs) is a widely used structural alphabet with 16 pentapeptide conformations that can fairly approximate a complete protein chain. Thus a 3D structure can be translated into a 1D sequence of PBs. With a simple Needleman-Wunsch approach and a raw PB substitution matrix, PB-based structural alignments were better than many popular methods. iPBA web server presents an improved alignment approach using (i) specialized PB Substitution Matrices (SM) and (ii) anchor-based alignment methodology. With these developments, the quality of similar to 88% of alignments was improved. iPBA alignments were also better than DALI, MUSTANG and GANGSTA(+) in > 80% of the cases. The webserver is designed to for both pairwise comparisons and database searches. Outputs are given as sequence alignment and superposed 3D structures displayed using PyMol and Jmol. A local alignment option for detecting subs-structural similarity is also embedded. As a fast and efficient `sequence-based' structure comparison tool, we believe that it will be quite useful to the scientific community. iPBA can be accessed at http://www.dsimb.inserm.fr/dsimb_tools/ipba/.
Resumo:
Roctest Group believes in the importance of maintaining a close contact with the scientific community active in fields close to our activities domains, in particular smart structures, structural engineering, sensing and fiber optic sensors. These contacts allow Roctest SMARTEC Telemac to remain at the front of scientific progress and to contribute to the diffusion of the monitoring culture worldwide. Our research and development team actively contributes in the research community, attending conferences and regularly publishing in the scientific literature. we support academic research by participating in joint research projects and by regularly welcoming graduate and undergraduate students for stages and exchange programs.
Resumo:
About a third of the human population is estimated to be infected with Mycobacterium tuberculosis. Emergence of drug resistant strains and the protracted treatment strategies have compelled the scientific community to identify newer drug targets, and to develop newer vaccines. In the host macrophages, the bacterium survives within an environment rich in reactive nitrogen and oxygen species capable of damaging its genome. Therefore, for its successful persistence in the host, the pathogen must need robust DNA repair mechanisms. Analysis of M. tuberculosis genome sequence revealed that it lacks mismatch repair pathway suggesting a greater role for other DNA repair pathways such as the nucleotide excision repair, and base excision repair pathways. In this article, we summarize the outcome of research involving these two repair pathways in mycobacteria focusing primarily on our own efforts. Our findings, using Mycobacterium smegmatis model, suggest that deficiency of various DNA repair functions in single or in combinations severely compromises their DNA repair capacity and attenuates their growth under conditions typically encountered in macrophages. (C) 2011 Elsevier Ireland Ltd. All rights reserved.
Resumo:
A computational pipeline PocketAnnotate for functional annotation of proteins at the level of binding sites has been proposed in this study. The pipeline integrates three in-house algorithms for site-based function annotation: PocketDepth, for prediction of binding sites in protein structures; PocketMatch, for rapid comparison of binding sites and PocketAlign, to obtain detailed alignment between pair of binding sites. A novel scheme has been developed to rapidly generate a database of non-redundant binding sites. For a given input protein structure, putative ligand-binding sites are identified, matched in real time against the database and the query substructure aligned with the promising hits, to obtain a set of possible ligands that the given protein could bind to. The input can be either whole protein structures or merely the substructures corresponding to possible binding sites. Structure-based function annotation at the level of binding sites thus achieved could prove very useful for cases where no obvious functional inference can be obtained based purely on sequence or fold-level analyses. An attempt has also been made to analyse proteins of no known function from Protein Data Bank. PocketAnnotate would be a valuable tool for the scientific community and contribute towards structure-based functional inference. The web server can be freely accessed at http://proline.biochem.iisc.ernet.in/pocketannotate/.
Resumo:
The sensing of carbon dioxide (CO2) at room temperature, which has potential applications in environmental monitoring, healthcare, mining, biotechnology, food industry, etc., is a challenge for the scientific community due to the relative inertness of CO2. Here, we propose a novel gas sensor based on clad-etched Fiber Bragg Grating (FBG) with polyallylamine-amino-carbon nanotube coated on the surface of the core for detecting the concentrations of CO2 gas at room temperature, in ppm levels over a wide range (1000 ppm-4000 ppm). The limit of detection observed in polyallylamine-amino-carbon nanotube coated core-FBG has been found to be about 75 ppm. In this approach, when CO2 gas molecules interact with the polyallylamine-amino-carbon nanotube coated FBG, the effective refractive index of the fiber core changes, resulting in a shift in Bragg wavelength. The experimental data show a linear response of Bragg wavelength shift for increase in concentration of CO2 gas. Besides being reproducible and repeatable, the technique is fast, compact, and highly sensitive. (C) 2013 AIP Publishing LLC.
Resumo:
Detection of explosives, especially trinitrotoluene (TNT), is of utmost importance due to its highly explosive nature and environmental hazard. Therefore, detection of TNT has been a matter of great concern to the scientific community worldwide. Herein, a new aggregation-induced phosphorescent emission (AIPE)-active iridium(III) bis(2-(2,4-difluorophenyl)pyridinato-NC2') (2-(2-pyridyl)benzimidazolato-N,N') complex FIrPyBiz] has been developed and serves as a molecular probe for the detection of TNT in the vapor phase, solid phase, and aqueous media. In addition, phosphorescent test strips have been constructed by impregnating Whatman filter paper with aggregates of FIrPyBiz for trace detection of TNT in contact mode, with detection limits in nanograms, by taking advantage of the excited state interaction of AIPE-active phosphorescent iridium(III) complex with that of TNT and the associated photophysical properties.
Missing (in-situ) snow cover data hampers climate change and runoff studies in the Greater Himalayas
Resumo:
The Himalayas are presently holding the largest ice masses outside the polar regions and thus (temporarily) store important freshwater resources. In contrast to the contemplation of glaciers, the role of runoff from snow cover has received comparably little attention in the past, although (i) its contribution is thought to be at least equally or even more important than that of ice melt in many Himalayan catchments and (ii) climate change is expected to have widespread and significant consequences on snowmelt runoff. Here, we show that change assessment of snowmelt runoff and its timing is not as straightforward as often postulated, mainly as larger partial pressure of H2O, CO2, CH4, and other greenhouse gases might increase net long-wave input for snowmelt quite significantly in a future atmosphere. In addition, changes in the short-wave energy balance such as the pollution of the snow cover through black carbon or the sensible or latent heat contribution to snowmelt are likely to alter future snowmelt and runoff characteristics as well. For the assessment of snow cover extent and depletion, but also for its monitoring over the extremely large areas of the Himalayas, remote sensing has been used in the past and is likely to become even more important in the future. However, for the calibration and validation of remotely-sensed data, and even-more so in light of possible changes in snow-cover energy balance, we strongly call for more in-situ measurements across the Himalayas, in particular for daily data on new snow and snow cover water equivalent, or the respective energy balance components. Moreover, data should be made accessible to the scientific community, so that the latter can more accurately estimate climate change impacts on Himalayan snow cover and possible consequences thereof on runoff. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
Streptococcus pneumoniae causes pneumonia, septicemia and meningitis. S. pneumoniae is responsible for significant mortality both in children and in the elderly. In recent years, the whole genome sequencing of various S. pneumoniae strains have increased manifold and there is an urgent need to provide organism specific annotations to the scientific community. This prompted us to develop the Streptococcus pneumoniae Genome Database (SPGDB) to integrate and analyze the completely sequenced and available S. pneumoniae genome sequences. Further, links to several tools are provided to compare the pool of gene and protein sequences, and proteins structure across different strains of S. pneumoniae. SPGDB aids in the analysis of phenotypic variations as well as to perform extensive genomics and evolutionary studies with reference to S. pneumoniae. (C) 2014 Elsevier Inc. All rights reserved.
Resumo:
Network theory has become an excellent method of choice through which biological data are smoothly integrated to gain insights into complex biological problems. Understanding protein structure, folding, and function has been an important problem, which is being extensively investigated by the network approach. Since the sequence uniquely determines the structure, this review focuses on the networks of non-covalently connected amino acid side chains in proteins. Questions in structural biology are addressed within the framework of such a formalism. While general applications are mentioned in this review, challenging problems which have demanded the attention of scientific community for a long time, such as allostery and protein folding, are considered in greater detail. Our aim has been to explore these important problems through the eyes of networks. Various methods of constructing protein structure networks (PSN) are consolidated. They include the methods based on geometry, edges weighted by different schemes, and also bipartite network of protein-nucleic acid complexes. A number of network metrics that elegantly capture the general features as well as specific features related to phenomena, such as allostery and protein model validation, are described. Additionally, an integration of network theory with ensembles of equilibrium structures of a single protein or that of a large number of structures from the data bank has been presented to perceive complex phenomena from network perspective. Finally, we discuss briefly the capabilities, limitations, and the scope for further explorations of protein structure networks.
Resumo:
El presente trabajo surge del análisis del Informe Warnock y muestra cómo el Informe influyó en diferentes legislaciones, al tiempo que describe su impacto y la nueva forma de ver y conocer, a partir de éste, ciertos temas por parte del ciudadano común. Se analizará cada uno de los temas, por demás controvertidos, planteados en el Informe, entre los que se pueden enumerar los siguientes: inicio de la vida humana, tratamientos de fertilización asistida, donantes anónimos, criopreservación de embriones y alquiler de vientres. El Informe Warnock fue tomado como referencia para la redacción de distintas normas legales en todo el mundo. De allí su trascendencia, que marca hasta hoy un antes y un después en la sociedad científica en general y en la sociedad civil en particular.