999 resultados para bioinformatics applications
Resumo:
Article About the Authors Metrics Comments Related Content Abstract Introduction Functionality Implementation Discussion Acknowledgments Author Contributions References Reader Comments (0) Figures Abstract Despite of the variety of available Web services registries specially aimed at Life Sciences, their scope is usually restricted to a limited set of well-defined types of services. While dedicated registries are generally tied to a particular format, general-purpose ones are more adherent to standards and usually rely on Web Service Definition Language (WSDL). Although WSDL is quite flexible to support common Web services types, its lack of semantic expressiveness led to various initiatives to describe Web services via ontology languages. Nevertheless, WSDL 2.0 descriptions gained a standard representation based on Web Ontology Language (OWL). BioSWR is a novel Web services registry that provides standard Resource Description Framework (RDF) based Web services descriptions along with the traditional WSDL based ones. The registry provides Web-based interface for Web services registration, querying and annotation, and is also accessible programmatically via Representational State Transfer (REST) API or using a SPARQL Protocol and RDF Query Language. BioSWR server is located at http://inb.bsc.es/BioSWR/and its code is available at https://sourceforge.net/projects/bioswr/under the LGPL license.
Resumo:
Very large molecular systems can be calculated with the so called CNDOL approximate Hamiltonians that have been developed by avoiding oversimplifications and only using a priori parameters and formulas from the simpler NDO methods. A new diagonal monoelectronic term named CNDOL/21 shows great consistency and easier SCF convergence when used together with an appropriate function for charge repulsion energies that is derived from traditional formulas. It is possible to obtain a priori molecular orbitals and electron excitation properties after the configuration interaction of single excited determinants with reliability, maintaining interpretative possibilities even being a simplified Hamiltonian. Tests with some unequivocal gas phase maxima of simple molecules (benzene, furfural, acetaldehyde, hexyl alcohol, methyl amine, 2,5 dimethyl 2,4 hexadiene, and ethyl sulfide) ratify the general quality of this approach in comparison with other methods. The calculation of large systems as porphine in gas phase and a model of the complete retinal binding pocket in rhodopsin with 622 basis functions on 280 atoms at the quantum mechanical level show reliability leading to a resulting first allowed transition in 483 nm, very similar to the known experimental value of 500 nm of "dark state." In this very important case, our model gives a central role in this excitation to a charge transfer from the neighboring Glu(-) counterion to the retinaldehyde polyene chain. Tests with gas phase maxima of some important molecules corroborate the reliability of CNDOL/2 Hamiltonians.
Resumo:
Périodicité : Mensuel
Resumo:
As the world’s energy demand is increasing, a durable solution to control it is to improve the energy efficiency of the processes. It has been estimated that pumping applications have a significant potential for energy savings trough equipment or control system changes. For many pumping application the use of a variable speed drive as a process control element is the most energy efficient solution. The main target of this study is to examine the energy efficiency of a drive system that moves the pump. In a larger scale the purpose of this study is to examine how the different manufacturers’ variable speed drives are functioning as a control device of a pumping process. The idea is to compare the drives from a normal pump user’s point of view. The things that are mattering for the pump user are the efficiency gained in the process and the easiness of the use of the VSD. So some thought is given also on valuating the user-friendliness of the VSDs. The VSDs are compared to each other also on the basis of their life cycle energy costs in different kind of pumping cases. The comparison is made between ACS800 from ABB, VLT AQUA Drive from Danfoss, NX-drive from Vacon and Micromaster 430 from Siemens. The efficiencies are measured in power electronics laboratory in the Lappeenranta University of Technology with a system that consists of a variable speed drive, an induction motor with dc-machine, two power analyzers and a torque transducer. The efficiencies are measured as a function of a load at different frequencies. According to measurement results the differences between the measured system efficiencies on the actual working area of pumping are on average few percent units. When examining efficiencies at the whole range of different loads and frequencies, the differences get bigger. At low frequencies and loads the differences between the most efficient and the least efficient systems are at the most about ten percent units. At the most of the tested points ABB’s drive seem to have slightly better efficiencies than the other drives.
Resumo:
Brown packaging linerboard, made entirely from recovered pulp, was subjected to deinking flotation for evaluating the possible improvements in its chemical, optical and mechanical properties. The increase in the rate of recovered paper utilisation, along with the tendency towards lower basis weights, in the packaging paper production, has created a growing need for the utilisation of secondary fibers of improved quality. To attain better quality fibers, flotation deinking of brown grades is being considered, along with the addition of primary fibers to recovered paper furnish. Numerous conducted studies, in which the flotation technology was used in the treatment of brown grades, support this idea. Most of them show that the quality of fibers is improved after flotation deinking, resulting in higher mechanical properties of the deinked handsheets and in lower amounts of chemical contaminants. As to food and human health safety, packaging paper has to meet specific requirements, to be classified as suitable for its direct contact with foods. Recycled paper and board may contain many potential contaminants, which, especially in the case of direct food contact, may migrate from packaging materials into foodstuffs. In this work, the linerboard sample selected for deinking was made from recycled fibers not submitted previously to chemical deinking flotation. Therefore, the original sample contained many noncellulosic components, as well as the residues of printing inks. The studied linerboardsample was a type of packaging paper used for contact with food products that are usually peeled before use, e.g. fruits and vegetables. The decrease in the amount of chemical contaminants, after conducting deinking flotation, was evaluated, along with the changes in the mechanical and optical properties of the deinked handsheets. Food contact analysis was done on both the original paper samples and the filter pads and handsheets made before and after deinking flotation. Food contact analysis consisted of migration tests of brightening agents, colorants, PCPs, formaldehydes and metals. Microbiological tests were also performed to determine the possible transfer of antimicrobial constituents
Resumo:
Recent advances in machine learning methods enable increasingly the automatic construction of various types of computer assisted methods that have been difficult or laborious to program by human experts. The tasks for which this kind of tools are needed arise in many areas, here especially in the fields of bioinformatics and natural language processing. The machine learning methods may not work satisfactorily if they are not appropriately tailored to the task in question. However, their learning performance can often be improved by taking advantage of deeper insight of the application domain or the learning problem at hand. This thesis considers developing kernel-based learning algorithms incorporating this kind of prior knowledge of the task in question in an advantageous way. Moreover, computationally efficient algorithms for training the learning machines for specific tasks are presented. In the context of kernel-based learning methods, the incorporation of prior knowledge is often done by designing appropriate kernel functions. Another well-known way is to develop cost functions that fit to the task under consideration. For disambiguation tasks in natural language, we develop kernel functions that take account of the positional information and the mutual similarities of words. It is shown that the use of this information significantly improves the disambiguation performance of the learning machine. Further, we design a new cost function that is better suitable for the task of information retrieval and for more general ranking problems than the cost functions designed for regression and classification. We also consider other applications of the kernel-based learning algorithms such as text categorization, and pattern recognition in differential display. We develop computationally efficient algorithms for training the considered learning machines with the proposed kernel functions. We also design a fast cross-validation algorithm for regularized least-squares type of learning algorithm. Further, an efficient version of the regularized least-squares algorithm that can be used together with the new cost function for preference learning and ranking tasks is proposed. In summary, we demonstrate that the incorporation of prior knowledge is possible and beneficial, and novel advanced kernels and cost functions can be used in algorithms efficiently.
Resumo:
In this thesis theoretical and technological aspects of fiber Bragg gratings (FBG) are considered. The fabrication of uniform and chirped fiber Bragg gratings using phase mask technique has been exploited throughout this study. Different requires of FBG inscription were considered and implemented experimentally to find economical and effective procedure. The hydrogen loading was used as a method for enhancement the photosensitivity of the fiber. The minimum loading time for uniform and chirped fiber Bragg gratings was determined as 3 days and 7 days at T = 50°C and hydrogen pressure 140 bar, respectively. The post-inscription annealing was considered to avoid excess losses induced by the hydrogen. The wavelength evolution during annealing was measured. The strain and temperature sensor application of FBG was considered. The wavelength shifts caused by tension and temperature were studied for both uniform and chirp fiber Bragg gratings.
Resumo:
In recent years, technological advances have allowed manufacturers to implement dual-energy computed tomography (DECT) on clinical scanners. With its unique ability to differentiate basis materials by their atomic number, DECT has opened new perspectives in imaging. DECT has been used successfully in musculoskeletal imaging with applications ranging from detection, characterization, and quantification of crystal and iron deposits; to simulation of noncalcium (improving the visualization of bone marrow lesions) or noniodine images. Furthermore, the data acquired with DECT can be postprocessed to generate monoenergetic images of varying kiloelectron volts, providing new methods for image contrast optimization as well as metal artifact reduction. The first part of this article reviews the basic principles and technical aspects of DECT including radiation dose considerations. The second part focuses on applications of DECT to musculoskeletal imaging including gout and other crystal-induced arthropathies, virtual noncalcium images for the study of bone marrow lesions, the study of collagenous structures, applications in computed tomography arthrography, as well as the detection of hemosiderin and metal particles.
Resumo:
In recent years, technological advances have allowed manufacturers to implement dual-energy computed tomography (DECT) on clinical scanners. With its unique ability to differentiate basis materials by their atomic number, DECT has opened new perspectives in imaging. DECT has been successfully used in musculoskeletal imaging with applications ranging from detection, characterization, and quantification of crystal and iron deposits, to simulation of noncalcium (improving the visualization of bone marrow lesions) or noniodine images. Furthermore, the data acquired with DECT can be postprocessed to generate monoenergetic images of varying kiloelectron volts, providing new methods for image contrast optimization as well as metal artifact reduction. The first part of this article reviews the basic principles and technical aspects of DECT including radiation dose considerations. The second part focuses on applications of DECT to musculoskeletal imaging including gout and other crystal-induced arthropathies, virtual noncalcium images for the study of bone marrow lesions, the study of collagenous structures, applications in computed tomography arthrography, as well as the detection of hemosiderin and metal particles.
Resumo:
Many people regard the concept of hypothesis testing as fundamental to inferential statistics. Various schools of thought, in particular frequentist and Bayesian, have promoted radically different solutions for taking a decision about the plausibility of competing hypotheses. Comprehensive philosophical comparisons about their advantages and drawbacks are widely available and continue to span over large debates in the literature. More recently, controversial discussion was initiated by an editorial decision of a scientific journal [1] to refuse any paper submitted for publication containing null hypothesis testing procedures. Since the large majority of papers published in forensic journals propose the evaluation of statistical evidence based on the so called p-values, it is of interest to expose the discussion of this journal's decision within the forensic science community. This paper aims to provide forensic science researchers with a primer on the main concepts and their implications for making informed methodological choices.
Resumo:
The SIB Swiss Institute of Bioinformatics (www.isb-sib.ch) provides world-class bioinformatics databases, software tools, services and training to the international life science community in academia and industry. These solutions allow life scientists to turn the exponentially growing amount of data into knowledge. Here, we provide an overview of SIB's resources and competence areas, with a strong focus on curated databases and SIB's most popular and widely used resources. In particular, SIB's Bioinformatics resource portal ExPASy features over 150 resources, including UniProtKB/Swiss-Prot, ENZYME, PROSITE, neXtProt, STRING, UniCarbKB, SugarBindDB, SwissRegulon, EPD, arrayMap, Bgee, SWISS-MODEL Repository, OMA, OrthoDB and other databases, which are briefly described in this article.
Resumo:
The multidimensional process of physical, psychological, and social change produced by population ageing affects not only the quality of life of elderly people but also of our societies. Some dimensions of population ageing grow and expand over time (e.g. knowledge of the world events, or experience in particular situations), while others decline (e.g. reaction time, physical and psychological strength, or other functional abilities like reduced speed and tiredness). Information and Communication Technologies (ICTs) can help elderly to overcome possible limitations due to ageing. As a particular case, biometrics can allow the development of new algorithms for early detection of cognitive impairments, by processing continuous speech, handwriting or other challenged abilities. Among all possibilities, digital applications (Apps) for mobile phones or tablets can allow the dissemination of such tools. In this article, after presenting and discussing the process of population ageing and its social implications, we explore how ICTs through different Apps can lead to new solutions for facing this major demographic challenge.
Resumo:
Life sciences are yielding huge data sets that underpin scientific discoveries fundamental to improvement in human health, agriculture and the environment. In support of these discoveries, a plethora of databases and tools are deployed, in technically complex and diverse implementations, across a spectrum of scientific disciplines. The corpus of documentation of these resources is fragmented across the Web, with much redundancy, and has lacked a common standard of information. The outcome is that scientists must often struggle to find, understand, compare and use the best resources for the task at hand.Here we present a community-driven curation effort, supported by ELIXIR-the European infrastructure for biological information-that aspires to a comprehensive and consistent registry of information about bioinformatics resources. The sustainable upkeep of this Tools and Data Services Registry is assured by a curation effort driven by and tailored to local needs, and shared amongst a network of engaged partners.As of November 2015, the registry includes 1785 resources, with depositions from 126 individual registrations including 52 institutional providers and 74 individuals. With community support, the registry can become a standard for dissemination of information about bioinformatics resources: we welcome everyone to join us in this common endeavour. The registry is freely available at https://bio.tools.