976 resultados para Fast methods
Resumo:
The prediction of binding modes (BMs) occurring between a small molecule and a target protein of biological interest has become of great importance for drug development. The overwhelming diversity of needs leaves room for docking approaches addressing specific problems. Nowadays, the universe of docking software ranges from fast and user friendly programs to algorithmically flexible and accurate approaches. EADock2 is an example of the latter. Its multiobjective scoring function was designed around the CHARMM22 force field and the FACTS solvation model. However, the major drawback of such a software design lies in its computational cost. EADock dihedral space sampling (DSS) is built on the most efficient features of EADock2, namely its hybrid sampling engine and multiobjective scoring function. Its performance is equivalent to that of EADock2 for drug-like ligands, while the CPU time required has been reduced by several orders of magnitude. This huge improvement was achieved through a combination of several innovative features including an automatic bias of the sampling toward putative binding sites, and a very efficient tree-based DSS algorithm. When the top-scoring prediction is considered, 57% of BMs of a test set of 251 complexes were reproduced within 2 Å RMSD to the crystal structure. Up to 70% were reproduced when considering the five top scoring predictions. The success rate is lower in cross-docking assays but remains comparable with that of the latest version of AutoDock that accounts for the protein flexibility. © 2011 Wiley Periodicals, Inc. J Comput Chem, 2011.
Resumo:
BACKGROUND: Solexa/Illumina short-read ultra-high throughput DNA sequencing technology produces millions of short tags (up to 36 bases) by parallel sequencing-by-synthesis of DNA colonies. The processing and statistical analysis of such high-throughput data poses new challenges; currently a fair proportion of the tags are routinely discarded due to an inability to match them to a reference sequence, thereby reducing the effective throughput of the technology. RESULTS: We propose a novel base calling algorithm using model-based clustering and probability theory to identify ambiguous bases and code them with IUPAC symbols. We also select optimal sub-tags using a score based on information content to remove uncertain bases towards the ends of the reads. CONCLUSION: We show that the method improves genome coverage and number of usable tags as compared with Solexa's data processing pipeline by an average of 15%. An R package is provided which allows fast and accurate base calling of Solexa's fluorescence intensity files and the production of informative diagnostic plots.
Resumo:
The demand for research in the area of safety health and environmental management of nanotechnologies is present since a decade and identified by several landmark reports and studies. It is not the intention of this compendium to report on these as they are widely available. It is also not the intention to publish scientific papers and research results as this task is covered by scientific conferences and the peer reviewed press. The intention of the compendium is to bring together researchers, create synergy in their work, and establish links and communication between them mainly during the actual research phase before publication of results. Towards this purpose we find useful to give emphasis to communication of projects strategic aims, extensive coverage of specific work objectives and of methods used in research, strengthening human capacities and laboratories infrastructure, supporting collaboration for common goals and joint elaboration of future plans, without compromising scientific publication potential or IP Rights. These targets are far from being achieved with the publication in its present shape. We shall continue working, though, and hope with the assistance of the research community to make significant progress. We would like to stress that this sector is under development and progressing very fast, which might make some of the statements outdated or even obsolete. Nevertheless it is intended to provide a basis for the necessary future developments. [Ed.]
Resumo:
A fast method for the identification of recombinant vaccinia viruses directly from individual plaques is described. Plaques are picked, resuspended in PBS-A and processed for PCR using two 'universal' primers. The amplified sequences are analyzed by agarose gel electrophoresis. This procedure allows discrimination between spontaneously arising TK-negative mutants, which do not carry the inserted gene, and the desired TK-negative recombinants resulting from insertional inactivation of the TK gene.
Resumo:
Nowadays, the joint exploitation of images acquired daily by remote sensing instruments and of images available from archives allows a detailed monitoring of the transitions occurring at the surface of the Earth. These modifications of the land cover generate spectral discrepancies that can be detected via the analysis of remote sensing images. Independently from the origin of the images and of type of surface change, a correct processing of such data implies the adoption of flexible, robust and possibly nonlinear method, to correctly account for the complex statistical relationships characterizing the pixels of the images. This Thesis deals with the development and the application of advanced statistical methods for multi-temporal optical remote sensing image processing tasks. Three different families of machine learning models have been explored and fundamental solutions for change detection problems are provided. In the first part, change detection with user supervision has been considered. In a first application, a nonlinear classifier has been applied with the intent of precisely delineating flooded regions from a pair of images. In a second case study, the spatial context of each pixel has been injected into another nonlinear classifier to obtain a precise mapping of new urban structures. In both cases, the user provides the classifier with examples of what he believes has changed or not. In the second part, a completely automatic and unsupervised method for precise binary detection of changes has been proposed. The technique allows a very accurate mapping without any user intervention, resulting particularly useful when readiness and reaction times of the system are a crucial constraint. In the third, the problem of statistical distributions shifting between acquisitions is studied. Two approaches to transform the couple of bi-temporal images and reduce their differences unrelated to changes in land cover are studied. The methods align the distributions of the images, so that the pixel-wise comparison could be carried out with higher accuracy. Furthermore, the second method can deal with images from different sensors, no matter the dimensionality of the data nor the spectral information content. This opens the doors to possible solutions for a crucial problem in the field: detecting changes when the images have been acquired by two different sensors.
Resumo:
Four methods were tested to assess the fire-blight disease response on grafted pear plants. The leaves of the plants were inoculated with Erwinia amylovora suspensions by pricking with clamps, cutting with scissors, local infiltration, and painting a bacterial suspension onto the leaves with a paintbrush. The effects of the inoculation methods were studied in dose-time-response experiments carried out in climate chambers under quarantine conditions. A modified Gompertz model was used to analyze the disease-time relatiobbnships and provided information on the rate of infection progression (rg) and time delay to the start of symptoms (t0). The disease-pathogen-dose relationships were analyzed according to a hyperbolic saturation model in which the median effective dose (ED50) of the pathogen and maximum disease level (ymax) were determined. Localized infiltration into the leaf mesophile resulted in the early (short t0) but slow (low rg) development of infection whereas in leaves pricked with clamps disease symptoms developed late (long t0) but rapidly (high rg). Paintbrush inoculation of the plants resulted in an incubation period of medium length, a moderate rate of infection progression, and low ymax values. In leaves inoculated with scissors, fire-blight symptoms developed early (short t0) and rapidly (high rg), and with the lowest ED50 and the highest ymax
Resumo:
A short overview is given on the most important analytical body composition methods. Principles of the methods and advantages and limitations of the methods are discussed also in relation to other fields of research such as energy metabolism. Attention is given to some new developments in body composition research such as chemical multiple-compartment models, computerized tomography or nuclear magnetic resonance imaging (tissue level), and multifrequency bioelectrical impedance. Possible future directions of body composition research in the light of these new developments are discussed.
Resumo:
A hallmark of schizophrenia pathophysiology is the dysfunction of cortical inhibitory GABA neurons expressing parvalbumin, which are essential for coordinating neuronal synchrony during various sensory and cognitive tasks. The high metabolic requirements of these fast-spiking cells may render them susceptible to redox dysregulation and oxidative stress. Using mice carrying a genetic redox imbalance, we demonstrate that extracellular perineuronal nets, which constitute a specialized polyanionic matrix enwrapping most of these interneurons as they mature, play a critical role in the protection against oxidative stress. These nets limit the effect of genetically impaired antioxidant systems and/or excessive reactive oxygen species produced by severe environmental insults. We observe an inverse relationship between the robustness of the perineuronal nets around parvalbumin cells and the degree of intracellular oxidative stress they display. Enzymatic degradation of the perineuronal nets renders mature parvalbumin cells and fast rhythmic neuronal synchrony more susceptible to oxidative stress. In parallel, parvalbumin cells enwrapped with mature perineuronal nets are better protected than immature parvalbumin cells surrounded by less-condensed perineuronal nets. Although the perineuronal nets act as a protective shield, they are also themselves sensitive to excess oxidative stress. The protection might therefore reflect a balance between the oxidative burden on perineuronal net degradation and the capacity of the system to maintain the nets. Abnormal perineuronal nets, as observed in the postmortem patient brain, may thus underlie the vulnerability and functional impairment of pivotal inhibitory circuits in schizophrenia.
Resumo:
Two common methods of accounting for electric-field-induced perturbations to molecular vibration are analyzed and compared. The first method is based on a perturbation-theoretic treatment and the second on a finite-field treatment. The relationship between the two, which is not immediately apparent, is made by developing an algebraic formalism for the latter. Some of the higher-order terms in this development are documented here for the first time. As well as considering vibrational dipole polarizabilities and hyperpolarizabilities, we also make mention of the vibrational Stark effec
Resumo:
BACKGROUND: A number of medical journals have developed policies for accelerated publication of articles judged by the authors, the editors or the peer reviewers to be of special importance. However, the validity of these judgements is unknown. We therefore compared the importance of articles published on a "fast track" with those published in the usual way. METHODS: We identified 12 "case" articles--6 articles from the New England Journal of Medicine that were prereleased on the journal's Web site before publication in print and 6 "fast-tracked" articles from The Lancet. We then identified 12 "control" articles matched to the case articles according to journal, disease or procedure of focus, theme area and year of publication. Forty-two general internists rated the articles, using 10-point scales, on dimensions addressing the articles' importance, ease of applicability and impact on health outcomes. RESULTS: For each dimension, the mean score for the case articles was significantly higher than the mean score for the control articles: importance to clinical practice 7.6 v. 7.1 respectively (p = 0.001), importance from a public health perspective 6.5 v. 6.0 (p < 0.001), contribution to advancement of medical knowledge 6.2 v. 5.8 (p < 0.001), ease of applicability in practice 7.0 v. 6.5 (p < 0.001), potential impact on health outcomes 6.5 v. 5.9 (p < 0.001). Despite these general findings, in 5 of the 12 matched pairs of articles the control article had a higher mean score than the case article across all the dimensions. INTERPRETATION: The accelerated publication practices of 2 leading medical journals targeted articles that, on average, had slightly higher importance scores than similar articles published in the usual way. However, our finding of higher importance scores for control articles in 5 of the 12 matched pairs shows that current journal practices for selecting articles for expedited publication are inconsistent.
Resumo:
A procedure based on quantum molecular similarity measures (QMSM) has been used to compare electron densities obtained from conventional ab initio and density functional methodologies at their respective optimized geometries. This method has been applied to a series of small molecules which have experimentally known properties and molecular bonds of diverse degrees of ionicity and covalency. Results show that in most cases the electron densities obtained from density functional methodologies are of a similar quality than post-Hartree-Fock generalized densities. For molecules where Hartree-Fock methodology yields erroneous results, the density functional methodology is shown to yield usually more accurate densities than those provided by the second order Møller-Plesset perturbation theory
Resumo:
In the present paper we discuss and compare two different energy decomposition schemes: Mayer's Hartree-Fock energy decomposition into diatomic and monoatomic contributions [Chem. Phys. Lett. 382, 265 (2003)], and the Ziegler-Rauk dissociation energy decomposition [Inorg. Chem. 18, 1558 (1979)]. The Ziegler-Rauk scheme is based on a separation of a molecule into fragments, while Mayer's scheme can be used in the cases where a fragmentation of the system in clearly separable parts is not possible. In the Mayer scheme, the density of a free atom is deformed to give the one-atom Mulliken density that subsequently interacts to give rise to the diatomic interaction energy. We give a detailed analysis of the diatomic energy contributions in the Mayer scheme and a close look onto the one-atom Mulliken densities. The Mulliken density ρA has a single large maximum around the nuclear position of the atom A, but exhibits slightly negative values in the vicinity of neighboring atoms. The main connecting point between both analysis schemes is the electrostatic energy. Both decomposition schemes utilize the same electrostatic energy expression, but differ in how fragment densities are defined. In the Mayer scheme, the electrostatic component originates from the interaction of the Mulliken densities, while in the Ziegler-Rauk scheme, the undisturbed fragment densities interact. The values of the electrostatic energy resulting from the two schemes differ significantly but typically have the same order of magnitude. Both methods are useful and complementary since Mayer's decomposition focuses on the energy of the finally formed molecule, whereas the Ziegler-Rauk scheme describes the bond formation starting from undeformed fragment densities
Resumo:
CD4 expression in HIV replication is paradoxical: HIV entry requires high cell-surface CD4 densities, but replication requires CD4 down-modulation. However, is CD4 density in HIV+ patients affected over time? Do changes in CD4 density correlate with disease progression? Here, we examined the role of CD4 density for HIV disease progression by longitudinally quantifying CD4 densities on CD4+ T cells and monocytes of ART-naive HIV+ patients with different disease progression rates. This was a retrospective study. We defined three groups of HIV+ patients by their rate of CD4+ T cell loss, calculated by the time between infection and reaching a CD4 level of 200 cells/microl: fast (<7.5 years), intermediate (7.5-12 years), and slow progressors (>12 years). Mathematical modeling permitted us to determine the maximum CD4+ T cell count after HIV seroconversion (defined as "postseroconversion CD4 count") and longitudinal profiles of CD4 count and density. CD4 densities were quantified on CD4+ T cells and monocytes from these patients and from healthy individuals by flow cytometry. Fast progressors had significantly lower postseroconversion CD4 counts than other progressors. CD4 density on T cells was lower in HIV+ patients than in healthy individuals and decreased more rapidly in fast than in slow progressors. Antiretroviral therapy (ART) did not normalize CD4 density. Thus, postseroconversion CD4 counts define individual HIV disease progression rates that may help to identify patients who might benefit most from early ART. Early discrimination of slow and fast progressors suggests that critical events during primary infection define long-term outcome. A more rapid CD4 density decrease in fast progressors might contribute to progressive functional impairments of the immune response in advanced HIV infection. The lack of an effect of ART on CD4 density implies a persistent dysfunctional immune response by uncontrolled HIV infection.