45 resultados para Contour
Resumo:
Negatively charged DNA can be compacted by positively charged dendrimers and the degree of compaction is a delicate balance between the strength of the electrostatic interaction and the elasticity of DNA. We report various elastic properties of short double-stranded DNA (dsDNA) and the effect of dendrimer binding using fully atomistic molecular dynamics and numerical simulations. In equilibrium at room temperature, the contour length distribution P(L) and the end-to-end distance distribution P(R) are nearly Gaussian, the former gives an estimate of the stretch modulus gamma(1) of dsDNA in quantitative agreement with the literature value. The bend angle distribution P(.) of the dsDNA also has a Gaussian form and allows to extract a persistence length, L-p of 43 nm. When the dsDNA is compacted by positively charged dendrimer, the stretch modulus stays invariant but the effective bending rigidity estimated from the end-to-end distance distribution decreases dramatically due to backbone charge neutralization of dsDNA by dendrimer. We support our observations with numerical solutions of the worm-like-chain (WLC) model as well as using non-equilibrium dsDNA stretching simulations. These results are helpful in understanding the dsDNA elasticity at short length scales as well as how the elasticity is modulated when dsDNA binds to a charged object such as a dendrimer or protein.
Resumo:
The contour tree is a topological abstraction of a scalar field that captures evolution in level set connectivity. It is an effective representation for visual exploration and analysis of scientific data. We describe a work-efficient, output sensitive, and scalable parallel algorithm for computing the contour tree of a scalar field defined on a domain that is represented using either an unstructured mesh or a structured grid. A hybrid implementation of the algorithm using the GPU and multi-core CPU can compute the contour tree of an input containing 16 million vertices in less than ten seconds with a speedup factor of upto 13. Experiments based on an implementation in a multi-core CPU environment show near-linear speedup for large data sets.
Resumo:
The moments of the hadronic spectral functions are of interest for the extraction of the strong coupling alpha(s) and other QCD parameters from the hadronic decays of the tau lepton. Motivated by the recent analyses of a large class of moments in the standard fixed-order and contour-improved perturbation theories, we consider the perturbative behavior of these moments in the framework of a QCD nonpower perturbation theory, defined by the technique of series acceleration by conformal mappings, which simultaneously implements renormalization-group summation and has a tame large-order behavior. Two recently proposed models of the Adler function are employed to generate the higher-order coefficients of the perturbation series and to predict the exact values of the moments, required for testing the properties of the perturbative expansions. We show that the contour-improved nonpower perturbation theories and the renormalization-group-summed nonpower perturbation theories have very good convergence properties for a large class of moments of the so-called ``reference model,'' including moments that are poorly described by the standard expansions. The results provide additional support for the plausibility of the description of the Adler function in terms of a small number of dominant renormalons.
Resumo:
Double helical structures of DNA and RNA are mostly determined by base pair stacking interactions, which give them the base sequence-directed features, such as small roll values for the purine-pyrimidine steps. Earlier attempts to characterize stacking interactions were mostly restricted to calculations on fiber diffraction geometries or optimized structure using ab initio calculations lacking variation in geometry to comment on rather unusual large roll values observed in AU/AU base pair step in crystal structures of RNA double helices. We have generated stacking energy hyperspace by modeling geometries with variations along the important degrees of freedom, roll, and slide, which were chosen via statistical analysis as maximally sequence dependent. Corresponding energy contours were constructed by several quantum chemical methods including dispersion corrections. This analysis established the most suitable methods for stacked base pair systems despite the limitation imparted by number of atom in a base pair step to employ very high level of theory. All the methods predict negative roll value and near-zero slide to be most favorable for the purine-pyrimidine steps, in agreement with Calladine's steric clash based rule. Successive base pairs in RNA are always linked by sugar-phosphate backbone with C3-endo sugars and this demands C1-C1 distance of about 5.4 angstrom along the chains. Consideration of an energy penalty term for deviation of C1-C1 distance from the mean value, to the recent DFT-D functionals, specifically B97X-D appears to predict reliable energy contour for AU/AU step. Such distance-based penalty improves energy contours for the other purine-pyrimidine sequences also. (c) 2013 Wiley Periodicals, Inc. Biopolymers 101: 107-120, 2014.
Resumo:
FT-IR (4000-400 cm(-1)) and FT-Raman (4000-200 cm(-1)) spectral measurements on solid 2,6-dichlorobenzonitrile (2,6-DCBN) have been done. The molecular geometry, harmonic vibrational frequencies and bonding features in the ground state have been calculated by density functional theory at the B3LYP/6-311++G (d,p) level. A comparison between the calculated and the experimental results covering the molecular structure has been made. The assignments of the fundamental vibrational modes have been done on the basis of the potential energy distribution (PED). To investigate the influence of intermolecular hydrogen bonding on the geometry, the charge distribution and the vibrational spectrum of 2,6-DCBN; calculations have been done for the monomer as well as the tetramer. The intermolecular interaction energies corrected for basis set superposition error (BSSE) have been calculated using counterpoise method. Based on these results, the correlations between the vibrational modes and the structure of the tetramer have been discussed. Molecular electrostatic potential (MEP) contour map has been plotted in order to predict how different geometries could interact. The Natural Bond Orbital (NBO) analysis has been done for the chemical interpretation of hyperconjugative interactions and electron density transfer between occupied (bonding or lone pair) orbitals to unoccupied (antibonding or Rydberg) orbitals. UV spectrum was measured in methanol solution. The energies and oscillator strengths were calculated by Time Dependent Density Functional Theory (TD-DFT) and matched to the experimental findings. TD-DFT method has also been used for theoretically studying the hydrogen bonding dynamics by monitoring the spectral shifts of some characteristic vibrational modes involved in the formation of hydrogen bonds in the ground and the first excited state. The C-13 nuclear magnetic resonance (NMR) chemical shifts of the molecule were calculated by the Gauge independent atomic orbital (GIAO) method and compared with experimental results. Standard thermodynamic functions have been obtained and changes in thermodynamic properties on going from monomer to tetramer have been presented. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
The average time tau(r) for one end of a long, self-avoiding polymer to interact for the first time with a flat penetrable surface to which it is attached at the other end is shown here to scale essentially as the square of the chain's contour length N. This result is obtained within the framework of the Wilemski-Fixman approximation to diffusion-limited reactions, in which the reaction time is expressed as a time correlation function of a ``sink'' term. In the present work, this sink-sink correlation function is calculated using perturbation expansions in the excluded volume and the polymer-surface interactions, with renormalization group methods being used to resum the expansion into a power law form. The quadratic dependence of tau(r) on N mirrors the behavior of the average time tau(c) of a free random walk to cyclize, but contrasts with the cyclization time of a free self-avoiding walk (SAW), for which tau(r) similar to N-2.2. A simulation study by Cheng and Makarov J. Phys. Chem. B 114, 3321 (2010)] of the chain-end reaction time of an SAW on a flat impenetrable surface leads to the same N-2.2 behavior, which is surprising given the reduced conformational space a tethered polymer has to explore in order to react. (C) 2014 AIP Publishing LLC.
Resumo:
Monte Carlo modeling of light transport in multilayered tissue (MCML) is modified to incorporate objects of various shapes (sphere, ellipsoid, cylinder, or cuboid) with a refractive-index mismatched boundary. These geometries would be useful for modeling lymph nodes, tumors, blood vessels, capillaries, bones, the head, and other body parts. Mesh-based Monte Carlo (MMC) has also been used to compare the results from the MCML with embedded objects (MCML-EO). Our simulation assumes a realistic tissue model and can also handle the transmission/reflection at the object-tissue boundary due to the mismatch of the refractive index. Simulation of MCML-EO takes a few seconds, whereas MMC takes nearly an hour for the same geometry and optical properties. Contour plots of fluence distribution from MCML-EO and MMC correlate well. This study assists one to decide on the tool to use for modeling light propagation in biological tissue with objects of regular shapes embedded in it. For irregular inhomogeneity in the model (tissue), MMC has to be used. If the embedded objects (inhomogeneity) are of regular geometry (shapes), then MCML-EO is a better option, as simulations like Raman scattering, fluorescent imaging, and optical coherence tomography are currently possible only with MCML. (C) 2014 Society of Photo-Optical Instrumentation Engineers (SPIE)
Resumo:
The complexity in visualizing volumetric data often limits the scope of direct exploration of scalar fields. Isocontour extraction is a popular method for exploring scalar fields because of its simplicity in presenting features in the data. In this paper, we present a novel representation of contours with the aim of studying the similarity relationship between the contours. The representation maps contours to points in a high-dimensional transformation-invariant descriptor space. We leverage the power of this representation to design a clustering based algorithm for detecting symmetric regions in a scalar field. Symmetry detection is a challenging problem because it demands both segmentation of the data and identification of transformation invariant segments. While the former task can be addressed using topological analysis of scalar fields, the latter requires geometry based solutions. Our approach combines the two by utilizing the contour tree for segmenting the data and the descriptor space for determining transformation invariance. We discuss two applications, query driven exploration and asymmetry visualization, that demonstrate the effectiveness of the approach.
Resumo:
It is well known that wrist pulse signals contain information about the status of health of a person and hence diagnosis based on pulse signals has assumed great importance since long time. In this paper the efficacy of signal processing techniques in extracting useful information from wrist pulse signals has been demonstrated by using signals recorded under two different experimental conditions viz. before lunch condition and after lunch condition. We have used Pearson's product-moment correlation coefficient, which is an effective measure of phase synchronization, in making a statistical analysis of wrist pulse signals. Contour plots and box plots are used to illustrate various differences. Two-sample t-tests show that the correlations show statistically significant differences between the groups. Results show that the correlation coefficient is effective in distinguishing the changes taking place after having lunch. This paper demonstrates the ability of the wrist pulse signals in detecting changes occurring under two different conditions. The study assumes importance in view of limited literature available on the analysis of wrist pulse signals in the case of food intake and also in view of its potential health care applications.
Resumo:
Active biological processes like transcription, replication, recombination, DNA repair, and DNA packaging encounter bent DNA. Machineries associated with these processes interact with the DNA at short length (<100 base pair) scale. Thus, the study of elasticity of DNA at such length scale is very important. We use fully atomistic molecular dynamics (MD) simulations along with various theoretical methods to determine elastic properties of dsDNA of different lengths and base sequences. We also study DNA elasticity in nucleosome core particle (NCP) both in the presence and the absence of salt. We determine stretch modulus and persistence length of short dsDNA and nucleosomal DNA from contour length distribution and bend angle distribution, respectively. For short dsDNA, we find that stretch modulus increases with ionic strength while persistence length decreases. Calculated values of stretch modulus and persistence length for DNA are in quantitative agreement with available experimental data. The trend is opposite for NCP DNA. We find that the presence of histone core makes the DNA stiffer and thus making the persistence length 3-4 times higher than the bare DNA. Similarly, we also find an increase in the stretch modulus for the NCP DNA. Our study for the first time reports the elastic properties of DNA when it is wrapped around the histone core in NCP. We further show that the WLC model is inadequate to describe DNA elasticity at short length scale. Our results provide a deeper understanding of DNA mechanics and the methods are applicable to most protein-DNA complexes.
Resumo:
In optical character recognition of very old books, the recognition accuracy drops mainly due to the merging or breaking of characters. In this paper, we propose the first algorithm to segment merged Kannada characters by using a hypothesis to select the positions to be cut. This method searches for the best possible positions to segment, by taking into account the support vector machine classifier's recognition score and the validity of the aspect ratio (width to height ratio) of the segments between every pair of cut positions. The hypothesis to select the cut position is based on the fact that a concave surface exists above and below the touching portion. These concave surfaces are noted down by tracing the valleys in the top contour of the image and similarly doing it for the image rotated upside-down. The cut positions are then derived as closely matching valleys of the original and the rotated images. Our proposed segmentation algorithm works well for different font styles, shapes and sizes better than the existing vertical projection profile based segmentation. The proposed algorithm has been tested on 1125 different word images, each containing multiple merged characters, from an old Kannada book and 89.6% correct segmentation is achieved and the character recognition accuracy of merged words is 91.2%. A few points of merge are still missed due to the absence of a matched valley due to the specific shapes of the particular characters meeting at the merges.
Resumo:
The hot deformation behavior of Nb-1 wt.%Zr alloy was studied using uniaxial compression tests carried out in vacuum to a true strain of 0.6 in the temperature range of 900 to 1700 degrees C and the strain rate range of 3 x 10(-3) to 10 s(-1). The optimum regime of hot workability of Nb-1Zr alloy was determined from the strain rate sensitivity (m) contour plots. A high m of about 02 was obtained in the temperature and strain rate range of 1200-1500 degrees C and 10(-3) to 10(-1) s(-1) and 1600-1700 degrees C and 10(-1) to 1 s(-1). Microstructure of the deformed samples showed features of dynamic recrystallization within the high strain rate sensitivity domain. Compared to the study on Nb-1Zr-0.1C alloy, Nb-1Zr showed a lower flow stress and an optimum hot working domain at lower temperatures. In the 1500 to 1700 degrees C range the apparent activation energy of deformation for Nb-1Zr was 259 kJ mol(-1), the stress exponent 5, and the activation volume about 200 to 700 b(3). (C) 2015 Elsevier Ltd. All rights reserved.
Resumo:
Imaging flow cytometry is an emerging technology that combines the statistical power of flow cytometry with spatial and quantitative morphology of digital microscopy. It allows high-throughput imaging of cells with good spatial resolution, while they are in flow. This paper proposes a general framework for the processing/classification of cells imaged using imaging flow cytometer. Each cell is localized by finding an accurate cell contour. Then, features reflecting cell size, circularity and complexity are extracted for the classification using SVM. Unlike the conventional iterative, semi-automatic segmentation algorithms such as active contour, we propose a noniterative, fully automatic graph-based cell localization. In order to evaluate the performance of the proposed framework, we have successfully classified unstained label-free leukaemia cell-lines MOLT, K562 and HL60 from video streams captured using custom fabricated cost-effective microfluidics-based imaging flow cytometer. The proposed system is a significant development in the direction of building a cost-effective cell analysis platform that would facilitate affordable mass screening camps looking cellular morphology for disease diagnosis. Lay description In this article, we propose a novel framework for processing the raw data generated using microfluidics based imaging flow cytometers. Microfluidics microscopy or microfluidics based imaging flow cytometry (mIFC) is a recent microscopy paradigm, that combines the statistical power of flow cytometry with spatial and quantitative morphology of digital microscopy, which allows us imaging cells while they are in flow. In comparison to the conventional slide-based imaging systems, mIFC is a nascent technology enabling high throughput imaging of cells and is yet to take the form of a clinical diagnostic tool. The proposed framework process the raw data generated by the mIFC systems. The framework incorporates several steps: beginning from pre-processing of the raw video frames to enhance the contents of the cell, localising the cell by a novel, fully automatic, non-iterative graph based algorithm, extraction of different quantitative morphological parameters and subsequent classification of cells. In order to evaluate the performance of the proposed framework, we have successfully classified unstained label-free leukaemia cell-lines MOLT, K562 and HL60 from video streams captured using cost-effective microfluidics based imaging flow cytometer. The cell lines of HL60, K562 and MOLT were obtained from ATCC (American Type Culture Collection) and are separately cultured in the lab. Thus, each culture contains cells from its own category alone and thereby provides the ground truth. Each cell is localised by finding a closed cell contour by defining a directed, weighted graph from the Canny edge images of the cell such that the closed contour lies along the shortest weighted path surrounding the centroid of the cell from a starting point on a good curve segment to an immediate endpoint. Once the cell is localised, morphological features reflecting size, shape and complexity of the cells are extracted and used to develop a support vector machine based classification system. We could classify the cell-lines with good accuracy and the results were quite consistent across different cross validation experiments. We hope that imaging flow cytometers equipped with the proposed framework for image processing would enable cost-effective, automated and reliable disease screening in over-loaded facilities, which cannot afford to hire skilled personnel in large numbers. Such platforms would potentially facilitate screening camps in low income group countries; thereby transforming the current health care paradigms by enabling rapid, automated diagnosis for diseases like cancer.
Resumo:
Acoustic feature based speech (syllable) rate estimation and syllable nuclei detection are important problems in automatic speech recognition (ASR), computer assisted language learning (CALL) and fluency analysis. A typical solution for both the problems consists of two stages. The first stage involves computing a short-time feature contour such that most of the peaks of the contour correspond to the syllabic nuclei. In the second stage, the peaks corresponding to the syllable nuclei are detected. In this work, instead of the peak detection, we perform a mode-shape classification, which is formulated as a supervised binary classification problem - mode-shapes representing the syllabic nuclei as one class and remaining as the other. We use the temporal correlation and selected sub-band correlation (TCSSBC) feature contour and the mode-shapes in the TCSSBC feature contour are converted into a set of feature vectors using an interpolation technique. A support vector machine classifier is used for the classification. Experiments are performed separately using Switchboard, TIMIT and CTIMIT corpora in a five-fold cross validation setup. The average correlation coefficients for the syllable rate estimation turn out to be 0.6761, 0.6928 and 0.3604 for three corpora respectively, which outperform those obtained by the best of the existing peak detection techniques. Similarly, the average F-scores (syllable level) for the syllable nuclei detection are 0.8917, 0.8200 and 0.7637 for three corpora respectively. (C) 2016 Elsevier B.V. All rights reserved.
Resumo:
We perceive objects as containing a variety of attributes: local features, relations between features, internal details, and global properties. But we know little about how they combine. Here, we report a remarkably simple additive rule that governs how these diverse object attributes combine in vision. The perceived dissimilarity between two objects was accurately explained as a sum of (a) spatially tuned local contour-matching processes modulated by part decomposition; (b) differences in internal details, such as texture; (c) differences in emergent attributes, such as symmetry; and (d) differences in global properties, such as orientation or overall configuration of parts. Our results elucidate an enduring question in object vision by showing that the whole object is not a sum of its parts but a sum of its many attributes.