985 resultados para central algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effect of fluid velocity fluctuations on the dynamics of the particles in a turbulent gas–solid suspension is analysed in the low-Reynolds-number and high Stokes number limits, where the particle relaxation time is long compared with the correlation time for the fluid velocity fluctuations, and the drag force on the particles due to the fluid can be expressed by the modified Stokes law. The direct numerical simulation procedure is used for solving the Navier–Stokes equations for the fluid, the particles are modelled as hard spheres which undergo elastic collisions and a one-way coupling algorithm is used where the force exerted by the fluid on the particles is incorporated, but not the reverse force exerted by the particles on the fluid. The particle mean and root-mean-square (RMS) fluctuating velocities, as well as the probability distribution function for the particle velocity fluctuations and the distribution of acceleration of the particles in the central region of the Couette (where the velocity profile is linear and the RMS velocities are nearly constant), are examined. It is found that the distribution of particle velocities is very different from a Gaussian, especially in the spanwise and wall-normal directions. However, the distribution of the acceleration fluctuation on the particles is found to be close to a Gaussian, though the distribution is highly anisotropic and there is a correlation between the fluctuations in the flow and gradient directions. The non-Gaussian nature of the particle velocity fluctuations is found to be due to inter-particle collisions induced by the large particle velocity fluctuations in the flow direction. It is also found that the acceleration distribution on the particles is in very good agreement with the distribution that is calculated from the velocity fluctuations in the fluid, using the Stokes drag law, indicating that there is very little correlation between the fluid velocity fluctuations and the particle velocity fluctuations in the presence of one-way coupling. All of these results indicate that the effect of the turbulent fluid velocity fluctuations can be accurately represented by an anisotropic Gaussian white noise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A fundamental task in bioinformatics involves a transfer of knowledge from one protein molecule onto another by way of recognizing similarities. Such similarities are obtained at different levels, that of sequence, whole fold, or important substructures. Comparison of binding sites is important to understand functional similarities among the proteins and also to understand drug cross-reactivities. Current methods in literature have their own merits and demerits, warranting exploration of newer concepts and algorithms, especially for large-scale comparisons and for obtaining accurate residue-wise mappings. Here, we report the development of a new algorithm, PocketAlign, for obtaining structural superpositions of binding sites. The software is available as a web-service at http://proline.physicslisc.emetin/pocketalign/. The algorithm encodes shape descriptors in the form of geometric perspectives, supplemented by chemical group classification. The shape descriptor considers several perspectives with each residue as the focus and captures relative distribution of residues around it in a given site. Residue-wise pairings are computed by comparing the set of perspectives of the first site with that of the second, followed by a greedy approach that incrementally combines residue pairings into a mapping. The mappings in different frames are then evaluated by different metrics encoding the extent of alignment of individual geometric perspectives. Different initial seed alignments are computed, each subsequently extended by detecting consequential atomic alignments in a three-dimensional grid, and the best 500 stored in a database. Alignments are then ranked, and the top scoring alignments reported, which are then streamed into Pymol for visualization and analyses. The method is validated for accuracy and sensitivity and benchmarked against existing methods. An advantage of PocketAlign, as compared to some of the existing tools available for binding site comparison in literature, is that it explores different schemes for identifying an alignment thus has a better potential to capture similarities in ligand recognition abilities. PocketAlign, by finding a detailed alignment of a pair of sites, provides insights as to why two sites are similar and which set of residues and atoms contribute to the similarity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As with 1,2-diphenylethane (dpe), X-ray crystallographic methods measure the central bond in meso-3,4-diphenylhexane-2,5-done (dphd) as significantly shorter than normal for an sp(3)-sp(3) bond. The same methods measure the benzylic (ethane C-Ph) bonds in dphd as unusually long for sp(3)-sp(2) liaisons. Torsional motions of the phenyl rings about the C-Ph bonds have been proposed as the artifacts behind the result of a 'short' central bond in dpe. While a similar explanation can, presumably, hold for the even 'shorter' central bond in dphd, it cannot account for the 'long' C-Ph bonds. The phenyl groups, departing much from regular hexagonal shape, adopt highly skewed conformations with respect to the plane constituted by the four central atoms. It is thought that-the thermal motions of the phenyl rings, conditioned by the potential wells in which they are ensconced in the unit cell, are largely libratory around their normal axes. In what appears to be a straightforward explanation under the 'rigid-body' concept, it appears that these libratory motions of the phenyl rings, that account, at the same time, for the 'short' central bond, are the artifacts behind the 'long' measurement of the C-Ph bonds. These motions could be superimposed on torsional motions analogous to those proposed in the case of dpe. An inspection of the ORTEP diagram from the 298 K data on dphd clearly suggests these possibilities. Supportive evidence for these qualitative explanations from an analysis of the differences between the mean square displacements of C(1) and C(7)/C(1a) and C(7a) based on the 'rigid-body model' is discussed. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Motion Estimation is one of the most power hungry operations in video coding. While optimal search (eg. full search)methods give best quality, non optimal methods are often used in order to reduce cost and power. Various algorithms have been used in practice that trade off quality vs. complexity. Global elimination is an algorithm based on pixel averaging to reduce complexity of motion search while keeping performance close to that of full search. We propose an adaptive version of the global elimination algorithm that extracts individual macro-block features using Hadamard transform to optimize the search. Performance achieved is close to the full search method and global elimination. Operational complexity and hence power is reduced by 30% to 45% compared to global elimination method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bluetooth is a short-range radio technology operating in the unlicensed industrial-scientific-medical (ISM) band at 2.45 GHz. A scatternet is established by linking several piconets together in ad hoc fashion to yield a global wireless ad hoc network. This paper proposes a polling policy that aims to achieve increased system throughput and reduced packet delays while providing reasonably good fairness among all traffic flows in a Bluetooth Scatternet. Experimental results from our proposed algorithm show performance improvements over a well known existing algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Use of engineered landfills for the disposal of industrial wastes is currently a common practice. Bentonite is attracting a greater attention not only as capping and lining materials in landfills but also as buffer and backfill materials for repositories of high-level nuclear waste around the world. In the design of buffer and backfill materials, it is important to know the swelling pressures of compacted bentonite with different electrolyte solutions. The theoretical studies on swell pressure behaviour are all based on Diffuse Double Layer (DDL) theory. To establish a relation between the swell pressure and void ratio of the soil, it is necessary to calculate the mid-plane potential in the diffuse part of the interacting ionic double layers. The difficulty in these calculations is the elliptic integral involved in the relation between half space distance and mid plane potential. Several investigators circumvented this problem using indirect methods or by using cumbersome numerical techniques. In this work, a novel approach is proposed for theoretical estimations of swell pressures of fine-grained soil from the DDL theory. The proposed approach circumvents the complex computations in establishing the relationship between mid-plane potential and diffused plates’ distances in other words, between swell pressure and void ratio.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Temporal analysis of gene expression data has been limited to identifying genes whose expression varies with time and/or correlation between genes that have similar temporal profiles. Often, the methods do not consider the underlying network constraints that connect the genes. It is becoming increasingly evident that interactions change substantially with time. Thus far, there is no systematic method to relate the temporal changes in gene expression to the dynamics of interactions between them. Information on interaction dynamics would open up possibilities for discovering new mechanisms of regulation by providing valuable insight into identifying time-sensitive interactions as well as permit studies on the effect of a genetic perturbation. Results: We present NETGEM, a tractable model rooted in Markov dynamics, for analyzing the dynamics of the interactions between proteins based on the dynamics of the expression changes of the genes that encode them. The model treats the interaction strengths as random variables which are modulated by suitable priors. This approach is necessitated by the extremely small sample size of the datasets, relative to the number of interactions. The model is amenable to a linear time algorithm for efficient inference. Using temporal gene expression data, NETGEM was successful in identifying (i) temporal interactions and determining their strength, (ii) functional categories of the actively interacting partners and (iii) dynamics of interactions in perturbed networks. Conclusions: NETGEM represents an optimal trade-off between model complexity and data requirement. It was able to deduce actively interacting genes and functional categories from temporal gene expression data. It permits inference by incorporating the information available in perturbed networks. Given that the inputs to NETGEM are only the network and the temporal variation of the nodes, this algorithm promises to have widespread applications, beyond biological systems. The source code for NETGEM is available from https://github.com/vjethava/NETGEM

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Landslides are hazards encountered during monsoon in undulating terrains of Western Ghats causing geomorphic make over of earth surface resulting in significant damages to life and property. An attempt is made in this paper to identify landslides susceptibility regions in the Sharavathi river basin downstream using frequency ratio method based on the field investigations during July- November 2007. In this regard, base layers of spatial data such as topography, land cover, geology and soil were considered. This is supplemented with the field investigations of landslides. Factors that influence landslide were extracted from the spatial database. The probabilistic model -frequency ratio is computed based on these factors. Landslide susceptibility indices were computed and grouped into five classes. Validation of LHS, showed an accuracy of 89% as 25 of the 28 regions tallied with the field condition of highly vulnerable landslide regions. The landslide susceptible map generated for the downstream would be useful for the district officials to implement appropriate mitigation measures to reduce hazards.