19 resultados para Clustering and objective measures

em Indian Institute of Science - Bangalore - Índia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article, we present a novel application of a quantum clustering (QC) technique to objectively cluster the conformations, sampled by molecular dynamics simulations performed on different ligand bound structures of the protein. We further portray each conformational population in terms of dynamically stable network parameters which beautifully capture the ligand induced variations in the ensemble in atomistic detail. The conformational populations thus identified by the QC method and verified by network parameters are evaluated for different ligand bound states of the protein pyrrolysyl-tRNA synthetase (DhPylRS) from D. hafniense. The ligand/environment induced re-distribution of protein conformational ensembles forms the basis for understanding several important biological phenomena such as allostery and enzyme catalysis. The atomistic level characterization of each population in the conformational ensemble in terms of the re-orchestrated networks of amino acids is a challenging problem, especially when the changes are minimal at the backbone level. Here we demonstrate that the QC method is sensitive to such subtle changes and is able to cluster MD snapshots which are similar at the side-chain interaction level. Although we have applied these methods on simulation trajectories of a modest time scale (20 ns each), we emphasize that our methodology provides a general approach towards an objective clustering of large-scale MD simulation data and may be applied to probe multistate equilibria at higher time scales, and to problems related to protein folding for any protein or protein-protein/RNA/DNA complex of interest with a known structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Isochronal and isothermal ageing experiments have been carried out to determine the influence of 0.01 at. % addition of a second solute on the clustering rate in the quenched Al-4,4 a/o Zn alloy. The influence of quenching and ageing temperatures has been interpreted to obtain the apparent vacancy formation and vacancy migration energies in the various ternary alloys. Using a vacancy-aided clustering model the following values of binding free energy have been evaluated: Ce-0.18; Dy-0.24; Fe-0.18; Li-0.25; Mn-0.27; Nb-0.18; Pt-0.23; Sb-0.21; Si-0.30; Y-0.25; and Yb-0.23 (± 0.02 eV). These binding energy values refer to that between a solute atom and a single vacancy. The values of vacancy migration energy (c. 0.4 eV) and the experimental activation energy for solute diffusion (c. 1.1 eV) are unaffected by the presence of the ternary atoms in the Al-Zn alloy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Clustering is a process of partitioning a given set of patterns into meaningful groups. The clustering process can be viewed as consisting of the following three phases: (i) feature selection phase, (ii) classification phase, and (iii) description generation phase. Conventional clustering algorithms implicitly use knowledge about the clustering environment to a large extent in the feature selection phase. This reduces the need for the environmental knowledge in the remaining two phases, permitting the usage of simple numerical measure of similarity in the classification phase. Conceptual clustering algorithms proposed by Michalski and Stepp [IEEE Trans. PAMI, PAMI-5, 396–410 (1983)] and Stepp and Michalski [Artif. Intell., pp. 43–69 (1986)] make use of the knowledge about the clustering environment in the form of a set of predefined concepts to compute the conceptual cohesiveness during the classification phase. Michalski and Stepp [IEEE Trans. PAMI, PAMI-5, 396–410 (1983)] have argued that the results obtained with the conceptual clustering algorithms are superior to conventional methods of numerical classification. However, this claim was not supported by the experimental results obtained by Dale [IEEE Trans. PAMI, PAMI-7, 241–244 (1985)]. In this paper a theoretical framework, based on an intuitively appealing set of axioms, is developed to characterize the equivalence between the conceptual clustering and conventional clustering. In other words, it is shown that any classification obtained using conceptual clustering can also be obtained using conventional clustering and vice versa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we develop a game theoretic approach for clustering features in a learning problem. Feature clustering can serve as an important preprocessing step in many problems such as feature selection, dimensionality reduction, etc. In this approach, we view features as rational players of a coalitional game where they form coalitions (or clusters) among themselves in order to maximize their individual payoffs. We show how Nash Stable Partition (NSP), a well known concept in the coalitional game theory, provides a natural way of clustering features. Through this approach, one can obtain some desirable properties of the clusters by choosing appropriate payoff functions. For a small number of features, the NSP based clustering can be found by solving an integer linear program (ILP). However, for large number of features, the ILP based approach does not scale well and hence we propose a hierarchical approach. Interestingly, a key result that we prove on the equivalence between a k-size NSP of a coalitional game and minimum k-cut of an appropriately constructed graph comes in handy for large scale problems. In this paper, we use feature selection problem (in a classification setting) as a running example to illustrate our approach. We conduct experiments to illustrate the efficacy of our approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pure alpha-Al2O3 exhibits a very high degree of thermodynamical stability among all metal oxides and forms an inert oxide scale in a range of structural alloys at high temperatures. We report that amorphous Al2O3 thin films sputter deposited over crystalline Si instead show a surprisingly active interface. On annealing, crystallization begins with nuclei of a phase closely resembling gamma-Alumina forming almost randomly in an amorphous matrix, and with increasing frequency near the substrate/film interface. This nucleation is marked by the signature appearance of sharp (400) and (440) reflections and the formation of a diffuse diffraction halo with an outer maximal radius of approximate to 0.23 nm enveloping the direct beam. The microstructure then evolves by a cluster-coalescence growth mechanism suggestive of swift nucleation and sluggish diffusional kinetics, while locally the Al ions redistribute slowly from chemisorbed and tetrahedral sites to higher anion coordinated sites. Chemical state plots constructed from XPS data and simple calculations of the diffraction patterns from hypothetically distorted lattices suggest that the true origins of the diffuse diffraction halo are probably related to a complex change in the electronic structure spurred by the a-gamma transformation rather than pure structural disorder. Concurrent to crystallization within the film, a substantially thick interfacial reaction zone also builds up at the film/substrate interface with the excess Al acting as a cationic source. (C) 2015 AIP Publishing LLC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We address the problem of detecting cells in biological images. The problem is important in many automated image analysis applications. We identify the problem as one of clustering and formulate it within the framework of robust estimation using loss functions. We show how suitable loss functions may be chosen based on a priori knowledge of the noise distribution. Specifically, in the context of biological images, since the measurement noise is not Gaussian, quadratic loss functions yield suboptimal results. We show that by incorporating the Huber loss function, cells can be detected robustly and accurately. To initialize the algorithm, we also propose a seed selection approach. Simulation results show that Huber loss exhibits better performance compared with some standard loss functions. We also provide experimental results on confocal images of yeast cells. The proposed technique exhibits good detection performance even when the signal-to-noise ratio is low.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Non-negative matrix factorization [5](NMF) is a well known tool for unsupervised machine learning. It can be viewed as a generalization of the K-means clustering, Expectation Maximization based clustering and aspect modeling by Probabilistic Latent Semantic Analysis (PLSA). Specifically PLSA is related to NMF with KL-divergence objective function. Further it is shown that K-means clustering is a special case of NMF with matrix L2 norm based error function. In this paper our objective is to analyze the relation between K-means clustering and PLSA by examining the KL-divergence function and matrix L2 norm based error function.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several statistical downscaling models have been developed in the past couple of decades to assess the hydrologic impacts of climate change by projecting the station-scale hydrological variables from large-scale atmospheric variables simulated by general circulation models (GCMs). This paper presents and compares different statistical downscaling models that use multiple linear regression (MLR), positive coefficient regression (PCR), stepwise regression (SR), and support vector machine (SVM) techniques for estimating monthly rainfall amounts in the state of Florida. Mean sea level pressure, air temperature, geopotential height, specific humidity, U wind, and V wind are used as the explanatory variables/predictors in the downscaling models. Data for these variables are obtained from the National Centers for Environmental Prediction-National Center for Atmospheric Research (NCEP-NCAR) reanalysis dataset and the Canadian Centre for Climate Modelling and Analysis (CCCma) Coupled Global Climate Model, version 3 (CGCM3) GCM simulations. The principal component analysis (PCA) and fuzzy c-means clustering method (FCM) are used as part of downscaling model to reduce the dimensionality of the dataset and identify the clusters in the data, respectively. Evaluation of the performances of the models using different error and statistical measures indicates that the SVM-based model performed better than all the other models in reproducing most monthly rainfall statistics at 18 sites. Output from the third-generation CGCM3 GCM for the A1B scenario was used for future projections. For the projection period 2001-10, MLR was used to relate variables at the GCM and NCEP grid scales. Use of MLR in linking the predictor variables at the GCM and NCEP grid scales yielded better reproduction of monthly rainfall statistics at most of the stations (12 out of 18) compared to those by spatial interpolation technique used in earlier studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Relatively few studies have addressed water management and adaptation measures in the face of changing water balances due to climate change. The current work studies climate change impact on a multipurpose reservoir performance and derives adaptive policies for possible futurescenarios. The method developed in this work is illustrated with a case study of Hirakud reservoir on the Mahanadi river in Orissa, India,which is a multipurpose reservoir serving flood control, irrigation and power generation. Climate change effects on annual hydropower generation and four performance indices (reliability with respect to three reservoir functions, viz. hydropower, irrigation and flood control, resiliency, vulnerability and deficit ratio with respect to hydropower) are studied. Outputs from three general circulation models (GCMs) for three scenarios each are downscaled to monsoon streamflow in the Mahanadi river for two future time slices, 2045-65 and 2075-95. Increased irrigation demands, rule curves dictated by increased need for flood storage and downscaled projections of streamflow from the ensemble of GCMs and scenarios are used for projecting future hydrologic scenarios. It is seen that hydropower generation and reliability with respect to hydropower and irrigation are likely to show a decrease in future in most scenarios, whereas the deficit ratio and vulnerability are likely to increase as a result of climate change if the standard operating policy (SOP) using current rule curves for flood protection is employed. An optimal monthly operating policy is then derived using stochastic dynamic programming (SDP) as an adaptive policy for mitigating impacts of climate change on reservoir operation. The objective of this policy is to maximize reliabilities with respect to multiple reservoir functions of hydropower, irrigation and flood control. In variations to this adaptive policy, increasingly more weightage is given to the purpose of maximizing reliability with respect to hydropower for two extreme scenarios. It is seen that by marginally sacrificing reliability with respect to irrigation and flood control, hydropower reliability and generation can be increased for future scenarios. This suggests that reservoir rules for flood control may have to be revised in basins where climate change projects an increasing probability of droughts. However, it is also seen that power generation is unable to be restored to current levels, due in part to the large projected increases in irrigation demand. This suggests that future water balance deficits may limit the success of adaptive policy options. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nonclassicality in the sense of quantum optics is a prerequisite for entanglement in multimode radiation states. In this work we bring out the possibilities of passing from the former to the latter, via action of classicality preserving systems like beam splitters, in a transparent manner. For single-mode states, a complete description of nonclassicality is available via the classical theory of moments, as a set of necessary and sufficient conditions on the photon number distribution. We show that when the mode is coupled to an ancilla in any coherent state, and the system is then acted upon by a beam splitter, these conditions turn exactly into signatures of negativity under partial transpose (NPT) entanglement of the output state. Since the classical moment problem does not generalize to two or more modes, we turn in these cases to other familiar sufficient but not necessary conditions for nonclassicality, namely the Mandel parameter criterion and its extensions. We generalize the Mandel matrix from one-mode states to the two-mode situation, leading to a natural classification of states with varying levels of nonclassicality. For two-mode states we present a single test that can, if successful, simultaneously show nonclassicality as well as NPT entanglement. We also develop a test for NPT entanglement after beam-splitter action on a nonclassical state, tracing carefully the way in which it goes beyond the Mandel nonclassicality test. The result of three-mode beam-splitter action after coupling to an ancilla in the ground state is treated in the same spirit. The concept of genuine tripartite entanglement, and scalar measures of nonclassicality at the Mandel level for two-mode systems, are discussed. Numerous examples illustrating all these concepts are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We use the BBGKY hierarchy equations to calculate, perturbatively, the lowest order nonlinear correction to the two-point correlation and the pair velocity for Gaussian initial conditions in a critical density matter-dominated cosmological model. We compare our results with the results obtained using the hydrodynamic equations that neglect pressure and find that the two match, indicating that there are no effects of multistreaming at this order of perturbation. We analytically study the effect of small scales on the large scales by calculating the nonlinear correction for a Dirac delta function initial two-point correlation. We find that the induced two-point correlation has a x(-6) behavior at large separations. We have considered a class of initial conditions where the initial power spectrum at small k has the form k(n) with 0 < n less than or equal to 3 and have numerically calculated the nonlinear correction to the two-point correlation, its average over a sphere and the pair velocity over a large dynamical range. We find that at small separations the effect of the nonlinear term is to enhance the clustering, whereas at intermediate scales it can act to either increase or decrease the clustering. At large scales we find a simple formula that gives a very good fit for the nonlinear correction in terms of the initial function. This formula explicitly exhibits the influence of small scales on large scales and because of this coupling the perturbative treatment breaks down at large scales much before one would expect it to if the nonlinearity were local in real space. We physically interpret this formula in terms of a simple diffusion process. We have also investigated the case n = 0, and we find that it differs from the other cases in certain respects. We investigate a recently proposed scaling property of gravitational clustering, and we find that the lowest order nonlinear terms cause deviations from the scaling relations that are strictly valid in the linear regime. The approximate validity of these relations in the nonlinear regime in l(T)-body simulations cannot be understood at this order of evolution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over past few years, the studies of cultured neuronal networks have opened up avenues for understanding the ion channels, receptor molecules, and synaptic plasticity that may form the basis of learning and memory. The hippocampal neurons from rats are dissociated and cultured on a surface containing a grid of 64 electrodes. The signals from these 64 electrodes are acquired using a fast data acquisition system MED64 (Alpha MED Sciences, Japan) at a sampling rate of 20 K samples with a precision of 16-bits per sample. A few minutes of acquired data runs in to a few hundreds of Mega Bytes. The data processing for the neural analysis is highly compute-intensive because the volume of data is huge. The major processing requirements are noise removal, pattern recovery, pattern matching, clustering and so on. In order to interface a neuronal colony to a physical world, these computations need to be performed in real-time. A single processor such as a desk top computer may not be adequate to meet this computational requirements. Parallel computing is a method used to satisfy the real-time computational requirements of a neuronal system that interacts with an external world while increasing the flexibility and scalability of the application. In this work, we developed a parallel neuronal system using a multi-node Digital Signal processing system. With 8 processors, the system is able to compute and map incoming signals segmented over a period of 200 ms in to an action in a trained cluster system in real time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Impact of disturbance on forest stand density, basal area, dbh class distribution of density and basal area, species richness, species diversity and similarity index was assessed through monitoring six, one-hectare, permanent forest plots after a period of 24 years in tropical moist forests of Uttara Kannada district, Western Ghats, India. It was observed that all sites lost trees due to removal by people and mortality. Loss of trees was more in sites that are easily accessible and closer to human habitation. In spite of a decrease in tree density, an increase in basal area was observed in some forest plots, which could be on account of stimulatory growth of surviving trees. Decrease in basal area in other sites indicates greater human pressure and overexploitation of trees. Preponderance of lower girth class trees, and a unimodal reverse `J-shaped' curve of density distribution as observed in majority of the sites in the benchmark year, was indicative of regenerating status of these forests. The decrease in number of species in all forest sites was due to indiscriminate removal of trees by people, without sparing species with only a few individuals, and also due to mortality of trees of rare species. Higher species richness and diversity in the lowest dbh class in most of the sites in the benchmark year is indicative of the existence of favorable conditions for sylvigenesis. The decrease in the similarity index suggests extirpation of species, favoring invasion and colonization by secondary species. To minimize human pressure on forests and to facilitate regeneration and growth, proper management planning and conservation measures are needed.