917 resultados para Combinatorial Grassmannian


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The goal of the work reported in this paper is to use automated, combinatorial synthesis to generate alternative solutions to be used as stimuli by designers for ideation. FuncSION, a computational synthesis tool that can automatically synthesize solution concepts for mechanical devices by combining building blocks from a library, is used for this purpose. The objectives of FuncSION are to help generate a variety of functional requirements for a given problem and a variety of concepts to fulfill these functions. A distinctive feature of FuncSION is its focus on automated generation of spatial configurations, an aspect rarely addressed by other computational synthesis programs. This paper provides an overview of FuncSION in terms of representation of design problems, representation of building blocks, and rules with which building blocks are combined to generate concepts at three levels of abstraction: topological, spatial, and physical. The paper then provides a detailed account of evaluating FuncSION for its effectiveness in providing stimuli for enhanced ideation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we revisit the combinatorial error model of Mazumdar et al. that models errors in high-density magnetic recording caused by lack of knowledge of grain boundaries in the recording medium. We present new upper bounds on the cardinality/rate of binary block codes that correct errors within this model. All our bounds, except for one, are obtained using combinatorial arguments based on hypergraph fractional coverings. The exception is a bound derived via an information-theoretic argument. Our bounds significantly improve upon existing bounds from the prior literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We apply the objective method of Aldous to the problem of finding the minimum-cost edge cover of the complete graph with random independent and identically distributed edge costs. The limit, as the number of vertices goes to infinity, of the expected minimum cost for this problem is known via a combinatorial approach of Hessler and Wastlund. We provide a proof of this result using the machinery of the objective method and local weak convergence, which was used to prove the (2) limit of the random assignment problem. A proof via the objective method is useful because it provides us with more information on the nature of the edge's incident on a typical root in the minimum-cost edge cover. We further show that a belief propagation algorithm converges asymptotically to the optimal solution. This can be applied in a computational linguistics problem of semantic projection. The belief propagation algorithm yields a near optimal solution with lesser complexity than the known best algorithms designed for optimality in worst-case settings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aberrant activation of Notch and Ras pathways has been detected in breast cancers. A synergy between these two pathways has also been shown in breast cell transformation in culture. Yet, the clinical relevance of Notch-Ras cooperation in breast cancer progression remains unexplored. In this study, we show that coordinate hyperactivation of Notch1 and Ras/MAPK pathways in breast cancer patient specimens, as assessed by IHC for cleaved Notch1 and pErk1/2, respectively, correlated with early relapse to vital organs and poor overall survival. Interestingly, majority of such Notch1 (high)Erk(high) cases encompassed the highly aggressive triple-negative breast cancers (TNBC), and were enriched in stem cell markers. We further show that combinatorial inhibition of Notch1 and Ras/MAPK pathways, using a novel mAb against Notch1 and a MEK inhibitor, respectively, led to a significant reduction in proliferation and survival of breast cancer cells compared with individual inhibition. Combined inhibition also abrogated sphere-forming potential, and depleted the putative cancer stem-like cell subpopulation. Most importantly, combinatorial inhibition of Notch1 and Ras/MAPK pathways completely blocked tumor growth in a panel of breast cancer xenografts, including the TNBCs. Thus, our study identifies coordinate hyperactivation of Notch1 and Ras/MAPK pathways as novel biomarkers for poor breast cancer outcome. Furthermore, based on our preclinical data, we propose combinatorial targeting of these two pathways as a treatment strategy for highly aggressive breast cancers, particularly the TNBCs that currently lack any targeted therapeutic module. (C) 2014 AACR.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

RAGs (recombination activating genes) are responsible for the generation of antigen receptor diversity through the process of combinatorial joining of different V (variable), D (diversity) and J (joining) gene segments. In addition to its physiological property, wherein RAG functions as a sequence-specific nuclease, it can also act as a structure-specific nuclease leading to genomic instability and cancer. In the present study, we investigate the factors that regulate RAG cleavage on non-B DNA structures. We find that RAG binding and cleavage on heteroduplex DNA is dependent on the length of the double-stranded flanking region. Besides, the immediate flanking double-stranded region regulates RAG activity in a sequence-dependent manner. Interestingly, the cleavage efficiency of RAGs at the heteroduplex region is influenced by the phasing of DNA. Thus, our results suggest that sequence, length and phase positions of the DNA can affect the efficiency of RAG cleavage when it acts as a structure-specific nuclease. These findings provide novel insights on the regulation of the pathological functions of RAGs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a complete bipartite graph with vertex sets of cardinalities n and n', assign random weights from exponential distribution with mean 1, independently to each edge. We show that, as n -> infinity, with n' = n/alpha] for any fixed alpha > 1, the minimum weight of many-to-one matchings converges to a constant (depending on alpha). Many-to-one matching arises as an optimization step in an algorithm for genome sequencing and as a measure of distance between finite sets. We prove that a belief propagation (BP) algorithm converges asymptotically to the optimal solution. We use the objective method of Aldous to prove our results. We build on previous works on minimum weight matching and minimum weight edge cover problems to extend the objective method and to further the applicability of belief propagation to random combinatorial optimization problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Let be a set of points in the plane. A geometric graph on is said to be locally Gabriel if for every edge in , the Euclidean disk with the segment joining and as diameter does not contain any points of that are neighbors of or in . A locally Gabriel graph(LGG) is a generalization of Gabriel graph and is motivated by applications in wireless networks. Unlike a Gabriel graph, there is no unique LGG on a given point set since no edge in a LGG is necessarily included or excluded. Thus the edge set of the graph can be customized to optimize certain network parameters depending on the application. The unit distance graph(UDG), introduced by Erdos, is also a LGG. In this paper, we show the following combinatorial bounds on edge complexity and independent sets of LGG: (i) For any , there exists LGG with edges. This improves upon the previous best bound of . (ii) For various subclasses of convex point sets, we show tight linear bounds on the maximum edge complexity of LGG. (iii) For any LGG on any point set, there exists an independent set of size .

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 1987, Kalai proved that stacked spheres of dimension d >= 3 are characterised by the fact that they attain equality in Barnette's celebrated Lower Bound Theorem. This result does not extend to dimension d = 2. In this article, we give a characterisation of stacked 2-spheres using what we call the separation index. Namely, we show that the separation index of a triangulated 2-sphere is maximal if and only if it is stacked. In addition, we prove that, amongst all n-vertex triangulated 2-spheres, the separation index is minimised by some n-vertex flag sphere for n >= 6. Furthermore, we apply this characterisation of stacked 2-spheres to settle the outstanding 3-dimensional case of the Lutz-Sulanke-Swartz conjecture that ``tight-neighbourly triangulated manifolds are tight''. For dimension d >= 4, the conjecture has already been proved by Effenberger following a result of Novik and Swartz. (C) 2015 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We formally extend the CFT techniques introduced in arXiv: 1505.00963, to phi(2d0/d0-2) theory in d = d(0) dimensions and use it to compute anomalous dimensions near d(0) = 3, 4 in a unified manner. We also do a similar analysis of the O(N) model in three dimensions by developing a recursive combinatorial approach for OPE contractions. Our results match precisely with low loop perturbative computations. Finally, using 3-point correlators in the CFT, we comment on why the phi(3) theory in d(0) = 6 is qualitatively different.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Affine transformations have proven to be very powerful for loop restructuring due to their ability to model a very wide range of transformations. A single multi-dimensional affine function can represent a long and complex sequence of simpler transformations. Existing affine transformation frameworks like the Pluto algorithm, that include a cost function for modern multicore architectures where coarse-grained parallelism and locality are crucial, consider only a sub-space of transformations to avoid a combinatorial explosion in finding the transformations. The ensuing practical tradeoffs lead to the exclusion of certain useful transformations, in particular, transformation compositions involving loop reversals and loop skewing by negative factors. In this paper, we propose an approach to address this limitation by modeling a much larger space of affine transformations in conjunction with the Pluto algorithm's cost function. We perform an experimental evaluation of both, the effect on compilation time, and performance of generated codes. The evaluation shows that our new framework, Pluto+, provides no degradation in performance in any of the Polybench benchmarks. For Lattice Boltzmann Method (LBM) codes with periodic boundary conditions, it provides a mean speedup of 1.33x over Pluto. We also show that Pluto+ does not increase compile times significantly. Experimental results on Polybench show that Pluto+ increases overall polyhedral source-to-source optimization time only by 15%. In cases where it improves execution time significantly, it increased polyhedral optimization time only by 2.04x.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Minimal crystallizations of simply connected PL 4-manifolds are very natural objects. Many of their topological features are reflected in their combinatorial structure which, in addition, is preserved under the connected sum operation. We present a minimal crystallization of the standard PL K3 surface. In combination with known results this yields minimal crystallizations of all simply connected PL 4-manifolds of ``standard'' type, that is, all connected sums of CP2, S-2 x S-2, and the K3 surface. In particular, we obtain minimal crystallizations of a pair of homeomorphic but non-PL-homeomorphic 4-manifolds. In addition, we give an elementary proof that the minimal 8-vertex crystallization of CP2 is unique and its associated pseudotriangulation is related to the 9-vertex combinatorial triangulation of CP2 by the minimum of four edge contractions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the problem of blind multiuser detection. We adopt a Bayesian approach where unknown parameters are considered random and integrated out. Computing the maximum a posteriori estimate of the input data sequence requires solving a combinatorial optimization problem. We propose here to apply the Cross-Entropy method recently introduced by Rubinstein. The performance of cross-entropy is compared to Markov chain Monte Carlo. For similar Bit Error Rate performance, we demonstrate that Cross-Entropy outperforms a generic Markov chain Monte Carlo method in terms of operation time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Simulated annealing is a popular method for approaching the solution of a global optimization problem. Existing results on its performance apply to discrete combinatorial optimization where the optimization variables can assume only a finite set of possible values. We introduce a new general formulation of simulated annealing which allows one to guarantee finite-time performance in the optimization of functions of continuous variables. The results hold universally for any optimization problem on a bounded domain and establish a connection between simulated annealing and up-to-date theory of convergence of Markov chain Monte Carlo methods on continuous domains. This work is inspired by the concept of finite-time learning with known accuracy and confidence developed in statistical learning theory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Linear Ordering Problem is a popular combinatorial optimisation problem which has been extensively addressed in the literature. However, in spite of its popularity, little is known about the characteristics of this problem. This paper studies a procedure to extract static information from an instance of the problem, and proposes a method to incorporate the obtained knowledge in order to improve the performance of local search-based algorithms. The procedure introduced identifies the positions where the indexes cannot generate local optima for the insert neighbourhood, and thus global optima solutions. This information is then used to propose a restricted insert neighbourhood that discards the insert operations which move indexes to positions where optimal solutions are not generated. In order to measure the efficiency of the proposed restricted insert neighbourhood system, two state-of-the-art algorithms for the LOP that include local search procedures have been modified. Conducted experiments confirm that the restricted versions of the algorithms outperform the classical designs systematically. The statistical test included in the experimentation reports significant differences in all the cases, which validates the efficiency of our proposal.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently, probability models on rankings have been proposed in the field of estimation of distribution algorithms in order to solve permutation-based combinatorial optimisation problems. Particularly, distance-based ranking models, such as Mallows and Generalized Mallows under the Kendall’s-t distance, have demonstrated their validity when solving this type of problems. Nevertheless, there are still many trends that deserve further study. In this paper, we extend the use of distance-based ranking models in the framework of EDAs by introducing new distance metrics such as Cayley and Ulam. In order to analyse the performance of the Mallows and Generalized Mallows EDAs under the Kendall, Cayley and Ulam distances, we run them on a benchmark of 120 instances from four well known permutation problems. The conducted experiments showed that there is not just one metric that performs the best in all the problems. However, the statistical test pointed out that Mallows-Ulam EDA is the most stable algorithm among the studied proposals.