990 resultados para convex subgraphs


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We prove that the group of continuous isometries for the Kobayashi or Caratheodory metrics of a strongly convex domain in C-n is compact unless the domain is biholomorphic to the ball. A key ingredient, proved using differential geometric ideas, is that a continuous isometry between a strongly convex domain and the ball has to be biholomorphic or anti-biholomorphic. Combining this with a metric version of Pinchuk's rescaling technique gives the main result.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The plastic response of a segment of a simply supported orthotropic spherical shell under a uniform blast loading applied on the convex surface of the shell is presented. The blast is assumed to impart a uniform velocity to the shell surface initially. The material of the shell is orthotropic obeying a modified Tresca yield hypersurface conditions and the associated flow rules. The deformation of the shell is determined during all phases of its motion by considering the motion of plastic hinges in different regimes of flow. Numerical results presented include the permanent deformed configuration of the shell and the total time of shell response for different degrees of orthotropy. Conclusions regarding the plastic behaviour of spherical shells with circumferential and meridional stiffening under uniform blast load are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study develops a real options approach for analyzing the optimal risk adoption policy in an environment where the adoption means a switch from one stochastic flow representation into another. We establish that increased volatility needs not decelerate investment, as predicted by the standard literature on real options, once the underlying volatility of the state is made endogenous. We prove that for a decision maker with a convex (concave) objective function, increased post-adoption volatility increases (decreases) the expected cumulative present value of the post-adoption profit flow, which consequently decreases (increases) the option value of waiting and, therefore, accelerates (decelerates) current investment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Motivated by certain situations in manufacturing systems and communication networks, we look into the problem of maximizing the profit in a queueing system with linear reward and cost structure and having a choice of selecting the streams of Poisson arrivals according to an independent Markov chain. We view the system as a MMPP/GI/1 queue and seek to maximize the profits by optimally choosing the stationary probabilities of the modulating Markov chain. We consider two formulations of the optimization problem. The first one (which we call the PUT problem) seeks to maximize the profit per unit time whereas the second one considers the maximization of the profit per accepted customer (the PAC problem). In each of these formulations, we explore three separate problems. In the first one, the constraints come from bounding the utilization of an infinite capacity server; in the second one the constraints arise from bounding the mean queue length of the same queue; and in the third one the finite capacity of the buffer reflect as a set of constraints. In the problems bounding the utilization factor of the queue, the solutions are given by essentially linear programs, while the problems with mean queue length constraints are linear programs if the service is exponentially distributed. The problems modeling the finite capacity queue are non-convex programs for which global maxima can be found. There is a rich relationship between the solutions of the PUT and PAC problems. In particular, the PUT solutions always make the server work at a utilization factor that is no less than that of the PAC solutions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For active contour modeling (ACM), we propose a novel self-organizing map (SOM)-based approach, called the batch-SOM (BSOM), that attempts to integrate the advantages of SOM- and snake-based ACMs in order to extract the desired contours from images. We employ feature points, in the form of ail edge-map (as obtained from a standard edge-detection operation), to guide the contour (as in the case of SOM-based ACMs) along with the gradient and intensity variations in a local region to ensure that the contour does not "leak" into the object boundary in case of faulty feature points (weak or broken edges). In contrast with the snake-based ACMs, however, we do not use an explicit energy functional (based on gradient or intensity) for controlling the contour movement. We extend the BSOM to handle extraction of contours of multiple objects, by splitting a single contour into as many subcontours as the objects in the image. The BSOM and its extended version are tested on synthetic binary and gray-level images with both single and multiple objects. We also demonstrate the efficacy of the BSOM on images of objects having both convex and nonconvex boundaries. The results demonstrate the superiority of the BSOM over others. Finally, we analyze the limitations of the BSOM.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A k-dimensional box is the Cartesian product R-1 X R-2 X ... X R-k where each R-i is a closed interval on the real line. The boxicity of a graph G, denoted as box(G), is the minimum integer k such that G can be represented as the intersection graph of a collection of k-dimensional boxes. A unit cube in k-dimensional space or a k-cube is defined as the Cartesian product R-1 X R-2 X ... X R-k where each R-i is a closed interval oil the real line of the form a(i), a(i) + 1]. The cubicity of G, denoted as cub(G), is the minimum integer k such that G can be represented as the intersection graph of a collection of k-cubes. The threshold dimension of a graph G(V, E) is the smallest integer k such that E can be covered by k threshold spanning subgraphs of G. In this paper we will show that there exists no polynomial-time algorithm for approximating the threshold dimension of a graph on n vertices with a factor of O(n(0.5-epsilon)) for any epsilon > 0 unless NP = ZPP. From this result we will show that there exists no polynomial-time algorithm for approximating the boxicity and the cubicity of a graph on n vertices with factor O(n(0.5-epsilon)) for any epsilon > 0 unless NP = ZPP. In fact all these hardness results hold even for a highly structured class of graphs, namely the split graphs. We will also show that it is NP-complete to determine whether a given split graph has boxicity at most 3. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The study presents a theory of utility models based on aspiration levels, as well as the application of this theory to the planning of timber flow economics. The first part of the study comprises a derivation of the utility-theoretic basis for the application of aspiration levels. Two basic models are dealt with: the additive and the multiplicative. Applied here solely for partial utility functions, aspiration and reservation levels are interpreted as defining piecewisely linear functions. The standpoint of the choices of the decision-maker is emphasized by the use of indifference curves. The second part of the study introduces a model for the management of timber flows. The model is based on the assumption that the decision-maker is willing to specify a shape of income flow which is different from that of the capital-theoretic optimum. The utility model comprises four aspiration-based compound utility functions. The theory and the flow model are tested numerically by computations covering three forest holdings. The results show that the additive model is sensitive even to slight changes in relative importances and aspiration levels. This applies particularly to nearly linear production possibility boundaries of monetary variables. The multiplicative model, on the other hand, is stable because it generates strictly convex indifference curves. Due to a higher marginal rate of substitution, the multiplicative model implies a stronger dependence on forest management than the additive function. For income trajectory optimization, a method utilizing an income trajectory index is more efficient than one based on the use of aspiration levels per management period. Smooth trajectories can be attained by squaring the deviations of the feasible trajectories from the desired one.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gene mapping is a systematic search for genes that affect observable characteristics of an organism. In this thesis we offer computational tools to improve the efficiency of (disease) gene-mapping efforts. In the first part of the thesis we propose an efficient simulation procedure for generating realistic genetical data from isolated populations. Simulated data is useful for evaluating hypothesised gene-mapping study designs and computational analysis tools. As an example of such evaluation, we demonstrate how a population-based study design can be a powerful alternative to traditional family-based designs in association-based gene-mapping projects. In the second part of the thesis we consider a prioritisation of a (typically large) set of putative disease-associated genes acquired from an initial gene-mapping analysis. Prioritisation is necessary to be able to focus on the most promising candidates. We show how to harness the current biomedical knowledge for the prioritisation task by integrating various publicly available biological databases into a weighted biological graph. We then demonstrate how to find and evaluate connections between entities, such as genes and diseases, from this unified schema by graph mining techniques. Finally, in the last part of the thesis, we define the concept of reliable subgraph and the corresponding subgraph extraction problem. Reliable subgraphs concisely describe strong and independent connections between two given vertices in a random graph, and hence they are especially useful for visualising such connections. We propose novel algorithms for extracting reliable subgraphs from large random graphs. The efficiency and scalability of the proposed graph mining methods are backed by extensive experiments on real data. While our application focus is in genetics, the concepts and algorithms can be applied to other domains as well. We demonstrate this generality by considering coauthor graphs in addition to biological graphs in the experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider functions that map the open unit disc conformally onto the complement of a bounded convex set. We call these functions concave univalent functions. In 1994, Livingston presented a characterization for these functions. In this paper, we observe that there is a minor flaw with this characterization. We obtain certain sharp estimates and the exact set of variability involving Laurent and Taylor coefficients for concave functions. We also present the exact set of variability of the linear combination of certain successive Taylor coefficients of concave functions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A polygon is said to be a weak visibility polygon if every point of the polygon is visible from some point of an internal segment. In this paper we derive properties of shortest paths in weak visibility polygons and present a characterization of weak visibility polygons in terms of shortest paths between vertices. These properties lead to the following efficient algorithms: (i) an O(E) time algorithm for determining whether a simple polygon P is a weak visibility polygon and for computing a visibility chord if it exist, where E is the size of the visibility graph of P and (ii) an O(n2) time algorithm for computing the maximum hidden vertex set in an n-sided polygon weakly visible from a convex edge.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the ergodic control for a controlled nondegenerate diffusion when m other (m finite) ergodic costs are required to satisfy prescribed bounds. Under a condition on the cost functions that penalizes instability, the existence of an optimal stable Markov control is established by convex analytic arguments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The set of attainable laws of the joint state-control process of a controlled diffusion is analyzed from a convex analytic viewpoint. Various equivalence relations depending on one-dimensional marginals thereof are defined on this set and the corresponding equivalence classes are studied.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper studies the problem of constructing robust classifiers when the training is plagued with uncertainty. The problem is posed as a Chance-Constrained Program (CCP) which ensures that the uncertain data points are classified correctly with high probability. Unfortunately such a CCP turns out to be intractable. The key novelty is in employing Bernstein bounding schemes to relax the CCP as a convex second order cone program whose solution is guaranteed to satisfy the probabilistic constraint. Prior to this work, only the Chebyshev based relaxations were exploited in learning algorithms. Bernstein bounds employ richer partial information and hence can be far less conservative than Chebyshev bounds. Due to this efficient modeling of uncertainty, the resulting classifiers achieve higher classification margins and hence better generalization. Methodologies for classifying uncertain test data points and error measures for evaluating classifiers robust to uncertain data are discussed. Experimental results on synthetic and real-world datasets show that the proposed classifiers are better equipped to handle data uncertainty and outperform state-of-the-art in many cases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tutte (1979) proved that the disconnected spanning subgraphs of a graph can be reconstructed from its vertex deck. This result is used to prove that if we can reconstruct a set of connected graphs from the shuffled edge deck (SED) then the vertex reconstruction conjecture is true. It is proved that a set of connected graphs can be reconstructed from the SED when all the graphs in the set are claw-free or all are P-4-free. Such a problem is also solved for a large subclass of the class of chordal graphs. This subclass contains maximal outerplanar graphs. Finally, two new conjectures, which imply the edge reconstruction conjecture, are presented. Conjecture 1 demands a construction of a stronger k-edge hypomorphism (to be defined later) from the edge hypomorphism. It is well known that the Nash-Williams' theorem applies to a variety of structures. To prove Conjecture 2, we need to incorporate more graph theoretic information in the Nash-Williams' theorem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An angle invariance property based on Hertz's principle of particle dynamics is employed to facilitate the surface-ray tracing on nondevelopable hybrid quadric surfaces of revolution (h-QUASOR's). This property, when used in conjunction with a Geodesic Constant Method, yields analytical expressions for all the ray-parameters required in the UTD formulation. Differential geometrical considerations require that some of the ray-parameters (defined heuristically in the UTD for the canonical convex surfaces) be modified before the UTD can be applied to such hybrid surfaces. Mutual coupling results for finite-dimensional slots have been presented as an example on a satellite launch vehicle modeled by general paraboloid of revolution and right circular cylinder.