136 resultados para SPANNING TREE PROBLEM
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
A minimum cost spanning tree (mcst) problem analyzes the way to efficiently connect individuals to a source when they are located at different places. Once the efficient tree is obtained, the question on how allocating the total cost among the involved agents defines, in a natural way, a confliicting claims situation. For instance, we may consider the endowment as the total cost of the network, whereas for each individual her claim is the maximum amount she will be allocated, that is, her connection cost to the source. Obviously, we have a confliicting claims problem, so we can apply claims rules in order to obtain an allocation of the total cost. Nevertheless, the allocation obtained by using claims rules might not satisfy some appealing properties (in particular, it does not belong to the core of the associated cooperative game). We will define other natural claims problems that appear if we analyze the maximum and minimum amount that an individual should pay in order to support the minimum cost tree. Keywords: Minimum cost spanning tree problem, Claims problem, Core JEL classification: C71, D63, D71.
Resumo:
The geometric characterisation of tree orchards is a high-precision activity comprising the accurate measurement and knowledge of the geometry and structure of the trees. Different types of sensors can be used to perform this characterisation. In this work a terrestrial LIDAR sensor (SICK LMS200) whose emission source was a 905-nm pulsed laser diode was used. Given the known dimensions of the laser beam cross-section (with diameters ranging from 12 mm at the point of emission to 47.2 mm at a distance of 8 m), and the known dimensions of the elements that make up the crops under study (flowers, leaves, fruits, branches, trunks), it was anticipated that, for much of the time, the laser beam would only partially hit a foreground target/object, with the consequent problem of mixed pixels or edge effects. Understanding what happens in such situations was the principal objective of this work. With this in mind, a series of tests were set up to determine the geometry of the emitted beam and to determine the response of the sensor to different beam blockage scenarios. The main conclusions that were drawn from the results obtained were: (i) in a partial beam blockage scenario, the distance value given by the sensor depends more on the blocked radiant power than on the blocked surface area; (ii) there is an area that influences the measurements obtained that is dependent on the percentage of blockage and which ranges from 1.5 to 2.5 m with respect to the foreground target/object. If the laser beam impacts on a second target/object located within this range, this will affect the measurement given by the sensor. To interpret the information obtained from the point clouds provided by the LIDAR sensors, such as the volume occupied and the enclosing area, it is necessary to know the resolution and the process for obtaining this mesh of points and also to be aware of the problem associated with mixed pixels.
Resumo:
It is known that, in a locally presentable category, localization exists with respect to every set of morphisms, while the statement that localization with respect to every (possibly proper) class of morphisms exists in locally presentable categories is equivalent to a large-cardinal axiom from set theory. One proves similarly, on one hand, that homotopy localization exists with respect to sets of maps in every cofibrantly generated, left proper, simplicial model category M whose underlying category is locally presentable. On the other hand, as we show in this article, the existence of localization with respect to possibly proper classes of maps in a model category M satisfying the above assumptions is implied by a large-cardinal axiom called Vopënka's principle, although we do not know if the reverse implication holds. We also show that, under the same assumptions on M, every endofunctor of M that is idempotent up to homotopy is equivalent to localization with respect to some class S of maps, and if Vopënka's principle holds then S can be chosen to be a set. There are examples showing that the latter need not be true if M is not cofibrantly generated. The above assumptions on M are satisfied by simplicial sets and symmetric spectra over simplicial sets, among many other model categories.
Resumo:
Using the continuation method we prove that the circular and the elliptic symmetric periodic orbits of the planar rotating Kepler problem can be continued into periodic orbits of the planar collision restricted 3–body problem. Additionally, we also continue to this restricted problem the so called “comets orbits”.
Resumo:
We say the endomorphism problem is solvable for an element W in a free group F if it can be decided effectively whether, given U in F, there is an endomorphism Φ of F sending W to U. This work analyzes an approach due to C. Edmunds and improved by C. Sims. Here we prove that the approach provides an efficient algorithm for solving the endomorphism problem when W is a two- generator word. We show that when W is a two-generator word this algorithm solves the problem in time polynomial in the length of U. This result gives a polynomial-time algorithm for solving, in free groups, two-variable equations in which all the variables occur on one side of the equality and all the constants on the other side.
Resumo:
The paper is devoted to the study of a type of differential systems which appear usually in the study of some Hamiltonian systems with 2 degrees of freedom. We prove the existence of infinitely many periodic orbits on each negative energy level. All these periodic orbits pass near the total collision. Finally we apply these results to study the existence of periodic orbits in the charged collinear 3–body problem.
Resumo:
The division problem consists of allocating an amount of a perfectly divisible good among a group of n agents with single-peaked preferences. A rule maps preference profiles into n shares of the amount to be allocated. A rule is bribe-proof if no group of agents can compensate another agent to misrepresent his preference and, after an appropriate redistribution of their shares, each obtain a strictly preferred share. We characterize all bribe-proof rules as the class of efficient, strategy-proof, and weak replacement monotonic rules. In addition, we identify the functional form of all bribe-proof and tops-only rules.
Resumo:
The division problem consists of allocating an amount M of a perfectly divisible good among a group of n agents. Sprumont (1991) showed that if agents have single-peaked preferences over their shares, the uniform rule is the unique strategy-proof, efficient, and anonymous rule. Ching and Serizawa (1998) extended this result by showing that the set of single-plateaued preferences is the largest domain, for all possible values of M, admitting a rule (the extended uniform rule) satisfying strategy-proofness, efficiency and symmetry. We identify, for each M and n, a maximal domain of preferences under which the extended uniform rule also satisfies the properties of strategy-proofness, efficiency, continuity, and "tops-onlyness". These domains (called weakly single-plateaued) are strictly larger than the set of single-plateaued preferences. However, their intersection, when M varies from zero to infinity, coincides with the set of single-plateaued preferences.
Resumo:
R.P. Boas has found necessary and sufficient conditions of belonging of function to Lipschitz class. From his findings it turned out, that the conditions on sine and cosine coefficients for belonging of function to Lip α(0 & α & 1) are the same, but for Lip 1 are different. Later his results were generalized by many authors in the viewpoint of generalization of condition on the majorant of modulus of continuity. The aim of this paper is to obtain Boas-type theorems for generalized Lipschitz classes. To define generalized Lipschitz classes we use the concept of modulus of smoothness of fractional order.
Resumo:
We propose a classification and derive the associated normal forms for rational difference equations with complex coefficients. As an application, we study the global periodicity problem for second order rational difference equations with complex coefficients. We find new necessary conditions as well as some new examples of globally periodic equations.
Resumo:
We present experimental and theoretical analyses of data requirements for haplotype inference algorithms. Our experiments include a broad range of problem sizes under two standard models of tree distribution and were designed to yield statistically robust results despite the size of the sample space. Our results validate Gusfield's conjecture that a population size of n log n is required to give (with high probability) sufficient information to deduce the n haplotypes and their complete evolutionary history. The experimental results inspired our experimental finding with theoretical bounds on the population size. We also analyze the population size required to deduce some fixed fraction of the evolutionary history of a set of n haplotypes and establish linear bounds on the required sample size. These linear bounds are also shown theoretically.
Resumo:
Counter automata are more powerful versions of finite state automata where addition and subtraction operations are permitted on a set of n integer registers, called counters. We show that the word problem of Zn is accepted by a nondeterministic m-counter automaton if and only if m &= n.
Resumo:
The Whitehead minimization problem consists in finding a minimum size element in the automorphic orbit of a word, a cyclic word or a finitely generated subgroup in a finite rank free group. We give the first fully polynomial algorithm to solve this problem, that is, an algorithm that is polynomial both in the length of the input word and in the rank of the free group. Earlier algorithms had an exponential dependency in the rank of the free group. It follows that the primitivity problem – to decide whether a word is an element of some basis of the free group – and the free factor problem can also be solved in polynomial time.
Resumo:
Conflict among member states regarding the distribution of net financial burdens has been allowed to contaminate the entire design of the EU budget with very negative consequences in terms of equity, efficiency and transparency. To get around this problem and pave the way for a substantive budget reform, we propose to decouple distributional negotiations from the rest of the budget process by linking member state net balances in a rigid manner to relative prosperity. This would be achieved through the introduction of a system of compensating horizontal transfers that would take to its logical conclusion the Commission's proposal for a generalized compensation mechanism. We discuss the impact of the proposed scheme on member states? incentives and illustrate its financial implications using revenue and expenditure projections for 2013 that are based on the current Financial Perspectives and Own Resources Decision.