138 resultados para Strictly Hyperbolic Polynomial
Resumo:
We propose a new procurement procedure which allocates shares of the total amount to be procured depending on the bids of suppliers. Among the properties of the mechanism are: (i) Bidders have an incentive to par- ticipate in the procurement procedure, as equilibrium payoffs are strictly positive. (ii) The mechanism allows to vary the extent to which affirma- tive action objectives, like promoting local industries, are pursued. (iii) Surprisingly, even accomplishing affirmative action goals, procurement ex- penditures might be lower than under a classical auction format. Keywords: Procurement Auction, Affirmative Action. JEL: C72, D44, H57
Resumo:
Assume that the problem Qo is not solvable in polynomial time. For theories T containing a sufficiently rich part of true arithmetic we characterize T U {ConT} as the minimal extension of T proving for some algorithm that it decides Qo as fast as any algorithm B with the property that T proves that B decides Qo. Here, ConT claims the consistency of T. Moreover, we characterize problems with an optimal algorithm in terms of arithmetical theories.
Resumo:
In a seminal paper [10], Weitz gave a deterministic fully polynomial approximation scheme for counting exponentially weighted independent sets (which is the same as approximating the partition function of the hard-core model from statistical physics) in graphs of degree at most d, up to the critical activity for the uniqueness of the Gibbs measure on the innite d-regular tree. ore recently Sly [8] (see also [1]) showed that this is optimal in the sense that if here is an FPRAS for the hard-core partition function on graphs of maximum egree d for activities larger than the critical activity on the innite d-regular ree then NP = RP. In this paper we extend Weitz's approach to derive a deterministic fully polynomial approximation scheme for the partition function of general two-state anti-ferromagnetic spin systems on graphs of maximum degree d, up to the corresponding critical point on the d-regular tree. The main ingredient of our result is a proof that for two-state anti-ferromagnetic spin systems on the d-regular tree, weak spatial mixing implies strong spatial mixing. his in turn uses a message-decay argument which extends a similar approach proposed recently for the hard-core model by Restrepo et al [7] to the case of general two-state anti-ferromagnetic spin systems.
Resumo:
Les pràctiques desenvolupades a l¿EAP (Equip Assessorament Psicopedagògic) m¿han permès un apropament a les activitats que realitzen fora i dintre dels Centres Escolars i mes concretament la intervenció que es fa mitjançant les USEE (Unitats de Suport a l¿Educació Especial. Aquestes Unitats tenen com a finalitat afavorir la inclusió dels alumnes amb Necessitats Especials (NEE) amb dictamen. Mitjançant aquest treball podrem analitzar com la proposta de uns criteris clars i ben definits permet millorar la distribució d¿aquest alumnat i a la vegada aprofitar millor els recursos de suport dels Centres Educatius. De la mateixa manera aquesta proposta de criteris pot servir d¿oportunitat per establir nous escenaris de cooperació entre les unitats de suport i el sistema docent de l¿aula ordinària.
Resumo:
Catadioptric sensors are combinations of mirrors and lenses made in order to obtain a wide field of view. In this paper we propose a new sensor that has omnidirectional viewing ability and it also provides depth information about the nearby surrounding. The sensor is based on a conventional camera coupled with a laser emitter and two hyperbolic mirrors. Mathematical formulation and precise specifications of the intrinsic and extrinsic parameters of the sensor are discussed. Our approach overcomes limitations of the existing omni-directional sensors and eventually leads to reduced costs of production
Resumo:
Most network operators have considered reducing Label Switched Routers (LSR) label spaces (i.e. the number of labels that can be used) as a means of simplifying management of underlaying Virtual Private Networks (VPNs) and, hence, reducing operational expenditure (OPEX). This letter discusses the problem of reducing the label spaces in Multiprotocol Label Switched (MPLS) networks using label merging - better known as MultiPoint-to-Point (MP2P) connections. Because of its origins in IP, MP2P connections have been considered to have tree- shapes with Label Switched Paths (LSP) as branches. Due to this fact, previous works by many authors affirm that the problem of minimizing the label space using MP2P in MPLS - the Merging Problem - cannot be solved optimally with a polynomial algorithm (NP-complete), since it involves a hard- decision problem. However, in this letter, the Merging Problem is analyzed, from the perspective of MPLS, and it is deduced that tree-shapes in MP2P connections are irrelevant. By overriding this tree-shape consideration, it is possible to perform label merging in polynomial time. Based on how MPLS signaling works, this letter proposes an algorithm to compute the minimum number of labels using label merging: the Full Label Merging algorithm. As conclusion, we reclassify the Merging Problem as Polynomial-solvable, instead of NP-complete. In addition, simulation experiments confirm that without the tree-branch selection problem, more labels can be reduced
Resumo:
All-optical label swapping (AOLS) forms a key technology towards the implementation of all-optical packet switching nodes (AOPS) for the future optical Internet. The capital expenditures of the deployment of AOLS increases with the size of the label spaces (i.e. the number of used labels), since a special optical device is needed for each recognized label on every node. Label space sizes are affected by the way in which demands are routed. For instance, while shortest-path routing leads to the usage of fewer labels but high link utilization, minimum interference routing leads to the opposite. This paper studies all-optical label stacking (AOLStack), which is an extension of the AOLS architecture. AOLStack aims at reducing label spaces while easing the compromise with link utilization. In this paper, an integer lineal program is proposed with the objective of analyzing the softening of the aforementioned trade-off due to AOLStack. Furthermore, a heuristic aiming at finding good solutions in polynomial-time is proposed as well. Simulation results show that AOLStack either a) reduces the label spaces with a low increase in the link utilization or, similarly, b) uses better the residual bandwidth to decrease the number of labels even more
Resumo:
In this paper, we give a new construction of resonant normal forms with a small remainder for near-integrable Hamiltonians at a quasi-periodic frequency. The construction is based on the special case of a periodic frequency, a Diophantine result concerning the approximation of a vector by independent periodic vectors and a technique of composition of periodic averaging. It enables us to deal with non-analytic Hamiltonians, and in this first part we will focus on Gevrey Hamiltonians and derive normal forms with an exponentially small remainder. This extends a result which was known for analytic Hamiltonians, and only in the periodic case for Gevrey Hamiltonians. As applications, we obtain an exponentially large upper bound on the stability time for the evolution of the action variables and an exponentially small upper bound on the splitting of invariant manifolds for hyperbolic tori, generalizing corresponding results for analytic Hamiltonians.
Resumo:
This paper is a sequel to ``Normal forms, stability and splitting of invariant manifolds I. Gevrey Hamiltonians", in which we gave a new construction of resonant normal forms with an exponentially small remainder for near-integrable Gevrey Hamiltonians at a quasi-periodic frequency, using a method of periodic approximations. In this second part we focus on finitely differentiable Hamiltonians, and we derive normal forms with a polynomially small remainder. As applications, we obtain a polynomially large upper bound on the stability time for the evolution of the action variables and a polynomially small upper bound on the splitting of invariant manifolds for hyperbolic tori.
Resumo:
This contribution compares existing and newly developed techniques for geometrically representing mean-variances-kewness portfolio frontiers based on the rather widely adapted methodology of polynomial goal programming (PGP) on the one hand and the more recent approach based on the shortage function on the other hand. Moreover, we explain the working of these different methodologies in detail and provide graphical illustrations. Inspired by these illustrations, we prove a generalization of the well-known two fund separation theorem from traditionalmean-variance portfolio theory.
Resumo:
The effect of initial conditions on the speed of propagating fronts in reaction-diffusion equations is examined in the framework of the Hamilton-Jacobi theory. We study the transition between quenched and nonquenched fronts both analytically and numerically for parabolic and hyperbolic reaction diffusion. Nonhomogeneous media are also analyzed and the effect of algebraic initial conditions is also discussed
Resumo:
Four methods were tested to assess the fire-blight disease response on grafted pear plants. The leaves of the plants were inoculated with Erwinia amylovora suspensions by pricking with clamps, cutting with scissors, local infiltration, and painting a bacterial suspension onto the leaves with a paintbrush. The effects of the inoculation methods were studied in dose-time-response experiments carried out in climate chambers under quarantine conditions. A modified Gompertz model was used to analyze the disease-time relatiobbnships and provided information on the rate of infection progression (rg) and time delay to the start of symptoms (t0). The disease-pathogen-dose relationships were analyzed according to a hyperbolic saturation model in which the median effective dose (ED50) of the pathogen and maximum disease level (ymax) were determined. Localized infiltration into the leaf mesophile resulted in the early (short t0) but slow (low rg) development of infection whereas in leaves pricked with clamps disease symptoms developed late (long t0) but rapidly (high rg). Paintbrush inoculation of the plants resulted in an incubation period of medium length, a moderate rate of infection progression, and low ymax values. In leaves inoculated with scissors, fire-blight symptoms developed early (short t0) and rapidly (high rg), and with the lowest ED50 and the highest ymax
Resumo:
A radiative equation of the Cattaneo–Vernotte type is derived from information theory and the radiative transfer equation. The equation thus derived is a radiative analog of the equation that is used for the description of hyperbolic heat conduction. It is shown, without recourse to any phenomenological assumption, that radiative transfer may be included in a natural way in the framework of extendedirreversible thermodynamics
Resumo:
This paper initially identifies the main transformations of the television system that are caused by digitalization. Its development in several broadcasting platforms is analyzed as well as the particular obstacles and requirements that are detected for each of them. Due to its technical characteristics and its historical link to the public services, the terrestrial network requires migration strategies different from those strictly commercial, and public intervention might be needed. The paper focuses on such migration strategies towards DTT and identifies the main issues for public intervention in the areas of the digital scenario: technology, business and market transformation and the reception field. Moreover, it describes and classifies the challenges that public broadcasters should confront due to digitalization. This paper finally concludes that the leadership of the public broadcasters during the migration towards DTT is an interesting tool for public policy. The need for foster the digitalization of the terrestrial platform and to achieve certain social and public goal besides the market interest brings an opportunity for public institutions and public broadcasters to work together. That leading role could also be positive for the public service to face its necessary redefinition and reallocation within the digital context.
Resumo:
In a number of programs for gene structure prediction in higher eukaryotic genomic sequences, exon prediction is decoupled from gene assembly: a large pool of candidate exons is predicted and scored from features located in the query DNA sequence, and candidate genes are assembled from such a pool as sequences of nonoverlapping frame-compatible exons. Genes are scored as a function of the scores of the assembled exons, and the highest scoring candidate gene is assumed to be the most likely gene encoded by the query DNA sequence. Considering additive gene scoring functions, currently available algorithms to determine such a highest scoring candidate gene run in time proportional to the square of the number of predicted exons. Here, we present an algorithm whose running time grows only linearly with the size of the set of predicted exons. Polynomial algorithms rely on the fact that, while scanning the set of predicted exons, the highest scoring gene ending in a given exon can be obtained by appending the exon to the highest scoring among the highest scoring genes ending at each compatible preceding exon. The algorithm here relies on the simple fact that such highest scoring gene can be stored and updated. This requires scanning the set of predicted exons simultaneously by increasing acceptor and donor position. On the other hand, the algorithm described here does not assume an underlying gene structure model. Indeed, the definition of valid gene structures is externally defined in the so-called Gene Model. The Gene Model specifies simply which gene features are allowed immediately upstream which other gene features in valid gene structures. This allows for great flexibility in formulating the gene identification problem. In particular it allows for multiple-gene two-strand predictions and for considering gene features other than coding exons (such as promoter elements) in valid gene structures.