113 resultados para Mixed capacitated arc routing problem


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Guba and Sapir asked, in their joint paper [8], if the simultaneous conjugacy problem was solvable in Diagram Groups or, at least, for Thompson's group F. We give an elementary proof for the solution of the latter question. This relies purely on the description of F as the group of piecewise linear orientation-preserving homeomorphisms of the unit. The techniques we develop allow us also to solve the ordinary conjugacy problem as well, and we can compute roots and centralizers. Moreover, these techniques can be generalized to solve the same questions in larger groups of piecewise-linear homeomorphisms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Vegeu el resum a l'inici del document del fitxer adjunt."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We prove existence theorems for the Dirichlet problem for hypersurfaces of constant special Lagrangian curvature in Hadamard manifolds. The first results are obtained using the continuity method and approximation and then refined using two iterations of the Perron method. The a-priori estimates used in the continuity method are valid in any ambient manifold.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Vegeu el resum a l'inici del document del fitxer adjunt."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We establish existence and non-existence results to the Brezis-Nirenberg type problem involving the square root of the Laplacian in a bounded domain with zero Dirichlet boundary condition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We construct a new family of semi-discrete numerical schemes for the approximation of the one-dimensional periodic Vlasov-Poisson system. The methods are based on the coupling of discontinuous Galerkin approximation to the Vlasov equation and several finite element (conforming, non-conforming and mixed) approximations for the Poisson problem. We show optimal error estimates for the all proposed methods in the case of smooth compactly supported initial data. The issue of energy conservation is also analyzed for some of the methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A family of nonempty closed convex sets is built by using the data of the Generalized Nash equilibrium problem (GNEP). The sets are selected iteratively such that the intersection of the selected sets contains solutions of the GNEP. The algorithm introduced by Iusem-Sosa (2003) is adapted to obtain solutions of the GNEP. Finally some numerical experiments are given to illustrate the numerical behavior of the algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This note develops a flexible methodology for splicing economic time series that avoids the extreme assumptions implicit in the procedures most commonly used in the literature. It allows the user to split the required correction to the older of the series being linked between its levels and growth rates on the basis what he knows or conjectures about the persistence of the factors that account for the discrepancy between the two series that emerges at their linking point. The time profile of the correction is derived from the assumption that the error in the older series reflects the inadequate coverage of emerging sectors or activities that grow faster than the aggregate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The division problem consists of allocating a given amount of an homogeneous and perfectly divisible good among a group of agents with single-peaked preferences on the set of their potential shares. A rule proposes a vector of shares for each division problem. The literature has implicitly assumed that agents will find acceptable any share they are assigned to. In this paper we consider the division problem when agents' participation is voluntary. Each agent has an idiosyncratic interval of acceptable shares where his preferences are single-peaked. A rule has to propose to each agent either to not participate or an acceptable share because otherwise he would opt out and this would require to reassign some of the remaining agents' shares. We study a subclass of efficient and consistent rules and characterize extensions of the uniform rule that deal explicitly with agents' voluntary participation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Arc@ té per objectiu millorar el procés d'aprenentatge de l'estudiant i facilitar l'eficiència en els processos de gestió docent i de continguts. Amb independència de la ubicació del material docent, s'han establert mecanismes en XML que permeten importar/exportar la informació des dels diferents aplicatius existents, indexar i organitzar aquesta informació per a la seva transformació i tractament posterior.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Current parallel applications running on clusters require the use of an interconnection network to perform communications among all computing nodes available. Imbalance of communications can produce network congestion, reducing throughput and increasing latency, degrading the overall system performance. On the other hand, parallel applications running on these networks posses representative stages which allow their characterization, as well as repetitive behavior that can be identified on the basis of this characterization. This work presents the Predictive and Distributed Routing Balancing (PR-DRB), a new method developed to gradually control network congestion, based on paths expansion, traffic distribution and effective traffic load, in order to maintain low latency values. PR-DRB monitors messages latencies on intermediate routers, makes decisions about alternative paths and record communication pattern information encountered during congestion situation. Based on the concept of applications repetitiveness, best solution recorded are reapplied when saved communication pattern re-appears. Traffic congestion experiments were conducted in order to evaluate the performance of the method, and improvements were observed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La acelerada invención de nuevos hardware y software van modificando, casi diariamente, la percepción del mundo, y, por lo tanto, la producción cultural, permeabilizando conceptos como arte-literatura, cuadro-libro, imagen-texto. Si bien estas parejas han sido siempre objeto del discurso teórico, la discusión asume hoy una urgencia creciente al momento que las nuevas tecnologías exponen lo que estaba refugiado en el reino de la teoría. La misma forma de comprender la realidad se ve afectada por la inmediatez de estos medios. La investigación analiza la obra de diferentes autores de los nuevos medios que trabajan en torno a la problemática de la representación de la memoria en esta perspectiva contemporánea. El trabajo de investigación desarrollado en la Tesis Doctoral se centra en la forma de representación de la memoria, así como esta planteada en la obra de Chris Marker. Interesan especialmente los últimos dispositivos creados por el autor en el marco de las llamadas nuevas tecnologías y los nuevos espacios de exposición de cine. El proyecto propone un análisis en torno a la memoria que dichos discursos sugieren a través de los temas que les son propios: archivo, identidades culturales, contribución del espectador, base de datos y tratamiento tecnológico de la información. Se ha seleccionado la obra de Chris Marker por las características de realización y de discurso que permiten una amplia discusión sobre las llamadas nuevas tecnologías y el mundo que éstas representan en el nuevo espacio híbrido construido entre las artes visuales, la literatura y la tecnología.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies endogenous mergers of complements with mixed bundling, by allowing both for joint and separate consumption. After merger, partner fi…rms decrease the price of the bundled system. Besides, when markets for individual components are suffi…ciently important, partner …firms raise prices of stand-alone products, exploiting their monopoly power in local markets and making substitute 'mix-and-match' composite products less attractive to consumers. Even though these effects favor the pro…fitability of mergers, merging is not always an equilibrium outcome. The reason is that outsiders respond by cutting their prices to retain their market share, and mergers can be unprofitable when competition is intense. From a welfare analysis, we observe that the number of mergers observed in equilibrium may be either excessive (when markets for individual components are important) or suboptimal (when markets for individual components are less important). Keywords: complements; merger; mixed bundling; separate consumption JEL classi…fication: L13; L41; D43

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies a dynamic principal-monitor-agent relation where a strategic principal delegates the task of monitoring the effort of a strategic agent to a third party. The latter we call the monitor, whose type is initially unknown. Through repeated interaction the agent might learn his type. We show that this process damages the principal's payoffs. Compensation is assumed exogenous, limiting to a great extent the provision of incentives. We go around this difficulty by introducing costly replacement strategies, i.e. the principal replaces the monitor, thus disrupting the agent's learning. We found that even when replacement costs are null, if the revealed monitor is strictly preferred by both parties, there is a loss in efficiency due to the impossibility of bene…tting from it. Nonetheless, these strategies can partially recover the principal's losses. Additionally, we establish upper and lower bounds on the payoffs that the principal and the agent can achieve. Finally we characterize the equilibrium strategies under public and private monitoring (with communication) for different cost and impatience levels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study I try to explain the systemic problem of the low economic competitiveness of nuclear energy for the production of electricity by carrying out a biophysical analysis of its production process. Given the fact that neither econometric approaches nor onedimensional methods of energy analyses are effective, I introduce the concept of biophysical explanation as a quantitative analysis capable of handling the inherent ambiguity associated with the concept of energy. In particular, the quantities of energy, considered as relevant for the assessment, can only be measured and aggregated after having agreed on a pre-analytical definition of a grammar characterizing a given set of finite transformations. Using this grammar it becomes possible to provide a biophysical explanation for the low economic competitiveness of nuclear energy in the production of electricity. When comparing the various unit operations of the process of production of electricity with nuclear energy to the analogous unit operations of the process of production of fossil energy, we see that the various phases of the process are the same. The only difference is related to characteristics of the process associated with the generation of heat which are completely different in the two systems. Since the cost of production of fossil energy provides the base line of economic competitiveness of electricity, the (lack of) economic competitiveness of the production of electricity from nuclear energy can be studied, by comparing the biophysical costs associated with the different unit operations taking place in nuclear and fossil power plants when generating process heat or net electricity. In particular, the analysis focuses on fossil-fuel requirements and labor requirements for those phases that both nuclear plants and fossil energy plants have in common: (i) mining; (ii) refining/enriching; (iii) generating heat/electricity; (iv) handling the pollution/radioactive wastes. By adopting this approach, it becomes possible to explain the systemic low economic competitiveness of nuclear energy in the production of electricity, because of: (i) its dependence on oil, limiting its possible role as a carbon-free alternative; (ii) the choices made in relation to its fuel cycle, especially whether it includes reprocessing operations or not; (iii) the unavoidable uncertainty in the definition of the characteristics of its process; (iv) its large inertia (lack of flexibility) due to issues of time scale; and (v) its low power level.