14 resultados para problem representation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The lack of stability in some matching problems suggests that alternative solution concepts to the core might be applied to find predictable matchings. We propose the absorbing sets as a solution for the class of roommate problems with strict preferences. This solution, which always exists, either gives the matchings in the core or predicts some other matchings when the core is empty. Furthermore, it satisfies an interesting property of outer stability. We also characterize the absorbing sets, determine their number and, in case of multiplicity, we find that they all share a similar structure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Contributed to: Fusion of Cultures: XXXVIII Annual Conference on Computer Applications and Quantitative Methods in Archaeology – CAA2010 (Granada, Spain, Apr 6-9, 2010)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we introduce four scenario Cluster based Lagrangian Decomposition (CLD) procedures for obtaining strong lower bounds to the (optimal) solution value of two-stage stochastic mixed 0-1 problems. At each iteration of the Lagrangian based procedures, the traditional aim consists of obtaining the solution value of the corresponding Lagrangian dual via solving scenario submodels once the nonanticipativity constraints have been dualized. Instead of considering a splitting variable representation over the set of scenarios, we propose to decompose the model into a set of scenario clusters. We compare the computational performance of the four Lagrange multiplier updating procedures, namely the Subgradient Method, the Volume Algorithm, the Progressive Hedging Algorithm and the Dynamic Constrained Cutting Plane scheme for different numbers of scenario clusters and different dimensions of the original problem. Our computational experience shows that the CLD bound and its computational effort depend on the number of scenario clusters to consider. In any case, our results show that the CLD procedures outperform the traditional LD scheme for single scenarios both in the quality of the bounds and computational effort. All the procedures have been implemented in a C++ experimental code. A broad computational experience is reported on a test of randomly generated instances by using the MIP solvers COIN-OR and CPLEX for the auxiliary mixed 0-1 cluster submodels, this last solver within the open source engine COIN-OR. We also give computational evidence of the model tightening effect that the preprocessing techniques, cut generation and appending and parallel computing tools have in stochastic integer optimization. Finally, we have observed that the plain use of both solvers does not provide the optimal solution of the instances included in the testbed with which we have experimented but for two toy instances in affordable elapsed time. On the other hand the proposed procedures provide strong lower bounds (or the same solution value) in a considerably shorter elapsed time for the quasi-optimal solution obtained by other means for the original stochastic problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The digital management of collections in museums, archives, libraries and galleries is an increasingly important part of cultural heritage studies. This paper describes a representation for folk song metadata, based on the Web Ontology Language (OWL) implementation of the CIDOC Conceptual Reference Model. The OWL representation facilitates encoding and reasoning over a genre ontology, while the CIDOC model enables a representation of complex spatial containment and proximity relations among geographic regions. It is shown how complex queries of folk song metadata, relying on inference and not only retrieval, can be expressed in OWL and solved using a description logic reasoner.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Linear Ordering Problem is a popular combinatorial optimisation problem which has been extensively addressed in the literature. However, in spite of its popularity, little is known about the characteristics of this problem. This paper studies a procedure to extract static information from an instance of the problem, and proposes a method to incorporate the obtained knowledge in order to improve the performance of local search-based algorithms. The procedure introduced identifies the positions where the indexes cannot generate local optima for the insert neighbourhood, and thus global optima solutions. This information is then used to propose a restricted insert neighbourhood that discards the insert operations which move indexes to positions where optimal solutions are not generated. In order to measure the efficiency of the proposed restricted insert neighbourhood system, two state-of-the-art algorithms for the LOP that include local search procedures have been modified. Conducted experiments confirm that the restricted versions of the algorithms outperform the classical designs systematically. The statistical test included in the experimentation reports significant differences in all the cases, which validates the efficiency of our proposal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Feasible tomography schemes for large particle numbers must possess, besides an appropriate data acquisition protocol, an efficient way to reconstruct the density operator from the observed finite data set. Since state reconstruction typically requires the solution of a nonlinear large-scale optimization problem, this is a major challenge in the design of scalable tomography schemes. Here we present an efficient state reconstruction scheme for permutationally invariant quantum state tomography. It works for all common state-of-the-art reconstruction principles, including, in particular, maximum likelihood and least squares methods, which are the preferred choices in today's experiments. This high efficiency is achieved by greatly reducing the dimensionality of the problem employing a particular representation of permutationally invariant states known from spin coupling combined with convex optimization, which has clear advantages regarding speed, control and accuracy in comparison to commonly employed numerical routines. First prototype implementations easily allow reconstruction of a state of 20 qubits in a few minutes on a standard computer

Relevância:

20.00% 20.00%

Publicador:

Resumo:

157 p.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to propose a new solution for the roommate problem with strict preferences. We introduce the solution of maximum irreversibility and consider almost stable matchings (Abraham et al. [2])and maximum stable matchings (Ta [30] [32]). We find that almost stable matchings are incompatible with the other two solutions. Hence, to solve the roommate problem we propose matchings that lie at the intersection of the maximum irreversible matchings and maximum stable matchings, which are called Q-stable matchings. These matchings are core consistent and we offer an effi cient algorithm for computing one of them. The outcome of the algorithm belongs to an absorbing set.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantum Computing is a relatively modern field which simulates quantum computation conditions. Moreover, it can be used to estimate which quasiparticles would endure better in a quantum environment. Topological Quantum Computing (TQC) is an approximation for reducing the quantum decoherence problem1, which is responsible for error appearance in the representation of information. This project tackles specific instances of TQC problems using MOEAs (Multi-objective Optimization Evolutionary Algorithms). A MOEA is a type of algorithm which will optimize two or more objectives of a problem simultaneously, using a population based approach. We have implemented MOEAs that use probabilistic procedures found in EDAs (Estimation of Distribution Algorithms), since in general, EDAs have found better solutions than ordinary EAs (Evolutionary Algorithms), even though they are more costly. Both, EDAs and MOEAs are population-based algorithms. The objective of this project was to use a multi-objective approach in order to find good solutions for several instances of a TQC problem. In particular, the objectives considered in the project were the error approximation and the length of a solution. The tool we used to solve the instances of the problem was the multi-objective framework PISA. Because PISA has not too much documentation available, we had to go through a process of reverse-engineering of the framework to understand its modules and the way they communicate with each other. Once its functioning was understood, we began working on a module dedicated to the braid problem. Finally, we submitted this module to an exhaustive experimentation phase and collected results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Feature-based vocoders, e.g., STRAIGHT, offer a way to manipulate the perceived characteristics of the speech signal in speech transformation and synthesis. For the harmonic model, which provide excellent perceived quality, features for the amplitude parameters already exist (e.g., Line Spectral Frequencies (LSF), Mel-Frequency Cepstral Coefficients (MFCC)). However, because of the wrapping of the phase parameters, phase features are more difficult to design. To randomize the phase of the harmonic model during synthesis, a voicing feature is commonly used, which distinguishes voiced and unvoiced segments. However, voice production allows smooth transitions between voiced/unvoiced states which makes voicing segmentation sometimes tricky to estimate. In this article, two-phase features are suggested to represent the phase of the harmonic model in a uniform way, without voicing decision. The synthesis quality of the resulting vocoder has been evaluated, using subjective listening tests, in the context of resynthesis, pitch scaling, and Hidden Markov Model (HMM)-based synthesis. The experiments show that the suggested signal model is comparable to STRAIGHT or even better in some scenarios. They also reveal some limitations of the harmonic framework itself in the case of high fundamental frequencies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In traditional teaching, the fundamental concepts of electromagnetic induction are usually quickly analyzed, spending most of the time solving problems in a more or less rote manner. However, physics education research has shown that the fundamental concepts of the electromagnetic induction theory are barely understood by students. This article proposes an interactive teaching sequence introducing the topic of electromagnetic induction. The sequence has been designed based on contributions from physics education research. Particular attention is paid to the relationship between experimental findings (macroscopic level) and theoretical interpretation (microscopic level). An example of the activities that have been designed will also be presented, describing the implementation context and the corresponding findings. Since implementing the sequence, a considerable number of students have a more satisfactory grasp of the electromagnetic induction explicative model. However, difficulties are manifested in aspects that require a multilevel explanation, referring to deep structures where the system description is better defined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[EN]This research had as primary objective to model different types of problems using linear programming and apply different methods so as to find an adequate solution to them. To achieve this objective, a linear programming problem and its dual were studied and compared. For that, linear programming techniques were provided and an introduction of the duality theory was given, analyzing the dual problem and the duality theorems. Then, a general economic interpretation was given and different optimal dual variables like shadow prices were studied through the next practical case: An aesthetic surgery hospital wanted to organize its monthly waiting list of four types of surgeries to maximize its daily income. To solve this practical case, we modelled the linear programming problem following the relationships between the primal problem and its dual. Additionally, we solved the dual problem graphically, and then we found the optimal solution of the practical case posed through its dual, following the different theorems of the duality theory. Moreover, how Complementary Slackness can help to solve linear programming problems was studied. To facilitate the solution Solver application of Excel and Win QSB programme were used.