915 resultados para constraint rules


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present two new constraint qualifications (CQs) that are weaker than the recently introduced relaxed constant positive linear dependence (RCPLD) CQ. RCPLD is based on the assumption that many subsets of the gradients of the active constraints preserve positive linear dependence locally. A major open question was to identify the exact set of gradients whose properties had to be preserved locally and that would still work as a CQ. This is done in the first new CQ, which we call the constant rank of the subspace component (CRSC) CQ. This new CQ also preserves many of the good properties of RCPLD, such as local stability and the validity of an error bound. We also introduce an even weaker CQ, called the constant positive generator (CPG), which can replace RCPLD in the analysis of the global convergence of algorithms. We close this work by extending convergence results of algorithms belonging to all the main classes of nonlinear optimization methods: sequential quadratic programming, augmented Lagrangians, interior point algorithms, and inexact restoration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study general properties of the Landau-gauge Gribov ghost form factor sigma(p(2)) for SU(N-c) Yang-Mills theories in the d-dimensional case. We find a qualitatively different behavior for d = 3, 4 with respect to the d = 2 case. In particular, considering any (sufficiently regular) gluon propagator D(p(2)) and the one-loop-corrected ghost propagator, we prove in the 2d case that the function sigma(p(2)) blows up in the infrared limit p -> 0 as -D(0) ln(p(2)). Thus, for d = 2, the no-pole condition sigma(p(2)) < 1 (for p(2) > 0) can be satisfied only if the gluon propagator vanishes at zero momentum, that is, D(0) = 0. On the contrary, in d = 3 and 4, sigma(p(2)) is finite also if D(0) > 0. The same results are obtained by evaluating the ghost propagator G(p(2)) explicitly at one loop, using fitting forms for D(p(2)) that describe well the numerical data of the gluon propagator in two, three and four space-time dimensions in the SU(2) case. These evaluations also show that, if one considers the coupling constant g(2) as a free parameter, the ghost propagator admits a one-parameter family of behaviors (labeled by g(2)), in agreement with previous works by Boucaud et al. In this case the condition sigma(0) <= 1 implies g(2) <= g(c)(2), where g(c)(2) is a "critical" value. Moreover, a freelike ghost propagator in the infrared limit is obtained for any value of g(2) smaller than g(c)(2), while for g(2) = g(c)(2) one finds an infrared-enhanced ghost propagator. Finally, we analyze the Dyson-Schwinger equation for sigma(p(2)) and show that, for infrared-finite ghost-gluon vertices, one can bound the ghost form factor sigma(p(2)). Using these bounds we find again that only in the d = 2 case does one need to impose D(0) = 0 in order to satisfy the no-pole condition. The d = 2 result is also supported by an analysis of the Dyson-Schwinger equation using a spectral representation for the ghost propagator. Thus, if the no-pole condition is imposed, solving the d = 2 Dyson-Schwinger equations cannot lead to a massive behavior for the gluon propagator. These results apply to any Gribov copy inside the so-called first Gribov horizon; i.e., the 2d result D(0) = 0 is not affected by Gribov noise. These findings are also in agreement with lattice data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract Background A popular model for gene regulatory networks is the Boolean network model. In this paper, we propose an algorithm to perform an analysis of gene regulatory interactions using the Boolean network model and time-series data. Actually, the Boolean network is restricted in the sense that only a subset of all possible Boolean functions are considered. We explore some mathematical properties of the restricted Boolean networks in order to avoid the full search approach. The problem is modeled as a Constraint Satisfaction Problem (CSP) and CSP techniques are used to solve it. Results We applied the proposed algorithm in two data sets. First, we used an artificial dataset obtained from a model for the budding yeast cell cycle. The second data set is derived from experiments performed using HeLa cells. The results show that some interactions can be fully or, at least, partially determined under the Boolean model considered. Conclusions The algorithm proposed can be used as a first step for detection of gene/protein interactions. It is able to infer gene relationships from time-series data of gene expression, and this inference process can be aided by a priori knowledge available.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We use the QCD sum rules to study the recently observed charmonium-like structure Z+ c (3900) as a tetraquark state. We evaluate the three-point function and extract the coupling constants of the Z+ c J/ψ π+, Z+ c ηc ρ+ and Z+ c D+ ¯D∗0 vertices and the corresponding decay widths in these channels. The results obtained are in good agreement with the experimental data and supports to the tetraquark picture of this state.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study, using the QCD sum rule framework, the possible existence of a charmed pentaquark that we call Θc(3250). In the QCD side we work at leading order in αs and consider condensates up to dimension 10. The mass obtained: mΘc = (3.21±0.13) GeV, is compatible with the mass of the structure seen by BaBar Collaboration in the decay channel B− →  ̄p Σ++ c π−π−.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conservatism is a central theme of organismic evolution. Related species share characteristics due to their common ancestry. Some concern have been raised among evolutionary biologists, whether such conservatism is an expression of natural selection or of a constrained ability to adapt. This thesis explores adaptations and constraints within the plant reproductive phase, particularly in relation to the evolution of fleshy fruit types (berries, drupes, etc.) and the seasonal timing of flowering and fruiting. The different studies were arranged along a hierarchy of scale, with general data sets sampled among seed plants at the global scale, through more specific analyses of character evolution within the genus Rhamnus s.l. L. (Rhamnaceae), to descriptive and experimental field studies in a local population of Frangula alnus (Rhamnaceae). Apart from the field study, this thesis is mainly based on comparative methods explicitly incorporating phylogenetic relationships. The comparative study of Rhamnus s.l. species included the reconstruction of phylogenetic hypotheses based on DNA sequences. Among geographically overlapping sister clades, biotic pollination was not correlated with higher species richness when compared to wind pollinated plants. Among woody plants, clades characterized by fleshy fruit types were more species rich than their dry-fruited sister clades, suggesting that the fleshy fruit is a key innovation in woody habitats. Moreover, evolution of fleshy fruits was correlated with a change to more closed (darker) habitats. An independent contrast study within Rhamnus s.l. documented allometric relations between plant and fruit size. As a phylogenetic constraint, allometric effects must be considered weak or non-existent, though, as they did not prevail among different subclades within Rhamnus s.l. Fruit size was correlated with seed size and seed number in F. alnus. This thesis suggests that frugivore selection on fleshy fruit may be important by constraining the upper limits of fruit size, when a plant lineage is colonizing (darker) habitats where larger seed size is adaptive. Phenological correlations with fruit set, dispersal, and seed size in F. alnus, suggested that the evolution of reproductive phenology is constrained by trade-offs and partial interdependences between flowering, fruiting, dispersal, and recruitment phases. Phylogenetic constraints on the evolution of phenology were indicated by a lack of correlation between flowering time and seasonal length within Rhamnus cathartica and F. alnus, respectively. On the other hand, flowering time was correlated with seasonal length among Rhamnus s.l. species. Phenological differences between biotically and wind pollinated angiosperms also suggested adaptive change in reproductive phenology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this thesis is to go through different approaches for proving expressiveness properties in several concurrent languages. We analyse four different calculi exploiting for each one a different technique. We begin with the analysis of a synchronous language, we explore the expressiveness of a fragment of CCS! (a variant of Milner's CCS where replication is considered instead of recursion) w.r.t. the existence of faithful encodings (i.e. encodings that respect the behaviour of the encoded model without introducing unnecessary computations) of models of computability strictly less expressive than Turing Machines. Namely, grammars of types 1,2 and 3 in the Chomsky Hierarchy. We then move to asynchronous languages and we study full abstraction for two Linda-like languages. Linda can be considered as the asynchronous version of CCS plus a shared memory (a multiset of elements) that is used for storing messages. After having defined a denotational semantics based on traces, we obtain fully abstract semantics for both languages by using suitable abstractions in order to identify different traces which do not correspond to different behaviours. Since the ability of one of the two variants considered of recognising multiple occurrences of messages in the store (which accounts for an increase of expressiveness) reflects in a less complex abstraction, we then study other languages where multiplicity plays a fundamental role. We consider the language CHR (Constraint Handling Rules) a language which uses multi-headed (guarded) rules. We prove that multiple heads augment the expressive power of the language. Indeed we show that if we restrict to rules where the head contains at most n atoms we could generate a hierarchy of languages with increasing expressiveness (i.e. the CHR language allowing at most n atoms in the heads is more expressive than the language allowing at most m atoms, with mrules. Depending on the shape of the rewriting rules, several dialects of the calculus can be obtained. We analyse the expressive power of some of these dialects by focusing on decidability and undecidability for problems like reachability and coverability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nel lavoro di tesi qui presentato si indaga l'applicazione di tecniche di apprendimento mirate ad una più efficiente esecuzione di un portfolio di risolutore di vincoli (constraint solver). Un constraint solver è un programma che dato in input un problema di vincoli, elabora una soluzione mediante l'utilizzo di svariate tecniche. I problemi di vincoli sono altamente presenti nella vita reale. Esempi come l'organizzazione dei viaggi dei treni oppure la programmazione degli equipaggi di una compagnia aerea, sono tutti problemi di vincoli. Un problema di vincoli è formalizzato da un problema di soddisfacimento di vincoli(CSP). Un CSP è descritto da un insieme di variabili che possono assumere valori appartenenti ad uno specico dominio ed un insieme di vincoli che mettono in relazione variabili e valori assumibili da esse. Una tecnica per ottimizzare la risoluzione di tali problemi è quella suggerita da un approccio a portfolio. Tale tecnica, usata anche in am- biti come quelli economici, prevede la combinazione di più solver i quali assieme possono generare risultati migliori di un approccio a singolo solver. In questo lavoro ci preoccupiamo di creare una nuova tecnica che combina un portfolio di constraint solver con tecniche di machine learning. Il machine learning è un campo di intelligenza articiale che si pone l'obiettivo di immettere nelle macchine una sorta di `intelligenza'. Un esempio applicativo potrebbe essere quello di valutare i casi passati di un problema ed usarli in futuro per fare scelte. Tale processo è riscontrato anche a livello cognitivo umano. Nello specico, vogliamo ragionare in termini di classicazione. Una classicazione corrisponde ad assegnare ad un insieme di caratteristiche in input, un valore discreto in output, come vero o falso se una mail è classicata come spam o meno. La fase di apprendimento sarà svolta utilizzando una parte di CPHydra, un portfolio di constraint solver sviluppato presso la University College of Cork (UCC). Di tale algoritmo a portfolio verranno utilizzate solamente le caratteristiche usate per descrivere determinati aspetti di un CSP rispetto ad un altro; queste caratteristiche vengono altresì dette features. Creeremo quindi una serie di classicatori basati sullo specifico comportamento dei solver. La combinazione di tali classicatori con l'approccio a portfolio sara nalizzata allo scopo di valutare che le feature di CPHydra siano buone e che i classicatori basati su tali feature siano affidabili. Per giusticare il primo risultato, eettueremo un confronto con uno dei migliori portfolio allo stato dell'arte, SATzilla. Una volta stabilita la bontà delle features utilizzate per le classicazioni, andremo a risolvere i problemi simulando uno scheduler. Tali simulazioni testeranno diverse regole costruite con classicatori precedentemente introdotti. Prima agiremo su uno scenario ad un processore e successivamente ci espanderemo ad uno scenario multi processore. In questi esperimenti andremo a vericare che, le prestazioni ottenute tramite l'applicazione delle regole create appositamente sui classicatori, abbiano risultati migliori rispetto ad un'esecuzione limitata all'utilizzo del migliore solver del portfolio. I lavoro di tesi è stato svolto in collaborazione con il centro di ricerca 4C presso University College Cork. Su questo lavoro è stato elaborato e sottomesso un articolo scientico alla International Joint Conference of Articial Intelligence (IJCAI) 2011. Al momento della consegna della tesi non siamo ancora stati informati dell'accettazione di tale articolo. Comunque, le risposte dei revisori hanno indicato che tale metodo presentato risulta interessante.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Il lavoro presentato in questa tesi si colloca nel contesto della programmazione con vincoli, un paradigma per modellare e risolvere problemi di ricerca combinatoria che richiedono di trovare soluzioni in presenza di vincoli. Una vasta parte di questi problemi trova naturale formulazione attraverso il linguaggio delle variabili insiemistiche. Dal momento che il dominio di tali variabili può essere esponenziale nel numero di elementi, una rappresentazione esplicita è spesso non praticabile. Recenti studi si sono quindi focalizzati nel trovare modi efficienti per rappresentare tali variabili. Pertanto si è soliti rappresentare questi domini mediante l'uso di approssimazioni definite tramite intervalli (d'ora in poi rappresentazioni), specificati da un limite inferiore e un limite superiore secondo un'appropriata relazione d'ordine. La recente evoluzione della ricerca sulla programmazione con vincoli sugli insiemi ha chiaramente indicato che la combinazione di diverse rappresentazioni permette di raggiungere prestazioni di ordini di grandezza superiori rispetto alle tradizionali tecniche di codifica. Numerose proposte sono state fatte volgendosi in questa direzione. Questi lavori si differenziano su come è mantenuta la coerenza tra le diverse rappresentazioni e su come i vincoli vengono propagati al fine di ridurre lo spazio di ricerca. Sfortunatamente non esiste alcun strumento formale per paragonare queste combinazioni. Il principale obiettivo di questo lavoro è quello di fornire tale strumento, nel quale definiamo precisamente la nozione di combinazione di rappresentazioni facendo emergere gli aspetti comuni che hanno caratterizzato i lavori precedenti. In particolare identifichiamo due tipi possibili di combinazioni, una forte ed una debole, definendo le nozioni di coerenza agli estremi sui vincoli e sincronizzazione tra rappresentazioni. Il nostro studio propone alcune interessanti intuizioni sulle combinazioni esistenti, evidenziandone i limiti e svelando alcune sorprese. Inoltre forniamo un'analisi di complessità della sincronizzazione tra minlex, una rappresentazione in grado di propagare in maniera ottimale vincoli lessicografici, e le principali rappresentazioni esistenti.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This doctoral thesis examines the use of liability rules to protect patent entitlements, focusing on a specific type of rule named ex-post since it is applied and designed ex-post by a court or an agency. The research starts from the premise that patents are defined by the legal and economic scholarship as exclusive rights but nevertheless, under certain circumstances there are economic as well as other compelling reasons to transform the exclusiveness of patent rights into a right to receive compensation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work presents hybrid Constraint Programming (CP) and metaheuristic methods for the solution of Large Scale Optimization Problems; it aims at integrating concepts and mechanisms from the metaheuristic methods to a CP-based tree search environment in order to exploit the advantages of both approaches. The modeling and solution of large scale combinatorial optimization problem is a topic which has arisen the interest of many researcherers in the Operations Research field; combinatorial optimization problems are widely spread in everyday life and the need of solving difficult problems is more and more urgent. Metaheuristic techniques have been developed in the last decades to effectively handle the approximate solution of combinatorial optimization problems; we will examine metaheuristics in detail, focusing on the common aspects of different techniques. Each metaheuristic approach possesses its own peculiarities in designing and guiding the solution process; our work aims at recognizing components which can be extracted from metaheuristic methods and re-used in different contexts. In particular we focus on the possibility of porting metaheuristic elements to constraint programming based environments, as constraint programming is able to deal with feasibility issues of optimization problems in a very effective manner. Moreover, CP offers a general paradigm which allows to easily model any type of problem and solve it with a problem-independent framework, differently from local search and metaheuristic methods which are highly problem specific. In this work we describe the implementation of the Local Branching framework, originally developed for Mixed Integer Programming, in a CP-based environment. Constraint programming specific features are used to ease the search process, still mantaining an absolute generality of the approach. We also propose a search strategy called Sliced Neighborhood Search, SNS, that iteratively explores slices of large neighborhoods of an incumbent solution by performing CP-based tree search and encloses concepts from metaheuristic techniques. SNS can be used as a stand alone search strategy, but it can alternatively be embedded in existing strategies as intensification and diversification mechanism. In particular we show its integration within the CP-based local branching. We provide an extensive experimental evaluation of the proposed approaches on instances of the Asymmetric Traveling Salesman Problem and of the Asymmetric Traveling Salesman Problem with Time Windows. The proposed approaches achieve good results on practical size problem, thus demonstrating the benefit of integrating metaheuristic concepts in CP-based frameworks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigate the benefits that emerge when the fields of constraint programming and concurrency meet. On one hand, constraints can be use in concurrency theory to increase the conciseness and the expressive power of concurrent languages from a pragmatic point of view. On the other hand, problems modeled by using constraints can be solved faster and more efficiently using a concurrent system. We explore both directions providing two separate lines of contribution. Firstly we study the expressive power of a concurrent language, namely Constraint Handling Rules, that supports constraints as a primitive construct. We show what features of this language make it Turing powerful. Then we propose a framework to solve constraint problems that is intended to be deployed on a concurrent system. For the development of this framework we used the concurrent language Jolie following the Service Oriented paradigm. Based on this experience, we also propose an extension to Service Oriented Languages to overcome some of their limitations and to improve the development of concurrent applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work presents exact algorithms for the Resource Allocation and Cyclic Scheduling Problems (RA&CSPs). Cyclic Scheduling Problems arise in a number of application areas, such as in hoist scheduling, mass production, compiler design (implementing scheduling loops on parallel architectures), software pipelining, and in embedded system design. The RA&CS problem concerns time and resource assignment to a set of activities, to be indefinitely repeated, subject to precedence and resource capacity constraints. In this work we present two constraint programming frameworks facing two different types of cyclic problems. In first instance, we consider the disjunctive RA&CSP, where the allocation problem considers unary resources. Instances are described through the Synchronous Data-flow (SDF) Model of Computation. The key problem of finding a maximum-throughput allocation and scheduling of Synchronous Data-Flow graphs onto a multi-core architecture is NP-hard and has been traditionally solved by means of heuristic (incomplete) algorithms. We propose an exact (complete) algorithm for the computation of a maximum-throughput mapping of applications specified as SDFG onto multi-core architectures. Results show that the approach can handle realistic instances in terms of size and complexity. Next, we tackle the Cyclic Resource-Constrained Scheduling Problem (i.e. CRCSP). We propose a Constraint Programming approach based on modular arithmetic: in particular, we introduce a modular precedence constraint and a global cumulative constraint along with their filtering algorithms. Many traditional approaches to cyclic scheduling operate by fixing the period value and then solving a linear problem in a generate-and-test fashion. Conversely, our technique is based on a non-linear model and tackles the problem as a whole: the period value is inferred from the scheduling decisions. The proposed approaches have been tested on a number of non-trivial synthetic instances and on a set of realistic industrial instances achieving good results on practical size problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent research has shown that the performance of a single, arbitrarily efficient algorithm can be significantly outperformed by using a portfolio of —possibly on-average slower— algorithms. Within the Constraint Programming (CP) context, a portfolio solver can be seen as a particular constraint solver that exploits the synergy between the constituent solvers of its portfolio for predicting which is (or which are) the best solver(s) to run for solving a new, unseen instance. In this thesis we examine the benefits of portfolio solvers in CP. Despite portfolio approaches have been extensively studied for Boolean Satisfiability (SAT) problems, in the more general CP field these techniques have been only marginally studied and used. We conducted this work through the investigation, the analysis and the construction of several portfolio approaches for solving both satisfaction and optimization problems. We focused in particular on sequential approaches, i.e., single-threaded portfolio solvers always running on the same core. We started from a first empirical evaluation on portfolio approaches for solving Constraint Satisfaction Problems (CSPs), and then we improved on it by introducing new data, solvers, features, algorithms, and tools. Afterwards, we addressed the more general Constraint Optimization Problems (COPs) by implementing and testing a number of models for dealing with COP portfolio solvers. Finally, we have come full circle by developing sunny-cp: a sequential CP portfolio solver that turned out to be competitive also in the MiniZinc Challenge, the reference competition for CP solvers.