83 resultados para preconditioning saddle point problems
Resumo:
El 2001 Croàcia va posar en marxa la iniciativa destinada a afluixar un alt grau de centralització mitjançant l'ampliació dels mandats de les unitats locals i el canvi de les fonts de finançament de les funcions públiques. No obstant això, els passos inicials en el procés de descentralització no ha estat seguida per altres mesures de descentralització fiscal, i en conseqüència, el seu nivell s'ha mantingut pràcticament sense canvis. El treball es proposa elaborar els principals problemes i obstacles en relació amb l'aplicació efectiva del procés de descentralització i donar tres grups de propostes per a la descentralització: (i) la divisió dels poders obligatoris entre les diferents unitats locals, (ii) el canvi en el finançament i (Iii) modificar la divisió territorial del país.
Resumo:
We put together the different conceptual issues involved in measuring inequality of opportunity, discuss how these concepts have been translated into computable measures, and point out the problems and choices researchers face when implementing these measures. Our analysis identifies and suggests several new possibilities to measure inequality of opportunity. The approaches are illustrated with a selective survey of the empirical literature on income inequality of opportunity.
Resumo:
Studies of the EU accession of the East and Central European Countries have stressed the importance of neo-liberal institutionalism as an explanation for Member State preferences. In this paper it is argued that Member States’ preferences over Turkish EU accession are better explained by power politics and neo-realism. It seems therefore that Turkey’s way to the EU follows another path than the East and Central Countries. Turkish accession raises the question of the EU’s role in a uni-polar world order – whether the EU should develop into an independent actor on the world stage or not. However, when it comes to the interaction among the Member States in order to decide on when to open accession negotiations with Turkey the constitutive values of the EU seriously modify the outcome that pure power politics would have let to.
Resumo:
The aim of this paper is to analyse the main agreements on the EU’s External Action agreed within the European Convention and the IGC taking into account why, how and who reached the consensus on them. In other words, this paper will explore the principles followed in order to improve the instruments of the EU’s External Action such as authority, coherence, visibility, efficiency and credibility.
Resumo:
One of the tantalising remaining problems in compositional data analysis lies in how to deal with data sets in which there are components which are essential zeros. By anessential zero we mean a component which is truly zero, not something recorded as zero simply because the experimental design or the measuring instrument has not been sufficiently sensitive to detect a trace of the part. Such essential zeros occur inmany compositional situations, such as household budget patterns, time budgets,palaeontological zonation studies, ecological abundance studies. Devices such as nonzero replacement and amalgamation are almost invariably ad hoc and unsuccessful insuch situations. From consideration of such examples it seems sensible to build up amodel in two stages, the first determining where the zeros will occur and the secondhow the unit available is distributed among the non-zero parts. In this paper we suggest two such models, an independent binomial conditional logistic normal model and a hierarchical dependent binomial conditional logistic normal model. The compositional data in such modelling consist of an incidence matrix and a conditional compositional matrix. Interesting statistical problems arise, such as the question of estimability of parameters, the nature of the computational process for the estimation of both the incidence and compositional parameters caused by the complexity of the subcompositional structure, the formation of meaningful hypotheses, and the devising of suitable testing methodology within a lattice of such essential zero-compositional hypotheses. The methodology is illustrated by application to both simulated and real compositional data
Resumo:
Globalization involves several facility location problems that need to be handled at large scale. Location Allocation (LA) is a combinatorial problem in which the distance among points in the data space matter. Precisely, taking advantage of the distance property of the domain we exploit the capability of clustering techniques to partition the data space in order to convert an initial large LA problem into several simpler LA problems. Particularly, our motivation problem involves a huge geographical area that can be partitioned under overall conditions. We present different types of clustering techniques and then we perform a cluster analysis over our dataset in order to partition it. After that, we solve the LA problem applying simulated annealing algorithm to the clustered and non-clustered data in order to work out how profitable is the clustering and which of the presented methods is the most suitable
Resumo:
In this paper, we are proposing a methodology to determine the most efficient and least costly way of crew pairing optimization. We are developing a methodology based on algorithm optimization on Eclipse opensource IDE using the Java programming language to solve the crew scheduling problems.
Resumo:
Black-box optimization problems (BBOP) are de ned as those optimization problems in which the objective function does not have an algebraic expression, but it is the output of a system (usually a computer program). This paper is focussed on BBOPs that arise in the eld of insurance, and more speci cally in reinsurance problems. In this area, the complexity of the models and assumptions considered to de ne the reinsurance rules and conditions produces hard black-box optimization problems, that must be solved in order to obtain the optimal output of the reinsurance. The application of traditional optimization approaches is not possible in BBOP, so new computational paradigms must be applied to solve these problems. In this paper we show the performance of two evolutionary-based techniques (Evolutionary Programming and Particle Swarm Optimization). We provide an analysis in three BBOP in reinsurance, where the evolutionary-based approaches exhibit an excellent behaviour, nding the optimal solution within a fraction of the computational cost used by inspection or enumeration methods.
Resumo:
When encountering a set of alternatives displayed in the form of a list, the decision maker usually determines a particular alternative, after which she stops checking the remaining ones, and chooses an alternative from those observed so far. We present a framework in which both decision problems are explicitly modeled, and axiomatically characterize a stop-and-choose rule which unifies position-biased successive choice and satisficing choice.
Resumo:
The longwave emission of planetary atmospheres that contain a condensable absorbing gas in the infrared (i.e., longwave), which is in equilibrium with its liquid phase at the surface, may exhibit an upper bound. Here we analyze the effect of the atmospheric absorption of sunlight on this radiation limit. We assume that the atmospheric absorption of infrared radiation is independent of wavelength except within the spectral width of the atmospheric window, where it is zero. The temperature profile in radiative equilibrium is obtained analytically as a function of the longwave optical thickness. For illustrative purposes, numerical values for the infrared atmospheric absorption (i.e., greenhouse effect) and the liquid vapor equilibrium curve of the condensable absorbing gas refer to water. Values for the atmospheric absorption of sunlight (i.e., antigreenhouse effect) take a wide range since our aim is to provide a qualitative view of their effects. We find that atmospheres with a transparent region in the infrared spectrum do not present an absolute upper bound on the infrared emission. This result may be also found in atmospheres opaque at all infrared wavelengths if the fraction of absorbed sunlight in the atmosphere increases with the longwave opacity
Resumo:
Purpose: The paper analyzes the micro-angels investment behaviour, looking both to the criteria used in the selection of their investment projects and to the characteristics of their guidance role during the investment period. Design/Methodology/Approach: The paper focuses on a double bottom line movement of French micro-angels clubs that has been operating since 1983. Our primary source of data is an online survey carried out during March 2012, asking members of clubs all over France for different aspects of their procedures. Findings: Our findings suggest that micro-angels are interested in small, socially or environmentally friendly projects having the potential to contribute to the development of their neighbourhood. We find that women are even more interested than men in such projects. Educated micro-angels value entrepreneurial motivation and understanding of the project more than less-educated micro-angels. We also point out the factors that micro-angels consider important in accompanying enterprises. Here we find that gender makes little difference. However, retired micro-angels value financial diagnosis made conjointly with entrepreneurs, while both active micro-angels and educated micro-angels value more the use of their network to help micro-entrepreneurs. Practical/Social Implications: Given the potential benefits of micro-angels investing and guiding the development of micro-enterprises, a social micro-angel investment on a major scale in developing countries could help in tackling some of the problems faced by the development of microfinance, such as the over-indebtedness of micro-entrepreneurs. Practitioners and new initiatives would gain from understanding what adaptations need to be made. Originality/value: We expect to add to the venture capital theory to take into account non-economic motives.
Resumo:
In this work we describe the usage of bilinear statistical models as a means of factoring the shape variability into two components attributed to inter-subject variation and to the intrinsic dynamics of the human heart. We show that it is feasible to reconstruct the shape of the heart at discrete points in the cardiac cycle. Provided we are given a small number of shape instances representing the same heart atdifferent points in the same cycle, we can use the bilinearmodel to establish this. Using a temporal and a spatial alignment step in the preprocessing of the shapes, around half of the reconstruction errors were on the order of the axial image resolution of 2 mm, and over 90% was within 3.5 mm. From this, weconclude that the dynamics were indeed separated from theinter-subject variability in our dataset.
Resumo:
The present paper is aimed at identifying what are the effects of the Point System of Selection of immigrants in Quebec. I defend that the distribution of points results in a different composition of immigrant stocks in terms of origin mix and not in terms of labour skills. To do so, I carry out a longitudinal descriptive analysis on the national composition of immigrants in Quebec and two other significant provinces (Ontario and British Columbia), as well as an analysis of the distribution of points in Quebec and in the rest of Canada.
Resumo:
From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.
Resumo:
We present a polyhedral framework for establishing general structural properties on optimal solutions of stochastic scheduling problems, where multiple job classes vie for service resources: the existence of an optimal priority policy in a given family, characterized by a greedoid (whose feasible class subsets may receive higher priority), where optimal priorities are determined by class-ranking indices, under restricted linear performance objectives (partial indexability). This framework extends that of Bertsimas and Niño-Mora (1996), which explained the optimality of priority-index policies under all linear objectives (general indexability). We show that, if performance measures satisfy partial conservation laws (with respect to the greedoid), which extend previous generalized conservation laws, then the problem admits a strong LP relaxation over a so-called extended greedoid polytope, which has strong structural and algorithmic properties. We present an adaptive-greedy algorithm (which extends Klimov's) taking as input the linear objective coefficients, which (1) determines whether the optimal LP solution is achievable by a policy in the given family; and (2) if so, computes a set of class-ranking indices that characterize optimal priority policies in the family. In the special case of project scheduling, we show that, under additional conditions, the optimal indices can be computed separately for each project (index decomposition). We further apply the framework to the important restless bandit model (two-action Markov decision chains), obtaining new index policies, that extend Whittle's (1988), and simple sufficient conditions for their validity. These results highlight the power of polyhedral methods (the so-called achievable region approach) in dynamic and stochastic optimization.