96 resultados para Methodological problems
Resumo:
A major obstacle to processing images of the ocean floor comes from the absorption and scattering effects of the light in the aquatic environment. Due to the absorption of the natural light, underwater vehicles often require artificial light sources attached to them to provide the adequate illumination. Unfortunately, these flashlights tend to illuminate the scene in a nonuniform fashion, and, as the vehicle moves, induce shadows in the scene. For this reason, the first step towards application of standard computer vision techniques to underwater imaging requires dealing first with these lighting problems. This paper analyses and compares existing methodologies to deal with low-contrast, nonuniform illumination in underwater image sequences. The reviewed techniques include: (i) study of the illumination-reflectance model, (ii) local histogram equalization, (iii) homomorphic filtering, and, (iv) subtraction of the illumination field. Several experiments on real data have been conducted to compare the different approaches
Resumo:
Scarcities of environmental services are no longer merely a remote hypothesis. Consequently, analysis of their inequalities between nations becomes of paramount importance for the achievement of sustainability in terms either of international policy, or of Universalist ethical principles of equity. This paper aims, on the one hand, at revising methodological aspects of the inequality measurement of certain environmental data and, on the other, at extending the scarce empirical evidence relating to the international distribution of Ecological Footprint (EF), by using a longer EF time series. Most of the techniques currently important in the literature are revised and then tested on EF data with interesting results. We look in depth at Lorenz dominance analyses and consider the underlying properties of different inequality indices. Those indices which fit best with environmental inequality measurements are CV2 and GE(2) because of their neutrality property, however a trade-off may occur when subgroup decompositions are performed. A weighting factor decomposition method is proposed in order to isolate weighting factor changes in inequality growth rates. Finally, the only non-ambiguous way of decomposing inequality by source is the natural decomposition of CV2, which additionally allows the interpretation of marginal term contributions. Empirically, this paper contributes to the environmental inequality measurement of EF: this inequality has been quite stable and its change over time is due to per capita vector changes rather than population changes. Almost the entirety of the EF inequality is explainable by differences in the means between the countries of the World Bank group. This finding suggests that international environmental agreements should be attempted on a regional basis in an attempt to achieve greater consensus between the parties involved. Additionally, source decomposition warns of the dangers of confining CO2 emissions reduction to crop-based energies because of the implications for basic needs satisfaction.
Mutigrid preconditioner for nonconforming discretization of elliptic problems with jump coefficients
Resumo:
In this paper, we present a multigrid preconditioner for solving the linear system arising from the piecewise linear nonconforming Crouzeix-Raviart discretization of second order elliptic problems with jump coe fficients. The preconditioner uses the standard conforming subspaces as coarse spaces. Numerical tests show both robustness with respect to the jump in the coe fficient and near-optimality with respect to the number of degrees of freedom.
Resumo:
We construct and analyze non-overlapping Schwarz methods for a preconditioned weakly over-penalized symmetric interior penalty (WOPSIP) method for elliptic problems.
Resumo:
In this note, we consider claims problems with indivisible goods. Specifically, by applying recursively the P-rights lower bound (Jiménez-Gómez and Marco-Gil (2008)), we ensure the fulfillment of Weak Order Preservation, considered by many authors as a minimal requirement of fairness. Moreover, we retrieve the Discrete Constrained Equal Losses and the Discrete Constrained Equal Awards rules (Herrero and Martíınez (2008)). Finally, by the recursive double imposition of a lower and an upper bound, we obtain the average between them. Keywords: Claims problems, Indivisibilities, Order Preservation, Constrained Egalitarian rules, Midpoint. JEL classification: C71, D63, D71.
Resumo:
Scarcities of environmental services are no longer merely a remote hypothesis. Consequently, analysis of their inequalities between nations becomes of paramount importance for the achievement of sustainability in terms either of international policy, or of Universalist ethical principles of equity. This paper aims, on the one hand, at revising methodological aspects of the inequality measurement of certain environmental data and, on the other, at extending the scarce empirical evidence relating to the international distribution of Ecological Footprint (EF), by using a longer EF time series. Most of the techniques currently important in the literature are revised and then tested on EF data with interesting results. We look in depth at Lorenz dominance analyses and consider the underlying properties of different inequality indices. Those indices which fit best with environmental inequality measurements are CV2 and GE(2) because of their neutrality property, however a trade-off may occur when subgroup decompositions are performed. A weighting factor decomposition method is proposed in order to isolate weighting factor changes in inequality growth rates. Finally, the only non-ambiguous way of decomposing inequality by source is the natural decomposition of CV2, which additionally allows the interpretation of marginal term contributions. Empirically, this paper contributes to the environmental inequality measurement of EF: this inequality has been quite stable and its change over time is due to per capita vector changes rather than population changes. Almost the entirety of the EF inequality is explainable by differences in the means between the countries of the World Bank group. This finding suggests that international environmental agreements should be attempted on a regional basis in an attempt to achieve greater consensus between the parties involved. Additionally, source decomposition warns of the dangers of confining CO2 emissions reduction to crop-based energies because of the implications for basic needs satisfaction. Keywords: ecological footprint; ecological inequality measurement, inequality decomposition.
Resumo:
In the last few years, many researchers have studied the presence of common dimensions of temperament in subjects with symptoms of anxiety. The aim of this study is to examine the association between temperamental dimensions (high negative affect and activity level) and anxiety problems in clinicalpreschool children. A total of 38 children, ages 3 to 6 years, from the Infant and Adolescent Mental Health Center of Girona and the Center of Diagnosis and Early Attention of Sabadell and Olot were evaluated by parents and psychologists. Their parents completed several screening scales and, subsequently, clinical child psychopathology professionals carried out diagnostic interviews with children from the sample who presented signs of anxiety. Findings showed that children with high levels of negative affect and low activity level have pronounced symptoms of anxiety. However, children with anxiety disorders do not present different temperament styles from their peers without these pathologies
Resumo:
El 2001 Croàcia va posar en marxa la iniciativa destinada a afluixar un alt grau de centralització mitjançant l'ampliació dels mandats de les unitats locals i el canvi de les fonts de finançament de les funcions públiques. No obstant això, els passos inicials en el procés de descentralització no ha estat seguida per altres mesures de descentralització fiscal, i en conseqüència, el seu nivell s'ha mantingut pràcticament sense canvis. El treball es proposa elaborar els principals problemes i obstacles en relació amb l'aplicació efectiva del procés de descentralització i donar tres grups de propostes per a la descentralització: (i) la divisió dels poders obligatoris entre les diferents unitats locals, (ii) el canvi en el finançament i (Iii) modificar la divisió territorial del país.
Resumo:
Studies of the EU accession of the East and Central European Countries have stressed the importance of neo-liberal institutionalism as an explanation for Member State preferences. In this paper it is argued that Member States’ preferences over Turkish EU accession are better explained by power politics and neo-realism. It seems therefore that Turkey’s way to the EU follows another path than the East and Central Countries. Turkish accession raises the question of the EU’s role in a uni-polar world order – whether the EU should develop into an independent actor on the world stage or not. However, when it comes to the interaction among the Member States in order to decide on when to open accession negotiations with Turkey the constitutive values of the EU seriously modify the outcome that pure power politics would have let to.
Resumo:
The aim of this paper is to analyse the main agreements on the EU’s External Action agreed within the European Convention and the IGC taking into account why, how and who reached the consensus on them. In other words, this paper will explore the principles followed in order to improve the instruments of the EU’s External Action such as authority, coherence, visibility, efficiency and credibility.
Resumo:
One of the tantalising remaining problems in compositional data analysis lies in how to deal with data sets in which there are components which are essential zeros. By anessential zero we mean a component which is truly zero, not something recorded as zero simply because the experimental design or the measuring instrument has not been sufficiently sensitive to detect a trace of the part. Such essential zeros occur inmany compositional situations, such as household budget patterns, time budgets,palaeontological zonation studies, ecological abundance studies. Devices such as nonzero replacement and amalgamation are almost invariably ad hoc and unsuccessful insuch situations. From consideration of such examples it seems sensible to build up amodel in two stages, the first determining where the zeros will occur and the secondhow the unit available is distributed among the non-zero parts. In this paper we suggest two such models, an independent binomial conditional logistic normal model and a hierarchical dependent binomial conditional logistic normal model. The compositional data in such modelling consist of an incidence matrix and a conditional compositional matrix. Interesting statistical problems arise, such as the question of estimability of parameters, the nature of the computational process for the estimation of both the incidence and compositional parameters caused by the complexity of the subcompositional structure, the formation of meaningful hypotheses, and the devising of suitable testing methodology within a lattice of such essential zero-compositional hypotheses. The methodology is illustrated by application to both simulated and real compositional data
Resumo:
Globalization involves several facility location problems that need to be handled at large scale. Location Allocation (LA) is a combinatorial problem in which the distance among points in the data space matter. Precisely, taking advantage of the distance property of the domain we exploit the capability of clustering techniques to partition the data space in order to convert an initial large LA problem into several simpler LA problems. Particularly, our motivation problem involves a huge geographical area that can be partitioned under overall conditions. We present different types of clustering techniques and then we perform a cluster analysis over our dataset in order to partition it. After that, we solve the LA problem applying simulated annealing algorithm to the clustered and non-clustered data in order to work out how profitable is the clustering and which of the presented methods is the most suitable
Resumo:
In this paper, we are proposing a methodology to determine the most efficient and least costly way of crew pairing optimization. We are developing a methodology based on algorithm optimization on Eclipse opensource IDE using the Java programming language to solve the crew scheduling problems.
Resumo:
Black-box optimization problems (BBOP) are de ned as those optimization problems in which the objective function does not have an algebraic expression, but it is the output of a system (usually a computer program). This paper is focussed on BBOPs that arise in the eld of insurance, and more speci cally in reinsurance problems. In this area, the complexity of the models and assumptions considered to de ne the reinsurance rules and conditions produces hard black-box optimization problems, that must be solved in order to obtain the optimal output of the reinsurance. The application of traditional optimization approaches is not possible in BBOP, so new computational paradigms must be applied to solve these problems. In this paper we show the performance of two evolutionary-based techniques (Evolutionary Programming and Particle Swarm Optimization). We provide an analysis in three BBOP in reinsurance, where the evolutionary-based approaches exhibit an excellent behaviour, nding the optimal solution within a fraction of the computational cost used by inspection or enumeration methods.
Resumo:
When encountering a set of alternatives displayed in the form of a list, the decision maker usually determines a particular alternative, after which she stops checking the remaining ones, and chooses an alternative from those observed so far. We present a framework in which both decision problems are explicitly modeled, and axiomatically characterize a stop-and-choose rule which unifies position-biased successive choice and satisficing choice.