97 resultados para Agglomeration principle


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this book is to survey on different Land Use Planning and safety approaches in vicinity of industrial plants. As this research is associated with three broad fields of Land Use Planning, safety and security, the set principle is to avoid unnecessary and over detailed information, but including the useful ones to provide a comprehensive resource which can be applicable for several purposes. Besides, the proposed method, which is explained in Chapter 7, can initiate a new field for future of Land Use Planning in vicinity of industrial plants.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper argues that the current Spanish system of regional financing does not adequately respect the principles of equality, autonomy, responsibility and transparency that should guide its design. It also advances a series of recommendations for the reform of the system that can be classified into two broad headings: guaranteeing the effective application of the constitucional principle of equality and reinforcing the fiscal responsibility and the accountability of regional governments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Quantum molecular similarity (QMS) techniques are used to assess the response of the electron density of various small molecules to application of a static, uniform electric field. Likewise, QMS is used to analyze the changes in electron density generated by the process of floating a basis set. The results obtained show an interrelation between the floating process, the optimum geometry, and the presence of an external field. Cases involving the Le Chatelier principle are discussed, and an insight on the changes of bond critical point properties, self-similarity values and density differences is performed

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An overview is given on a study which showed that not only in chemical reactions but also in the favorable case of nontotally symmetric vibrations where the chemical and external potentials keep approximately constant, the generalized maximum hardness principle (GMHP) and generalized minimum polarizability principle (GMPP) may not be obeyed. A method that allows an accurate determination of the nontotally symmetric molecular distortions with more marked GMPP or anti-GMPP character through diagonalization of the polarizability Hessian matrix is introduced

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The hypothesis of minimum entropy production is applied to a simple one-dimensional energy balance model and is analysed for different values of the radiative forcing due to greenhouse gases. The extremum principle is used to determine the planetary “conductivity” and to avoid the “diffusive” approximation, which is commonly assumed in this type of model. For present conditions the result at minimum radiative entropy production is similar to that obtained by applying the classical model. Other climatic scenarios show visible differences, with better behaviour for the extremal case

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When one wishes to implement public policies, there is a previous need of comparing different actions and valuating and evaluating them to assess their social attractiveness. Recently the concept of well-being has been proposed as a multidimensional proxy for measuring societal prosperity and progress; a key research topic is then on how we can measure and evaluate this plurality of dimensions for policy decisions. This paper defends the thesis articulated in the following points: 1. Different metrics are linked to different objectives and values. To use only one measurement unit (on the grounds of the so-called commensurability principle) for incorporating a plurality of dimensions, objectives and values, implies reductionism necessarily. 2. Point 1) can be proven as a matter of formal logic by drawing on the work of Geach about moral philosophy. This theoretical demonstration is an original contribution of this article. Here the distinction between predicative and attributive adjectives is formalised and definitions are provided. Predicative adjectives are further distinguished into absolute and relative ones. The new concepts of set commensurability and rod commensurability are introduced too. 3. The existence of a plurality of social actors, with interest in the policy being assessed, causes that social decisions involve multiple types of values, of which economic efficiency is only one. Therefore it is misleading to make social decisions based only on that one value. 4. Weak comparability of values, which is grounded on incommensurability, is proved to be the main methodological foundation of policy evaluation in the framework of well-being economics. Incommensurability does not imply incomparability; on the contrary incommensurability is the only rational way to compare societal options under a plurality of policy objectives. 5. Weak comparability can be implemented by using multi-criteria evaluation, which is a formal framework for applied consequentialism under incommensurability. Social Multi-Criteria Evaluation, in particular, allows considering both technical and social incommensurabilities simultaneously.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: We address the problem of studying recombinational variations in (human) populations. In this paper, our focus is on one computational aspect of the general task: Given two networks G1 and G2, with both mutation and recombination events, defined on overlapping sets of extant units the objective is to compute a consensus network G3 with minimum number of additional recombinations. We describe a polynomial time algorithm with a guarantee that the number of computed new recombination events is within ϵ = sz(G1, G2) (function sz is a well-behaved function of the sizes and topologies of G1 and G2) of the optimal number of recombinations. To date, this is the best known result for a network consensus problem.Results: Although the network consensus problem can be applied to a variety of domains, here we focus on structure of human populations. With our preliminary analysis on a segment of the human Chromosome X data we are able to infer ancient recombinations, population-specific recombinations and more, which also support the widely accepted 'Out of Africa' model. These results have been verified independently using traditional manual procedures. To the best of our knowledge, this is the first recombinations-based characterization of human populations. Conclusion: We show that our mathematical model identifies recombination spots in the individual haplotypes; the aggregate of these spots over a set of haplotypes defines a recombinational landscape that has enough signal to detect continental as well as population divide based on a short segment of Chromosome X. In particular, we are able to infer ancient recombinations, population-specific recombinations and more, which also support the widely accepted 'Out of Africa' model. The agreement with mutation-based analysis can be viewed as an indirect validation of our results and the model. Since the model in principle gives us more information embedded in the networks, in our future work, we plan to investigate more non-traditional questions via these structures computed by our methodology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We show how to build full-diversity product codes under both iterative encoding and decoding over non-ergodic channels, in presence of block erasure and block fading. The concept of a rootcheck or a root subcode is introduced by generalizing the same principle recently invented for low-density parity-check codes. We also describe some channel related graphical properties of the new family of product codes, a familyreferred to as root product codes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The academic debate about the secession of a territory which is part of a liberal democracy state displays an initial contrast. On the one hand, practical secessionist movements usually legitimize their position using nationalist arguments linked to the principle of national self- determination. On the other hand, we find in academia few defenders of a normative principle of national self-determination. Philosophers, political scientists and jurists usually defend the status quo. And even when they do not defend it, most of them tend to leave the question of that question and secession unresolved or confused. Regarding this issue, liberal-democratic theories show a tendency to be “conservative” in relation to the political borders, regardless the historical and empirical processes of creation of current States. Probably, this feature is not far away to the fact that, since its beginning, political liberalism has not been a theory of the nation, but a theory of the state.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We formulate performance assessment as a problem of causal analysis and outline an approach based on the missing data principle for its solution. It is particularly relevant in the context of so-called league tables for educational, health-care and other public-service institutions. The proposed solution avoids comparisons of institutions that have substantially different clientele (intake).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In principle, a country can not endure negative genuine savings for longperiods of time without experiencing declining consumption. Nevertheless,theoreticians envisage two alternatives to explain how an exporter ofnon-renewable natural resources could experience permanent negativegenuine savings and still ensure sustainability. The first one allegesthat the capital gains arising from the expected improvement in theterms of trade would suffice to compensate for the negative savings ofthe resource exporter. The second alternative points at technologicalchange as a way to avoid economic collapse. This paper uses the dataof Venezuela and Mexico to empirically test the first of these twohypotheses. The results presented here prove that the terms oftrade do not suffice to compensate the depletion of oil reservesin these two open economies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We compare two methods for visualising contingency tables and developa method called the ratio map which combines the good properties of both.The first is a biplot based on the logratio approach to compositional dataanalysis. This approach is founded on the principle of subcompositionalcoherence, which assures that results are invariant to considering subsetsof the composition. The second approach, correspondence analysis, isbased on the chi-square approach to contingency table analysis. Acornerstone of correspondence analysis is the principle of distributionalequivalence, which assures invariance in the results when rows or columnswith identical conditional proportions are merged. Both methods may bedescribed as singular value decompositions of appropriately transformedmatrices. Correspondence analysis includes a weighting of the rows andcolumns proportional to the margins of the table. If this idea of row andcolumn weights is introduced into the logratio biplot, we obtain a methodwhich obeys both principles of subcompositional coherence and distributionalequivalence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new algorithm called the parameterized expectations approach(PEA) for solving dynamic stochastic models under rational expectationsis developed and its advantages and disadvantages are discussed. Thisalgorithm can, in principle, approximate the true equilibrium arbitrarilywell. Also, this algorithm works from the Euler equations, so that theequilibrium does not have to be cast in the form of a planner's problem.Monte--Carlo integration and the absence of grids on the state variables,cause the computation costs not to go up exponentially when the numberof state variables or the exogenous shocks in the economy increase. \\As an application we analyze an asset pricing model with endogenousproduction. We analyze its implications for time dependence of volatilityof stock returns and the term structure of interest rates. We argue thatthis model can generate hump--shaped term structures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We formulate performance assessment as a problem of causal analysis and outline an approach based on the missing data principle for its solution. It is particularly relevant in the context of so-called league tables for educational, health-care and other public-service institutions. The proposed solution avoids comparisons of institutions that have substantially different clientele (intake).