110 resultados para Mohr-Coulomb Criterion


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a KAM theory for some dissipative systems (geometrically, these are conformally symplectic systems, i.e. systems that transform a symplectic form into a multiple of itself). For systems with n degrees of freedom depending on n parameters we show that it is possible to find solutions with n-dimensional (Diophantine) frequencies by adjusting the parameters. We do not assume that the system is close to integrable, but we use an a-posteriori format. Our unknowns are a parameterization of the solution and a parameter. We show that if there is a sufficiently approximate solution of the invariance equation, which also satisfies some explicit non–degeneracy conditions, then there is a true solution nearby. We present results both in Sobolev norms and in analytic norms. The a–posteriori format has several consequences: A) smooth dependence on the parameters, including the singular limit of zero dissipation; B) estimates on the measure of parameters covered by quasi–periodic solutions; C) convergence of perturbative expansions in analytic systems; D) bootstrap of regularity (i.e., that all tori which are smooth enough are analytic if the map is analytic); E) a numerically efficient criterion for the break–down of the quasi–periodic solutions. The proof is based on an iterative quadratically convergent method and on suitable estimates on the (analytical and Sobolev) norms of the approximate solution. The iterative step takes advantage of some geometric identities, which give a very useful coordinate system in the neighborhood of invariant (or approximately invariant) tori. This system of coordinates has several other uses: A) it shows that for dissipative conformally symplectic systems the quasi–periodic solutions are attractors, B) it leads to efficient algorithms, which have been implemented elsewhere. Details of the proof are given mainly for maps, but we also explain the slight modifications needed for flows and we devote the appendix to present explicit algorithms for flows.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We formulate a necessary and sufficient condition for polynomials to be dense in a space of continuous functions on the real line, with respect to Bernstein's weighted uniform norm. Equivalently, for a positive finite measure [lletra "mu" minúscula de l'alfabet grec] on the real line we give a criterion for density of polynomials in Lp[lletra "mu" minúscula de l'alfabet grec entre parèntesis].

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a bankruptcy situation, not all claimants are affected in the same way. In particular, some depositors may enter into a situation of personal bankruptcy if they lose part of their investments. Events of this kind may lead to a social catastrophe. We propose discrimination among the claimants as a possible solution. A fact considered in the American bankruptcy law (among others) that establishes some discrimination on the claimants, or the Santander Bank that in the Madoff’s case reimbursed only the deposits to its particular customers. Moreover, the necessity of discriminating has already been mentioned in different contexts by Young (1988), Bossert (1995), Thomson (2003) and Pulido et al. (2002, 2007), for instance. In this paper, we take a bankruptcy solution as the reference point. Given this initial allocation, we make transfers from richer to poorer with the purpose of distributing not only the personal incurred losses as evenly as possible but also the transfers in a progressive way. The agents are divided into two groups depending on their personal monetary value (wealth, net-income, GDP or any other characteristic). Then, we impose a set of Axioms that bound the maximal transfer that each net-contributor can make and each net-receiver can obtain. Finally, we define a value discriminant solution, and we characterize it by means of the Lorenz criterion. Endogenous convex combinations between solutions are also considered. Keywords: Bankruptcy, Discrimination, Compensation, Rules JEL classification: C71, D63, D71.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Immobile location-allocation (LA) problems is a type of LA problem that consists in determining the service each facility should offer in order to optimize some criterion (like the global demand), given the positions of the facilities and the customers. Due to the complexity of the problem, i.e. it is a combinatorial problem (where is the number of possible services and the number of facilities) with a non-convex search space with several sub-optimums, traditional methods cannot be applied directly to optimize this problem. Thus we proposed the use of clustering analysis to convert the initial problem into several smaller sub-problems. By this way, we presented and analyzed the suitability of some clustering methods to partition the commented LA problem. Then we explored the use of some metaheuristic techniques such as genetic algorithms, simulated annealing or cuckoo search in order to solve the sub-problems after the clustering analysis

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: The COSMIN checklist (COnsensus-based Standards for the selection of health status Measurement INstruments) was developed in an international Delphi study to evaluate the methodological quality of studies on measurement properties of health-related patient reported outcomes (HR-PROs). In this paper, we explain our choices for the design requirements and preferred statistical methods for which no evidence is available in the literature or on which the Delphi panel members had substantial discussion. Methods: The issues described in this paper are a reflection of the Delphi process in which 43 panel members participated. Results: The topics discussed are internal consistency (relevance for reflective and formative models, and distinction with unidimensionality), content validity (judging relevance and comprehensiveness), hypotheses testing as an aspect of construct validity (specificity of hypotheses), criterion validity (relevance for PROs), and responsiveness (concept and relation to validity, and (in) appropriate measures).Conclusions: We expect that this paper will contribute to a better understanding of the rationale behind the items, thereby enhancing the acceptance and use of the COSMIN checklist.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Spain has recently become an inward migration country. Little is known about the occupational health of immigrant workers. This study aimed to explore the perceptions that immigrant workers in Spain had of their working conditions.Methods: Qualitative, exploratory, descriptive study. Criterion sampling. Data collected between September 2006 and May 2007 through semi-structured focus groups and individual interviews, with a topic guide. One hundred and fifty-eight immigrant workers (90 men/68 women) from Colombia (n = 21), Morocco (n = 39), sub-Saharan Africa (n = 29), Romania (n = 44) and Ecuador (n = 25), who were authorised (documented) or unauthorised (undocumented) residents in five medium to large cities in Spain.Results: Participants described poor working conditions, low pay and health hazards. Perception of hazards appeared to be related to gender and job sector. Informants were highly segregated into jobs by sex, however, so this issue will need further exploration. Undocumented workers described poorer conditions than documented workers, which they attributed to their documentation status. Documented participants also felt vulnerable because of their immigrant status. Informants believed that deficient language skills, non-transferability of their education and training and, most of all, their immigrant status and economic need left them with little choice but to work under poor conditions.Conclusions: The occupational health needs of immigrant workers must be addressed at the job level, while improving the enforcement of existing health and safety regulations. The roles that documentation status and economic need played in these informants' work experiences should be considered and how these may influence health outcomes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We design powerful low-density parity-check (LDPC) codes with iterative decoding for the block-fading channel. We first study the case of maximum-likelihood decoding, and show that the design criterion is rather straightforward. Since optimal constructions for maximum-likelihood decoding do not performwell under iterative decoding, we introduce a new family of full-diversity LDPC codes that exhibit near-outage-limit performance under iterative decoding for all block-lengths. This family competes favorably with multiplexed parallel turbo codes for nonergodic channels.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this chapter, after pointing out the different logics that lie behind the familiar ideas of democracy and federalism, I have dealt with the case of plurinational federal democracies. Having put forward a double criterion of an empirical nature with which to differentiate between the existence of minority nations within plurinational democracies (section 2), I suggest three theoretical criteria for the political accommodation of these democracies. In the following section, I show the agonistic nature of the normative discussion of the political accommodation of this kind of democracies, which bring monist and pluralist versions of the demos of the polity into conflict (section 3.1), as well as a number of conclusions which are the result of a comparative study of 19 federal and regional democracies using four analytical axes: the uninational/plurinational axis; the unitarianism-federalism axis; the centralisation-decentralisation axis; and the symmetry-asymmetry axis (section 3.2). This analysis reveals shortcomings in the constitutional recognition of national pluralism in federal and regional cases with a large number of federated units/regions with political autonomy; a lower degree of constitutional federalism and a greater asymmetry in the federated entities or regions of plurinational democracies. It also reveals difficulties to establish clear formulas in these democracies in order to encourage a “federalism of trust” based on the participation and protection of national minorities in the shared government of plurinational federations/regional states. Actually, there is a federal deficit in this kind polities according to normative liberal-democratic patterns and to what comparative analysis show. Finally, this chapter advocates the need for a greater normative and institutional refinement in plurinational federal democracies. In order to achieve this, it is necessary to introduce a deeper form of “ethical” pluralism -which displays normative agonistic trends, as well as a more “confederal/asymmetrical” perspective, congruent with the national pluralism of these kind of polities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The past four decades have witnessed an explosive growth in the field of networkbased facility location modeling. This is not at all surprising since location policy is one of the most profitable areas of applied systems analysis in regional science and ample theoretical and applied challenges are offered. Location-allocation models seek the location of facilities and/or services (e.g., schools, hospitals, and warehouses) so as to optimize one or several objectives generally related to the efficiency of the system or to the allocation of resources. This paper concerns the location of facilities or services in discrete space or networks, that are related to the public sector, such as emergency services (ambulances, fire stations, and police units), school systems and postal facilities. The paper is structured as follows: first, we will focus on public facility location models that use some type of coverage criterion, with special emphasis in emergency services. The second section will examine models based on the P-Median problem and some of the issues faced by planners when implementing this formulation in real world locational decisions. Finally, the last section will examine new trends in public sector facility location modeling.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Optimum experimental designs depend on the design criterion, the model andthe design region. The talk will consider the design of experiments for regressionmodels in which there is a single response with the explanatory variables lying ina simplex. One example is experiments on various compositions of glass such asthose considered by Martin, Bursnall, and Stillman (2001).Because of the highly symmetric nature of the simplex, the class of models thatare of interest, typically Scheff´e polynomials (Scheff´e 1958) are rather differentfrom those of standard regression analysis. The optimum designs are also ratherdifferent, inheriting a high degree of symmetry from the models.In the talk I will hope to discuss a variety of modes for such experiments. ThenI will discuss constrained mixture experiments, when not all the simplex is availablefor experimentation. Other important aspects include mixture experimentswith extra non-mixture factors and the blocking of mixture experiments.Much of the material is in Chapter 16 of Atkinson, Donev, and Tobias (2007).If time and my research allows, I would hope to finish with a few comments ondesign when the responses, rather than the explanatory variables, lie in a simplex.ReferencesAtkinson, A. C., A. N. Donev, and R. D. Tobias (2007). Optimum ExperimentalDesigns, with SAS. Oxford: Oxford University Press.Martin, R. J., M. C. Bursnall, and E. C. Stillman (2001). Further results onoptimal and efficient designs for constrained mixture experiments. In A. C.Atkinson, B. Bogacka, and A. Zhigljavsky (Eds.), Optimal Design 2000,pp. 225–239. Dordrecht: Kluwer.Scheff´e, H. (1958). Experiments with mixtures. Journal of the Royal StatisticalSociety, Ser. B 20, 344–360.1

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Scoring rules that elicit an entire belief distribution through the elicitation of point beliefsare time-consuming and demand considerable cognitive e¤ort. Moreover, the results are validonly when agents are risk-neutral or when one uses probabilistic rules. We investigate a classof rules in which the agent has to choose an interval and is rewarded (deterministically) onthe basis of the chosen interval and the realization of the random variable. We formulatean e¢ ciency criterion for such rules and present a speci.c interval scoring rule. For single-peaked beliefs, our rule gives information about both the location and the dispersion of thebelief distribution. These results hold for all concave utility functions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Iterated Local Search has many of the desirable features of a metaheuristic: it is simple, easy to implement, robust, and highly effective. The essential idea of Iterated Local Search lies in focusing the search not on the full space of solutions but on a smaller subspace defined by the solutions that are locally optimal for a given optimization engine. The success of Iterated Local Search lies in the biased sampling of this set of local optima. How effective this approach turns out to be depends mainly on the choice of the local search, the perturbations, and the acceptance criterion. So far, in spite of its conceptual simplicity, it has lead to a number of state-of-the-art results without the use of too much problem-specific knowledge. But with further work so that the different modules are well adapted to the problem at hand, Iterated Local Search can often become a competitive or even state of the artalgorithm. The purpose of this review is both to give a detailed description of this metaheuristic and to show where it stands in terms of performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An important problem in descriptive and prescriptive research in decision making is to identify regions of rationality, i.e., the areas for which heuristics are and are not effective. To map the contours of such regions, we derive probabilities that heuristics identify the best of m alternatives (m > 2) characterized by k attributes or cues (k > 1). The heuristics include a single variable (lexicographic), variations of elimination-by-aspects, equal weighting, hybrids of the preceding, and models exploiting dominance. We use twenty simulated and four empirical datasets for illustration. We further provide an overview by regressing heuristic performance on factors characterizing environments. Overall, sensible heuristics generally yield similar choices in many environments. However, selection of the appropriate heuristic can be important in some regions (e.g., if there is low inter-correlation among attributes/cues). Since our work assumes a hit or miss decision criterion, we conclude by outlining extensions for exploring the effects of different loss functions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the fixed design regression model, additional weights areconsidered for the Nadaraya--Watson and Gasser--M\"uller kernel estimators.We study their asymptotic behavior and the relationships between new andclassical estimators. For a simple family of weights, and considering theIMSE as global loss criterion, we show some possible theoretical advantages.An empirical study illustrates the performance of the weighted estimatorsin finite samples.