106 resultados para Routh-Hurwitz criterion


Relevância:

10.00% 10.00%

Publicador:

Resumo:

From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The past four decades have witnessed an explosive growth in the field of networkbased facility location modeling. This is not at all surprising since location policy is one of the most profitable areas of applied systems analysis in regional science and ample theoretical and applied challenges are offered. Location-allocation models seek the location of facilities and/or services (e.g., schools, hospitals, and warehouses) so as to optimize one or several objectives generally related to the efficiency of the system or to the allocation of resources. This paper concerns the location of facilities or services in discrete space or networks, that are related to the public sector, such as emergency services (ambulances, fire stations, and police units), school systems and postal facilities. The paper is structured as follows: first, we will focus on public facility location models that use some type of coverage criterion, with special emphasis in emergency services. The second section will examine models based on the P-Median problem and some of the issues faced by planners when implementing this formulation in real world locational decisions. Finally, the last section will examine new trends in public sector facility location modeling.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Optimum experimental designs depend on the design criterion, the model andthe design region. The talk will consider the design of experiments for regressionmodels in which there is a single response with the explanatory variables lying ina simplex. One example is experiments on various compositions of glass such asthose considered by Martin, Bursnall, and Stillman (2001).Because of the highly symmetric nature of the simplex, the class of models thatare of interest, typically Scheff´e polynomials (Scheff´e 1958) are rather differentfrom those of standard regression analysis. The optimum designs are also ratherdifferent, inheriting a high degree of symmetry from the models.In the talk I will hope to discuss a variety of modes for such experiments. ThenI will discuss constrained mixture experiments, when not all the simplex is availablefor experimentation. Other important aspects include mixture experimentswith extra non-mixture factors and the blocking of mixture experiments.Much of the material is in Chapter 16 of Atkinson, Donev, and Tobias (2007).If time and my research allows, I would hope to finish with a few comments ondesign when the responses, rather than the explanatory variables, lie in a simplex.ReferencesAtkinson, A. C., A. N. Donev, and R. D. Tobias (2007). Optimum ExperimentalDesigns, with SAS. Oxford: Oxford University Press.Martin, R. J., M. C. Bursnall, and E. C. Stillman (2001). Further results onoptimal and efficient designs for constrained mixture experiments. In A. C.Atkinson, B. Bogacka, and A. Zhigljavsky (Eds.), Optimal Design 2000,pp. 225–239. Dordrecht: Kluwer.Scheff´e, H. (1958). Experiments with mixtures. Journal of the Royal StatisticalSociety, Ser. B 20, 344–360.1

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Scoring rules that elicit an entire belief distribution through the elicitation of point beliefsare time-consuming and demand considerable cognitive e¤ort. Moreover, the results are validonly when agents are risk-neutral or when one uses probabilistic rules. We investigate a classof rules in which the agent has to choose an interval and is rewarded (deterministically) onthe basis of the chosen interval and the realization of the random variable. We formulatean e¢ ciency criterion for such rules and present a speci.c interval scoring rule. For single-peaked beliefs, our rule gives information about both the location and the dispersion of thebelief distribution. These results hold for all concave utility functions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Iterated Local Search has many of the desirable features of a metaheuristic: it is simple, easy to implement, robust, and highly effective. The essential idea of Iterated Local Search lies in focusing the search not on the full space of solutions but on a smaller subspace defined by the solutions that are locally optimal for a given optimization engine. The success of Iterated Local Search lies in the biased sampling of this set of local optima. How effective this approach turns out to be depends mainly on the choice of the local search, the perturbations, and the acceptance criterion. So far, in spite of its conceptual simplicity, it has lead to a number of state-of-the-art results without the use of too much problem-specific knowledge. But with further work so that the different modules are well adapted to the problem at hand, Iterated Local Search can often become a competitive or even state of the artalgorithm. The purpose of this review is both to give a detailed description of this metaheuristic and to show where it stands in terms of performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An important problem in descriptive and prescriptive research in decision making is to identify regions of rationality, i.e., the areas for which heuristics are and are not effective. To map the contours of such regions, we derive probabilities that heuristics identify the best of m alternatives (m > 2) characterized by k attributes or cues (k > 1). The heuristics include a single variable (lexicographic), variations of elimination-by-aspects, equal weighting, hybrids of the preceding, and models exploiting dominance. We use twenty simulated and four empirical datasets for illustration. We further provide an overview by regressing heuristic performance on factors characterizing environments. Overall, sensible heuristics generally yield similar choices in many environments. However, selection of the appropriate heuristic can be important in some regions (e.g., if there is low inter-correlation among attributes/cues). Since our work assumes a hit or miss decision criterion, we conclude by outlining extensions for exploring the effects of different loss functions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the fixed design regression model, additional weights areconsidered for the Nadaraya--Watson and Gasser--M\"uller kernel estimators.We study their asymptotic behavior and the relationships between new andclassical estimators. For a simple family of weights, and considering theIMSE as global loss criterion, we show some possible theoretical advantages.An empirical study illustrates the performance of the weighted estimatorsin finite samples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The past four decades have witnessed an explosive growth in the field of networkbased facilitylocation modeling. This is not at all surprising since location policy is one of the mostprofitable areas of applied systems analysis in regional science and ample theoretical andapplied challenges are offered. Location-allocation models seek the location of facilitiesand/or services (e.g., schools, hospitals, and warehouses) so as to optimize one or severalobjectives generally related to the efficiency of the system or to the allocation of resources.This paper concerns the location of facilities or services in discrete space or networks, thatare related to the public sector, such as emergency services (ambulances, fire stations, andpolice units), school systems and postal facilities. The paper is structured as follows: first,we will focus on public facility location models that use some type of coverage criterion,with special emphasis in emergency services. The second section will examine models based onthe P-Median problem and some of the issues faced by planners when implementing thisformulation in real world locational decisions. Finally, the last section will examine newtrends in public sector facility location modeling.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Departures from pure self interest in economic experiments have recently inspired models of "social preferences". We conduct experiments on simple two-person and three-person games with binary choices that test these theories more directly than the array of games conventionally considered. Our experiments show strong support for the prevalence of "quasi-maximin" preferences: People sacrifice to increase the payoffs for all recipients, but especially for the lowest-payoff recipients. People are also motivated by reciprocity: While people are reluctant to sacrifice to reciprocate good or bad behavior beyond what they would sacrifice for neutral parties, they withdraw willingness to sacrifice to achieve a fair outcome when others are themselves unwilling to sacrifice. Some participants are averse to getting different payoffs than others, but based on our experiments and reinterpretation of previous experiments we argue that behavior that has been presented as "difference aversion" in recent papers is actually a combination of reciprocal and quasi-maximin motivations. We formulate a model in which each player is willing to sacrifice to allocate the quasi-maximin allocation only to those players also believed to be pursuing the quasi-maximin allocation, and may sacrifice to punish unfair players.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new parametric minimum distance time-domain estimator for ARFIMA processes is introduced in this paper. The proposed estimator minimizes the sum of squared correlations of residuals obtained after filtering a series through ARFIMA parameters. The estimator iseasy to compute and is consistent and asymptotically normally distributed for fractionallyintegrated (FI) processes with an integration order d strictly greater than -0.75. Therefore, it can be applied to both stationary and non-stationary processes. Deterministic components are also allowed in the DGP. Furthermore, as a by-product, the estimation procedure provides an immediate check on the adequacy of the specified model. This is so because the criterion function, when evaluated at the estimated values, coincides with the Box-Pierce goodness of fit statistic. Empirical applications and Monte-Carlo simulations supporting the analytical results and showing the good performance of the estimator in finite samples are also provided.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coordination games arise very often in studies of industrial organization and international trade. This type of games has multiple strict equilibria, and therefore the identification of testable predictions isvery difficult. We study a vertical product differentiation model with two asymmetric players choosing first qualities and then prices. This game has two equilibria for some parameter values. However, we apply the risk dominance criterion suggested by Harsanyi and Selten and show that it always selects the equilibrium where the leader is the firm having some initial advantage. We then perform an experimental analysis totest whether the risk dominance prediction is supported by the behaviour oflaboratory agents. We show that the probability that the risk dominance prediction is right depends crucially on the degree of asymmetry of the game. The stronger the asymmetries the higher the predictive power of the risk dominance criterion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When can a single variable be more accurate in binary choice than multiple sources of information? We derive analytically the probability that a single variable (SV) will correctly predict one of two choices when both criterion and predictor are continuous variables. We further provide analogous derivations for multiple regression (MR) and equal weighting (EW) and specify the conditions under which the models differ in expected predictive ability. Key factors include variability in cue validities, intercorrelation between predictors, and the ratio of predictors to observations in MR. Theory and simulations are used to illustrate the differential effects of these factors. Results directly address why and when one-reason decision making can be more effective than analyses that use more information. We thus provide analytical backing to intriguing empirical results that, to date, have lacked theoretical justification. There are predictable conditions for which one should expect less to be more.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Climate science indicates that climate stabilization requires low GHG emissions. Is thisconsistent with nondecreasing human welfare?Our welfare or utility index emphasizes education, knowledge, and the environment. Weconstruct and calibrate a multigenerational model with intertemporal links provided by education,physical capital, knowledge and the environment.We reject discounted utilitarianism and adopt, first, the Pure Sustainability Optimization (orIntergenerational Maximin) criterion, and, second, the Sustainable Growth Optimization criterion,that maximizes the utility of the first generation subject to a given future rate of growth. We applythese criteria to our calibrated model via a novel algorithm inspired by the turnpike property.The computed paths yield levels of utility higher than the level at reference year 2000 for allgenerations. They require the doubling of the fraction of labor resources devoted to the creation ofknowledge relative to the reference level, whereas the fractions of labor allocated to consumptionand leisure are similar to the reference ones. On the other hand, higher growth rates requiresubstantial increases in the fraction of labor devoted to education, together with moderate increasesin the fractions of labor devoted to knowledge and the investment in physical capital.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The criterion, based on the thermodynamics theory, that the climatic system tends to extremizesome function has suggested several studies. In particular, special attention has been devoted to the possibility that the climate reaches an extremal rate of planetary entropy production.Due to both radiative and material effects contribute to total planetary entropy production,climatic simulations obtained at the extremal rates of total, radiative or material entropy production appear to be of interest in order to elucidate which of the three extremal assumptions behaves more similar to current data. In the present paper, these results have been obtainedby applying a 2-dimensional (2-Dim) horizontal energy balance box-model, with a few independent variables (surface temperature, cloud-cover and material heat fluxes). In addition, climatic simulations for current conditions by assuming a fixed cloud-cover have been obtained. Finally,sensitivity analyses for both variable and fixed cloud models have been carried out

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: The NDI, COM and NPQ are evaluation instruments for disability due to NP. There was no Spanish version of NDI or COM for which psychometric characteristics were known. The objectives of this study were to translate and culturally adapt the Spanish version of the Neck Disability Index Questionnaire (NDI), and the Core Outcome Measure (COM), to validate its use in Spanish speaking patients with non-specific neck pain (NP), and to compare their psychometric characteristics with those of the Spanish version of the Northwick Pain Questionnaire (NPQ).Methods: Translation/re-translation of the English versions of the NDI and the COM was done blindly and independently by a multidisciplinary team. The study was done in 9 primary care Centers and 12 specialty services from 9 regions in Spain, with 221 acute, subacute and chronic patients who visited their physician for NP: 54 in the pilot phase and 167 in the validation phase. Neck pain (VAS), referred pain (VAS), disability (NDI, COM and NPQ), catastrophizing (CSQ) and quality of life (SF-12) were measured on their first visit and 14 days later. Patients' self-assessment was used as the external criterion for pain and disability. In the pilot phase, patients' understanding of each item in the NDI and COM was assessed, and on day 1 test-retest reliability was estimated by giving a second NDI and COM in which the name of the questionnaires and the order of the items had been changed.Results: Comprehensibility of NDI and COM were good. Minutes needed to fill out the questionnaires [median, (P25, P75)]: NDI. 4 (2.2, 10.0), COM: 2.1 (1.0, 4.9). Reliability: [ICC, (95%CI)]: NDI: 0.88 (0.80, 0.93). COM: 0.85 (0.75,0.91). Sensitivity to change: Effect size for patients having worsened, not changed and improved between days 1 and 15, according to the external criterion for disability: NDI: -0.24, 0.15, 0.66; NPQ: -0.14, 0.06, 0.67; COM: 0.05, 0.19, 0.92. Validity: Results of NDI, NPQ and COM were consistent with the external criterion for disability, whereas only those from NDI were consistent with the one for pain. Correlations with VAS, CSQ and SF-12 were similar for NDI and NPQ (absolute values between 0.36 and 0.50 on day 1, between 0.38 and 0.70 on day 15), and slightly lower for COM (between 0.36 and 0.48 on day 1, and between 0.33 and 0.61 on day 15). Correlation between NDI and NPQ: r = 0.84 on day 1, r = 0.91 on day 15. Correlation between COM and NPQ: r = 0.63 on day 1, r = 0.71 on day 15.Conclusion: Although most psychometric characteristics of NDI, NPQ and COM are similar, those from the latter one are worse and its use may lead to patients' evolution seeming more positive than it actually is. NDI seems to be the best instrument for measuring NP-related disability, since its results are the most consistent with patient's assessment of their own clinical status and evolution. It takes two more minutes to answer the NDI than to answer the COM, but it can be reliably filled out by the patient without assistance.