70 resultados para Pareto-Optimal
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Multi-objective optimization algorithms aim at finding Pareto-optimal solutions. Recovering Pareto fronts or Pareto sets from a limited number of function evaluations are challenging problems. A popular approach in the case of expensive-to-evaluate functions is to appeal to metamodels. Kriging has been shown efficient as a base for sequential multi-objective optimization, notably through infill sampling criteria balancing exploitation and exploration such as the Expected Hypervolume Improvement. Here we consider Kriging metamodels not only for selecting new points, but as a tool for estimating the whole Pareto front and quantifying how much uncertainty remains on it at any stage of Kriging-based multi-objective optimization algorithms. Our approach relies on the Gaussian process interpretation of Kriging, and bases upon conditional simulations. Using concepts from random set theory, we propose to adapt the Vorob’ev expectation and deviation to capture the variability of the set of non-dominated points. Numerical experiments illustrate the potential of the proposed workflow, and it is shown on examples how Gaussian process simulations and the estimated Vorob’ev deviation can be used to monitor the ability of Kriging-based multi-objective optimization algorithms to accurately learn the Pareto front.
Resumo:
Agents with single-peaked preferences share a resource coming from different suppliers; each agent is connected to only a subset of suppliers. Examples include workload balancing, sharing earmarked funds, and rationing utilities after a storm. Unlike in the one supplier model, in a Pareto optimal allocation agents who get more than their peak from underdemanded suppliers, coexist with agents who get less from overdemanded suppliers. Our Egalitarian solution is the Lorenz dominant Pareto optimal allocation. It treats agents with equal demands as equally as the connectivity constraints allow. Together, Strategyproofness, Pareto Optimality, and Equal Treatment of Equals, characterize our solution.
Resumo:
We construct an empirically informed computational model of fiscal federalism, testing whether horizontal or vertical equalization can solve the fiscal externality problem in an environment in which heterogeneous agents can move and vote. The model expands on the literature by considering the case of progressive local taxation. Although the consequences of progressive taxation under fiscal federalism are well understood, they have not been studied in a context with tax equalization, despite widespread implementation. The model also expands on the literature by comparing the standard median voter model with a realistic alternative voting mechanism. We find that fiscal federalism with progressive taxation naturally leads to segregation as well as inefficient and inequitable public goods provision while the alternative voting mechanism generates more efficient, though less equitable, public goods provision. Equalization policy, under both types of voting, is largely undermined by micro-actors' choices. For this reason, the model also does not find the anticipated effects of vertical equalization discouraging public goods spending among wealthy jurisdictions and horizontal encouraging it among poor jurisdictions. Finally, we identify two optimal scenarios, superior to both complete centralization and complete devolution. These scenarios are not only Pareto optimal, but also conform to a Rawlsian view of justice, offering the best possible outcome for the worst-off. Despite offering the best possible outcomes, both scenarios still entail significant economic segregation and inequitable public goods provision. Under the optimal scenarios agents shift the bulk of revenue collection to the federal government, with few jurisdictions maintaining a small local tax.
Resumo:
This paper presents a parallel surrogate-based global optimization method for computationally expensive objective functions that is more effective for larger numbers of processors. To reach this goal, we integrated concepts from multi-objective optimization and tabu search into, single objective, surrogate optimization. Our proposed derivative-free algorithm, called SOP, uses non-dominated sorting of points for which the expensive function has been previously evaluated. The two objectives are the expensive function value of the point and the minimum distance of the point to previously evaluated points. Based on the results of non-dominated sorting, P points from the sorted fronts are selected as centers from which many candidate points are generated by random perturbations. Based on surrogate approximation, the best candidate point is subsequently selected for expensive evaluation for each of the P centers, with simultaneous computation on P processors. Centers that previously did not generate good solutions are tabu with a given tenure. We show almost sure convergence of this algorithm under some conditions. The performance of SOP is compared with two RBF based methods. The test results show that SOP is an efficient method that can reduce time required to find a good near optimal solution. In a number of cases the efficiency of SOP is so good that SOP with 8 processors found an accurate answer in less wall-clock time than the other algorithms did with 32 processors.
Resumo:
OBJECTIVES: To determine (1) the optimal sites for mini-implant placement in the maxilla and the mandible based on dimensional mapping of the interradicular spaces and cortical bone thickness and (2) The effect of age and sex on the studied anatomic measurements. MATERIAL AND METHODS: The cone beam computed tomography images of 100 patients (46 males, 54 females) divided into two age groups (13-18 years), and (19-27 years) were used. The following interradicular measurements were performed: (1) Buccolingual bone thickness; (2) Mesiodistal spaces both buccally and palatally/lingually; and (3) Buccal and palatal/lingual cortical thicknesses. RESULTS: In the maxilla, the highest buccolingual thickness existed between first and second molars; the highest mesiodistal buccal/palatal distances were between the second premolar and the first molar. The highest buccal cortical thickness was between the first and second premolars. The highest palatal cortical thickness was between central and lateral incisors. In the mandible, the highest buccolingual and buccal cortical thicknesses were between the first and second molars. The highest mesiodistal buccal distance was between the second premolar and the first molar. The highest mesiodistal lingual distance was between the first and second premolars. The highest lingual cortical thickness was between the canine and the first premolar. The males and the older age group had significantly higher buccolingual, buccal, and palatal cortical thicknesses at specific sites and levels in the maxilla and the mandible. CONCLUSIONS: A clinical guideline for optimal sites for mini-implant placement is suggested. Sex and age affected the anatomic measurements in certain areas in the maxilla and the mandible.
Resumo:
BACKGROUND: Physiologic data display is essential to decision making in critical care. Current displays echo first-generation hemodynamic monitors dating to the 1970s and have not kept pace with new insights into physiology or the needs of clinicians who must make progressively more complex decisions about their patients. The effectiveness of any redesign must be tested before deployment. Tools that compare current displays with novel presentations of processed physiologic data are required. Regenerating conventional physiologic displays from archived physiologic data is an essential first step. OBJECTIVES: The purposes of the study were to (1) describe the SSSI (single sensor single indicator) paradigm that is currently used for physiologic signal displays, (2) identify and discuss possible extensions and enhancements of the SSSI paradigm, and (3) develop a general approach and a software prototype to construct such "extended SSSI displays" from raw data. RESULTS: We present Multi Wave Animator (MWA) framework-a set of open source MATLAB (MathWorks, Inc., Natick, MA, USA) scripts aimed to create dynamic visualizations (eg, video files in AVI format) of patient vital signs recorded from bedside (intensive care unit or operating room) monitors. Multi Wave Animator creates animations in which vital signs are displayed to mimic their appearance on current bedside monitors. The source code of MWA is freely available online together with a detailed tutorial and sample data sets.
Resumo:
Expert debate and synthesis of research to inform future management approaches for acute whiplash disorders.
Resumo:
Objective: To compare clinical outcomes after laparoscopic cholecystectomy (LC) for acute cholecystitis performed at various time-points after hospital admission. Background: Symptomatic gallstones represent an important public health problem with LC the treatment of choice. LC is increasingly offered for acute cholecystitis, however, the optimal time-point for LC in this setting remains a matter of debate. Methods: Analysis was based on the prospective database of the Swiss Association of Laparoscopic and Thoracoscopic Surgery and included patients undergoing emergency LC for acute cholecystitis between 1995 and 2006, grouped according to the time-points of LC since hospital admission (admission day (d0), d1, d2, d3, d4/5, d ≥6). Linear and generalized linear regression models assessed the effect of timing of LC on intra- or postoperative complications, conversion and reoperation rates and length of postoperative hospital stay. Results: Of 4113 patients, 52.8% were female, median age was 59.8 years. Delaying LC resulted in significantly higher conversion rates (from 11.9% at d0 to 27.9% at d ≥6 days after admission, P < 0.001), surgical postoperative complications (5.7% to 13%, P < 0.001) and re-operation rates (0.9% to 3%, P = 0.007), with a significantly longer postoperative hospital stay (P < 0.001). Conclusions: Delaying LC for acute cholecystitis has no advantages, resulting in significantly increased conversion/re-operation rate, postoperative complications and longer postoperative hospital stay. This investigation—one of the largest in the literature—provides compelling evidence that acute cholecystitis merits surgery within 48 hours of hospital admission if impact on the patient and health care system is to be minimized.